Tuesday, June 24, 2008

Search engine optimization seo new information

Search engine optimization (seo new information)

From Wikipedia

History seo new information

Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all a webmaster needed to do was submit a page, or URL, to the various engines which would send a spider to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words and all links the page contains, which are then placed into a scheduler for crawling at a later date. seo new information

Site owners started to recognize the value of having their sites highly ranked and visible in search engine results.They also recognised that the higher their site ranking the more people would click on the website. According to industry analyst Danny Sullivan, the earliest known use of the phrase search engine optimization was a spam message posted on Usenet on July 26, 1997. seo new information

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provided a guide to each page's content. But using meta data to index pages was found to be less than reliable because the webmaster's account of keywords in the meta tag were not truly relevant to the site's actual keywords. Inaccurate, incomplete, and inconsistent data in meta tags caused pages to rank for irrelevant searches.[3] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines. seo new information

By relying so much on factors exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. seo new information

Graduate students at Stanford University, Larry Page and Sergey Brin developed "backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer. seo new information

Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. Off-page factors such as PageRank and hyperlink analysis were considered, as well as on-page factors, to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaining PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming. seo new information

To reduce the impact of link schemes, search engines, by 2007, considered a wide range of undisclosed factors for their ranking algorithms. Google says it ranks sites using more than 200 different signals. The three leading search engines, Google, Yahoo and Microsoft's Live Search, do not disclose the algorithms they use to rank pages. Notable SEOs, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs. SEO practitioners may also study patents held by various search engines to gain insight into the algorithms. seo new information


Wednesday, January 9, 2008

Top 5 Mistakes in Web Site Optimization

Top 5 Mistakes in Web Site Optimization

Web site optimization is the practice of making your site search-friendly both to search engines and searchers. There are five common mistakes that people tend to make when beginning to optimize their sites for search engines.

1) Web Site Optimization Mistake : Lack of Targeted Keywords and Phrases
A lack of targeted keywords and phrases can make or break your site. When a searcher goes to a search engine or directory and types in a keyword or keyword phrase that you would like to be found for - but you don't have these particular keywords and phrases on your site - then that searcher will most likely not be able to find you.

2) Web Site Optimization Mistake : Poorly Written ContentSite content is the absolute number one draw for both search engine spiders and search engine/directory users. If your content is badly written, is never updated, or is not relevant to what you want to be found for, than your site is neither search engine friendly or user-friendly. In addition, your content absolutely must have targeted keywords and phrases (see mistake number Well-written content is what search engine spiders feed on, and content is the key to high rankings.

3) Web Site Optimization Mistake : Black Hat SEO TechniquesBlack Hat web site optimization includes techniques that are unethical and frowned upon by search engine spiders. These techniques include keyword stuffing, doorway pages, invisible text, and more. These practices actually do raise your site ranking. However, search engine spiders have become very aware of Black Hat seo techniques and can spot them pretty easily (and ban your site). Don't go for the quick fix; learn how to optimize your site for the long haul.

4) Web Site Optimi1zation Mistake : Using Meta Tags Improperly or Not At AllMeta tags include title tags, keyword tags, and description tags. While not the absolute most important component in Web site optimization (that spot is reserved for keywords and content), they do have an important part to play. Make sure to include all these tags in every page of your site, and target them for each individual page.

5) Web Site Optimization Mistake : A Badly Designed SiteIneffective site design includes lack of clear navigation, poorly designed frames, and large,load-intensive graphics. Search engine spiders can actually be blocked from crawling sites that are not well-designed, so it's a major part of Web site optimization; plus, if users can't find what they want on your site quickly, they'll find another site.