SEO History

SEO : What is SEO : SEO History : White Hat vs Black Hat : Marketing Strategy : What Is LSI
Our SEO Process : Our SEO Packages : Link Building : Pay Per Click Campaign Management
SEO Case Studies

Webmasters and content providers began optimising sites for search engines in the 1990s, as the first search engines were cataloging the early World Wide Web. Initially, all a webmaster needed to do was submit a page, or URL, to the various search engines which would send a spider to crawl that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words and all links the page contains, which are then placed into a scheduler for crawling at a later date.

Site owners started to recognise the value of having their sites highly ranked and visible in search engine results. They also recognised that the higher their site ranked, the more people would click through to the website. Early search engines suffered from abuse and ranking manipulation by relying so much on factors exclusively within a webmaster's control. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.

By 2007, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.

SEO : What is SEO : SEO History : White Hat vs Black Hat : Marketing Strategy : What Is LSI
Our SEO Process : Our SEO Packages : Link Building : Pay Per Click Campaign Management
SEO Case Studies