Advertise Box

LinkAdage's Take On Google's New Search Engine Patent

Has Google thrown the cyber world a curveball? Let's fill in some blanks and connect a few dots regarding the recently-filed patent application for Google's latest Search Engine algorithm - Search Engine 125. For those unfamiliar with the inner workings of search engines, each Search Engine uses its own unique formula for determining that all-important ranking for each web site. Remember, users who query a Search Engine rarely look beyond the first page, so if you want to increase visitor traffic, step one is to develop your website in a way that matches the major search engine's ranking algorithms. You need to find out what the search engines like and make sure you feed it to them.

Now, over the years, the formulae used by search engines to rank a site have grown more complex. Pre-2000, search engines didn't do much more than count keywords on a site. The more times the words 'limburger cheese' appeared on the site, the higher the site's limburger cheese search engine ranking position (SERP). Of course, the key then became to develop SEO text with limburger cheese mentioned in every header, twice in subheads and at least once in every paragraph. Hardly compelling reading, except for the most avid of limburger cheese fans.

So, the Google, Yahoo, and MSN search engines moved to improve the quality of their SERPs, to provide users with helpful, expert information. Changes were made to the keyword algorithms (the weighing formulae), awarding more points for things like the quality of inbound and outbound links to and from a site. This meant that quality links from a relevant 'authority' site - a highly-prized designation, will move your site up in the SERPs.

Well, on March 31, 2005, Google applied for a patent on its latest search algorithm. For those who have no fear of their brains exploding from buzzword overload do a search on "Patent Application 0050071741" to read the entire patent. The patent application describes "a method for scoring a document comprising: identifying the document; obtaining one or more types of history (sic) data associated with the document; and generating a score for the document based on the one or more types of historical data."

Apparently (or not), Google has determined that historical data associated with each site is an essential ingredient in developing the highest quality search results for users who query. And just what kind of historical data are we talking about here? Well, things like:

* the site's inception date (more likely the date the Search Engine noticed you)
* how frequently documents are added and removed from the site
* how often sites change over time
* number of visitors over time
* number of repeat visitors
* number of times your site is bookmarked
* how often keyword density is changed
* the rate at which the site's anchor text is revised
* inbound/outbound links - how long in place and high trust (quality) links

The list goes on and on. Factors associated with your domain include: how long your site has been registered, has the domain expired (ghost sites), is the domain stable - as in not moving from one physical address to another.

Links remain a key component of Search Engine 125. Links have to be relevant to your site. Links to your site increase in "SERP Power" as they age. Link growth should be slow and steady. A sudden influx of inbound links - especially links that have no relationship to the content of your site - is a surefire way to drop in the SERPs. Google gives such sites a much lower score.

How about data on your visitor traffic? How will Search Engine 125 weigh that? Number of visitors, growth in visitor rates, spikes in visitor rates, the length of each visitor's stay, number of bookmarks to and favorite rankings of your site - all enter into Google's new Search Engine algo according to the patent application.

Another weighing factor is search results. The number of searches using a given query word or phrase, a sudden increase or decrease in click through rates, an exceedingly large number of quick click throughs (which might indicate 'stale' content), again all factors that Google believes will increase the quality of its search results.

Other factors are also listed as part of the patent application. A site with frequent ups and downs in traffic will lose points for untrustworthiness (even if your site sells only seasonal items!). Keyword volatility, focus change and other variables will also be employed in Google's never-ending quest to quantify the quality of each site its Search Engine delivers to users based on their queries.

So, okay, where's the mystery? The intrigue? The disinformation? The e-commerce community is abuzz with speculation - speculation that Google's well-publicized patent is nothing more than a plant to throw off the competition, disinformation intended to keep the competition and SEOs off balance. So why the speculation? Well, even a quick scan of the patent application reveals large areas of gray, vagaries and downright inconsistencies within Google's proposed ranking criteria. For example, sites are penalized for changing content often (untrustworthy) and rewarded for the frequent addition of new content (freshness). A paradox, you say? Or all part of Google's master plan to feint right while going left.

The object, in the end, is quality search results. That's what Google, Yahoo and the other popular search engines want - that perfect equation, the ideal formula that will provide high quality search results. And for site owners and designers who, in fact, do keep their sites fresh, who have quality links useful to visitors, who deliver the information the user is looking for - there's no reason for concern. However, the owners of links farms, keyword dense sites and cyber garbage dumps should sit up and take notice. In the end, quality search engines will inevitably improve the quality of content available on the Internet.
Related Posts Plugin for WordPress, Blogger...

Followers