Site icon Search Engine People Blog

Weakest Link

Stay Connected with Us!

Introduction

If you were expecting to find a website devoted to a disciplinarian lady dressed in black, then you have come to the wrong place.

This is not about that popular television quiz show, Weakest Link, hosted by Anne Robinson.  That was first shown by the BBC in the UK on 14 August 2000, and you can now even play an online version of that game.  It goes without saying that such a topic is not covered by SEO Scoop.

Instantly I am sure everyone then guesses what we will be talking about.  It clearly must be something to do with Google.  I should quickly clarify that this has only the vaguest connection with an earlier post this week, The Weakest Button - An Open Letter to Matt Cutts, Google

According to his comment in the Sphinn item on that post, Matt Cutts found the suggested mock-up for the Classic Search buttons interesting.  Whether anyone in Google will find what is written below of interest remains to be seen.

The title was chosen because in the game, Weakest Link, contestants were excluded if they did not measure up to the required standards.  That is the concept that will be developed in this article.

Links

As most people realize, links which is short for hyperlinks are an important factor in the Google algorithms for search.  At one point, Google called them backlinks but now seems to use the more precise term, inbound links.  Yahoo uses the term inlinks to mean the same thing.  One good source of information on Google's thinking is the Google Webmaster Central Blog.  As its tagline says, it offers Official News On Crawling And Indexing Sites For The Google Index.

Here is what they say about inbound links.

Inbound links are links from pages on external sites linking back to your site. Inbound links can bring new users to your site, and when the links are merit-based and freely-volunteered as an editorial choice, they're also one of the positive signals to Google about your site's importance. Other signals include things like our analysis of your site's content, its relevance to a geographic location, etc. As many of you know, relevant, quality inbound links can affect your PageRank (one of many factors in our ranking algorithm). And quality links often come naturally to sites with compelling content or offering a unique service.

They have some useful suggestions on how to increase merit-based inbound links.  In summary, you should create unique and compelling content on your web site by such methods as:

You will find much more detail in that blog post and they also encourage the use of the Webmaster Tools website to identify how well you have managed to create merit-based inbound links.

Current Google Algorithm

Although the Webmaster Central Blog gives useful information, it is a little general.  How can we develop more precise guidance on what to do?

The Google search approach relies on PageRank Technology

PageRank reflects our view of the importance of web pages by considering more than 500 million variables and 2 billion terms. Pages that we believe are important pages receive a higher PageRank and are more likely to appear at the top of the search results.

PageRank also considers the importance of each page that casts a vote, as votes from some pages are considered to have greater value, thus giving the linked page greater value. We have always taken a pragmatic approach to help improve search quality and create useful products, and our technology uses the collective intelligence of the web to determine a page's importance.

Getting Quality Votes - Link Juice

A popular term for the value of the vote from another website that provides an inbound link to your website is Link Juice.  Any given web page has only a certain value for PageRank which is distributed across all the links leaving that web page.  Thus any more authoritative web page with few outbound links from it provides more Link Juice to each of those outbound links than a weaker web page with many outbound links.  The following three articles from 2007 provide more information on Link Juice and are still valid.

The problem with this approach is that once everyone knew that the more inbound links the better, such links were created by whatever methods would work.  The vast majority of such links could never be thought of as merit-based inbound links.  The Google solution to partially correct this was to introduce the NOFOLLOW tag

Nofollow

There seems to be some confusion about the nofollow tag but here is the Google explanation:

How does Google handle nofollowed links?

We don't follow them. This means that Google does not transfer PageRank or anchor text across these links. Essentially, using nofollow causes us to drop the target links from our overall graph of the web. However, the target pages may still appear in our index if other sites link to them without using nofollow, or if the URLs are submitted to Google in a Sitemap. Also, it is important to note that other search engines may handle nofollow in slightly different ways.

The nofollow tag must be assigned by the owner of the web pages. The basis for this assignment seems to be based on intent.  If the owner of the website has received some benefit, usually cash, by placing the outbound link and the intent is to influence PageRank, then the link should be nofollowed. This of course then nullifies the PageRank influence.  

This paradoxical situation leaves a degree of arbitrariness in its application.  Some websites such as Twitter apply the nofollow tag on all outbound links.  This is why you will see cries from the heart that Google and/or Twitter Need to Ditch Nofollow for All Our Sakes!   In consequence, Julie Joyce provides guidance on How To Avoi
d The Link Vacuum Effect
.  In effect it comes back to developing worthwhile content.

The Present State Of The Web

It is interesting to take a big picture view of the web under the influences described above.  A whole industry has developed to create links to influence Google rankings.  Armies of individuals seek agreement between owners of pairs of websites that agree to exchange reciprocal links.  Or it can arise through multitudes of websites created purely to contain links to other websites.  Although Google has clearly stated that any such links created purely to influence rankings are counter to their Quality Guidelines, it seems to have little effect on the flood.  For reference, here are what those guidelines state on links:

Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.

A Disclaimer

Google carefully guards the secrets of its search algorithms.  Accordingly the following is based purely on speculation and may be completely in error.

It is based on the assumption that almost all web pages are included in Google databases and have some PageRank value however small.  It should be noted that this is not the PageRank value as displayed by the Toolbar PageRank gauge.  Instead it is the precise mathematical value used within the algorithms.

It is also assumed that only a small proportion of Web pages are excluded from these databases.

Possible Alternative Algorithm

The assumption is made that the score that measures relevance for a given Web page in a given keyword query includes as one factor the sum of a very large number of PageRank contributions from inbound links from other web pages.  Even though each inbound link provides a miniscule contribution the sum from thousands and thousands of inbound links can produce a measurable contribution to relevance.  This is why spammers generate thousands of web pages to target web pages which can rank high in keyword rankings.

The alternative that is being suggested here is that most outbound links from most web pages would be assigned a zero PageRank value for algorithmic purposes.  Only those outbound links with PageRank contribution above a certain threshold value would retain this normal PageRank value in the algorithmic calculations.  This would be a very tiny fraction of all outbound links from all web pages.

This would mean that the algorithms are ignoring (setting as 0) potential PageRank contributions from such outbound links as:

Benefits Of This Alternative Algorithm

At present the general view is that any link is worth having even if its contribution is incredibly small.  The more links the merrier.

If this alternative algorithm thinking has any merit and is accepted, then its logic that the weakest links are out can be explained and widely publicized. 

Given the cutoff arrangement in this alternative algorithm, it is no longer true that all links have value in the algorithm.  This simple and clear statement should encourage people to go for content that is valuable and stop wasting their time on links of dubious value.

How To Work Your Links At Present

Even if this alternative algorithm is not accepted, it is probably wise to behave as if it were true.  It is much better to put effort into getting worthwhile inbound links rather than going after thousands of possibly dubious links with probably miniscule benefit.

With the present algorithm, actions that go against the quality guidelines may or may not damage the keyword ranking of your web pages in any given period.  If you decide to take the risk, you can always use 'throw-away' domains if your actions are spotted and your website is penalized.  With the alternate algorithm it will be very clear that actions that go against the quality guidelines can be shown mathematically to have zero effect. 

Conclusion

This suggestion is very speculative.  What are your reactions?  Do you think it would have a beneficial effect?  Please add your comments.  There are no wrong answers.