Inbound

  • Related

  • The Evolution of PageRank

    Pagerank used to be a simple weighting factor for all links regardless of the topic of the page that contained the link. This led to a small industry that focused around buying and selling high-PageRank links. However, when anyone can achieve high rankings by simply buying enough links from any website, or trading links with any unrelated website, Pagerank loses its value as a factor in ranking websites accurately.

    As such, Google has done some tweaking of how it analyzes the value of links. Links are now scored differently and some links may not count as much as they used to. PageRank as the defining metric for links is becoming less important and the other variations listed below are becoming more important.

    Topic-Sensitive PageRank

    Topic-sensitive PageRank computes link value based only on incoming links from pages that are returned from a given search result set that matches the search query (whether the result set is 100 or 10,000 pages is not known).

    This means that a flower site only gets links counted from other sites that are related to flowers and gardening – not from sites that are about mortgage loans for example.

    By using Topic-sensitive PageRank, Google hopes to filter out irrelevant links that have skewed the value of PageRank in the past.

    PageRank is Being Replaced By LocalRank

    Traditionally, PageRank (PR) was calculated once a month for every page in the Google index. The PR score for a page was pretty static until the next time around that PR was calculated, and was computed based on ALL links that point to a page.

    Contrast this with LocalRank, which computes a link score based only on incoming links from pages that are returned from a given search result set that matches the search query (whether the result set is 100, 1000 or 10,000 pages is not yet known).

    Which means that a flower site only gets links counted from other sites that are related to flowers – not from sites that are about mortgage loans for example. This is similar to how the Teoma search engine does it.

    Think of LocalRank has a dynamic PageRank score that is computed on the fly each time for a given search query There is evidence to suggest that as part of the Florida algorithm, Google is using their newly-patented LocalRank system to replace or supplement PageRank in calculating incoming link scores to a site. PageRank has been abused as more people try to get as many links as possible to their site – regardless of whether the other site has anything to do with the topic of the linked-to page. By using LocalRank, Google hopes to filter out irrelevant links that has thrown the value of PageRank off in the past.

    LocalRank can have a big impact for sites that currently have lots of incoming links from sites that aren’t related to the linked-to page and whose favorable rankings have been skewed as a result of PR rather than on-page optimization (content, titles, headlines, etc).

    LocalRank takes a lot more processor power as a PageRank-like score is computed on the fly for every search query. This may also explain why running the Florida filter test (using excluded words in the search query) brings back pre-Florida results – it invokes the older algorithm that uses the static PageRank score which takes up a lot less server processing power. For those search queries that do not use exclusion words (and perhaps also for “simpler” queries), the new algorithm using LocalRank is used instead.

    TrustRank and the Sandbox

    A variation of PageRank whereby links from site that are “trusted” by Google carry more weight than other links. This also related to the Google Sandbox. As you recal , the Google Sandbox is a series of filters applied to new sites that cause them not to rank wel  or rank at all for anything but very niche, unique keyword phrases, such as their company name.

    TrustRank says that new websites either have to reach a certain age (say 6 – 18 months) OR obtain relevant, quality links from authoritative “highly-trusted” sites to escape the Sandbox. However, links from highly-trusted sites can be very difficult for new sites to get. For this reason, most new sites must be of sufficient age AND the links that point to new sites need also to be of sufficient age and at least “moderately trusted” before a new site can rank wel .

    The TrustRank threshold that new sites need to overcome to escape the Sandbox varies by keyword and industry. Gambling and pill sites have a much harder time breaking free from the Sandbox filters than say baby blanket sites.

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Related Blogs

    Share this to your friends

    Leave a Reply

      

      

      


    You can use these HTML tags

    <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

    Maximum 2 links per comment. Do not use BBCode.