Monthly Archives: June 2012
Successful web sites have one very important thing in common. It doesn’t matter what type of website it is. The site could sell things, offer help with math homework, show home videos, talk about political viewpoints, or provide a non-intrusive way for friends and family to keep up with a loved one who is dying. The one thing every website has to have to continue to exist is visitors.
The vast majority of websites are found by people logging onto their internet service, going to a search engine and typing in what they are interested in viewing. The best known and most used search engine in the world is Google. In 2011 people all over the world submitted more than 4.7 billion queries to Google’s search engine every single day. That’s 1.7 trillion searches a year. With that many potential visitors, all website managers want their site to be brought up in the search results.
But there’s a big problem. Say someone wants to know where to find free help with math homework. There shouldn’t be too many places that provide that specific service, right? A Google search comes up with 21,600,000 hits in less than half a second. Suddenly where in the search list a website is listed takes on great importance. Website success becomes closely linked with a sites rank in the Google search.
What becomes important is the method that Google uses to rank the sites. When Larry Page, the co-founder of Google, was a Ph.D. student at Stanford, he believed that he could design a formula, or algorithm that could render a value for a website based upon the number of links connecting to and from it to other websites. This concept came from research where an article or paper’s value was determined by how many other publications cited that article. Page and his friend Sergey Bri, Google’s other co-founder, applied this concept to the internet and its interconnectedness through hyperlinks between webpages. They developed Backrub, software that crawled through all the webpages on the internet, looked at the quality and quantity of other webpages that linked to each specific page and assigned a PageRank to each webpage.
Over the years Google has improved and adjusted the algorithm it uses to rank websites. As people began to understand the importance of search engine algorithms, a new profession specializing in search engine optimization (SEO) was born. SEO works to improve website ranking by understanding the algorithm and developing strategies designed to take advantage of the things in the formula’s components.
There are many techniques to improve a website’s PageRank. Some of these techniques are considered a part of good web design and ethical marketing. These are called “white-hat” and tend to focus on providing content for web visitors, not just for search engines. When helpful, factual, and quality content is created, the resulting boost to a website’s visitors, as well as its increased search engine rank, is long-lasting. “Black-hat” SEO techniques primarily involve deception. For example, adding keywords of links in the same color as the webpage, thus making them invisible or placing them off of the viewable portion of the page. These techniques essentially seek to scam the search engines to falsely influence the PageRank. They can work to either rise up the rank of a page, or else attack the competition in trying to lower their rank.
To try to more fairly rank pages and penalize pages that use “black-hat” techniques, Google developed Panda. First released in February 2001, Panda was designed to increase the rank for high-quality websites, and lower the ranking for poor-quality sites. Developed by name-sake Navneet Panda, this algorithm uses artificial intelligence designed from the input of human quality raters’ appraisals of thousands of sites. They reviewed characteristics such as quality content, design, speed, and trustworthiness. These types of ratings were then given a much higher importance than the older PageRank.
A major paradigm shift within Panda is that it ranks entire websites instead of just specific webpages. There is also focus upon the age of the webpage, with older pages ranked lower than newer. Critics are concerned that many well-developed, in-depth websites that house “evergreen content,” information that doesn’t often change, are unfairly penalized.
Panda has been consistently updated. The January 2012 update added page layout to the algorithm. It particularly lowered the rank for those sites with little to no content “above the fold.” This term comes from newspapers and refers to the importance of the information on the upper half of the first page. Thus, this update focused upon webpages that contained little to no quality content at the top most portion of the webpage.
The most recent update, v3.6, occurred on April 24, 2012 and was named Penguin. Its purpose was to shed light and heat on those websites that are in violation of Google’s Webmaster Guidelines. It imposes severe penalties for sites with “unnatural link velocity.” In other words, if the sites are deploying black-hat SEO techniques, the website owner is notified that they are not within Google’s guidelines, and they receive Penguin’s web-spam penalty, which significantly lowers their ranking. More than 700,000 websites received a notice from Google informing them of this penalty being assessed to their site.
Google created two feedback forms for webmasters concerning Panda, page layout and Penguin. One is for those who believe their rank was unfairly downgraded, and the other is used to report web spam still highly ranked.