The work of an online search engine might seem very easy to a layman because it does nothing more than searching links and putting them on many pages. However, that’s just as wrong as you can get about search engines. There is a lot that needs to be done when running a search engine – managing such huge amount of data is just one part of it. A search engine constantly needs to work on methods it uses to put the searched websites on its search results in a particular order. Are these results being produced randomly?
Not at all! When you search for something on Google you are presented with results that are not only most relevant to your queries but one of the major concerns for Google is that those results have to be informative and of high quality as well. In the past you could end up on a page stuffed with keyword that you searched for but with no information at all. This is not the case anymore – at least on most occasions. Google has to tackle a lot of problems when it is trying to put the best results on top and push down the ones with low quality. Lawyers in particular, have been chasing keywords since the days of Excite. Once they started getting good at it, Google had a tendency of coming along with some new “spam” update, or something or another that would temporarily stop many overly aggressive law firm sites from ranking.
In time, the SEO company, or the solo guy would figure it out, and catch up. With Hummingbird and Penguin, that all changed. Gone were the days of ranking for terms easily, unless you used a paid Google service, such as adwords, or PPC. Now the bid prices are so high, lawyers are ditching their over-optimized websites and praying that they can get back some of their old rankings. Thus, the seasaw of mastering and being frustrated by a new update continues.
How Does Google Produce Search Results?
Google would first look into the relevance of the search with contents of the website and present the results that are most relevant to what a user has searched for online. The words a user inserts in Google for searching something are called key phrases or keywords. One way for Google to check the relevance of some website’s content to your search query is to see how your entered keyword matches the keywords found in that website’s content. But what if there are several websites with your searched keyword’s abundance in their content?
At this stage Google uses its algorithms to find out a website’s reputation. The reputation of a website is found out by evaluating its page rank, domain authority and backlinks. Backlinks are the most important things here. These are the links present on other websites, forums, blogs, article directories etc. that point back at your website. Clicking on these links will take an internet user to your website. With powerful servers and huge libraries of indexed websites there is nothing special that Google needed to do. Then why does Google keep coming up with new updates in its algorithms?
The Publicly Stated Reason For Algorithm Updates From Google
The mere foundation of rules and regulations is to keep humans under check. This is the same reason for Google’s consistent and continuous updates in its algorithms. The criteria for putting results on its search result pages was quite simple for Google but SEO (search engine optimization) professionals did not let things remain that simple. Deceptive methods were used to get the website to rise in rank and activities that were against Google guidelines were conducted. They were named as black hat SEO techniques.
What black hat SEO experts did was they stuffed the web pages of the websites with keywords that were being searched the most on Google. Due to unnecessary use of the keywords in the content the content lost its quality and became completely unreadable for humans. However, it was enough to fool the search engine into thinking that the web page was most relevant to a user’s query, which was true, but such a page offered nothing to users. Another thing that SEOs started doing was creating backlinks in ways that were not the recommended methods for creating backlinks.
SEOs found out that a website that had more links pointing to it was considered more authoritative by Google and having backlinks coming from such a website gave more authority to their website as well. Instead of naturally creating backlinks the professionals started buying these links. At the same time SEOs got hold of softwares that allowed them to create backlinks, thousands of them, on websites within seconds. The software they used was so high-tech that it would even solve CAPTCHAs without a problem. In short, Google was being fooled and Google felt the need of doing something about the situation and regain its authority.
Google Regains Authority
When Google saw that SEOs were taking the wrong and short path to their work and trying to game Google, it came up with a plan to regain its authority. Google started to roll out a string of updates. It is not to say that Penguin update was the first update in algorithms by Google. There was panda update and other regular updates in its algorithm before Penguin even came into existence. However, with Penguin update Google was going to completely change how the SEO professionals were going about their businesses and how they were using methods that were against Google’s SEO guidelines to make websites climb up in search results.
Google keeps making changes and modifications to its algorithms but the major difference between these changes and Penguin was that Penguin waddled into the worlds of SEO professionals to teach them some lessons on link building, backlink generation etc. The main target for this update was to trim the protruding edges of link spam and bring it within the shroud of legality. Google was quite successful with what it intended to do in this case. Google sent many following Penguin updates after the first major one and every new penguin affected a huge number of websites in the search results.
The Various Link Spamming Methods And Penguin Updates
By rolling out a string of updates on its penguin algorithm update Google sifted good from evil slowly but steadily. The world of search engine optimization experts knows that Google shook the foundations of how they used to work before Penguin walked into their lives. They had to revise every single strategy that they were using before this update in relation to link building. Here’s how Google took care of different types of link spamming in its string of updates.
- Penguin 1.0
This was the first tsunami in the world of SEO and according to the most authentic statistics nearly 3.1% websites were affected by Penguin 1.0. At first the percentage might seem small but you might want to look at the total number of websites that Google indexes to know how huge 3.1% was. It must also be mentioned here that some of the biggest companies of the world were affected by this update and JC Penny was one of them. Penguin 1.0 came out of the cold lands of Google algorithms in the April of 2012.
The main objectives of this particular update from Google were to thwart the over-optimization of anchor text. Google did not intend to stop websites from using the anchor text and linking it back to their home pages completely but over-optimization was the area of target. Companies would have links coming back to their websites with the same anchor text over-optimized. By over-optimization it meant that if they had thousands of backlinks nearly 80% of those links were coming from the same anchor text from various websites.
Another thing that Google pierced into was links from bad websites. When you want domain authority and better Page Rank you need links coming to your website from other website. However, the important thing here is how good or reputable the website is that’s linking back to you. If the website your backlink is located on is not a good website then you are up for a Google penalty. Adult websites, websites known as phishing websites, websites with viruses and malwares on them or websites that have been recognized by Google as link spamming websites are considered bad websites.
What you need to know here is that the only thing that Penguin 1.0 targeted was not only the links coming from those websites to yours. It would punish you too if you had links to bad websites hosted on your website. The third important thing that was targeted by Penguin 1.0 was irrelevant links. If you have a website about mp3 songs and there are links coming to your website from a website that sells herbal medicines, you will face the rage of Penguin 1.0.
The web page where an anchor text or your link is located should be relevant to your business and the anchor text that’s directing to your website should be relevant to the content of the web page that visitors land on after clicking the anchor text.
- Penguin 2.0
It took more than a year for the next Penguin update to haunt the SEOs while they were still trying to recover from the jolt that Penguin 1.0 had given them. SEOs were worried what this update was going to bring for them and if the hammer would hit them just as hard as it hit them in the previous version. Even though the number of websites affected by this update was smaller than the previous number, it was still huge and it further filtered the SEO process. The first Penguin update was only for the home pages of the websites and links pointing back to home pages. This particular update was focused on internal links on the websites.
Two small Penguin updates had already been rolled over by Google previous to this one but Penguin 2.0 needs special mentioning because this one was a big update. Most of its particulars were the same as the first Penguin update but this time Google went deeper into the links and overhauled the internal linking networks of the websites as well.
- Penguin 2.1
With Penguin 2.1, Google hit hard again. This time Google was after websites that were creating spun content, using microsites to help them get better rankings on the search engine and resorting to tiered link building for better rankings. In this update Google promoted the contextual links that were natural and original. Contextual links are those that are contained in an article and appear naturally and link back to another web page on another website. However, if these links are not relevant to the page they are on and to the web page they are directing to, Google would punish.
Google also targeted companies that were creating spun articles to place anchor texts and links in them. Spun articles are articles written from original articles by changing a few words within them. These articles are not written with hands mostly. In fact, companies use a spinning software for this purpose. What this software does is that it replaces the words in the original article with their synonyms. Even with the best software you will not be able to create an article that is as natural sounding as the original one. Furthermore, these softwares could create thousands of articles in a day.
The third black hat technique targeted by Google was tiered link building. In this type of linking the SEOs would create microsites for the actual websites. They will then create links for these microsites by posting links in spun articles or in blog and forum comments. They will use all different types of grey methods to link back to their microsites and have them indexed. Google struck the hammer down on these websites and tiered links as well. Google made it hard for them to fool Google and now they have to be more intelligent and wise with their approach.
Is It Possible To Recover After Penguin Hit?
Many SEOs started to believe that it was impossible to recover from the punishment of Penguin but there are hundreds of online websites with experts posting case studies on them where they helped websites to get back on their feet after penguin hit. The most important thing that needs to be done by websites is that they need to get rid of all spamming and bad links. They can do it manually but doing it manually would require weeks and months of work. A disavow tool has been released by Google that can be used by websites to disavow the links that they consider dangerous for their website’s reputation.
Some experts have suggested that a website be created as a website. If you are always focused on SEO then you will definitely end up doing something that is considered fishy in the eyes of Google and one day you will get hit by another animal from Google’s zoo of intimidating species. Use only natural content on your website and get backlinks only from reliable websites. Make sure your anchor text is used only when needed and that there are various versions of the anchor text that link back to your website and not just an exact one.
A few unfortunate websites will have to be started from the scratch. Working on their devastated structure would actually require more time than building a new website and starting afresh. When you start a new website make sure you are focused only on your business and the legit ways of marketing your product, services and business. Don’t go after SEO strategies blindly because this might hurt you in the long run. Use online social media to get the attention to your personal injury website, for example, that you have been looking for through SEO.
Posts by Michael Ehline
- PR, Social Media, Content Marketing & SEO – A World of Rapid Changes
- How Will Google's EU Fines Affect PPC Bids?
- EU Slaps Google with More Antitrust Allegations
- Google Lawyers Up Over Extensive Probe
- Fight Between EU and Google Just Warming Up
- Tech Lobbying Money a Troubling Trend
- Google Seeks Self Driving Car Safety Exemption
- The "Right to be Forgotten" and Legal Precedent
- Gmail a Potential Security Minefield
- Fight Between Google and EU Just Warming Up