Attorney Marketing | How and Why Link Schemes Don’t Work Unless Your Findlaw?

Circle of Legal Trust

I wanted to start with just an opinion here. I think Findlaw is unfairly gaming Google by taking advantage of Google’s new policy of discounting links from sites that are unrelated in content to the target site.

For example, Findlaw sells lawyers “Firm sites,” blogs, and hosts for them. All are related, as they are legal sites. Are you following this so far? First, many bloggers have long complained about Findlaw’s alleged sordid history. We hear stories of them gaming Google by selling links, calling it anything but that! Now we are talking about a very advanced automated blog network (no accurate authorship signals, etc.)

Do Paid Attorney Blog Networks Get a Google Pass

Google has explicitly stated that it hates paid blog networks and not actual votes (discussed below). And this is because a vast SEO company like Findlaw, or Scorpion Web Design, can easily create a vast network of sites. (customers pay lots of money for Firm sites, for example) And then, they spin LSI content with in-house content spinners. And this is how they blast them across this vast network of interlinked sites.

The Network

In other words, you pay them to make you a site, and they add other law firms’ content (actually content written by Findlaw writers) to your site and blog roll. So they can get even more money by making you pay extra to be part of the “network.”

You pay them to help them use your blog to help them make more sales. My opinion is that this is simply selling links. And this is a big no-no. But let’s look at this some more: My research has shown that “non-legacy” legal sites are ranking number one on the first page of Google for terms like “Los Angeles, personal injury attorney.”

If you or I tried to do this (get people to pay us to build their blogs and then link them all back and forth to each other), we would be blasted into cyberspace. Where is Matt Cutts on this? How can a company not gaming the system survive in a Panda/Penguin environment?

This raises several questions: Does Google give large SEO companies like FL a pass? I mean, all the sites are usually on the same server, and it is clear from my perspective these links are not legitimate votes. Most people in the early stages of learning how search engine algorithms work are tempted to test what they have learned. This can include buying fifty or one hundred domains to link all to the main website. Some people come up with a complex plan that does work for some time but does not last. But here, the Findlaw Blog Network seems unaffected.

One company, Dejan SEO, analyzed a large amount of link data between 2005 and 2011 to see the results with this type of domain owner. What was found is it did not work, attempting to influence Google’s algorithm. The result will be a waste of time when this tactic manipulates the Google algorithm of link graphs and its signals. Plus, it ends up in the loss of funds and results in Google penalties.

Opting for a link scheme to bolster traffic is one of the shortcuts that can easily get discovered. This shortcut can be found by an algorithm or through human review. But here, it has been working for months, and the internet is so far silent.

These sites rose relatively high from obscurity post Panda and Penguin. So this is a bad result, not a “real” vote. And it’s just as crazy as having Wikipedia show up as the first result in almost every organic search.

Here are other methods gamers use that don’t work like they used to.

Simple Content Networks

Content networks consume a large portion of the web and are growing fast; Panda also consumes them. The cost is moderately low when implementing the following basic steps.

  • Register Domains (coupons for cheaper pricing can often be found).
  • Purchase cheap hosting with WHM
  • Batch install CMS (one of the most popular is Word Press).

Content is the next step; content can be from:

  • Cheap article writing
  • RSS Feeds
  • Scraping
  • Spinning

Linking the sites is the next step in the plan.

  • Flow Page Rank
  • Collect Ad Sense money, sell links and link your leading site

The low setup cost and uncomplicated level is why there have been many low-quality websites showing up in Google’s results over the past five years. When the person starts to diversify IP addresses and use quality content, along with designs, the cost will be to operate the domains.

The cost will be much more expensive, and there is still no way to know that your plan will be successful long term.

Sophisticated Content Networks

I believe that Findlaw’s SEO tactics are just, more or less a sophisticated content network. Panda rooted out most of the artificial low-quality content sites in 2011. But there still can be some found that are ranking decently. One has to question if the unsophisticated level of schemes has gotten more advanced.

And I think the evidence proved that the Findlaw blog network is just that and VERY very sophisticated content linking scheme. What is happening is that some of the content setups have reached an in-between stage, which causes them to be more accurate and even have some helpful information. And this makes them more difficult to be killed off, but not impossible.

Google is committed on ridding the web of these content-farming sites and has enlarged the range of elements they use to determine if a network of websites is an actual or simulated link scheme designed to influence rankings.

But Google makes a lot of money off of Findlaw. Will this influence the spam team’s decision? Website owners that still attempt to try link schemes will find it almost a guarantee that at least one of the elements used will be seen by the tactics Google is using to ferret out these sites.

Link Buying Invitations

Many invitations from networks invite people to buy links, sell links, exchange links, and pay to blog. One of the things that will be seen within their pages is information about Page Rank increases. These are red flags for the spam that Google is weeding out in low-quality content website networks.

Domain Information and Google

As a domain registrar, Google can access information data about any domain they choose. The considerable amount of data they can access means they can compare ownership, contact details, domain naming patterns, and TDL consistency.
The thought of using expired domains for your site will not help. So this is because Google can determine ownership. They can tell whether it is new ownership or the previous owner restoring or re-registering a domain.

Private registration for all a person’s domains is not a solution to avoiding Google since it shows a pattern, and including other parts of the scheme can result in red flags.

Hosting Information and Google

There is a similarity in hosting characteristics in content networks that Google can compare server information and C-blocks, which includes the hosting company, geo locations, and server type. What this will mean to the person with many sites and a link scheme is expanding name server information and rotating IP addresses to a higher level. This is both time-consuming and expensive.

Content and Google

Websites are observed by Google for historical content changes and the frequency of updates due to Google’s capacity to track changes over time. Google’s index is very refined in its approach to websites.

The Panda updates have consistently sorted out content duplications, spun articles, and automation. Google knows that natural websites will grow gradually over time, whereas low-quality content website usually will generate content fast. But then, it will slow down unless you use an automated content scheme.

There are simple things that are a giveaway to Google. After all, they have an extensive database of addresses, and organization businesses. Next, contact information in Google Places, Google Maps, and other services can compare this data with information on sites they suspect of scheming. One of the other things that Google knows is that blog networks do not usually have “About us” pages with staff profiles, contact information, telephone numbers, or local maps.

Other signals alert Google to fake sites, like watching the topical regularity and mixture of content. Google can determine the reading level of content, the presence or absence of citations, and references, so qualitative analysis is a matter of how Google looks at a site.

The content types allow it to determine if it is commercial content, a blog, forums, news, academic, or social networking sites. Identifying information can be included in the content, but it can also be in the images and media on the sites, including file naming conventions. When the content is inconsistent, the site can be flagged by Google as fake. Google gives an explicit flag to places with no buy, connect, signup, rates, subscribe, or anything of that nature.

Google Link Signals

The algorithm Google remains based on links, so it is clear that Google understands connections completely. Internal links are analyzed, 301 redirects, hidden links, and outbound links. Sending the wrong link signal can be a bad choice.

outbound Links and Google

Outbound links leave a footprint when people attempt to manipulate rankings. All this starts from the anchor text that is used and consistently using the “exact match phrase” links; without using non-anchor text links, like click here and read more; your site can be flagged. The location of the links and the ratio of follow and no follow links on a page can be responsible for another flag.
Breaking the pattern of standard link patterns can cause unnatural links and cause a flag to be triggered.

Google’s Thoughts on Inbound Links

It is becoming more common to see inbound link signals; Google will look at how trustworthy the inbound links are, the topic of the pages, and the websites linking to the site. When the inbound links are forums spam, blog comment spamming, or hacked sites, Google will know, and there is no chance of a long-term linking scheme. The other things that Google will look for are placement velocity, link placement removal spikes, and the quantity and diversity of the inbound links.

Using Related Websites

So this is how I believe Findlaw is benefiting. So look at it like this, the available link metrics assist Google in forming a picture of your website. That way, it can understand the relation of your domains using cross-site interlinking patterns, cascading Page Rank low, and Page Rank sources with common elements. Google wants to see your content on related sites. But Google is interested in how you got that content.

But merely paying to be part of a content network is terrible. But look to see if it is trying to manipulate your link weight with a certain percentage of no follows, do follows anchor, etc. In my opinion, many do it, but this is a dangerous shortcut. So be careful.

The Technical Elements and the Site Architecture

They leave a footprint without using manually created websites in various ways and using several technologies. When Google looks for low-quality content farms Elements, Google is used and includes consistency in the CMS platforms used, consistent themes, plug-ins, the page extensions, such as HTM, HTML, PHP, or apex, URL structures, and URL rewriting rules.

Even when the site is manually set, there still can be an amount of recycling, and this is common at the maritime level, the file naming conventions, and the CSS classes. Looking at the footer is often a way to tell if a website is a part of a network since this is regularly duplicated or overlooked during the coding.

Google Focusing on Social Media

And although this is part of the matrix, one of my blogger buddies smokes every site on the FL blog network. He does so with his social profile, and has over 140,000 backlinks from mostly related sites. (unfortunately, primarily anchor text, though) And he is getting ruined by these Firm sites that rose to the top as soon as the new Panda/Penguin updates went live. He came to me and said he thinks that Google is paying off sites like FL with this new algo update because of the revenue Google gets from selling ads to FL.

Could that be true? Well, I looked at the social profiles of many of the FL sites that rank, and I could not find any who even had a Google+ profile, so I will say: “seems fishy.”

In 2011, Google focused on social media and used it to verify people and businesses. They use it to determine the influence of the company or person, and with Google + flagging potential spam and validating valuable resources will expand. But with the new algo, that only wants a 1-3% anchored text profile. Otherwise, it is not natural. So now the new site with a better profile weight. (since now everything Google said you were supposed to do, like use anchor text, is suspected as an SEO effort instead of a vote) This will easily beat a legacy site. The FL Penguin and Panda defeating backlinks make this happen rapidly.

Data Google Has at its Disposal

Google has a tremendous amount of browsing and behavior data for internet users and websites. This makes it easy for Google to determine the manipulated or low-quality content site, the high search result bounce data, and other flags generated with their link graph analysis algorithm.

Who Your Competitors Are

The content farms will find their competitors. The farmers are people trying to manipulate the system. And they want to be above you in the results. So there is a chance, even with your scheme’s most sophisticated setup, being picked up by a competitor and reported to Google.

If you think Findlaw has created a sophisticated interlinking blog network, you can report it here.
Google might penalize your website when they do a quality review. So the Google spam people keep the information, and you must explain in detail what you did to fix the problem. But before a reconsideration request to have a successful website, you have to work at it. There are only two choices: 1. do things the right way this time or 2. find a new scheme and risk being penalized by Google again.

Wrapping Up

It is getting more complex every day to play against Google’s algorithm unless, apparently, your Findlaw, even though Google has made its intentions clear. G will improve upon the evaluation of content through authorship signals. Let’s hope so since the only authorship signal I see here is that FL is authoring content, sharing it across sites of attorneys who probably don’t even know each other, and making killing off parasitic sites. Google plans a more intense assessment of the social graph. And it plans to improve understanding of the semantic qualities of content on the internet. Let’s see if they nail FL or give them a pass.

To have a successful link matrix, you will need to invest time. Also, energy and money must be spent on a white-hat SEO campaign. How to do this? Ensure that you or your SEO people like me recognize fake sites. And use sustainable practices. But make sure to expand content development power. Most of all, when these are followed, you’re golden. After all, when SEO is done correctly, your links are safe. If you want to learn more, you can visit me at G+.

Related Articles