Google’s John Mueller was asked in a Webmaster Hangout why Google ranked websites that used bad link building. John Mueller explained how Google treats those bad links. Then shared the ranking factors that caused those sites to rank number one.
Why Reporting Bad Links Does Not Always Work
A web publisher asked why sites with bad links ranked. The publisher added that they had reported the site but the site continued to rank well.
Here is the question:
“I see a disturbing amount of link networks and nefarious link building schemes being used… I reported these as suggested but is there anything else that we can do?
This is really frustrating.”
John Mueller responded that the spam report form works, but not always the way you hope it will:
“Reporting them in the… search console, the spam report form, the link spam report form, that’s kind of a good place to go.
That helps us to be better understand that these are pages that we need to review from a manual web spam point of view.”
John Mueller then cautioned against high expectations when reporting spam:
“It’s not really a guarantee that we drop those pages completely.”
Mueller explained why the spam report form does not result in an automatic penalty:
“…when it comes to competitive areas, what we’ll often see is that some sites do some things really well and some sites do some things really bad.
We try to take the overall picture and use that for ranking.”
John is explaining that the bad links are ignored and that the real reason the site is ranking is because they do some things really well.
Ranking Signals that Power Sites with Spammy Links
Now here is where John Mueller alludes to the ranking factors that cause the link spammers to rank:
“For example, it might be that one site uses keyword stuffing in a really terrible way but actually their business is fantastic and people really love going there, they love finding it in search and we have lots of really good signals for that site.
So we might still show them at number one, even though we recognize they’re doing keyword stuffing.”
As you can see, many of those signals that influence the rankings have to do with user interactions with the SERPs and user expectations. What Mueller seems to be implying is that users themselves are some of the sources of the ranking signals that power sites with spammy link building.
John Mueller explains that ignoring bad links is something Google does.
“A lot of times what will happen is also that our algorithms will recognize these kind of bad states and try to ignore them.
So we do that specifically with regards to links… where if we can recognize that they’re doing something really weird with links… then we can kind of ignore that and just focus on the good parts where we have reasonable signals that we can use for ranking.”
…we try to look at the bigger picture when it comes to search, to try to understand the relevance a little bit better.”
Bad Links and Competitor Research
The important takeaway here is that what you see in the backlinks is not necessarily the reason why a site is ranking. Some publishers feel they need to copy the competitor’s link building in order to compete. But that’s not necessarily the case, especially if the links are spammy.
That kind of false evidence is called a Red Herring. A red herring is a tool that authors use to trick a reader into believing that one of the characters is guilty. What they do is write big obvious clues that point to the red herring as the guilty person when in reality it’s someone else entirely.
This happens with competitive research. My opinion is that you shouldn’t stop your competitor research when you come across spammy backlinks. Dig deeper and you’ll likely find the real reason why a site is ranking.
Isn’t understanding why a competitor ranks the point of competitor research?
This content was originally published here.