How to Combat & Recover from Negative SEO Attack: The Survival Guide

What Exactly is Negative SEO?

Negative SEO is when someone uses unethical, black hat techniques to harm a website’s rankings in the search results. Basically, they do everything Google says not to do in order to make it look you are violating Google policies. Although negative SEO can take different forms, here are some common ways your site can be attacked:

  • Building bad links to your site using unrelated keywords
  • Hacking your website to add malicious code
  • Creating fake social profiles of your company

Why Can Negative SEO be Risky?

Google can’t always tell the difference between a site being hit with negative SEO and an actual spammer, which is why you should always keep an eye on your websites links and traffic. Start-ups or new websites are generally not a target (but they can be); it is usually bigger websites that become the prey.

Google has started to address this issue and is now providing solutions through the Google Webmaster Tools and the Disavow tool. This is another reason why people tend to avoid taking a risk.

Bad Links

Your competitor can build bad links to your website by paying off a few dollars to adult sites, gambling sites, or a banned website. Creating a link farm is the easiest way to get you hit with a Google penalty. This is done by creating a group of websites that hyperlink to each other to increase the number of incoming links.

If a competitor does this aggressively enough, Google may ban your site. Some aggressive negative SEOs will only target to take down a specific page on the website by building low-quality links to a subpage.

Bad Links

Your competitor can build bad links to your website by paying off a few dollars to adult sites, gambling sites, or a banned website. Creating a link farm is the easiest way to get you hit with a Google penalty. This is done by creating a group of websites that hyperlink to each other to increase the number of incoming links.

If a competitor does this aggressively enough, Google may ban your site. Some aggressive negative SEOs will only target to take down a specific page on the website by building low-quality links to a subpage.

Identifying Bad Links: When there’s a relatively huge list of backlinks, it gets time-consuming to go through each one of them for the bad links identification purpose. Use tools like URL ProfilerCognitiveSEO, and Backlinks Monitor, to identify bad links and unnatural anchor text.

Disavow the Bad Links: After you have identified and made the list of the links to be removed, submit the data into the Disavow Tool. The Disavow Tool will label a link with a tag so that Google algorithm does not credit it to your site in a positive or negative way, but keep in mind there are mixed reports on the efficacy of the tool.

You can use the Disavow tool to disavow individual links or the whole domain. I recommend using the tool as the last resort. Here’s a complete guide on Identifying Bad Links and Pruning Them Using Google’s Disavow Tool.

Contact the Webmasters: The key point to understand is that disavowing a link won’t help you come out clean. You need to contact the domain webmasters and ask them to remove these links. Getting the contact details of all these bad links might take you several days. This is where tools such as URL Profiler and Rmoov come handy:

FAQ: All about the Google RankBrain algorithm

Google’s using a machine learning technology called RankBrain to help deliver its search results. Here’s what’s we know about it.

So RankBrain is part of Google’s Hummingbird search algorithm?

That’s our understanding. Hummingbird is the overall search algorithm, just like a car has an overall engine in it. The engine itself may be made up of various parts, such as an oil filter, a fuel pump, a radiator and so on. In the same way, Hummingbird encompasses various parts, with RankBrain being one of the newest.

In particular, we know RankBrain is part of the overall Hummingbird algorithm because the Bloomberg article makes clear that RankBrain doesn’t handle all searches, as only the overall algorithm would.

Hummingbird also contains other parts with names familiar to those in the SEO space, such as Panda, Penguin and Payday designed to fight spam, Pigeon designed to improve local results, Top Heavy designed to demote ad-heavy pages, Mobile Friendly designed to reward mobile-friendly pages and Pirate designed to fight copyright infringement.

I thought the Google algorithm was called “PageRank”

PageRank is part of the overall Hummingbird algorithm that covers a specific way of giving pages credit based on the links from other pages pointing at them.

PageRank is special because it’s the first name that Google ever gave to one of the parts of its ranking algorithm, way back at the time the search engine began, in 1998.

.. How many signals are there?
Google has fairly consistently spoken of having more than 200 major ranking signals that are evaluated that, in turn, might have up to 10,000 variations or sub-signals. It more typically just says “hundreds” of factors, as it did in yesterday’s Bloomberg article.

.. And RankBrain is the third-most important signal?

.. What are the first- and second-most important signals?

When this story was originally written, Google wouldn’t tell us. Our assumption was this:

My personal guess is that links remain the most important signal, the way that Google counts up those links in the form of votes. It’s also a terribly aging system, as I’ve covered in my Links: The Broken “Ballot Box” Used By Google & Bing article from the past.

As for the second-most important signal, I’d guess that would be “words,” where words would encompass everything from the words on the page to how Google’s interpreting the words people enter into the search box outside of RankBrain analysis.

Blog subdomain or subdirectory? Hint: one is 40% better

Historically the most common reason for companies to host their blog on blog.example.com is due to technical reasons. Throwing your blog on a subdomain is the easiest route when setting up a 3rd party blog like WordPress, Tumblr, SquareSpace, etc. To achieve www.example.com/blog is surprisingly difficult and in the case of WordPress forces your engineering team to self-host WordPress. Ask any engineer or dev ops eng, they’ll tell you this has huge security and maintenance implications.

So instead of going through that hassle, the default choice is to just go with blog.example.com because it’s easiest, not because it’s best for SEO. Butter solves this issue for you (more on that later), first let’s look at the data.