11 Sneaky Negative SEO Attacks & How To Protect Against Them

murroughfoley

Murrough Foley

Posted on July 28, 2020

11 Sneaky Negative SEO Attacks & How To Protect Against Them

Negative SEO is nothing new. As long as there have been ways to improve your position in the organic results, there have been malicious actors who target their competition with negative search engine optimisation techniques.

This article will cover the common types of SEO attack, one or two less common types and a couple of sneaky ways bad actors can disrupt your business.

It is important to note at the outset that the best ways to protect your sites or your clients businesses, is to be aware of these various different strategies and closely monitor what is going on.

If you think your site has been the victim of an attack, a good SEO specialist should be able to diagnose the problem quickly and most times, remediate the situation. But like all things, prevention is better than cure and some of the attacks listed below require security measures put in place at the outset.

  • Spam Links / Link Farms / Manual Penalties
  • Anchor Text Ratio Unbalancing
  • Purchased Links / Cheap PBNs
  • Link Removal
  • Email Blacklisting Or Reducing Sender Score
  • Hot Linked Images
  • Scraped Content
  • DDOS
  • Slow Loris
  • Poisoned Canonical
  • Straight Up Hacking

If good search engine optimisation is knowing what and where the boundaries are, then negative SEO is knowing how to push the target into and over those boundaries.

Strictly speaking, the term negative SEO should only really be applied to attacks that aim to reduce your SERP visibility, but I’ve come to look at it more holistically as any digital attack meant to disrupt your online business operations. That can include brand sabotage, some variations of hacking or disruption of communications. This view may seem excessively broad, but as we go through the examples, you may start to agree with me.

Spam Links / Link Farms / Manual Penalties

Spam links are low quality links that can be built to negatively impact a site’s SERP visibility. Within this category of attack, there’s a lot of scope to do damage in multiple different ways. But to understand spam links, it’s best to understand a little bit of the history.

During the hay day of automated tools when Blackhat SEOs were able to rank poor quality sites for competitive keywords with ease, Google needed a way to improve the quality of organic results and filter these automatically generated links out of their link index.

It’s my view that Google’s web spam team were ready to throw a machine learning algorithm at the problem but needed a data source to separate the good links from the bad.

Cue the arrival of Google’s disavow tool.

Google introduced the disavow tool under the auspices of giving webmasters the ability to remove poor quality(purchased, automatically built) links from negatively effecting their site.

It was the proverbial confessional box for any nervous webmaster and promised a swift return to page 1 for those that had sinned.

Couple this technical “Mea Culpa” machine with a hearty dose of manual penalties applied to both the most egregious of offenders and Google had the perfect mood of fear, uncertainty and doubt spreading to fuel submissions to the disavow tool.

This fear drove countless webmasters to use the tool and feed Google’s Machine Learning system.

Using this constantly growing bank of submitted links, built by tools such as GSA SER, Xrumer and Senuke, Google was able to identify and devalue these links. And pretty quickly these automated tools fell out of widespread use to all but the most creative users.

How the Disavow tool works is open to debate. Is it a complicated machine learning algo that identifies common variables from the submitted pages? Or is it a far more simple, weighted list of bad domains with poor user registration controls and moderation?

Due to the sheer amount of computing power needed to analyse the common factors of a “bad site”, and then apply these identified signals to the entire web, my money is on a simpler type of solution.

So when John Muller ** says;**

FWIW looking at the site, I don’t see any negative seo effects there. Like the others in the forums posted, most of those links have no effect at all — sites collect links from all kinds of weird & spammy places over time, we’re pretty good at ignoring them .

He is not completely lying. But he is doing his usual thing of telling half truths.

Whilst a massive number of low quality article type links, showing up one day in your backlink analysis tool might look shocking, it’s probably not going to do too much damage. You ought to still disavow them though.

Google can filter out many automated links such as those that turn up for purchase on link lists. However, the more creative people out there will have built their own engines to post on more niche areas or done in a way that blends in with user generated content much better. This can’t be filtered.

Modern use of low quality links for negative SEO is more likely to be much slower and stretched out over several months so as not to arouse suspicion. The aim is to take down your site and not alert you to the process so that you will take action.

A large part of it is manipulating the link anchors that are pointed to your site.

Anchor Text Ratio Unbalancing

The Penguin update targeted webmasters who had over optimised anchor text, that is too say, webmasters who were gaming the system by using targeted key phrases in their anchor text and spawned the use of the full stop and coma as an anchor as SEOs tried to bring things within acceptable levels.

Unbalancing anchor text happens when a malicious actor builds ‘good links’ with over optimised anchors to one(or all) of your pages. Initially this can look like a positive signal, your page is generating natural links but if not monitored can push you over the edge into a penalty.

Things get more complicated when you understand that anchor text ratios flow through do-follow links.

Situation 1 — New tier 1 exact match do follow links are built to your site. A good SEO can easily monitor new inbound links on tier 1 and the effects to anchor text. Then react and disavow links when he/she feels things are getting critical.

Situation 2 — Exact match anchors are built on tier 2 to do follow tier 1 links. This is more difficult to identify and deal with, as most backlink tools are just not set up to easily provide you with the anchor used on tier 2. So without spending the time to dig into your website’s tier 2 anchors, you may never know why your rankings have dropped.

How To Mitigate Or Prevent

The only way to prevent anchor text unbalancing is to regularly do anchor text audits that look at Tier 2 and maybe even Tier 3 anchors. This is a proactive step that most businesses won’t be able to afford.

Obviously Purchased Links

Buying links is against Google’s terms of service. Most search engine optimisation professionals will keep a wide berth of any site which is clearly selling “guest posts” or indicates that a post was sponsored.

A bad actor can use this to his/her advantage seeking out domains that have been obviously penalised and dropped dramatically in rankings to buy a link and point it at your site.

The other variation of this would be buying links on the cheapest PBNs available on Blackhat SEO marketplaces.

Private/Public Blog Networks

A PBN, unless built very carefully, will leave a footprint that Google is well versed in spotting, the simplest of which would be all the sites on the same IP address but there are many more.

If a large number of your links are coming from a PBN with an easily discernible footprint, your site may be at risk of a penalty.

The problem you may face with this is, that many PBN operators cloak their sites from the Backlink analysis tools such as Majestic and Ahrefs, so you may have to trawl through your Search Console data to spot this type of attack.

How To Mitigate Or Prevent

There is no way to completely protect yourself against this type of attack but there are a number of steps to mitigate the risk of a malicious actor being successful.

#1. Monitor your backlink profile and investigate any new links to see if they have natural anchors and check the quality of the domain.

#2. Use the disavow tool for anything that you are uncertain about.

Link Removal

As you may have guessed, links are a major part of how Google identifies and rewards good sites. And large chunks of a companies marketing budget goes into raising brand awareness and the natural links that accompany that spend.

Those links are hard earned and what makes this type of attack really sneaky is that you may never be aware it’s going on.

An attacker might set up a branded domain with a different TLD or CCTLD for the purpose of impersonating someone from your company and requesting good links be removed.

All the pages linking to your site can be pulled from the various backlink databases and contact details of webmasters can be scrapped or manually retrieved.

It’s at this point the negative outreach campaign begins from an email that may look official;

Hey {first-name/}

It’s Jane Bloggs from YourCompany. I just have a quick request. We are facing some SEO issue that, to be honest, I don’t really understand. Anyway, I’ve been tasked with removing some of our links that we’ve accumulated over the years. Can I get you to remove the link on the page {target-for-link-removal}. I know it would only take a minute and it would really help me out. Thanks!

Regards

Jane Bloggs

The emails don’t have to come from a similarly branded domain, but obviously it’s the kind of thing that adds some authenticity to the email request.

These link removal requests can also be perpetrated from social media accounts that impersonate your brand.

How To Mitigate Or Prevent

There is no way to completely protect yourself against this type of attack but there are a number of steps to mitigate the risk of a malicious actor being successful.

#1. The first would be to own all the top level domains that match your main domain. So if your site is greatcompany.com , then buy the .net, .org and the main country tld where you operate whether it be .co.uk or .ru.

#2. The next step would be claiming all of your brand accounts on social media. Having a legitimate account with a little bit of activity on each of the major social media platforms should be enough raise alarm bells if someone tries link removal outreach via a different account.

#3. Finally, adding some security to your email service such as SPF, DKIM and DMARC records will stop anyone from spoofing your company email addresses.

Which brings us on to email. Most people wouldn’t classify this as a negative SEO attack, but at one time, spam reports could have your server IP pulled.

Email Blacklisting Or Reducing Sender Score

Email is the life-blood communication of most businesses, it’s used for everyday back and forth, in the sales process, in the generation of leads and it provides delivery of all sorts of essential information between systems.

Disrupting a business’s communications could lead to a massive decrease in revenue and might not be identified until it’s too late.

One very important aspect that affects email deliverability, is a domain’s ‘sender score’. It’s calculated by the ISP (or Email Service Provider) based on a number of factors including email open rates, spam reports and . If an attacker can lower this score, your company’s emails will take a hit.

This can be done by sending spammy emails purporting to be from your domain.

Imagine hundred’s of thousands of emails being sent from your spoofed domain with subject lines like;

  • Buy Cheap C*alis — International Delivery!
  • College E**ays | Best Prices | Quick TAT
  • Beautiful Slavic M*il O*der Br*des — From The Russian Steppes To Your Doorstep

A sustained attack of this nature could get your domain blacklisted and stop all of your email reaching it’s intended target.

How To Mitigate Or Prevent

#1. The main way to prevent this type attack is by installing SPF, DKIM and DMARC records for your email server. You might also consider moving the email to a subdomain of your main domain.

#2. Checking your domains Sender Score and whether your domain can be found on the email blacklists might be overkill unless you notice issues.

This is not a common type of attack, that I am aware of but one that could be devastating to many companies. It’s likely that the issue wouldn’t be diagnosed for weeks and in the mean time, all those missed sales would be gone.

Setting up the needed records in your DNS and with your email provider shouldn’t be overlooked.

Hot Linked Images

Image hot linking is when another site uses your image on their site. More often than not, they will not have permission and the low quality site will look something like this with thousands of pages covering various niches.

The category pages are tagged ‘index’, whilst the individual pages for each image are tagged ‘no-index’. Very often these sites are built in subdomains of sites with tlds like .tk or .xyz and also on blogspot sites.

I’ve seen sites ranking locally and nationally in Ireland, the UK and the US powered mainly by these type of links.

The problem is that once your images are being hot-linked by a couple of these sites, you will get caught up in more and more over time. When your site tips over some unknown ratio, you’ll see rankings and traffic plummet.

Is this a directed negative SEO attack? In most cases, probably not. But it is an attack vector that can be prevented.

How To Mitigate Or Prevent

#1. Preventing hot linking of images is relatively simple. It just requires adding a little bit of code in your htaccess file.

RewriteEngine on
RewriteCond %{HTTP\_REFERER} !^$
RewriteCond %{HTTP\_REFERER} !^http://(www\.)yoursite.com/.\*$ [NC]
RewriteRule \.(gif|jpg|jpeg|bmp|zip|rar|mp3|flv|swf|xml|php|png|css|pdf)$ - [F]
Enter fullscreen mode Exit fullscreen mode

You will of course need to change ‘yoursite.com’ to your own domain name.

#2. This blocks the hot linking of the images, but a more preventative method would be to block all unwanted bots in both your website’s htaccess/nginx.conf file and the robots.txt.

Blocking all malicious bots has a couple of benefits including limiting the amount information available to your customers and is always a good idea.

But this brings us neatly along to the next topic

Scraped Content

Imperva / Incapsula reported earlier in 2020, that up to 37% of all internet traffic is some form of automation. Personally, I’d like to see exactly how they arrived at that figure, but as proxy providers have entered the corporate world servicing big data, content scraping is becoming more pervasive.

Whilst content scrapers and other bots pollute your analytics, from a marketing perspective, it’s not the major problem that it once was.

Google, and the other search engines to an extent too, have become much better at assigning ownership to the original content author.

The main problem, arises if your site(or page) goes offline for a time. This could be down to server error, a misconfigured index tag, or unpublishing a page by mistake.

Given enough time, Google will reassign the content’s authorship to the next most authoritative source. In this case, a low quality scraper site would now own your content.

To check how many sites are scraping your content, grab a random sentence and Google it surrounded by double quotes.

How To Mitigate Or Prevent

A determined adversary will be able to circumvent any controls you try and put in place, but most content scraping bots are simple and not targeted.

#1. Block bots. Stoping bots that declare their useragent, is the first and most obvious step. This just requires editing your .htaccess or nginx.conf files. CCarter has a regularly updated bot list on BuSo, with Mitchell Krog being another good option.

#2. Invest in a Web Application Firewall(WAF) — Whether you opt for a paid service such as CloudFlare and AWS, or go for an open source solution such as ModSecurity, a firewall will prevent a large number of bots, hitting your sites.

#3. Modify your RSS Feeds  — RSS feeds have a range of different uses but can be exploited by scraper sites. Modify your RSS feeds to serve only a summary that doesn’t include any hyperlinks.

#4. DMCA Request  — If scraper sites are causing you issues, your last resort may be a DMCA request.

#5. Google Copyright Infringement Tool  — Google provide a tool for copyright infringement, if your report is successful, the infringing party is notified via Google Search Console and the infringing page is delisted. This tool is open to abuse by bad actors and negative SEO.

(Distributed) Denial Of Service Attack

Picture this, it’s Black Friday, the busiest day of the year for online sales and your e-commerce site is under constant attack. A DOS attack will effect your site’s responsiveness and a slow server means visitors and customers will drop off like flies.

A sustained attack could overwhelm your server under the load of connections, robbing you of sales for hours.

Depending on the traffic to your site, prevention is probably better than cure.

How To Mitigate Or Prevent

Like other automated attacks, the best way to mitigate a DDOS is by paying for enterprise level protection.

#1. Invest in a WAF  — A Web Application Firewall will protect your site against all but the most leveraged types of Denial of Service Attack.

The next type of attack is related to a DDOS but with a twist.

Slow Loris

A slow loris attack’s main aim is to slow down your server by limiting the number of connections available to real visitors and GoogleBot. The attacker send partial HTTP requests that are never finished, and one by one, takes over all your servers sockets.

If Googlebot is unable to fully crawl your site several time, your rankings will drop site wide.

Most modern Apache servers are configured to prevent this type of attack out-of-the-box and whilst I’ve heard of a modified version called Gloris used against Nginx servers, I’m not familiar with it.

Poisoned Canonical

Anyone who has bought domains on the secondary market is aware of how diligent you have to be before buying a dropped domain. You have to check the existing and historical links and anchors for indications of spam.

You have to eyeball the domain’s archived history in the Wayback machine to see if it was used to sell knockoff Nike’s or Ugg Boots at any stage and then sleuth for a missing year, just in case the spammers were blocking the Internet Archive’s bot “ia_archiver”.

But it’s par for the course that after doing all the checks, you’ll pick up a domain, load it up on a server, register it in the Search Console only to see a penalty of some sort appear after a few days. Or worse still, it’ll just never perform.

The point is that penalised domains are not uncommon. They can be bought and used for negative SEO. That might be to redirect toward a target’s site but more recently bad actors have been exploiting the canonical tag.

The canonical tag tells the search engines which is the master copy of a page. It combines similar pages into one in the eyes of Google including the links associated with each canonicalised page.

Using the canonical tag, an attacker can tell Google that your domain is the master copy for the penalised domain and Google will share the penalty with your site.

Straight Up Hacking

If someone is determined and patient, they will be able to breach your security at some stage. Whether this is a simple password are a 0day vulnerability. The key is not to leave the key in the lock. Take basic precautions.

Once an attacker has control of your site, there is a range of things they can do, from defacing the site, to adding links to gambling domains, or changing your robots.txt file to disallow all.

Conclusion

Whilst negative SEO attacks are unscrupulous, I don’t believe they are against the law, with the clear exception of hacking. Add to this the fact that they can be difficult to identify and when done right, impossible to trace and you have a recipe for disaster.

Most SEOs are not in the business of destroying the competition out of spite and want to compete on a level playing field. Negative SEO is far less common than you’d think.

If you are considering negative SEO as a strategy for beating the competition, understand that it’s not an exact science and you may end up strengthening the domain that you were trying to impact.

Also consider the moral issues, if you negatively effect someone’s business, you are not just hurting the owner, but also the employees and their families. Plot these effects on a timeline and you might be casting a very long shadow.

💖 💪 🙅 🚩
murroughfoley
Murrough Foley

Posted on July 28, 2020

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related