When webmasters fear getting Google penalties, most of them think of the dreaded algorithm updates: Penguin, Panda, the Top Heavy algorithm, etc. Get caught in one of these algorithm sweeps, and you could lose some or even all of your organic search traffic.
But Google actually has a lot more in its arsenal than just algorithms to encourage you to follow their Webmaster Guidelines. Ever heard of a Manual Action Penalty?
What’s a Manual Action Penalty?
Google has teams of search engineers tasked with the job of reviewing individual websites and, if necessary, assigning a rank penalty.
When Google runs Penguin, sites across the web can take a rank hit. A Manual Action Penalty means your site alone has been hit, and it’s your problem to fix.
Manual Actions are the most common type of Google penalty you can get. If you have one, you should see a message in search console about it:
You can go to Search Console and check for yourself right now. Just go to Search Traffic > Manual Actions and see if you have a message:
If Google has penalized your site with a manual action, you’ll see a message here describing the penalty. You can either get a Site-wide Match (meaning your whole site is affected), or a Partial Match, meaning only certain pages of your site are penalized.
Google Search Console Help provides a list of 12 common manual actions you can receive (meaning that there are more):
- Hacked site
- User-generated spam
- Spammy freehosts
- Spammy structured markup
- Unnatural links to your site
- Thin content with little or no added value
- Cloaking and/or sneaky redirects
- Cloaking: First Click Free violation
- Unnatural links from your site
- Pure spam
- Cloaked images
- Hidden text and/or keyword stuffing
Some of them, like the Pure spam or Spammy freehosts penalty, aren’t likely to happen to your average webmaster (unless you own a blatant spam site or host a lot of them).
Other manual actions, though, a lot of webmasters could be at risk for.
The good news is you can fix the problems Google’s engineers found on your site, and then request a review of the Manual Action in Search Console:
Google’s engineers will review it, and if they approve it, they’ll remove the penalty and allow your pages to start gaining rank again.
Here are five common types of penalties my clients have gotten, and a walk-through of how we’ve helped their sites recover from each.
1. Unnatural Links
There are two different kinds of “Unnatural Links” penalties Google has:
- Unnatural Links from Your Site: You are hosting unnatural, artificial, or deceptive outbound links on your site.
- Unnatural Links to Your Site: You have unnatural, artificial, or deceptive backlinks pointed at your site.
These manual actions are in line with Google’s Penguin update, meant to penalize people who are participating in link exchanges, buying links, or selling them to manipulate rank.
If you have an unnatural links penalty, here are some examples of the kind of links you need to fix:
- Paid links
- Links acquired from link exchanges
- Spammy guest post links
- Automatically generated links
- Spammy forum comment links
- Irrelevant backlinks
- Links from low-quality directory or bookmark sites
Clean up the Links From Your Site (Outbound)
Use a link analysis tool like Ahrefs or Majestic to get a list of your outbound links. SEOChat’s Website Crawler is another free option that will analyze 100 pages for you without registering.
Download a list of external links from the report:
Identify any links on your site that are against Webmaster Guidelines. Once you find them, you can either:
- Remove the links
- Use a 301 redirect attribute through a page blocked by robots.txt
- Set the links to “nofollow”
Clean up the Links to Your Site (Inbound)
You can get a list of links pointed at your site using your backlink analyzer of choice, or you can use Search Console. Just click “Search Traffic,” then “Links to Your Site,” and you can download a list.
Find any backlinks that are against Webmaster Guidelines.
Next, you’ll need to send out take-down request emails to the webmasters hosting them. If they don’t respond to you, then as a last resort, use Google’s Disavow Tool to tag the links so they don’t pass PageRank.
Once you’ve cleaned up your links, you can move on to submit a reconsideration request.
2. User-Generated Spam
If you’ve gotten a User-Generated Spam penalty, it doesn’t mean you’re a spammer – but your site users are. As far as Google’s concerned, it’s up to you to clean up spammy content people post to your:
- Forum pages
- Guest book pages
- Blog post comments
- Other site areas
Mozilla famously got penalized by this Manual Action a while back. Here are some user-generated spam examples from their site:
If you haven’t already, the first thing you’ll want to do is install some kind of anti-spam software. Akismet is a popular WordPress tool that will detect and filter out some of your spam comments.
Hopefully, this does most of the cleanup work for you, but don’t stop there. You need to manually go through and remove any spam that got through the filters.
Look out for things like:
- Posts that are blatant advertisements
- Posts with gibberish text
- Posts with off-topic links (Probably the most common type of comment spam I see)
- Commercial-sounding content (Think payday loans, discount insurance, libido enhancers, etc.)
- Auto-generated comments
You should also vet your user profiles, and delete any that might be spam accounts. These are usually auto-generated, have no profile photo, no description, and of course, post a lot of irrelevant comments.
A User-Generated Spam penalty is one you’re likely to get again and again, unless you take a hard line on spam from now on. To help my clients who have had this penalty prevent spam in the future, we:
- Use a CAPTCHA on their sites
- Change all forum links to “nofollow”
- Allow users to report spam
- Consider moderating all comments
If you have a User-Generated Spam penalty, chances are you have a lot of comments and user accounts to go through. Neil Patel’s Quicksprout got this penalty several times and were faced with nearly 350,000 forum users to sift through.
But it’s worth it to be as thorough as possible, because if Google sees there’s still spam on your site, they’ll reject your reconsideration request.
In my experience, you have three options:
- Take the time to go through all your user-generated content yourself.
- Hire someone else to do it for you.
- Delete all your user-generated content.
In the end, number three is what Neil Patel did. The decision is up to you – if user-generated content is central to your site, it might be worth it to clean everything up.
3. Hacked Site
Some of my clients followed Google’s Webmaster Guidelines to a T, but still ended up with a manual action penalty because their site was hacked.
Hacked sites pose a threat to you and your site users, so Google wants you to clean things up.
Keep in mind that if hackers are doing something malicious on your site, you might not even get a “Hacked Site” manual action. I’ve handled several cases where unknowing webmasters have ended up with a “Pure Spam” manual action instead.
Fixing a hack can be a big undertaking, but Google has some helpful resources for what to do.
Here are the basic steps:
Quarantine your site
You don’t know what’s happened to your site or if the problem is spreading.
So the first thing you want to do is take your site offline. You can do this by either stopping your web server or setting up a 503 response code.
You’ll also want to change all the passwords for your site, including system administrators, content management systems, FTP logins, and any other access points.
Identify the type of hack
Next, you need to figure out what kind of hack you have.
Search Console may have left you a message about it to go with your Manual Action Penalty. If not, move on to check the “Security Issues” section of Search Console.
Here are some of the ways you can be hacked:
- Spam – Someone’s adding spammy pages, links, or text to your site
- Malware – Someone’s installed software to damage computers
- Phishing – Someone’s installed software to collect information about your site users
Eliminate the vulnerability
Before you fix any changes hackers made to your site, you need to figure out how they accessed your site in the first place. If you don’t close this hole, they can continue to damage your site.
The problem could be a lot of things, like:
- A virus-infected computer
- Weak passwords
- Out-of-date software
- Permissive coding practices
If you aren’t comfortable investigating these possibilities yourself, bring a professional in to do it for you.
Clean up the hack
This is another job you’ll probably want a professional to do. They can remove the malware by hand and help you clean up your servers.
If the hacker got access to confidential user information on your site, you’ll have some legal responsibilities as well. Here’s a helpful resource on what to do in that case.
4. Cloaking and/or Sneaky Redirects
If you’ve gotten a manual action penalty for cloaking or sneaky redirects, one of these things happened:
- Cloaking: You’re showing some content to Google but not to site visitors (either images or text)
- Sneaky redirects: Your pages indexed in Google redirect users to completely different content
Here’s what I do with clients to help them recover from both.
Check for cloaking
To figure out what the problem is, use Fetch as Google. This tool will show you how Google sees your site.
Take the root address of affected pages from your Manual Actions report, and plug them in:
Compare Google’s rendering of your page to how it appears in your browser. If there are any differences, fix them.
Most cloaking is deliberate, but if you aren’t sure why your pages look different, talk to your web developer, SEO agency, and anyone else who has access to your HTML to diagnose the problem.
Repeat the process, rendering different versions of your site pages, including mobile.
Check your redirects
Next, check your redirects using a tool like Screaming Frog. Their report has a “Redirect URI” column so you can analyze each URL destination.
Look for any URLs on your site that redirect somewhere that site visitors probably didn’t want to go. Change these redirects to more relevant pages, or remove the redirect entirely.
Check for deceptive buttons, ads, and plugins
If you use an anti-hotlinking plugin to protect your images and bandwidth, Google could see it as cloaking. You may need to remove the plugin or disable the anti-hotlinking feature.
Also look out for any advertisements on your site that could trick people into clicking on something they wouldn’t have otherwise. These often look like trusted entities but are actually ads:
Any of these things could be responsible for the manual action, so make sure you clean up as much as possible to make Google happy with your site.
5. Thin Content
Google wants to deliver a variety of quality options in search results. If your site is full of shallow, duplicate content, you could get a manual action to keep your pages low in rank.
Here’s what Google means by “thin content,” so you can evaluate your own pages:
Duplicate content from other sites
If you’ve taken content from another site and republished it on yours, this can be considered thin content.
Some webmasters scrape content from somewhere (like Wikipedia), make minor changes, and republish. If you sell products and copy a manufacturer’s product descriptions, that can also count.
Thin content with affiliate links
If you have an affiliate site and publish content with the sole purpose of hosting affiliate links, you could get a thin content penalty.
Google wants to see that your content offers more value than what the original affiliate can provide. If your site is full of product descriptions and reviews from the merchant, you need to seriously improve your content.
Duplicate content on your site
Hosting a lot of identical or very similar pages on your own site can also land you a thin content penalty. I’ve seen SEOs come across this problem when they use doorway pages targeting different regions:
Google sees auto-generated content as thin. This can include auto-translated text, automatically spun content, and text generated from scraping RSS feeds.
Once we’ve found all the potential thin content on my clients’ sites, here are the three options I give them for moving forward:
- Delete it. If you scraped someone else’s text or use auto-generated content, this is what you’ll need to do.
- Update it so it’s better quality. That includes rewriting product descriptions and adding substance to affiliate pages, and merging doorway pages and deleting unnecessary duplicate content.
- Remove it from search (adding noindex meta tags). If you have to keep some of your thin content pages, put this meta tag in the <head> section of those pages:<meta name=”robots” content=”noindex”>Or you can use a tool like SEO by Yoast to do this for you.
After you’ve done your best to clean up your site, you can submit a reconsideration request.
Submitting Your Reconsideration Request
Once we’ve done everything we can to clean up a client’s site and fix whatever caused Google to give them a Manual Action in the first place, it’s time to submit a reconsideration request.
To do this yourself, go back to Search Console and click “Request a Review.”
Then you’ll see a box where you can submit your request.
When my clients reach this step, we try to be as specific as possible, including all relevant information about the cleanup process. Sometimes it’s easier to detail it in a Google Doc or Sheets file, then add that to the review request.
Relevant information for your review request might include:
- A list of bad links you removed from your site
- A list of spam comments you deleted
- Details of your malware cleanup
Also be sure to explain how you plan to prevent the same problem in the future.
After you submit, you should get a confirmation from Google. Then hopefully, within a few weeks, you’ll receive communication that the Manual Action has been removed.
If you didn’t do a good enough job and your site’s still violating Webmaster Guidelines, Google will tell you to go back and try again.
The Big Picture
If you get a Manual Action Penalty from Google, it’s not the end of the world. Google created this system to give webmasters an opportunity to clean up their sites and be in line with Webmaster Guidelines.
It could be a lot worse – I know a lot of sites that lost rank from algorithm changes and never fully recovered, no matter what they did to fix the problem.
Just follow the steps in this post to overcome your manual action, and pay close attention to Google’s Webmaster Guidelines overall to avoid another problem in the future.