If you’re a website owner, keeping your site visible on Google is likely a top priority. But did you know that your site could be penalized and even removed from Google’s search results due to violations of Google’s Webmaster Guidelines? When that happens, Google issues a Manual Action, which is a direct intervention from a Google reviewer due to non-compliance with its policies.
What is a Manual Action?
A Manual Action is when a human reviewer at Google determines that some part of your website violates Google’s guidelines. This can be due to spammy practices, deceptive content, or attempts to manipulate search rankings. Unlike algorithmic penalties (which happen automatically through Google’s search algorithms), manual actions are specifically reviewed by a real person at Google.
If your website is subject to a manual action, it will appear in your Google Search Console, and the affected pages may be de-indexed or ranked lower in search results. This can lead to significant traffic losses, so it’s important to monitor and address any issues promptly.
Why do manual actions exist?
Manual actions exist to maintain the integrity of search results and ensure that users find the most relevant information. Over time, individuals have attempted to manipulate search engines to rank their pages higher, often at the expense of more legitimate or relevant content. This negatively impacts both users, who struggle to find accurate information, and genuine websites, which become harder to discover.
To combat this, Google has worked since its inception to fight spam and help users find quality content. While Google’s algorithms are highly effective at detecting and automatically removing spam, manual actions are sometimes necessary to maintain the quality of search results. This extra step ensures that spammy or manipulative practices are addressed, protecting users and supporting legitimate websites in gaining appropriate visibility.
Why Does Google Issue Manual Actions?
Google’s goal is to provide users with the most relevant, high-quality search results. To do this, they need to ensure that websites follow their Search Essentials (formerly known as Webmaster Guidelines). If a website is found to be violating these guidelines—whether it’s due to manipulative behavior or unsafe content—Google issues a manual action to protect users and maintain the integrity of its search results.
Common reasons for manual actions include:
- Spammy content: Pages filled with misleading or deceptive content, such as fake news or clickbait.
- User-generated spam: If your site allows users to contribute content, like blog comments or forum posts, spammy links or content can result in a manual action.
- Unnatural backlinks occur when a site has links pointing to it that were obtained through manipulative or paid link schemes.
- Cloaking and sneaky redirects: If a website shows different content to users than it does to search engines, this can lead to a manual action.
How to Check for Manual Actions
If you suspect that your site has been hit with a manual action or want to stay proactive, it’s essential to regularly check the Manual Actions Report in Google Search Console.
Steps to Access the Manual Actions Report:
- Log in to Google Search Console – If you don’t have an account, you can create one and verify your website’s ownership.
- Navigate to the “Manual Actions” section – This can be found under the “Security & Manual Actions” tab on the left-hand menu.
- Review any notices – If Google has taken manual action on your site, it will display a notice here, explaining the type of violation and the affected pages.
Google has identified a significant portion of your site being abused with spam that violates its policies and adds little value to the web. This spammy content may exist in areas where visitors or third parties can post or interact, such as forums, guestbooks, social media platforms, file uploaders, or internal search pages.
The spam is likely generated by site visitors or external parties, using your site to promote irrelevant content. Fortunately, Google hasn’t applied a manual action to your entire site, as it believes your site’s overall quality is high. However, only the pages containing spam will be affected. If left unchecked, excessive spam can harm your site’s user experience, reputation, and search rankings.
To address this issue, review Google’s guidelines on user-generated spam, comment spam prevention, and hacked pages. Identify the areas on your site where users can post content and examine the URLs provided in your Google Search Console messages. Use Google’s “site:” search operator to find spammy or inappropriate content and keywords on your site.
Remove any irrelevant content and prevent future spam by blocking inappropriate terms or using anti-spam measures. After resolving the issues, submit a reconsideration request in Search Console and continue monitoring your site to prevent future spam incidents.
Google has detected spam on pages of your site that were submitted by visitors. This type of spam is commonly found on areas like forums, guestbooks, or user profiles.
Recommended actions:
To resolve this issue, follow these steps after reviewing Google’s policies on user-generated spam:
- Identify where users can add content on your site, such as forums, blog comments, or user profiles.
- Check for spammy content, like:
- Profiles or posts that look like advertisements
- Off-topic or out-of-context links
- Commercial usernames, such as “Discount Insurance,” that seem fake or link to unrelated sites
- Automatically generated posts or profiles
3. Use Google search to detect unexpected spam on your site. Use the “site:” operator along with keywords like commercial or adult terms unrelated to your site’s content. For example, search for “[site:example.com viagra]” to find spam related to “viagra” on your site.
4. Remove any inappropriate content you find.
5. Consider implementing spam-prevention measures for future protection.
6. Once you’ve addressed the issue, submit a Request Review through the Manual Actions report in Search Console. After submitting, monitor your account for review status updates. If your site is found compliant, Google will revoke the manual action.
When a significant portion of websites hosted on a free web hosting service are identified as spammy, Google may take manual action on the entire service. While Google strives to target only the offending content, a widespread issue can lead to action against the whole hosting service.
Recommended actions:
- Learn about identifying and preventing abuse on your hosting service.
- Proactively remove any accounts or websites that are contributing to the spam problem.
- Inform the technical team at your web hosting service about the manual action and address the issues.
- Once you’re confident that the service complies with Google’s spam policies, go to the Manual Actions report and select Request Review.
- After submitting a reconsideration request, it’s important to be patient and monitor your Search Console account for updates. Google will notify you once the review is complete. If the issues have been resolved, Google will lift the manual action, and the service can regain its standing in search results.
Following these steps can help ensure your service complies with Google’s guidelines, restoring its visibility and integrity in search.
Google has identified that some of the structured data markup on your web pages may not comply with their structured data guidelines. This could include using techniques such as marking up content that is hidden from users, marking up content that is irrelevant or misleading, or engaging in other forms of manipulative behavior.
Structured data is intended to provide search engines with clear, relevant information that enhances the visibility and presentation of your content in search results. However, when structured data is used improperly, such as by marking up elements that users cannot see or including inaccurate information, it misleads both search engines and users, which can lead to search result penalties.
To maintain transparency and accuracy, it is essential that all structured data on your site accurately reflects the content users interact with. Any manipulation or misuse of structured data can harm your site’s reputation in search rankings and limit the effectiveness of rich results. We recommend reviewing and correcting your markup to ensure it complies with Google’s guidelines, thereby ensuring that your website provides users and search engines with accurate, trustworthy information. This will help maintain your site’s visibility in search results while enhancing user experience.
- Review Google’s Link Spam Policy: Understand the guidelines to ensure compliance.
- Identify and Correct Violations:
- Download a list of your site’s backlinks from Search Console, sorted either by top linking sites or the latest links.
- Examine this list for any links that violate the link spam policy, focusing on the most frequent or recent links.
- Contact the site owners to request removal of the problematic links or ask them to add a rel=”nofollow” or relevant attribute to prevent passing PageRank.
- You make a genuine effort to remove bad links before disavowing them.
- Use the “domain:” operator to disavow multiple links from the same domain.
- Avoid disavowing organic, legitimate links.
- Review Google’s Link Spam Policy: Understand the guidelines to ensure compliance.
- Identify and Correct Violations:
- Download a list of your site’s backlinks from Search Console, sorted either by top linking sites or the latest links.
- Examine this list for any links that violate the link spam policy, focusing on the most frequent or recent links.
- Contact the site owners to request removal of the problematic links or ask them to add a rel=”nofollow” or relevant attribute to prevent passing PageRank.
- You make a genuine effort to remove bad links before disavowing them.
- Use the “domain:” operator to disavow multiple links from the same domain.
- Avoid disavowing organic, legitimate links.
Google has detected low-quality pages or shallow pages on your site. Here are a few common examples of pages that often have thin content with little or no added value:
- Thin affiliate pages
- Content from other sources. For example: scraped content or low-quality guest blog posts
- Doorways: Doorway abuse is when sites or pages are created to rank for specific, similar search queries. They lead users to intermediate pages that are not as useful as the final destination. Examples of doorway abuse include:
- Having multiple websites with slight variations to the URL and home page to maximize their reach for any specific query
- Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page
- Generating pages to funnel visitors into the actual usable or relevant portion of a site
- Creating substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy
Your website may be displaying different content to users than what is shown to Google, or redirecting users to pages that differ from what Google initially viewed. Such practices, known as cloaking and sneaky redirects, are violations of Google’s spam policies and can result in penalties for your site.
Cloaking refers to the practice of presenting different content or URLs to search engines than what is shown to actual visitors. This tactic is often used to manipulate search rankings, leading Google to penalize or de-index the site. Similarly, sneaky redirects send users to a different page than the one Google indexed, which misleads both users and search engines.
For publishers offering subscription-based or paywalled content, it is essential to distinguish this legitimate practice from cloaking. To do so, publishers should use structured data to mark up paywalled content. This helps Google understand that the content behind the paywall is intended for users who have subscribed, and not an attempt to manipulate search engine rankings. By properly using structured data, you can ensure that Google can correctly index your content without violating their guidelines, allowing your site to maintain good standing with Google’s search policies.
Your website may be displaying different content to users than what is shown to Google, or redirecting users to pages that differ from what Google initially viewed. Such practices, known as cloaking and sneaky redirects, are violations of Google’s spam policies and can result in penalties for your site.
Cloaking refers to the practice of presenting different content or URLs to search engines than what is shown to actual visitors. This tactic is often used to manipulate search rankings, leading Google to penalize or de-index the site. Similarly, sneaky redirects send users to a different page than the one Google indexed, which misleads both users and search engines.
For publishers offering subscription-based or paywalled content, it is essential to distinguish this legitimate practice from cloaking. To do so, publishers should use structured data to mark up paywalled content. This helps Google understand that the content behind the paywall is intended for users who have subscribed, and not an attempt to manipulate search engine rankings. By properly using structured data, you can ensure that Google can correctly index your content without violating their guidelines, allowing your site to maintain good standing with Google’s search policies.
Some images on your website may appear differently in Google search results compared to how they are displayed on your site. This practice, known as cloaking, involves showing different content to search engines than to human users and is a violation of Google’s spam policies. Image cloaking can negatively impact user experience by displaying incorrect or mismatched thumbnails in Google Image search results, leading users to expect different images than what they actually find on your site.
Examples of Cloaked Images
Common examples of cloaked images include:
- Serving obscured images to Google, such as images covered by blocks of text.
- Displaying different images to Google than those shown to visitors on your site.
To avoid these issues, ensure that the images served to Google and your site visitors are identical. If you need to block images from appearing in Google’s search results, there are specific methods you should follow.
Recommended Actions
To prevent violations, confirm that your website displays the same images to both users and Google’s search results. Cloaking is only acceptable when opting out of inline linking, as described below. Once you’ve ensured consistency in image display, you can request a review through the Manual Actions report in Google Search Console. Be patient while waiting for the review, and Google will notify you once the manual action has been lifted if no violations are found.
Minimizing or Blocking Images in Search Results
If you want to block full-sized images from appearing in search results, you can opt out of inline linking. To do this:
- Examine the HTTP referrer header when an image request is made.
- If the request originates from a Google domain, respond with HTTP 200 or 204 without content.
This approach prevents the image from being linked directly while still displaying a thumbnail in search results. This is not considered image cloaking and will not result in manual action.
There is a discrepancy between the content of the AMP version and its corresponding canonical web page. While the content of the AMP and canonical pages does not need to be word-for-word identical, they should cover the same topic and allow users to complete the same tasks. The goal is to ensure consistency in the user experience, whether visitors land on the AMP page or the canonical version.
When significant differences exist between the two versions, it may lead to issues with how your pages are displayed in Google Search. Specifically, if the AMP page differs too much from the canonical version, Google may apply a manual action. As a result, the AMP version of the page will no longer appear in search results, and only the canonical version will be shown.
To avoid these penalties, ensure that both the AMP and canonical versions provide similar information and functionality, so users have a seamless experience across different formats. By aligning the content, you maintain better visibility in Google Search and offer a consistent experience for all users, whether they access the AMP or canonical version of the page.
Some pages on this site appear to be redirecting mobile users to content that is not visible to search engine crawlers. These types of sneaky redirects are violations of Google’s spam policies. To maintain the quality of search results for users, Google’s Search Quality team may take action against such sites, which can include the removal of URLs from Google’s index.
Overview
In some cases, displaying slightly different content for mobile users is acceptable. For instance, optimizing content for smaller smartphone screens may involve adjusting images or layout to enhance user experience. Similarly, mobile redirects can be beneficial when they improve the browsing experience, such as redirecting users from a desktop URL (example.com/url1) to a mobile-optimized URL (m.example.com/url1). However, sneaky redirects—where mobile users are sent to unrelated content—can significantly harm the user experience and are considered a violation of Google’s policies.
An example of this poor experience occurs when a URL appears in search results for both desktop and mobile. While desktop users land on URL A, mobile users clicking the same result may be redirected to an entirely different, unrelated URL B. These sneaky mobile redirects can be set up intentionally by a site owner, but they may also happen without the owner’s knowledge due to misconfigurations.
Some common causes of sneaky mobile redirects include:
- Code or scripts that specifically create redirection rules for mobile users.
- Ads or monetization scripts that inadvertently redirect mobile users.
- Hacked elements or scripts injected into the site that maliciously redirect mobile users to harmful sites.
It’s essential for website owners to monitor and prevent sneaky redirects to ensure they comply with Google’s guidelines and provide a consistent experience for all users across devices.