The major search engines and many SEO experts extol the virtues of developing websites that are easy to categorize and navigate, and that contain frequently updated content popular enough to generate organic backlinks and user distribution. By doing so, one can ensure maximum visibility in search results. This is considered practicing white hat SEO – the practice of optimizing website results in search through the ethical and strict application of search engine best practices and guidelines.

By contrast, many SEO professionals use what are known as black hat SEO strategies. This involves the application of SEO strategies and tactics that are the opposite of search engine practices. More fundamentally, black hat SEO professionals try to fool search engines into thinking that their website is more relevant, popular, and/or useful than it actually is.

From a business perspective, black hat SEO should be avoided or at least approached with caution. Black hat SEO can indeed lead to short-term improvements in one’s search engine ranking, as well as site traffic, and even conversions. Nevertheless, the punitive measures a firm may face if caught using them, can be time-consuming and expensive to correct. Therefore, firms should avoid incorporating black hat strategies into long-term strategic marketing and/or digital plans.

A compromise between white hat and black hat SEO is known as gray hat SEO. This involves using black hat SEO techniques sparingly enough to avoid the punitive measures commonly employed by search engines. A combination of white hat and gray hat strategies can indeed effect long-term improvements but must be carefully deployed and continuously monitored for best results.

11 Black Hat SEO Strategies You Should Avoid

© | Studio10Artur

If a firm decides that black hat SEO techniques are in its best interest or plans to employ gray hat strategies, in this article, we will provide an overview of the best black hat techniques. These include 1) keyword stuffing, 2) meta tag/description stuffing, 3) cloaking, 4) hidden text, 5) article spinning, 6) doorway pages, 7) duplicate content, 8) unnatural links/link farming, 9) comment spamming, 10) parasite hosting, and 11) Google bombing, as well as 12) other black hat techniques.


SEO practitioners who saturate text-based content with target keywords engage in a black hat practice known as keyword stuffing.

Search engine crawlers index websites, in part, through analysis of the keywords and key phrases on the website’s pages. Some SEO practitioners incorporate significant amounts of similar keywords and phrases into webpage content, whether or not they are relevant to the content of the pages in question. Generally, a keyword density of between 2% and 4% is considered optimal; anything above that is considered black hat. In some cases, this results in grammatically incorrect or even unreadable pages. However, these practitioners hope to improve their website’s chances of being ranked high in searches for those keywords and phrases.

This is a widely known practice and penalties can be severe. In 2011, Google released Panda – a series of updates that discouraged various black hat abuses, including keyword stuffing. This resulted in significantly diminished search engine rankings for many websites.

In general, those who choose to employ this technique should use it sparingly and ensure that the keyword-stuffed content they produce is not unreadable and/or of low utility. Content, such as instructions for a DIY task or an entertaining popular culture, may be popular enough for readers to ignore the repetitive content, and generate the traffic, backlink, and social shares, which along with the extra keywords, ensure improvements in webpage visibility.


In addition to keyword stuffing, SEO practitioners may engage in meta tag/description stuffing. This is the practice of filling meta tags and meta descriptions with keywords and phrases, again, in an effort to convince search engine crawlers to index and rank webpages high for those keywords and phrases.

Because meta tags and meta descriptions are, by nature, self-reported data about what a website is about, it is safe to say that web crawlers pay far more attention to the site content itself than metadata. Meta tag/description stuffing may fool a crawler into ranking a webpage high for particular keywords and phrases, but without keyword stuffing or appropriate keyword usage in the content, any change in search engine ranking due to this technique alone will be extremely brief. Further, this may draw negative attention to the website.

For those practitioners who choose to use this technique, meta tag/description stuffing should be pared with either a strong content strategy or else keyword stuffing.


Cloaking is a black hat strategy in which SEO practitioners code their webpages and sites in such a way that crawlers see one version, and users see another. This is antithetical to search engine best practices, which, essentially, call for websites to be designed for humans as if search engines did not exist. In fact, one of Google’s Webmaster Guidelines is “Don’t deceive your users or present different content to search engines than you display to users.”

Cloaking is achieved by server-side scripts, which identify the IP address or the User Agent HTTP header of the user accessing the page. Usually, those users identified as web crawlers are served keyword and meta tag/description stuffed content while all others are served human-friendly content.

Perhaps the most notable example of cloaking occurred in 2006, when automobile manufacturer BMW, having employed an SEO firm that used a cloaking strategy, was severely penalized by Google, by a delisting from search engine results entirely. Luckily for the firm, the duration of the delisting was short.

Cloaking has been used to highlight site content in instances when a site’s design is friendly to humans but not particularly friendly to search engine crawlers (i.e. the site contains considerable images and/or animation). This is the ideal use for this black hat SEO technique.


Similar to cloaking is hidden text. This black hat technique refers to text that is configured to be invisible or unreadable. SEO practitioners may hide links or text to increase keyword density on a page or to “improve” internal link structure. Crawlers usually consider hidden text spam, though just as with cloaking, hidden text can be used sparingly to highlight text coded in JavaScript or Flash that is unreadable to or ignored by most search engines.


Search engines do assess the frequency of content updates as a measure of compliance with their best practices. White hat SEO professionals either rely on their in-house marketing and/or copywriting teams to develop or purchase content that is aligned with the firm’s strategic objectives, branding standards, and SEO objectives.

However, many SEO practitioners, without this resource to rely on, avoid this by on-site article spinning – duplicating content by rewriting it, usually multiple times and publishing each version as a distinct page at periodic intervals. This content is repetitive, and often poorly written; it generally has low value for the website visitor. Generally, SEO practitioners pare article spinning with keyword stuffing.

More frequently, article spinning is used as an off-page black hat technique for backlinking purposes. Practitioners create multiple versions of text-based content, embed backlinks to their site, and submit each version to a different site for publication. Many practitioners use article-spinning software for this purpose (though such software often creates unintelligible text).

The practice of article spinning is one of which Google and the other major search engines are aware; indeed, Google’s Panda and Penguin updates were designed, in part, to punish against sites that employ this practice. A better strategy is employing writers, or using in-house writers, to rewrite content from different perspectives, which may yield sustainable backlinks and content considered fresh for SEO purposes.


Doorway pages are groups of poorly written pages optimized for specific keywords. These pages usually have little content, in favor of calls to action and links that drive users through multiple webpages to a single landing page. Some practitioners use doorway pages to redirect user traffic to sites without their knowledge.

As an example, a firm may send out an email about a breaking news story. The recipient may click on the embedded link, and see a page with a teaser image concerning the story and be told to click on a link below. The recipient may then be directed to a second or multiple similar pages, all driving them eventually to a landing page. When these doorway pages mimic the landing page, or otherwise feature links, design features and navigation, they are known as content-rich doorways.

As with the aforementioned practices, doorway pages are on the radar of the major search engines, which have, and will, take punitive actions against sites that employ them. Doorway pages can be used to ease navigation by redirecting traffic internally or externally at the user’s direction. However, too many redirects can have a negative impact on SEO.


Duplicate content can refer to on-site article spinning. However, the phrase also denotes two other common black hat practices. The first refers to the establishment of affiliate programs for website promotion. Practitioners will offer a set of template webpages designed to drive traffic to their website to a group of affiliates. These template websites offer minimal original content, serve no purpose other than SEO, and can quickly be delisted.

The other instance of duplicate content is more commonly known as plagiarism – lifting content from another published site and publishing it as one’s own. Plagiarism can also refer to paraphrasing said content without attribution in a manner that it is still clearly someone else’s work. Like Google, Bing, and Yahoo are invested in indexing the maximum amount of original content and offering it to search engine users; plagiarism is frowned upon. Further, it can lead to both legal consequences – as the original author can initiate civil proceedings for recompense and damages, and reputational harm.

Anti-plagiarism applications, available widely online, can quickly detect duplicate content, preventing accidental publication. Moreover, anti-plagiarism algorithms in search engines can detect duplicate content and penalize the websites on which it appears.


Rather than pursuing organic links through white hat techniques, some SEO professionals obtain links from link farms, or buy them from other third parties. While this is not necessarily a bad strategy, irrelevant links – links from sources that are obviously unrelated to a particular website, have a negative effect on SEO. Practitioners looking to obtain links from a third party should careful vet the provider site for quality, content, and consistency with the firm’s brand to avoid penalties for irrelevant links.


A practice more prevalent in the early days of SEO, comment spamming involves including comments on blogs and websites including backlinks to one’s website. However, in 2005, Google introduced the nofollow attribute that, when coded, instructed crawlers to bypass such links when indexing sites. Yahoo and Bing’s crawlers follow the nofollow link but do not include it in their ranking calculations.

However, the ability of webmasters to nullify the link has reduced, but not eliminated, the practice. Moreover, some webmasters do not employ the attribute, making it a viable strategy for some. Further, the addition of these links can result in additional traffic.


Anyone who has ever visited clicked on a website from search results and been taken to another, completely different website, may have seen parasite hosting in action. Parasite hosting involves hosting a website on another individual’s computer or server for SEO purposes. This illegal practice first became prevalent in 2007 and has no gray or white hat counterpart. However, it is still widely used by many practitioners, who often employ a script that redirects traffic away from a website link referred in search results to a site of their own choosing. Because the hijacked website shows no other signs of being compromised, this can be difficult to detect without vigilant and consistent security measures.


Google bombing is a practice some SEO professionals use to reduce their competitor’s rankings. Once used to pull pranks, Google bombing refers to the set of tactics for manipulating the search results of competitors. These can include creating negative pages about competitors, often with a viral hook; media campaigns to generate user-created webpages to redefine or recontextualize a particular keyword or key phrase; and media campaigns to boost the popularity of trending terms and phrases (usually negative ones).

Google bombing can be effective for short-term visibility – both on- and offline. However, the major search engines often quickly limit their effectiveness once discovered. Google bombs may, however, sustain a lasting public relations impact.


There are many other black hat techniques including:

  • Pingback spam: Some practitioners set up scripts to notify search engines of content repeatedly to create the illusion that it is new.
  • Cybersquatting: This involves the domain registration of a trademarked name and either used to redirect traffic to a site of one’s own choosing or to otherwise post promotional content.
  • Typosquatting: This involves purchasing a domain name that is a misspelling of a competitor’s brand name and redirecting traffic from it to one’s own site.
  • Cookie stuffing: Practitioners employ this technique when they place cookies on a user’s computer without their knowledge.
  • Social networking spam: Practitioners may flood social networks with spam comments promoting their website and/or target specific users on social networks for the recipients of these messages.
  • Page swapping: Some practitioners “swap” the content of a high performing page with content from a low-performing one.
  • Scraper sites: Scraper sites refer to websites created by programs that “scrape” other sites for unique content and amalgamate it. These sites are often filled with ads, and/or used to promote a practitioner main site.
  • Translated sites/machine translation: Practitioners may use programs to translate content into other languages automatically for SEO purposes. This often yields unintelligible pages.

As with all black hat SEO techniques, these may generate short-term improvements in search engine results. However, they may eventually subject a website to punitive measures, as well as alienate website visitors.

Comments are closed.