5 Silly Yet Harmful SEO Mistakes to Avoid

Common SEO mistakes, Simple SEO mistakes, Harmful SEO mistake, SEO mistakes to avoid, On-page SEO mistakes

With the rise of comprehensive digital marketing, search engine optimization (SEO) has been somewhat on the sidelines. Why would you spend your time on keywords, tags, and links (all this technical stuff) when you have paid ads, social media, and content? Though a bit ignored, SEO is still increasingly important to a site’s overall performance in search.

Utilizing the power of on-page and off-page optimization, you can bring your site to the next level (i.e. increase rankings, drive more traffic, bolster conversion rates). The bad news is, doing SEO the right way is easier said than done. It is evolving quickly, and Google continues to throw out search algorithm updates, one after another.

It can be hard to stay on top of things and keep your site optimized for the new realities of search. No wonder SEO professionals are constantly skimming the Web to fish out crucial bits of info about upcoming changes and updates. But the problem is, being deeply afraid of missing out on the new stuff, they often make basic-level mistakes.

In this article, we reviewed five silly but harmful SEO mistakes that even professionals make. Avoid them at any cost because these errors can ruin your entire digital marketing campaign. Let’s dig in!

5 Simple SEO Mistakes to Avoid

#1 Close Your Site from Indexing in .htaccess

If you do SEO for a living, most likely you have heard about .htaccess. Basically, it is a configuration file which is used to store specific directives that block or provide access to a site’s document directories.

I cannot stress enough how important the .htaccess file is. If you know how to manage it, you can:

  • Create a more detailed sitemap
  • Generate cleaner URLs
  • Adjust caching to improve load time

In short, .htaccess is a crucial tool to polish your site’s indexing process and, eventually, receive higher SERPs.

However, you need to be a true pro to set up an .htaccess file correctly. A single mistake could lead to dire consequences. For instance, you can entirely block your site from indexing, like this:

RewriteCond %{HTTP_USER_AGENT} ^Google.* [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Bing.*
RewriteRule ^/dir/.*$ – [F]

If you see these lines of code in your site’s .htaccess file, search bots won’t crawl and index it. Ask a developer to delete the code or do it yourself.

Make sure that you check .htaccess every time you start a new project. Some SEOs promote and optimize sites for months, not realizing that all their efforts are in vain. You don’t want to be like them.

#2 Discourage Search Engines from Indexing in CMS

SEO plugins of CMS platforms like WordPress, Joomla, and Drupal can jeopardize your optimization efforts. They have built-in features that allow users to instruct search engines not to crawl the website. All you need to do is go to Settings → Reading → Discourage to stop search engines from indexing the site. Tick the box and you can forbid search bots from indexing your site.

Make sure that you check this box at least once a week. After all, anyone who has access to the CMS might accidentally click the box, which will undoubtedly have a negative effect on your campaign.

To note: Search engines may keep indexing your site even if you tick the ‘discourage’ box. So, if you really need to close the site from indexing, you’d better do it in the .htaccess or robots.txt files.

#3 Leave Your robots.txt File Entirely Open for Crawling

This one is the worst. Remember: Never ever leave your robots.txt file open for crawling because it can result in serious privacy issues. You can lose your site entirely through a data breach.

If you are a beginner, make sure you take the time to learn as much as possible about setting up and managing robots.txt files. Act immediately if you check robots.txt and see something like this:

User-Agent: *
Allow: /

This means that search bots can access and crawl every web page on your site, including admin, login, cart, and dynamic pages (search and filters). Keep your customers’ personal pages closed and protected. You don’t want to be penalized for having dozens of spammy dynamic pages as well.

Either way, ensure that you disallow pages that should be blocked but allow pages that should be indexed. It sounds simple but takes time to learn.

#4 Forget Adding “nofollow” Tag Attribute to Outbound Links

SEOs know that links are still an important ranking factor. Unfortunately, they focus on backlinks and completely forget that their own sites pass link juice to other sites. What you should do is drive high-quality backlinks, but keep all the link power on your site.

So, your strategy is simple:

  • Scan your site using a site scanner tool (I use Xenu)
  • Sort links by address to locate outbound ones
  • Create an Excel file with all outbound links (or download a standard HTML report)
  • Check out every link in the list to implement “nofollow” tag attribute where necessary

Don’t be obsessed with the “nofollow” tag attribute, though. By saving all the link juice for yourself, you provoke other SEO professionals to nofollow you as well. In short, don’t abuse it.

#5 Fail to Check the Code in Validator

Your website consists of code, and the better this code is, the higher SERPs your site will potentially earn. This is because neat and clean code allows search crawlers to scan and index your site more efficiently, without leaving a single page behind.

So, everytime a new project is assigned to you to promote and optimize, make sure you check the code. You don’t have to be a developer. Just copy your site’s URL and paste it into the address field of The W3C Markup Validation Service. Then, ask a developer to fix the errors. The image below demonstrates a typical validator report:

A Typical The W3C Markup Validation Service Report

While Google doesn’t penalize websites for having invalid bits of HTML and CSS, you’re better off running the validator tool anyway. After all, it doesn’t take much time but improves your site’s performance for both users and crawlers.


Search engine optimization is ever-changing, and you should work hard to keep up with all the tactics and algorithm updates. Keeping your finger on the pulse of SEO is great (a must, actually) but don’t forget about the basic stuff, too. After all, silly mistakes are the most harmful ones.

What do you think? What SEO mistakes do you usually come across? Share your thoughts on the comment section below. 


Please enter your comment!
Please enter your name here