Why You Should Block Crawler Spiders from SEO Tools You Don’t Use

On Twitter? Follow me. 🐥

Let’s talk about blocking crawler bots from SEO tools you don’t use and why you should do it. Below are two screenshots, one showing what SEMRush believes my organic traffic is and one showing what it actually is.

This is the image of SEMRush’s predictive traffic based on the keywords and pages they’ve seen I’m ranking with and for. It looks a whole lot like my organic traffic is stagnant or actually declining month over month. Fortunately for me, and unfortunately for my competition, that’s not the case at all.

Here’s a picture of my actual organic traffic from inception to the last day in September. Constant growth month over month and no plateau in sight.

According to SEMRush, my organic traffic has a slight decline every month. In reality, I had 6300 organic visitors in September, about 1300 more than August. So if my competition wanted to do a little research on me to see how I was doing, they would think I’m no threat. Little do they know around the beginning of July, I went ahead and blocked the crawler bots from the major SEO tools I don’t use.

There is absolutely 0 reason to allow those tools to keep track of your rankings if you don’t plan on using them in the future. All you’re doing is giving your competition a chance to easily see what you’re ranking for.

Do you want to give them an open book on how to beat you?

I’m talking about SEMRush, AHREFs, Mox, Serp Stat, Majestic, and any of the other tools you don’t use. I mean, if you do use them to keep up with your rankings or whatever, keep them unblocked. Hell, even if you do use them, block the bots. Is there that much more information that they’ll give you that you can’t get from GSC or any other search engine’s webmaster tools?

If you do plan on using a given tool in the future, you can block them now and unblock them later for an update. In the meantime, however, don’t give your competition your playbook.

Leave a Comment