The Internet is expanding every day and every day thousands of new blogs are being created on the web, bloggers and internet marketers all over the world are invested in the market of Blogging and making website for their personal or commercial purpose but in the early stages most of them face some common issues related to website creation or the it’s indexing and one of those common error’s is “Submitted URL blocked by robots.txt“.
The internet is vast and it takes Google even some time to crawl the whole web but even the almighty Google cannot crawl those websites that prohibit the Google bot from accessing the whole website or some URL specifically. Google has its hands tied if you are not permitting the Google bots to crawl the website and that is why sometimes you might receive a notification in your Google webmasters saying “New Coverage Issue Detected”.
What Is A Coverage Issue In Google Webmasters?
Coverage issue in your webmasters indicates that the coverage of the website may be negatively impacted on the Google result because of which your website may not get crawled by the Google bots and your recent posts or the whole website may not appear in the Google results.
Google webmasters is the place where you can manage or your website search appearance and see how frequently your site is appearing on Google results and how it is performing there and they also send you notifications whenever they find something wrong with your website and one of those common occurrences is “Submitted URL blocked by robots.txt “.
What is Submitted URL Blocked By Robots.txt Problem?
Submitted URL Blocked By Robots.txt is a common problem that most of the new bloggers face who have just started in the blogging industry and have no knowledge of indexing and robots.txt.
The problem is simple to fix and may include a few steps but before that, you need the basic knowledge about the robots.txt file and Google indexing works. Google indexing is the process where the search engine sends the Googlebot to different websites in order to crawl their content and when the crawling process is done the websites appear on the Google search engine.
The basic information about the robots.txt file will be available in a different post which you can check after clicking on the link in this sentence which will lead you to that post but we are including a little information about robots.txt file here so that you can have the general idea.
What Is Robots.txt file?
Robots.txt is a file that the webmasters create in order in order to instruct the search engine bots how to crawl and index the pages of this website, what pages to ignore and what pages to index.The robots.txt file can decide the indexing fate of your website so make sure your understand about the robots.txt file before actually creating it for your website.
How To Fix Submitted URL Blocked By Robots.txt Problem?
The Submitted URL blocked by robots.txt problem can occur because of certain conditions and the most common is you disallowing the google bots to crawl your website.Check your robots.txt file by going to mysite.com/robots.txt and replace the mysite.com by your domain name and there you will be able to see the robots.txt file of your own website.
Check the things that you have allowed there and the things that you have disallowed and you will get a general idea of why there’s a coverage issue on your website.Fix if you see any problems there and if you are having problems understanding the whole thing you need to read our post on robots.txt and get a general idea on how it works.
Sometimes fixing robots.txt file is all it takes to fix the issue and after that you can validate the fix and then the search engine will start indexing your site as usual but sometimes even after doing so many webmasters get the same message in their webmaster console again saying new coverage issue detected.
The same thing happened to me even after fixing my robots.txt file and then after some brainstorming I finally found the solution to the problem.Sometimes by accident the “Search Engine Visibility” option in your WordPress Reading setting gets checked which puts a noidndex to all your posts which causes the coverage issue to appear in the webmasters console.
The fix is easy, just untick that box and then re-index your site by submitting the recent updated sitemap of your website and in few days all of the pages on your website will be appearing on the Google search engine.
So this is how you can fix all the coverage issues appearing on your website.If you get any further notifications regarding this issue do share with us and we will try our best to find the right solution for you.