Robots.txt Error: Blocked by robots.txt
Ever encountered the dreaded “Blocked by robots.txt” error in Google Search Console? It means your robots.txt file is preventing search engines from crawling important pages on your website. This can tank your search ranking!
Don’t worry, Here’s how to fix it:
Diagnose the Problem: ️♀️
▪ Use Google Search Console’s URL Inspection tool to pinpoint the affected page.
▪ Check your robots.txt file for any accidental blocks on the URL or its directory.
Fix the Robots.txt:
▪ If a specific directive is blocking the page, remove or adjust it to allow crawling.
▪ Remember, Google no longer obeys noindex in robots.txt. Use a robots meta tag for that.
Test and Resubmit:
▪ Use a robots.txt validator to ensure your changes are correct.
▪ Resubmit the fixed URL in Google Search Console for re-indexing.
Inbox me if you need help solving robots.txt error and getting new website indexed by Google!