Google: Can’t Crawl Your Robots.txt Then We Stop Crawling Your Site

See on Scoop.itStrategy SEO

Did you know that if Google cannot crawl your robots.txt file, that it will stop crawling your whole site?This doesn’t mean you need to have a robots.txt file, you can simply not have one.

Mohammed ALAMI‘s insight:

If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file. If this isn’t happening frequently, then it’s probably a one off issue you won’t need to worry about. If it’s happening frequently or if you’re worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.

See on www.seroundtable.com

Publicités

A propos mohammedalami

Consultant SEO expert (search engine optimization), internet marketing enthusiast SEM (Search Engine Marketing) and SMM specialist (social media marketing), graduate of the University of Montreal and HEC Master in e-commerce. I blog on Mozalami for Montreal SEO and on Google+
Cet article a été publié dans Non classé. Ajoutez ce permalien à vos favoris.