Vous constatez que vos pages web ne sont pas indexées à temps par GoogleBot ? La raison n’est pas toujours ce que vous pouvez penser.
Looking at your site, I do see that we’d like to crawl more from the server, but we’re holding back because we think the server might not be able to handle the load. This is the reason why the Fetch as Google requests aren’t making it through. In particular, we’re seeing a fairly high response-time for URLs from the server, which often signals that the server is pretty busy even without us crawling. On the one hand, this is something you could work to resolve by seeing if you could speed up your site’s hosting (perhaps optimizing your server-side code or moving to a slightly faster server). On the other hand, your website currently isn’t that large & not constantly changing, so it’s not really necessary for crawl 1000’s of URLs every day from there. From that point of view, while we’d like to crawl more, I don’t see it as being a critical problem for your site in particular. If users are happy with the speed of your site, then we can definitely live with it too :).
See on www.arobasenet.com