Sunday, August 10, 2014

Google launches new tool to test robots.txt files

 Offered within Google Webmaster Tools, this new tool allows you to test changes to the robots.txt file and check for any errors.

>> You can check How to improve the quality of images in your posts on Blogger

Google has just launched a new tool to test robots.txt in its Webmaster Tools (and its Exploration menu). It allows to view the content of this file, but also to modify and verify the absence of errors.
URL of the site can be tested to see if they are crawled by Google spiders different (the "classic" Googlebot, but also its variations as Googlebot or Googlebot-News-Mobile for example). When these pages are not covered because of the robots.txt file, the tool will highlight the responsible statement.
File changes can be tested from within the interface, and any errors will be indicated (for "syntax not understood" for example). Once a new version uploaded to the server file, the previous will be archived in the Webmaster Tools.

>> If you need good SEO result then you can Optimize Images For Better SEO

In announcing these new products, Webmaster Trends Analyst John Mueller of Google invited webmasters and SEO to couple it with another new tool that Google has recently launched which lets you see exactly what may or may not explore the Google spiders .