Google WebmasterTools now provides a single page where you can do number of settings for your website. Among the changes that you can make in the Settings tab are:-
1. Geographic Target:- The core benefit of this feature is that you can gather information regarding how your site appears in country-specific search results. This would be a case when your site targets visitors of a particular country. However, you can use this feature only for neutral top-level domains and not for country-specific domains, as they are already related with a country or region. This setting would help Google improve search results based on geographic queries.
2. Preferred Domain:- With this feature, you can set the canonical issues of the site.
3. Image Search:- If you opt in to enhanced Image Search, Google Image Labeler would use the labels associated with your image, which in turn would result in better indexing and search quality of images.
4. Crawl rate:- Crawl rate is the time used by Googlebot to crawl your site on each visit. In order to keep pages getting regularly indexed by Googlebot, your site must be thoroughly crawled. Googlebot does this without creating a major impact on server’s bandwidth. For most webmasters, Google’s default crawl setting is fine but those webmasters, who experience problems regarding traffic, can change the Googlebot’s crawl rate. Basically, there are 2 options for setting the crawl rate.
I. Default Crawl rate: – On setting this option, you will let Google decide the crawl rate for your website which is, of course, recommended by Google.
II. Customized Crawl rate: – However, if you experience some traffic problems with server, then you can set the pointer on the slide bar. You can watch your current requests/sec and seconds between requests change according to slide in pointer.
You must set this rate based on your web server’s capabilities. Also, the rate would vary from one site to another and across time based on several factors. Now it’s quite important to think well before you set your crawl rate since on setting it slower will result in lesser number of fresh pages and number of crawled pages, while setting it faster would cause trouble of higher Googlebot traffic on your server.
The new custom crawl rate will be valid for 90 days after which the crawl rate will be set to default value. On making changes in the crawl rate here, it would only affect the speed of Googlebot’s requests during the crawl process. It will not change the frequency of Googlebot’s crawl. Google determines the recommended rate based on the number of pages in your site. This setting can be done for root level sites only. It can not be applied to sites hosted on a large domain like blogspot.com as Google has special settings assigned for them.