For more controlled access to Googlebot as far as you URLs are concerned, Google has come up with some changes in its Webmaster Tools, in reference to handling of URLs with parameters. Now, you can decide which page Google should crawl and index, and which pages it should not. This is also a good way to hide any duplicate content, if you have that on your site. Here is a word of caution though, be careful while preventing some of your URLs from being crawled as it would be completely hidden from the search engines in that case.

Google chief also criticized using multiple domains for selling the same product; doing content and RSS syndication and then duplicating it on other domains, resulting into user’s IRE. He clarified that if anything is against good user experience, it is not liked by the Google as well.
Leave a Reply