I see no reason to use robots.txt to shroud this valuable documentation completely from the testing, documenting, developer and user communities. sometimes the best documentation on features of previous versions is in updated documentation. google is often easier to use to find pages than other methods, and robots.txt would prevent or complicate "internal" indexing with other tools also. I think links to other versions, a site map, and labeling of the pages with version numbers, would be much better.
The site map spec is at https://www.google.com/webmasters/tools/docs/en/protocol.html, and see also http://en.wikipedia.org/wiki/Site_map
I see no reason to use robots.txt to shroud this valuable documentation completely from the testing, documenting, developer and user communities. sometimes the best documentation on features of previous versions is in updated documentation. google is often easier to use to find pages than other methods, and robots.txt would prevent or complicate "internal" indexing with other tools also. I think links to other versions, a site map, and labeling of the pages with version numbers, would be much better. /www.google. com/webmasters/ tools/docs/ en/protocol. html, and see also http:// en.wikipedia. org/wiki/ Site_map
The site map spec is at https:/