During the book fair in Frankfurt, publishers announced a new standard which is supposed to automatically communcicate permissions information. This means as I understand it, that search engine and other bots would automatically see if the content on a certain site is supposed to be indexed or if the publisher prohibits indexing. While I do understand the publishers point of view I nevertheless think that the ACAP protocol is a bunch of crap – clinging to a concept of yesteryear, publishing 1.0. The creators of the standard (ACAP stands for Automated Content Access Protocol) haven’t paid attention to the goings-on of recent times. Call it Web 2.0, call it user/consumer created content, call it GestureBank or what not, but information is growing up and accessible to more and more people. What will prevent “live” content from being seen? Nothing. If you don’t want your information to be seen, password-protect it. And if it is “live” but no indexed, why publish it at all? A newspaper article clearly conneced this protocoll to the Google Book project, but if course it doesn’t say that on the web page…