The tool is configured through a robots.txt document that tells the robot what data can be crawled and what data can be closed. However, such sites will continue to be indexed by the Google search engine.
Site owners will be able to control the availability of data that Google uses to train Bard and VertexAI neural network models. The company plans to expand tools for controlling access to data.
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.