Cool. To be totally honest, I'd feel a lot better if we could just extend robots.txt especially given that we finally have a standard for it with IETF RFC 9309: https://datatracker.ietf.org/doc/html/rfc9309
Also might be worth looking at llms.txt (https://github.com/AnswerDotAI/llms-txt), though it's aimed to be more of an agent-friendly sitemap.xml alternative.
You note that the problem is that we "lack a standardized way for websites to declare their agent-interaction capabilities", but I don't understand how that squares with the lack of standards for intent types and domains in this proposal.
Cool. To be totally honest, I'd feel a lot better if we could just extend robots.txt especially given that we finally have a standard for it with IETF RFC 9309: https://datatracker.ietf.org/doc/html/rfc9309
Also might be worth looking at llms.txt (https://github.com/AnswerDotAI/llms-txt), though it's aimed to be more of an agent-friendly sitemap.xml alternative.
You note that the problem is that we "lack a standardized way for websites to declare their agent-interaction capabilities", but I don't understand how that squares with the lack of standards for intent types and domains in this proposal.