I'm wondering for the website, I'm interested in leveraging the best practices for AI spiders, large language models and indexing the website and specifically I'd like to adopt a friendly posture, by which I mean encouraging LLMs to scrape the content, etc.
I know this is a very new and emerging field, but are there any equivalent of a robots.txt that would create a permissive set of rules for AI tools?