1075
AI companies are violating a basic social contract of the web and and ignoring robots.txt
(www.theverge.com)
This is a most excellent place for technology news and articles.
robots.txt has been an unofficial standard for 30 years and its augmented with sitemap.xml to help index uncrawlable pages, and Schema.org to expose contents for Semantic Web. I'm not stating it shouldn't not be a law, but to suggest changing norms as a reason is a pretty weak counterargument, man.