Content-type: text/html Downes.ca ~ Stephen's Web ~ IAB Workshop on AI-CONTROL (aicontrolws)

Stephen Downes

Knowledge, Learning, Community

Robots.txt (a.k.a. the Robots Exclision Protocol) is a small file web servers provide to tell web crawlers like the Google search engine where they can search and where they can't. The Internet Engineering Task Force (IETF), which creates the protocols for the internet, is considering the use of robots.txt to manage what crawlers used by AI companies can do. This page is a set of submissions to that task force, including contributions from OpenAI, Creative Commons, the BBC, Elsevier, and more. Most of the submissions are pretty short and all of them are interesting reading. Via Ed Summers.

Today: 0 Total: 113 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Nov 21, 2024 07:02 a.m.

Canadian Flag Creative Commons License.

Force:yes