Appropriate/ing Knowledge and Belief Tools?
Tony Hirst,
OUseful Info,
Jul 09, 2021
This post wanders through some compelling thoughts raised by the use of AI software like GPT-3 to write stories, relate facts, give instructions, program computers, and more. One question is: how reliable are they? Tony Hirst comments, "we might see Google as attempting to perform as a knowledge engine, returning facts that are true, and OpenAI as a belief engine, freewheeling a response based on what it's just heard and what it's heard others say before." But more: where does GPT-3 get its information? There's no attribution, no direct way to know whether it has been plagiarized, "is it possible to licence code in a way that forbids its inclusion in machine learning/AI training sets," and should we be prohibiting the use of such tools by students in coding exercises?
Today: 7 Total: 109 [Share]
] [