"A well-designed AI can be helpful in assisting humans to make complex decisions about exam grades," says Rose Luckin, "but it must be well-designed." The recently revised AI-predicted A-Level grades in Britain are a good example of where such an algorithm can go wrong. This is just one respect in which AI can be a double-edged tool in education. Another is the disadvantage it creates for students who do not have the computer power they need to run the algorithms. And with a lack of diversity in the AI workforce there is a continuing of bias and misrepresentative training data. Though as Jon Dron says, "having such issues is far preferable to not knowing how we are affected, and not being able to fix it." Luckin sees AI playing a role in content selection in 2030; "the intelligent backbone would enable the best learning resources to be made available to all learners in the most appropriate way for each learner." I think, though, we should focus more on supporting AI as a tool to be used by learners rather than teachers and institutions.
Today: 7 Total: 35 [Share]
] [