This is Audrey Watters speaking to a class of students about the ed tech industry and especially about the use of algorithms to surveil and to predict outcomes. She covers some of the more recent events, including Ofqual's ill-conceived notion that student grades should be adjusted based on the history of the school they attend. But it makes me think - isn't this what we do anyways? The algorithms magnify and weaponize the bias, but the bias is there nonetheless. Even without AI, elite institutions somehow manage to admit more students from private upper-class schools, no matter what their grades. Even without AI, exam proctors are going to regard the darker skinned students with more suspicion. We need to look at AI in a way Watters doesn't usually, unfortunately, and that is, as a way to expose and redress bias and prejudice, as opposed to merely magnifying it. Because just going back to the way things were is just not an option for so many people.
Today: 6 Total: 98 [Share]
] [