America Needs a Working-Class�Media
Alissa Quart,
Columbia Journalism Review,
2025/02/21
The idea that we need a working class media seems appealing until we consider that people don't really want to be working class. They would like to enjoy the privileges enjoyed by someone who, like the author, was able to attend schools like Brown and Columbia before entering a life of authorship and academia. From someone from the actual working class, an article like this is best read as a call to deprofessionalize and defund journalists, all in the name of making them more 'authentic'. I see so much writing like this, including in the field of education, coming from elite universities praising the virtues of the working class written by people in no danger of joining them. Truely authentic working class writing isn't 'authentic', it's aspirational, not celebrating what is, but searching for something better.
Web: [Direct Link] [This Post][Share]
The Costs of AI in Education
Marc Watkins,
Rhetorica,
2025/02/21
So there's some smart thinking in this post and some thinking that is, well, less so. Here's the smart bit: "Universities aren't paying for AI - they're paying for the illusion of control. Institutions are buying into the idea that if they adopt AI at scale, they can manage how students use it, integrate it seamlessly into teaching and learning, and somehow future-proof education." That rings totally right for me. But on the flip side, here is a less successful line of reasoning: "It will take years of trial and error to integrate AI effectively in our disciplines. That's assuming the technology will pause for a time. It won't. Which leaves us in a constant state of trying to adapt. So, why are we investing millions in greater access to tools no one has the bandwidth or resources to learn or integrate?" The assumption is that we (the instructors, the institution) must fully master the tool before it is useful to learners. But of course, that's not true at all.
Web: [Direct Link] [This Post][Share]
Suffering is Real. AI Consciousness is Not.
David McNeill, Emily Tucker,
Tech Policy Press,
2025/02/21
"Probabilistic generalizations based on internet content are not steps toward algorithmic moral personhood," write David McNeill and Emily Tucker. To which I respond: why not? Now let's be clear here: I wouldn't think that the generalizations themselves are instances of consciousness. But just is there is something it is 'like' to be a bat - or a worm, or an octopus, or a human - surely then there could be there is something it is 'like' to be a machine. Now we as humans can't really grasp that, though we can't really grasp any of the other examples either. Sure, "an algorithmic or computational process is a kind of abstract machine we use in our thinking, it is not a thinking machine." Sure. But the machine could be a thinking machine - it is as real as you or I.
Web: [Direct Link] [This Post][Share]
Automated Contempt
Audrey Watters,
Second Breakfast,
2025/02/21
OK, this much from Audrey Watters is true: "We cannot outsource thinking and compassion to a hierarchy-generating machine and expect the world to be anything other than automated emptiness and exploitation." But that's a different question from the one asking whether we need human teachers to teach. Now I'm not endorsing the idea "that automation will make learning better, faster, cheaper, more scalable, more 'personalized.'" What I think technology does is to enable students to manage their own learning for themselves, so that they can be thinking and compassionate learners, and not mere automatons following a teacher's instructions. But it's interesting. Watters again: "It (Paul Ford's God Gets Involved) tackles the biiiig question: Can AI replace God?" If you think of God as something 'out there', emanating rules and laws and proclamations, then sure, AI could replace God. But if you think of God as something 'in here', in ourselves, then AI doesn't stand a chance.
Web: [Direct Link] [This Post][Share]
Grappa-Ling With Mark Carney (1)
Stephen Downes,
Half an Hour,
2025/02/21
I've been reading Mark Carney's book Value(s). It seems a reasonable read given his new place in Canadian political affairs. It's a serious book by a serious thinker, which I must say is a refreshing change in a landscape dominated by demagogues. Will I support everything he says? No. But the thinking here is well worth engaging.
Web: [Direct Link] [This Post][Share]
The OAD is looking for a co-editor
Open Access Directory,
2025/02/21
According to the website, "The Open Access Directory (OAD) is a crowdsourced collection of factual lists about open access (OA) to science and scholarship. Maintained by the global OA community, OAD serves as a central hub for discovering, referencing, and updating OA-related information." It is a mature project and has been around since 2008. This post gives me an excuse to link to them (as though I need an excuse). It seeks a volunteer to join as a Co-Editor. The OAD is a wiki environment and also has social media accounts.
Web: [Direct Link] [This Post][Share]
There are many ways to read OLDaily; pick whatever works best for you:
This newsletter is sent only at the request of subscribers. If you would like to unsubscribe, Click here.
Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own, you can join our mailing list. Click here to subscribe.
Copyright 2025 Stephen Downes Contact: stephen@downes.ca
This work is licensed under a Creative Commons License.