Last Time, the Religious Right Told Us Not Only What We Can Teach but How to Teach It
Alfie Kohn,
2022/05/05
I have zero desire to engage in the culture wars but I also take note of how often they overlap into my own world and my own work. This post from Alfie Kohn sets a stage not only for today's newsletter but for a lot of my own work in general. A form of educational activism has emerged that not only seeks to limit content, but also pedagogy, in such a way as to have students learn based on authority, rather than through their own skills and inquiry. Every time I comment on the need for diversity, for critical thinking, or for active learning - and contrary, therefore, to things like direct instruction or content-based pedagogy - I am being drawn into this conflict. I see the same things Kohn dioes - but in my own analysis I keep the words 'religious' and 'right' out of it. These are just facades. To me, it's a matter of money, power and control, and whether we will ever get to live in something like a genuine democracy, or be entrenched forever in an oligarchy.
Web: [Direct Link] [This Post]
Don’t call it crazy: How the media “wraparound” effect cements people’s beliefs
Dan Falk,
Nieman Lab,
2022/05/05
This is a really good interview with Whitney Phillips, who argues that the narratives from media networks - "especially about liberal bias, or about how you can't trust experts" - is the oxygen people have breathed their entire lives; "that's what you bring to Facebook; that's not what Facebook gives you. So once you go to Facebook, with those narratives fully internalized... the algorithm starts feeding you more of what you're already bringing to the table." So "we just can't simplistically say, it's the algorithm that's radicalizing our kids - you don't just take someone who's a run-of-the-mill, everyday person, a centrist or a moderate, and have them watch 10 YouTube videos, and suddenly they're a Nazi." This interview exists, of course, because Phillips has published a book, but you can read The Oxygen of Amplification (three part PDF - part one, part two, part three) online for free thanks to Craig Newmark Philanthropies and News Integrity Initiative.
Web: [Direct Link] [This Post]
The Horrors of Change
Tim Stahmer,
Assorted Stuff,
2022/05/05
I grind my teeth every time The Next Chapter plays on CBC because it feels to me like an endorsement of, and advertising for, traditional print media over the wealth of other media out there. I feel the same about other media outlets that structure their coverage around what books and newspapers have to say. And this applies to learning, so I agree with the National Council of Teachers of English (NCTE) when they say "Students should examine how digital media and popular culture are completely intermingled with language, literature, and writing," in their recent position statement, Media Education in English Language Arts. "The time has come to decenter book reading and essay writing as the pinnacles of English language arts education." This post by Tim Stahmer is a response to Jay Matthews, who reacts to the NCTE position with horror, though as Stahmer says, the NCTE's core motivation "is absolutely correct, and way overdue."
Web: [Direct Link] [This Post]
OPT: Open Pre-trained Transformer Language Models
Susan Zhang, et.al.,
arXiv,
2022/05/05
Meta (a.k.a. Facebook) has released "Open Pre-trained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M to 175B parameters." OPT is similar in scale and capacity to GPT-3, which has been producing astonishing results for OpenAI. The model is available for non-commercial use. Meta is also releasing its code and it training logbook. As the article (30 page PDF) reports, OPT was developed to perform natural language tasks, such as detection for hate speech, 9 types of biases, and toxicity. "Our goal was to replicate the performance and sizes of the GPT-3 class of models, while also applying the latest best practices in data curation and training efficiency." See also: Meta AI, Hacker News,
Web: [Direct Link] [This Post]
“12 examples of artificial intelligence in everyday life” [Oldman] + 6 other AI-related items
Daniel S. Christian,
Learning Ecosystems,
2022/05/05
This is a grab-bag of an article, as the messy title suggests, but it's also useful in that it makes it pretty clear that AI is with us now and influencing our everyday lives. Now by 'AI' of course we don't mean intelligent robots walking the earth, but we do mean advanced algorithms that perform functions previously thought impossible for machines: being creative, solving problems, making predictions, prioritizing information. And with this, new skill sets are being developed and new roles for designing and feeding data to AI systems in the future are evolving.
Web: [Direct Link] [This Post]
In Test Tubes, RNA Molecules Evolve Into a Tiny Ecosystem
Yasemin Saplakoglu,
Quanta Magazine,
2022/05/05
This is as interesting as anything I've read this year. Here's the gist: researches developed a type of RNA that can replicate itself, put it in a warm environment, and watched as different types of 'host' (which can replicate) and 'parasite' (which depends on another's replication mechanism) variants evolved. These variants competed with each other at first, but after a time evolved into a stable cooperative network. "Scientists have focused on studies of competition in evolution for so long that the role of cooperation "has been a bit overlooked," Xavier said. "I think cooperation is also going to start having a major role, especially in origins, because there are so many things that have to come together in the right way."
Web: [Direct Link] [This Post]
Coursera Launches Clips to Accelerate Skills Development through Short Videos and Lessons
Coursera Blog,
2022/05/05
The whole story is basically in the headline. Coursera is introducing "Clips, which currently offer over 10,000 bite-sized videos and lessons from the world's leading companies and universities, and will scale to more than 200,000 videos by the end of the year." Now maybe this might turn into something useful (the page says it will roll out to business customers in June). But right now it's not even remotely useful: there's no 'clips' page; to view the sample 'clips' you have to enroll into Coursera (and you have to answer a long series of questions) and then you're simply taken to the course page where the 'clip' is one of the lesson videos. There's a big market for free-standing short learning content (consider CSPS's Busrides or Clark Aldrich's Short Sims) but Clips as currently available fails utterly to tap into that.
Web: [Direct Link] [This Post]
Twitter’s decentralized, open-source offshoot just released its first code
Adi Robertson,
The Verge,
2022/05/05
Source code for Twitter's decentralized open-source 'Bluesky' social network has been released and is available on GitHub. It's a "data protocol which we've termed ADX: the Authenticated Data eXperiment. The 'X' stands for 'experiment' while the project lives in an early exploratory state" and developers are warned "please do not try to build anything with this!" The system is based on decentralized identifiers (DID) which the codebase says is "the canoncial, unchanging identifier for a user." One of the recent changes removes a reference to "user IDs at a low cost to users" leaving open the question of the nature and makeup of "the (currently-unnamed) DID Consortium." To me, the real question is how DID will be integrated with ActivityPub, a concept explored here.
Web: [Direct Link] [This Post]
This newsletter is sent only at the request of subscribers. If you would like to unsubscribe, Click here.
Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own, you can join our mailing list. Click here to subscribe.
Copyright 2022 Stephen Downes Contact: stephen@downes.ca
This work is licensed under a Creative Commons License.