[Home] [Top] [Archives] [About] [Options]

OLDaily

Welcome to Online Learning Daily, your best source for news and commentary about learning technology, new media, and related topics.
100% human-authored

Perceptual Learning
Kevin Connolly, Stanford Encyclopedia of Philosophy, 2024/09/20


Icon

Perceptual learning is defined as "any relatively permanent and consistent change in the perception of a stimulus array, following practice or experience with this array." For example, if you're a beginner, all coffee tastes like coffee, but if you try a large number of different coffees, you learn to spot the difference between African (light, fruity) and South American (woody) coffees. This is an updated encyclopedia article on the topic of perceptual learning, and well worth reading if you want to understand how experience and sensation are closely related. Image: Huang Changbing.

Web: [Direct Link] [This Post][Share]


This App May Indicate Something Is Deeply Wrong with Us - Daily Nous
Justin Weinberg, Daily Nous, 2024/09/20


Icon

You can only sign up on an Apple iPhone, so I haven't tried it yet. But it's intriguing - it's like Twitter, but as soon as you sign up, you have a million followers, but they're all AI. Justin Weinberg is not impressed. "Really it's just sad. If this app becomes successful, what does that tell us? That we're not good at being there for other persons, such that many of them feel they have to turn to this? That we don't care if there are other persons there for us, since we can have substitutes like this? Both?" I get that we're social beings and that this is the essence of human existence and all that, but there's a lot about society that doesn't work for a lot of people, and if something like this fills the gap for them, it sounds good to me.

Web: [Direct Link] [This Post][Share]


How can we make the best possible use of large language models for a smarter and more inclusive society?
Max Planck Institute, 2024/09/20


Icon

Short article describing and referencing an article published from 28 authors in major scientific institutions. It says, in part, "If LLMs are to support rather than undermine collective intelligence, the technical details of the models must be disclosed, and monitoring mechanisms must be implemented." The actual article is published behind a paywall on Nature, which is a classic case of not understanding the point you've just made in your paper.

Web: [Direct Link] [This Post][Share]


Ms Rachel - Toddler Learning Videos
YouTube, 2024/09/20


Icon

This YouTube channel was mentioned on CTV today as one of the most popular sites out there for parents to treach young children. "Ms Rachel uses techniques recommended by speech therapists and early childhood experts to help children learn important milestones and preschool skills!" What I notice is that the videos are long - an hour or two hours long! They're a mixture of basic language learning and popular children's songs. I started playing them this morning and couldn't turn them off!

Web: [Direct Link] [This Post][Share]


Releasing Common Corpus: the largest public domain dataset for training LLMs
Pierre-Carl Langlais, Hugging Face, 2024/09/20


Icon

One thing I love about Mastodon is that I get to sit in on conversations like this one between Clint Lalonde and Alan Levine on open data sets used to train large language models. It's prompted by Lalonde asking whether there are other open data sets like Common Corpus (not to be confused with Common Crawl). This leades to an article about The Pile, an 885GB collection of documents aggregating 22 datasets including Wikipedia, ArXiV, and more. There's Semantic Scholar, which appears to be based on scientific literature, but also includes a vague reference to 'web data'. There's also the Open Language Model (OLMo).

Web: [Direct Link] [This Post][Share]


We publish six to eight or so short posts every weekday linking to the best, most interesting and most important pieces of content in the field. Read more about what we cover. We also list papers and articles by Stephen Downes and his presentations from around the world.

There are many ways to read OLDaily; pick whatever works best for you:

This newsletter is sent only at the request of subscribers. If you would like to unsubscribe, Click here.

Know a friend who might enjoy this newsletter? Feel free to forward OLDaily to your colleagues. If you received this issue from a friend and would like a free subscription of your own, you can join our mailing list. Click here to subscribe.

Copyright 2024 Stephen Downes Contact: stephen@downes.ca

This work is licensed under a Creative Commons License.