Kids are taught to look things up, right? That basic recommendation assumed that available sources would contain facts. But now, there are LLMs, which are "nonsense machines." So looking things up is bad advice, or at least incomplete advice, given the widespread availability of a tool that lies to you. Otherwise kids are going to use the nonsense lying tool and believe they've learned something from it.
"They 'looked it up'! they got it from somewhere! ... it's something they think they LEARNED"
Well this is grim
— Pavel (@spavel.bsky.social) July 6, 2024 at 5:15 PM
[image or embed]
Schools were unprepared for this:
"at least in the early days, a total crapshoot: Some states claimed that they had not thought about ChatGPT at all, while other state departments of education brought in consulting firms to give trainings to teachers and principals about how to use ChatGPT in the classroom. Some of the trainings were given by explicitly pro-AI organizations and authors, and organizations backed by tech companies. The documents, taken in their totality, show that American public schools were wildly unprepared for students’ widespread adoption of ChatGPT, which has since become one of the biggest struggles in American education."
— American Schools Were Deeply Unprepared for ChatGPT, Public Records Show, Jason Koebler, 404 Media, May 15, 2025
"Everyone participating in generative AI is polluting the data supply for everyone." AI is already eating itself: www.theregister.com/2025/06/15/a...
— Doc Sarah Lonsdale (@sarahjlonsdale.bsky.social) June 22, 2025 at 2:23 AM
[image or embed]
No comments:
Post a Comment