little more than a reading list (AI)
Apr. 7th, 2026 09:21 amHere, two papers and two articles, all about AI, all I think better than most:
Researchers at the Wharton School at the University of Pennsylvania are proposing an extended model of cognition as a way of measuring and studying “cognitive surrender,” the regular handoff of cognition to LLM models. It’s long but if you’ve got the patience, it’s here. I didn’t see much in the way of surprises, but it does provide an interesting framework for analysis.
One not-emphasised takeaway is that once again, the human intervention for wrong LLM responses model is shit. It’s not emphasised because that’s not the point of their paper – they’re demonstrating their model as an explanative/conceptual framework – but it’s still there.
Scientific American writes about a study showing that AI outputs tend to sway users’ beliefs, even when users are told about biases built into the model. As many – including me – have said many times before, this is absolutely part of the point of AI, particularly but not just for people like Elon Musk. But it’s good to see numbers on it.
Combine study two with study one and you see why the tech brogliarchs so eager to turn thinking into something they sell you. They don’t want to make your life easier, they want to make you pay to think like them. Or, as Karl Bode put it a few months ago, “The problem with AI isn’t going to be Skynet. It’s going to be amoral extraction class assholes applying half-cooked automation at scale onto deeply broken sectors in exploitative ways in a country too corrupt to have functioning regulators.”
Finally, give a look of the narrowly-focused (to coding) but still worthwhile essay, “I used AI. It worked. I hated it.” It strikes me that much of what he hated about it are what people who actually want to be managers like, which explains so very, very much, doesn’t it?
Posted via Solarbird{y|z|yz}, Collected.
no subject
Date: 2026-04-07 06:38 pm (UTC)Well, it's good to have a few year's warning of an existential threat to my profession (elementary education).
(CW: rant impending, mention of abusive scholastic systems and work environments, contemplating doom)
I've already experienced being the only person in the room who thinks it's a problem if the people teaching others how to think get in the habit of offloading their own thinking. That's going to intensify.
I don't know what my new coworkers will know how to do for themselves.
The industry-wide occurence of a combined brain drain and widespread lack of useful professional growth, occuring as a secondary effect of pressure that forces more and more people to rely upon mediocre automation, which produces typically worse-but-technically-acceptable and faster/cheaper outcomes along with off-the-books negative externalities...is a very familiar pattern, and not dependent on LLMs or even computers specifically.
With the whole genAI issue, it seems likely that the 'why not let the chatbot write it' problem is going to continue to repeatedly hit, to some extent, every profession that has documentation / writing requirements.
We KNOW that sticking kids in front of screens indefinetely doesn't work, but some of that will happen. More relevantly to me, teachers will be pressured to use genAI rather than *plan lessons*, meaning...they will not be teaching information they know, but delivering a slurry of information and misinformation that bypassed the little bit of critical analysis they had time for, following precise scripts that just don't work for nine out of ten of theiur students, at least not without tweaks they won't know how to make. It's egregious, yes, but we've already seen similar messes happen with three-cueing reading, the gutting of social studies, and what I like to call 'guess and check' math curricula. There's likely to be a squeeze to try to insist everyone use a specific tool or meet an output metric reliant on that tool, followed by a crunch as the absence of actual experts in the pipeline becomes apparent down the line, much like the shortages we've seen in healthcare when nurses and doctors aren't being trained. The combination of forcing out experienced experts who won't or can't adapt, choking the pipeline for new people, and dulling rather than sharpening current workers means there will also be a lot of 'trained' people who just don't know how to carry out core job responsibilities. Such as observing, probing, understanding, and adjusting for what an individual student did or did not follow in an explanation of how to do something.
The thing is I don't know if public education in the US will come back from that crisis as an institution. It's already borderline insupportable as a job for many. Meanwhile the outcomes are dismal considered as one option among many for schooling, and very VERY far from being a mainstay of actually providing or even approaching universal public education. I predict that in at least some school systems there will be a split between edutainment software implementors called teachers and childcare workers called something else, neither of whom has the professional authority, or access to the time and materials to plan and prepare, to actually teach their students. Which is of course to the massive detriment of children. Ticky-box training mandates are already preventing people who want to learn from doing useful professional development, and high-stakes standardized testing with no useful feedback is already a major timesuck and source of stress for students. The system is pre-broken in ways that make it extra vulnerable to quick fix snake oil salespeople.
Meanwhile, the core work of teaching kids--of teaching any student who doesn't yet have the mindset and independent study skills to independently *seek out* information and attend to it and retain something useful, e.g. getting something other than just sitting out of sitting through a lecture, video, or presentation--is going to continue to be something that is absolutely impossible to automate with anything close to current technology. And policymakers are going to ignore that inconveient fact. Yes, educational software is a useful tool with adult supervision, it allows for gamification and individualization and quick information retrieval and instant feedback and all kinds of goodies ... but the edutainment junkware outnumbers and out-advertises it.
Precisely no amount of screentime, from 0% to 100%, is going to fix the issue that you can't connect young students into a society without them experiencing social interaction. Printing off slop worksheets won't help, either.
A teacher who doesn't know how to explain their subject cannot teach it. At *best* they can neutrally present material that their students might or might not be able to cobble into a semi-working model of the content. So, widespread appearance of new barriers to professional learning, brand-new proliferating moments of being forced to pick *one* of the *two* needs of checking off a job task now *or* building expertise for ongoing use, like the third article describes happening for programmers, could be lethal to the profession. Making teachers worse at teaching is bad.
Interacting with a group of peers and a community of supportive adults while learning how to learn is the rationale of having schools at all. It works out badly for a lot of people -- me included -- who don't fit the mainstream, but sheer isolation is typically worse. The outcomes are *very* correlated to both how much one-on-one attention from teachers qualified in a given subject students get AND whether their overall basic needs are being met. Most people don't learn well without some social mirroring and personal feedback. Infants and young children *cannot* develop healthy brains and bodies without individual caregivers and experiential learning opportunities. Without providing that at school, we might as well lock kids in closets with a phone to keep them company. < sarcasm > That's gonna work GREAT. < / sarcasm >
If people send little kids to daycare for socialization and then stick them on a phone or laptop to 'learn' for a couple of hours a day, trying to provide both content and interpersonal education but seperately, the vast majority of kids will not learn to read and write, and neither will they develop effective social skills for resolving conflicts fairly or working towards shared goals.
So in reality, not on paper, it's going to be just as important as ever to have a class community to learn how to survive being in a group of humans. For learning other things -- crucial things, literacy and numeracy -- it will still be necessary to have teachers meet their students where they are in order to effectively introduce them to new skills. But for the people working in education, it'll get harder and harder to actually achieve either...and most teachers are already functionally working two jobs, six to eight hours of mass babysitting with some permissable teaching in there if they are lucky, plus roughly three to eight hours of prep and other job requirements per work day. Meaning the outcomes will get worse. And parents and kids who can will look for other options, and workers who can will look for different jobs, and it all spirals. I don't know where the bottom of the negative feedback cycle even is.
I don't think the people actively driving these pressure consider lower educational attainment, worse health, fewer job prospects, etc to be UNdesirable outcomes. I think they want workers that are ever more exploitable, at any cost. I think that setting 'higher standards' resulting in worse teaching and learning is a win-win for them: they can say they are insisting on improving test scores while gettiong the result of mass illiteracy.
So for me personally it's time for one of my periodic reasserssments of whether I am still in a job that allows me to fulfill my vocation: teaching. If the answer is 'maybe not,' I'm jobhunting, though my efficiency there is...not good. When the answer becomes 'no,' I might need to retrain as, say, a Montessori assistant.
I believe in the *idea* of public education. But I can't accomplish much *for* that idea, if my job isn't sustainable for me as a day-to-day or as a career path. I used to think there would always be jobs in education, but my estimate of the willingness of people to fool themselves about policies that hurt kids has unfortunately skyrocketed.
I'm thinking of your song "Something's Coming." Every time I think I finally fully appreciate its meaning...another monster shows up.
I refuse to stop teaching. But if that means I'm working a different job and volunteering at the local library or community garden, so be it.
Of course it's a moot point if we end up nuked first, right?
no subject
Date: 2026-04-08 05:44 am (UTC)yeah
It was all always about this.
Here's something I never say: as long as I can remember - which doesn't go back the way it does for normies, just saying... I've this combination of observation and knowledge, this vision, almost, of the Baby Boom generation...
Okay let me take an aside to make the usual but sincere expression of how I am not talking about every Baby Boomer, there are heroes of the resistance in that group too, I'm talking about the cohort as a whole and their socio-political gestalt, okay?
We good? Good.
Anyway.
As long as I can remember, I've had this... conception. This vision, almost. Of the Baby Boomers eating the world, leaving nothing but crumbs, and then rolling up the carpet behind them. It was a vision that I watched happen, including the little bit where they unrolled it again for their kids through school, before rolling it right back up again.
How that was going to play out later became more and more clear to me, and I started writing about it on Usenet in 1990, to a fair amount of dismay. I had to reign it in a lot to be taken seriously at all - cushion it, dampen it, condition it - but even given that, I've had people from then find me now going, "You were writing about now. You were writing about NOW on USENET in 1990. HOW?!"
If the army I helped build hadn't fallen for all the propaganda and hadn't decided to sit '24 out, I was going to write...
...I had an essay more than half-written about how the Boomers were told over and over again as kids that the world was going to go up in nuclear fire, but they could save it - but only if they could just find the right leader.
(seriously, go back and look, it's everywhere, I mean, everywhere. It's in fucking Beach Blanket Bingo, ffs, verbatim)
...and how that turned out once they finally found their fucking guy, and how all it wasn't entirely their fault, and how they were kinda set up for it by their WWII-scarred parents and elders (and also peak lead poisoning) but... it's too kind. I don't know that I can ever write it now.
Not that it matters.
Anyway, that's what "Something's Coming" was about.
Similarly, the whole "Crime and the Forces of Evil" shtick was about superheroes from a world refusing to abandon their ideas of good and evil when the world around them inverts, swapping those ideals. They stick to their guns - or rather, their principles - which means that now, they're villains. Or rather, supervillains.
Nobody got it. xD
On a side note, I'm surprised / charmed / pleased, and ... thankful, I suppose? that someone actually remembers my music. I never felt like I really reached many people, not in my own band anyway. So thanks for that.
no subject
Date: 2026-04-08 03:59 am (UTC)Your take reminds me of another software engineer, who described* going to a conference accidentally dressed like a manager/executive, and getting the pitches intended for managers. He realized ALL the software sales talks were pitching "the reproducible delivery of work that was good enough to accomplish business objectives, without having to rely on all those pesky technicians...We've got a drag-and-drop interface, so that you can now hire replaceable cogs to do the dragging-and-dropping at a consistent rate."
And that process in “I used AI. It worked. I hated it.” of review for 5 minutes, click to approve, review the next chunk... is both how managers relate to code and what they want senior programmers to do once they assign all the junior programmers to prompting 'AI'.
*https://ludic.mataroa.blog/blog/the-failed-commodification-of-technical-work/