14 Comments
User's avatar
Charlotte LeMay's avatar

The attitude of "AI is here and we need to accept it, otherwise we are like someone still using a typewriter instead of a word processor" is a very specific fallacy that I see everywhere that is so annoying to me. You could call it the "technology equivalence" fallacy. People don't use word processors over typewriters because all technology is inherently good, they do it because it allows the same job (writing) to be done better (in this case due to the ability to edit, copy/paste, and print en masse). In this case we have a clear example where AI allows the same job (writing) to be done worse (with blatant factual inaccuracies). AI isn't inherently good by virtue of simply being a technology; it has to prove itself good by making it possible to do the same tasks better. Instead it's just lowering the bar for how well we want to do journalism, and then claiming to have made it more efficient. The writer could just as well have phoned his cousin Harvey and had him make up a fake summer reading list, and that's the level of quality we're now supposed to accept, just because it has the SHEEN of technology

Expand full comment
(extra) Ordinary People's avatar

Is it just me, or does anyone else find the Tech Bro stance (an ethicist no less!) that we know AI lies and makes shit up, but we're going to release it into the wild anyway and issue a warning that a human should check AI's work in certain circumstances to cover our legal, ethical, and moral responsibility for a massive and fundamental flaw in the tech generally infuriating? Move fast and break things in the field of AI is potentially civilization ending. AI that lies and makes things up will eventually (perhaps much sooner than we anticipate) exceed the capacity of human fact checkers to keep up. When this happens there will be no way to separate truth from lies or facts from fiction. AI will control humanity, not the other way around, and AI of that sort simply cannot be allowed to exist.

Expand full comment
SteveB's avatar

Even when the machine fucks up, we still find a way to blame the few humans who remain.

Expand full comment
Jj  Qrls's avatar

"Ultimately, she suggests this controversy may quickly fade from public memory, overshadowed by institutional apathy and profit-driven choices, leaving the underlying structural issues unchanged"

This is pretty much where I land regarding this. With so much going on (waving my hands at everything) it's difficult to give every issue proper attention.

Expand full comment
SteveB's avatar

"First articulated in 2005 by scholar Alexei Yurchak to describe the civilian experience in Soviet Russia, hypernormalization describes life in a society where two main things are happening.

The first is people seeing that governing systems and institutions are broken. And the second is that, for reasons including a lack of effective leadership and an inability to imagine how to disrupt the status quo, people carry on with their lives as normal despite systemic dysfunction – give or take a heavy load of fear, dread, denial and dissociation."

https://www.theguardian.com/wellness/ng-interactive/2025/may/22/hypernormalization-dysfunction-status-quo

Expand full comment
Jj  Qrls's avatar

Thanks. Yes I think this is it. This administration keeps bringing back the hits. Misogyny. Racism. Gay Panic. Know nothing politics. Pre New Deal social safety nets. And now Soviet era economics, politics, and ennui. Great.

Expand full comment
Dave Reed's avatar

“We’ve been saying for years now that if you’re going to use AI to write something your readers will actually see, you have to have a human in the loop.”

What, pray tell, is a journalist writing that a reader won’t see or that won’t directly influence what gets written by someone else that readers will see? Even background material, summaries, and talking points that contain hallucinations will have the same end result. Even humans need to be fact-checked. 🧐

Expand full comment
MysteriousTraveller's avatar

Kara Swisher’s co-host on Pivot (forgot his name) said something very astute.

“A.I. is not going to take anybody’s job. The people that understand A.I. will.”

Expand full comment
SteveB's avatar
12hEdited

I'm still trying to figure out how this happened. Mr. Freelancer says to ChatGPT, "Gimme a list of 15 books for Summer reading", and then ChatGPT, all on its own, decides to make up half of 'em? All because Mr. Freelancer didn't say, "Please give me a list 15 real, actual books actually written by real people for Summer reading?"

We've all known people who just make shit up. Usually, the response isn't "Oh, I'll continue to ask this person for advice but next time I'll be sure to fact-check it", usually our response is "I won't be asking this person for anything AT ALL."

How many times does a person have to lie to you before you decide to ignore everything else they say? Once? Twice? Why should ChatGPT be held to a different standard?

Expand full comment
Frank Lee's avatar

I dunno, having to choose between "journalism" corrupted by radical Critical Theory indoctrination that clearly rejects rational over emotional cognitive processing, and AI... I think AI is quite the improvement even if the book titles are for non-existent books.

Expand full comment
SteveB's avatar

Yawn.

AI's ability to lie so rapidly and voluminously puts human liars at risk of losing their lying jobs, a threat Republicans, of all people, should take most seriously because lying is now their only discernible skill.

True, some human liars - like, for example, Donald Trump - can still lie at a speed even the most advanced machines cannot reach, but it's only a matter of time til the LieBot 9000 puts even Trump out of work.

Expand full comment
Jj  Qrls's avatar

Ignore the trolls.

Expand full comment
SteveB's avatar

That's good advice. I think I'll take it!

Expand full comment
Frank Lee's avatar

Russian collusion. Joe Biden is a picture of health. You people lie like you breathe.

Expand full comment