My Fantasy Football League Taught Me to Worry About AI-Generated Journalism's Potential
It might not be as far off as we think.
Hello, all! Parker here.
So far, whenever a company has gotten caught trying to replace its journalists with AI, it’s gone a little something like this: a media outlet types something like “write an article ranking the Star Wars films in chronological order,” into a large language model, it cranks out something based on its original training data, and then bam, you get… something kind of crappy.
But first, real quick: here’s the part of the newsletter where I ask you to consider signing up for the free version if you’re new here and ask existing free subscribers to consider upgrading to the paid version.
That’s exactly what happened when G/O Media tried that last year. It was awful, and I think it led a lot of people to think that maybe generative AI1 wasn’t quite ready for primetime. And, yeah, if that’s your idea of how to use it, definitely not.
In the months that followed, I took part in a fantasy football league with a few other media people, most of whom covered right-wing media, and all our team names were cheeky nods to that (one team, for instance, was named Vish Burrow, a combination of George Santos staffer Vish Burra’s name and Bengals quarterback Joe Burrow — you get the idea). Naturally, after my many years of having to watch Fox News for work, I named my team the Chicago Baiers, after Bret Baier (awful). I even made a fun little logo. Check it:
Anyway… so I noticed a few weeks in that there were these recap emails that were being sent out. And they were more than just lists of who won, who lost, and so on. No, they were full articles written as though we were an actual league. Here’s a small sample of one of the updates:
It reads like a pretty normal article, right? For the first couple of weeks, I just sort of assumed that it was one member of the league with way too much time on their hands. This, as it turns out, was not the case. These recaps were automatically generated, and it’s easy to see how. CBS Sports had all the information it needed. I knew which players each one of us played, which ones we sat, which ones we cut and traded, and how all of them performed. It had all the data, and from that data, was able to generate articles that put together some fairly compelling storylines. Was it the greatest writing on earth? No, but it did the job.
And maybe that’s what the future of journalism is going to look a bit more like: journalists collecting the relevant information (the who, what, where, when, why, and how), feeding it into a machine, letting the machine churn out an article, and then editing the final product to make sure it’s correct and complies with publication style guides.
My point here isn’t that this is what should happen, just that it may be what will happen. The writing portions of journalism may slowly fade away, but information gathering and reporting will remain crucial elements, even in big shifts to AI. Any executive running around making cuts willy-nilly right now who doesn’t see that getting rid of reporters is the last thing they should be doing at the moment… well, maybe they are the ones who should be on the unemployment line, not the journalists.
Here’s where someone will usually go, “Well, technically, it’s not AI, because…” and while I appreciate that, I think we may have to accept that content generated from LLMs is going to be referred to as “AI” even if it’s not technically AI, at least colloquially. Sort of like when there were those things on wheels that people were calling hoverboards a few years back. They did not hover. They were not Marty McFly’s choice of transportation in 2015, etc.