When I used to give talks explaining the news industry to people in PR and business, part of it would describe the process of producing a daily paper. Imagine you work for Mars or Cadbury or KitKat, I’d say, except that each day when you turn up for work all the ingredients from yesterday are gone, and it’s not clear whether you can source them today. But everyone wants something that tastes and looks like what you made yesterday. Though not, crucially, exactly the same. OK, off you go.
That’s the Sisyphean task that confronts journalists, whether on daily papers or TV news stations or websites: people want something that’s similar to what you provided yesterday, but not the same. Everything from the biggest story on the front page (the “splash”) to the crossword to the weather must be different. But it should all feel the same.
This is trickier than it sounds, of course; getting your publication to have a distinctive voice is a delicate dance, established over the course of years during which you discover and strengthen your audience. Britons know that if you read about the same event (say, the introduction of a new bill) in the Daily Mail, the Daily Telegraph and The Guardian then while the basic facts will be the same across them, the scaffolding around it—the implications of what it means in the first couple of paragraphs, the people chosen to comment on the event, the rest of the context that shores up the piece.
Keeping this “voice” consistent is one of the trickiest things that journalists who are new to a publication have to learn. It becomes part of the institutional memory: when an inexperienced journalist joins and writes their first piece, there will be a more experienced editor who reviews what they write, and may throw it back (in the old days, literally) and demand that it is rewritten to emphasise whatever the institution’s stance is: perhaps that things the Tories do are wonderful, or alternatively that they’re terrible, and so someone to back up that view is required to give a suitable quote. Once that has been done, the piece will/should go to a subeditor (a what? Read this) who will knock it about to a varying extent—probably less at The Guardian than at the Daily Mail, where the “house style” is far more rigorous, for example. From there, it will be published (the writer fervently hopes), though more important pieces may go through another editing/revision process for length, or emphasis, or importance: in a newspaper, more prominent articles will get worked over by many people.
I know, I know: that sounds like a lot of hands on the tiller. But it’s necessary generally in order to keep the ship steering in the same direction. (You might ask: where’s the fact-checker in all this? The answer, at least in the UK, is that the fact-checking job is spread through the writer, editor and subeditor. I’ve had lots of calls from subeditors asking me if I’m sure of this fact or that.)
Dramatic changes to how things are done and what the stance of the publication is disturb the institutional memory; it’s like a blow to the head if an editor comes in and says “we used to hate [x], well, now we love [x].” Readers/listeners/viewers don’t like it either, which is part of why TV stations are incredibly careful about which presenter they hire to replace someone who’s going. Getting “the voice” right is difficult.
So that’s how institutional media work. And that’s just the process of getting the right tone on the content that you do have. Though of course as a journalist, your challenge is—as mentioned at the start—coming up with that content, again and again, day after day, hour after hour. Things have changed with the rise of the web; where it used to be about researching and writing one story a day (were we very lazy? Well, we didn’t have the internet—try finding a story without that), these days newsbloggers might be writing five, ten, maybe more posts per day. The Sisyphean task of rolling the rock of content up the hill only to come back the next day and found it has rolled down again has turned into one where the rock rolls back faster than ever.
In this comes generative AI, which is already being used to create junk content in places where journalists used to create good content. The most recent example is Sports Illustrated, a once-illustrious title which has fallen on hard times and new owners, who decided that getting a chumbox company to do some content was a good idea. It wasn’t. Maggie Harrison at Futurism (a real journalist! Doing journalism! At a publication!) discovered that there were a slew of nonexistent “writers” creating “articles”.
SI’s owners did an “investigation”, and came back to say:
The articles in question were product reviews and were licensed content from an external, third-party company, AdVon Commerce. A number of AdVon's e-commerce articles ran on certain Arena websites. We continually monitor our partners and were in the midst of a review when these allegations were raised. AdVon has assured us that all of the articles in question were written and edited by humans.
Yeah, no. The articles included stuff by “Drew Ortiz”, who had an AI-generated headshot. Harrison had a murderous paragraph:
The AI authors' writing often sounds like it was written by an alien; one Ortiz article, for instance, warns that volleyball "can be a little tricky to get into, especially without an actual ball to practice with."
If SI is seriously trying to tell us that an actual human wrote and edited that, they must think we’re stupid. As Harrison notes, another source told her:
“The content is absolutely AI-generated, no matter how much they say that it's not.”
The Arena Group owns SI. So Harrison, being smart, began looking further into other media properties it owns:
Take TheStreet, a financial publication cofounded by Jim Cramer in 1996 that The Arena Group bought for $16.5 million in 2019. Like at Sports Illustrated, we found authors at TheStreet with highly specific biographies detailing seemingly flesh-and-blood humans with specific areas of expertise — but with profile photos traceable to that same AI face website. And like at Sports Illustrated, these fake writers are periodically wiped from existence and their articles reattributed to new names, with no disclosure about the use of AI.
CNet also has AI-generated content which is simply about the current mortgage rate: it’s entirely to get clicks because the main source of traffic is Google (and to a lesser extent other search engines) and guess what, it works.
This is only going to accelerate. Despite Harrison’s work, Arena isn’t going to stop doing this sort of thing, and more media-owning companies will join it, because it’s cheap and it’s quick and there are enough people (or bots) that are happy to click on it. The efforts right now are poor, but I can remember the early internet, and a lot of that experience was pretty poor too. It didn’t take that many years to improve a lot.
So for the Sisyphean journalists, things are all changing pretty fast. The mountain is blowing up in front of you: do you even need to roll that rock of content up the hill anymore? Is something going to do that for you? Will it do it better? Will it do it cheaper? Unfortunately it’s the latter point that will catch the attention of companies like The Arena Group.
That in turn suggests to me that content creation by humans will increasingly be pushed into spaces where the human touch makes a difference. And what are we good at? It isn’t articles saying that volleyball is difficult without a ball. It’s going to be stuff that gets people wound up, and also makes you money.
Such as this:
Only one thing about content designed to wind people up: it’s vulnerable to debunking. Such as by the BBC Verify journalist Shayan Sardarizadeh, who responded to that one promptly:
But marvellous as Sardarizadeh’s work on social networks is (he’s out there fighting the good fight against disinformation and misinformation on Twitter and Instagram), you also have to think that he too is going to be overwhelmed by the irresistible rise of AI-generated content. There’s just too much, even now. Look at this thread (which Sardarizadeh retweeted) about a network of 24 AI-generated bots and accounts. They post AI-generated art, and amplify each other’s work, and respond to big influencer accounts—apparently with the highbrow aim of selling T-shirts. How bathetic. And that’s quite apart from all the misinformation and disinformation being thrown around in the Israel-Hamas conflict, whose effects are rippling around the world: I’ve never seen such a polarising set of events on social media.
The problem is that there are now perverse incentives to spread misinformation on social media and on YouTube. Because it’s all about attention, the only zero-sum game in town, in a world where everything else can be reproduced endlessly. If you’re looking at this person’s tweet, you’re not reading that person’s article on a news site. And everyone gets the same amount of time per day to dole out their attention. If you can get paid to generate misinformation (as that tweet by Charlie Kirk above would do; he’s a Verified user on X/Twitter, which means he would be in line for some sort of revenue share) then you’re going to do that, because it’s far more profitable to wind people up than to soothe them—a fact that British tabloids have taken to heart for decades. (Why do you think the Daily Mail is so widely read?)
The transformation in the past couple of decades from the slow Sisyphean effort, rolling one bit of content up the hill each day, to the breakneck attempt to do it multiple times a day now, is about to change. Nobody’s quite sure how AI-generated content on news sites is going to affect journalism. But I think it’s a safe bet that the space reserved for humans will remain in sparking outrage and opposition: opinions more than news. I’ll start to worry when the chatbots can write an op-ed that makes me properly riled.
This is the last instalment here this year. I’ll be back in 2024, perhaps with a different format: if you have any suggestions for how that should look, let me know in the comments.
• You can buy Social Warming in paperback, hardback or ebook via One World Publications, or order it through your friendly local bookstore. Or listen to me read it on Audible.
You could also sign up for The Overspill, a daily list of links with short extracts and brief commentary on things I find interesting in tech, science, medicine, politics and any other topic that takes my fancy.
I just tried an AI chatbot for the first time, and asked it about my reputation in the field of Steak and Kidney pudding. Apparently I am a world expert, widely acknowledged for my achievements in balancing tradition and innovation.
But the good news is that the AI generated click bait will be clicked on by other AIs, not by people.