The personalization of news is a permanent feature of our information ecosystem that comes with some benefits and many challenges. News consumers are often overwhelmed by the amount of news they’re exposed to on a daily basis, and news personalization definitely helps by filtering that content. News outlets have learned that catering to individual interests improves the overall experience for most users and can make them more loyal and regular news consumers.
Personalization can be broad and determined by location or demographic information, or it can be individualized to a degree never seen before, thanks to artificial intelligence and algorithms. The company Taboola just announced a new initiative called “Homepage for You,” which will allow news outlets to offer a customized homepage for each user.
The news company McClatchy, which publishes 29 newspapers in 14 states, has signed up to give its readers this ultra-personalized content. “As more people rely on our network of properties for news, we want to ensure all readers get the best experience relevant for them,” said Kristin Roberts, McClatchy’s senior vice president in a story by Mediaweek.com. “Coupling the vast expertise of our newsroom with a data-driven approach to personalizing our homepages with Taboola’s advanced A.I. technology is going to be a game-changer for us.”
Here’s the problem: A strategy that emphasizes personal relevancy over editorial judgment by definition eliminates exposure to large swaths of news. The algorithms that allow news outlets to curate information for each user can also lead to a filter bubble in which people only see information that conforms to their pre-existing beliefs and interests. Filter bubbles make people vulnerable to misinformation and conspiracy theories and can lead to polarization and extreme views.
This is already happening on social media networks like TikTok. A Wall Street Journal investigation that created over 100 automated fake accounts revealed that TikTok tracks every second users spend on its platform. In fact, the critical piece of information it monitors is how long users linger over a video. Once the algorithm has this data, it starts feeding users content it thinks will engage them, eventually “driving users deep into rabbit holes that are hard to escape,” according to the reporters on this story. Case in point: Wall Street Journal bots programmed to show a general interest in politics ended up seeing videos about election conspiracies and QAnon in no time at all.
In this video report, data scientist Guillaume Chaslot says TikTok’s algorithm is looking for content that will generate engagement. “The algorithm is pushing people to more and more extreme content, so it can push them to more and more watch time,” said Chaslot.
Since the Wall Street Journal published this investigation, TikTok says it is testing ways to change its algorithm so users will not be shown too much content on the same topic.
The question is, do you want to cede control of your news diet to these large sequences of code?
If the answer is no, you can counter personalization by taking a more active role in choosing the news you consume.
First, be aware that your preferred news outlets are already curating some news for you. In many cases, they know what you will click on before you even get to their home page.
Here’s the thing, you can trick the algorithm into giving you a broader selection of news by visiting multiple news sites and engaging with a wider range of stories, especially those that do not conform to your pre-existing worldview. By feeding the algorithm these new data points, you will eventually be served a more balanced news diet.
You can also reduce personalization by adjusting privacy settings and turning off customization features. And there may soon be a way to opt-out of algorithms that personalize content entirely. Members of Congress in both parties support a bill called the Filter Bubble Transparency Act that gives users more control over the information they see and requires internet platforms to give people the option to consume content “outside the potential manipulation zone” of algorithms. Another bill proposed in the Senate, known as the Algorithmic Accountability Act of 2022, would give consumer protection agencies like the Federal Trade Commission oversight of companies that use automated algorithms.
The news industry should also be more transparent about how it uses this technology and remember that it has a civic responsibility to give people the news they need to be free and self-governing–not just news that generates engagement. Ultimately, that means news outlets cannot rely on algorithms and artificial intelligence alone to personalize the news. “News isn’t Netflix,” said Mike Dyer, the former president and publisher of The Daily Beast, in a story on Digiday. “Journalism companies have a public good mission to confront people with fact-based reporting regardless of how they feel about it, and too much personalization can harm that mission.”