CNTI’s Summary
Digital platforms subtly guide how we create and discover content. People around the world increasingly rely on digital intermediaries for news and information, and newsrooms must now optimize online content for clicks, shareability and engagement. In this environment, ensuring that algorithmic selection incentivizes high-quality information plays an important role in promoting an informed public, protecting an independent press and enhancing platform credibility. Alongside the need for legal and organizational policy to promote platform transparency, cross-industry collaboration is critical to ensuring that platform algorithms select and prioritize fact-based, independent news content.
The Issue
The display of news content on digital platforms deviates from the traditional news model by decoupling news production from news distribution. In addition to news organizations’ own digital distribution, technology companies’ algorithms distribute news by selecting, filtering, ranking and bundling news for consumers. News organizations increasingly depend on digital intermediaries to reach audiences through these means, and the public, in turn, relies on digital intermediaries to access news. Social media platforms, search engines, news aggregators and video-sharing services are becoming the dominant means for news consumption across the world. While these pathways increase news access and reach, they also introduce two challenges: the algorithmic selection process is usually opaque to news publishers and the public (who both rely on it), and the algorithmic selection results have the potential to expose the public to lower-quality news and information.
In addition to addressing transparency in how digital platform algorithms function (a topic CNTI addresses in a separate issue primer), there is a need to understand how to ensure fact-based, independent journalism rises to the top on digital platforms. Both publishers and digital platforms face challenges over how to determine the newsworthiness of editorial content. In particular, how can algorithms find, select and prioritize accurate, evidence-based content? This is important in enabling:
- An informed public. News reporting remains critical to supporting an informed public. With the digital environment open to both well- and ill-intended actors, algorithmic selections risk delivering lower-quality or erroneous content to the public. In addition, it is possible that amid declining revenues, some publishers may attempt to prioritize and elevate content selected and amplified by platform algorithms over coverage driven by public interest.
- Journalism’s sustainability. News publishers increasingly rely on digital platforms to reach a broader audience, to drive more traffic to their own websites or apps and to increase subscriptions and donations. Less than a quarter (22%) of news audiences say they prefer to start their news journeys with a news website or app, down 10 percentage points since 2018. Instead, social media platforms play an increasingly critical role in news consumption. Amid efforts to seek more revenue from algorithmically driven news consumption (discussed in a separate CNTI issue primer), the opportunities digital platforms provide for exposure and reach to fact-based, independent content remain critical for many publishers. There is also the real possibility of platforms shifting away from news altogether which, at least in our current structures, would almost certainly harm publishers’ ability to reach audiences and inhibit the public’s ability to access legitimate news sources and information.
- Digital platforms’ relevance, trust and revenue. Although news content typically makes up only a small portion (according to Facebook, around 3-4%) of digital platform content, platforms that provide independent and fact-based news content reap the benefits of increased relevance, credibility and some revenue when they display ads next to links or snippets and collect user data for targeted advertising. News generally improves the quality and range of content on digital platforms, whose aim is to entice users to spend as much time as possible in their “walled gardens.”
Behind the challenge of ensuring that algorithms identify and promote fact-based, independent journalism, is the influence digital platforms possess through subtly guiding how society creates and discovers content. Digital platforms have evolved beyond the role of distribution channels to exert direct and indirect editorial influence (though some have begun to shift away from news). For example, in the past decade, digital platforms have incentivized particular types of content (e.g., live video, short video) and set design standards (e.g., subscription policy, anti-cloaking). As a result, editorial decision-making in a competitive, data-driven news market is increasingly based on third-party choices, resulting in newsrooms optimizing online editorial content for clicks, shareability and engagement.
These dynamics signal the growing importance of ensuring that platform algorithms are structured in a way to identify and promote fact-based, independent journalism. There is a clear need for both legal and organizational policy when it comes to algorithmic transparency. There is also a critical need for cross-industry conversation and collaboration. There is evidence that collaboration leads to increased visibility of fact-based journalism by, for example, changing algorithms (and including human validation) to prioritize and elevate original reporting over standard ranking considerations. Is there a role for news publishers in shaping these algorithmic processes? And, what do publishers need from digital platforms to be able to serve the public and make effective and informed decisions about digital news content?
What Makes It Complex
Policy in this space will need to grapple with what quality means and how it gets assessed.
What content algorithms should promote, as well as who or what defines news “quality,” can be difficult to determine or to articulate in policy. And it often means different things to academics, to news publishers and to the public.Historically, the term “quality” was used by broadsheet newspapers and public broadcasters to distinguish their coverage from the more quantity-driven, sometimes sensationalist tabloid headlines as well as from commercial channels – though whether this distinction was ever true is questionable. It can be difficult, if not outright impossible, to label many news publishers entirely along a binary measure of “quality.” (Further, research suggests that such labels don’t measurably improve people’s news diets or reduce misperceptions.) In some contexts, the question of who determines the quality of news content can open the door to governmental overreach or abuse. Governments’ involvement in decisions around what qualifies as “quality” news content can lead to them serving as the arbiters of what does or does not appear online, potentially threatening an independent press and free expression.
Even if algorithms prioritize fact-based and independent journalism, people may still choose to consume content that is not fact-based or independent.
For instance, tabloid publishers have often argued their popularity guarantees that their content is what the public wants. The disconnect between what information publishers and the public deem relevant or valuable is a challenge for legal and organizational policy attempting to address algorithmic content recommendations. For instance, should the public be compelled to see content they don’t want to see if governments or others determine it is good for them? Efforts must strike a balance between prioritizing fact-based content and protecting the public’s right to make their own decisions about how, or to what extent, they are informed. This disconnect is explored in more depth in a separate CNTI issue primer on news relevance.
Even if we can agree on the standards of quality journalism, the challenge of how to choose among sources remains.
For instance, many news stories repeat, repackage, aggregate or comment on previously published content. How can policy or other approaches disincentive these practices in favor of original reporting? What role might platforms perform in this? Further, in today’s attention economy, as algorithms sift through and prioritize content from an ever-growing number of news publishers and freelancers, smaller brands are often at a disadvantage amid the prioritization of publishers’ reach and quantity. Further, because of their assumed public and economic value, legacy and upmarket publishers often have more access to digital platform and search engine representatives and may receive more guidance on what types of content will perform better in algorithms. Meanwhile, smaller and often local news organizations may not be able to afford the same level of digital expertise needed to best navigate news algorithms. A large proportion of digital subscriptions already go to just a few big national brands, reinforcing a “winner takes most” dynamic in the online media industry.
Much like their audiences, news publishers generally have a limited understanding of how algorithms target news consumers and rank, boost, restrict and recommend news content.
To journalists, the metrics used in these algorithms range from opaque to obscure, despite some efforts to improve transparency. When proprietors strategically make regular and unannounced changes to their algorithms, ensuing decreases in traffic often lead newsroom, SEO and social media teams to feel at the mercy of major technology companies. These knowledge gaps make it difficult for news publishers to participate in decisions about how platform algorithms should be built to promote an informed public by recognizing and encouraging fact-based, independent journalism.
Platform algorithms are entangled with broader human choices and user preferences, so they are not necessarily designed to prioritize news content.
Platform algorithms are generally built to respond to people’s broader interests, which may or may not include news and current affairs. Individuals see customized search results and feed pushes, and it is unclear whether the public desires more uniform results and pushes.
As private business entities, platforms’ commercial incentives may not always be in sync with the aim to promote fact-based, independent journalism.
For publishers, news content is the top priority on their platforms, but for digital platforms, there is a substantial (and widening) delta in how much importance different parties attach to news content, resulting in different algorithmic objectives. Platforms also have commercial incentives to respond to and recommend content based on users’ digital behaviors, which may or may not include news content. These differences can lead, and have led, to conflicts between publishers and technology companies, at times culminating in news bans at publishers’ expense when technology companies feel unduly regulated. Like debates around algorithmic transparency, these conflicts demonstrate the need for cross-industry conversation and collaboration to ensure that commercial interests do not take precedence over identifying and promoting fact-based, independent news and information.
State of Research
The impact of algorithms on news media has been recognized as a critical area of research. The central academic debates focus on the risks and opportunities algorithmically ranked news presents to individuals and society, digital platform accountability and the shift from mass media distribution to personalized recommendation systems. Scholars across a range of fields have increasingly analyzed the critical relationship between those who manage algorithms and those who produce news.
Researchers have found evidence of publishers adjusting – even lowering – editorial standards to boost their position in ranking and recommendation systems. Researchers have also expressed concerns about accessing, understanding and evaluating the algorithms that govern a range of news-related processes. These concerns involve the opaque, “black box” nature of these algorithms as well as the increasing suppression of researchers’ access to digital trace data (which CNTI addresses in detail in a separate issue primer).
While it is clear that platform algorithms subtly guide trends and drive how people discover content, there is little evidence of exactly how they work or what they reward or amplify. Thus, it is important to differentiate between what the data firmly reveal about algorithms and news and what there is not yet evidence to support:
- In spite of public attention to fears of algorithmic “filter bubbles,” studies in the UK and several other countries consistently find that algorithmic selection by digital platforms generally leads to slightly more diverse news consumption (though self-selection may hinder this, particularly among political partisans).
- Algorithmic rankings can exert influence over how people engage with and evaluate news.
- Some work suggests that emotional language and out-group animosity are more likely to go “viral” or be shared on social media – trends that rely on user behavior alongside platform and algorithmic design.
- Historically, social media platforms like Facebook overwhelmingly weighted comments as more important than other types of interactions, leading posts that explicitly or implicitly encouraged commenting (e.g. divisive content) to gain more popularity. But transparency around algorithms has limitations because the code is not static: Platforms’ formulas for engagement are frequently changing.
- Research demonstrates that platforms are frequently changing algorithms and investing in both manual and algorithmic moderation processes to minimize the spread of disinformation. At the same time, problematic or extremist content is more likely to be amplified by certain digital platforms’ recommendation systems than by others.
- News consumers are generally skeptical of algorithmic selection of online news but are equally wary of news selection by editors and journalists. Similarly, people often perceive news produced by algorithms as equally or more credible than (or simply not discernible from) news selected by human journalists.
There is opportunity for future work to serve as an intermediary between digital platforms and news organizations, pushing for more transparency to fully understand the complexity of algorithmic mechanisms and where the balance may lie between demystifying processes to promote fact-based, independent news and information while protecting against political or commercial manipulation. On the policy side, the effects of recent digital platform legislation on newsroom practices and reach are unclear. Establishing best practices, both in digital platforms and editorial environments, will depend on the findings of research in this area.
Notable Studies
State of Legislation
Legislative approaches to address platform algorithms in regard to news content and distribution range from direct government intervention to self-regulation. While laws and proposals vary by country, they face a common challenge: the “pacing problem” – technology moving faster than the law.
We discuss two key groups of approaches addressing the impacts of algorithm-driven services with policy in separate issue primers. The first group appears in countries where the emphasis is to protect citizens against safety risks and to increase algorithmic transparency and accountability. The second group appears in countries like Australia, Canada, Brazil and the U.S. where the aim is to require large platform companies to negotiate commercial deals with news publishers for their content. Within both sets of approaches, it is unclear how to assess what “quality” news is.
Legislation in these areas can greatly impact – and even supersede – what news publishers and technology companies can do to promote fact-based, independent content. In some cases, media bargaining structures, which often primarily benefit legacy and larger publishers, would have determined these actions. Thus, legislation may end up being the arbiter of what content appears online. Empirical evidence is needed to better understand how regulation impacts the algorithmic information landscape in practice, with intended and unintended consequences for fact-based editorial content and news consumption.
Notable Legislation
Resources & Events
Notable Articles & Statements
Digital News Report 2024
Reuters Institute for the Study of Journalism (June 2024)
YouTube launches new watch page that only shows videos from “authoritative” news sources
Nieman Lab (October 2023)
X changes its public interest policy to redefine ‘newsworthiness’ of posts
TechCrunch (October 2023)
Pluralism of news and information in curation and indexing algorithms
Centre for Media Pluralism and Media Freedom (February 2023)
“This is transparency to me”: User insights into recommendation algorithm reporting
Center for Democracy & Technology (October 2022)
Promoting a favourable environment for quality journalism in the digital age
Council of Europe (March 2022)
How can news algorithms do a better job at ranking and recommending journalism?
MisinfoCon (February 2021)
How algorithms are changing what we read online
The Walrus (September 2020)
Can improving algorithms in fact improve news quality?
NewsQ (April 2020)
Why am I seeing this? How video and e-commerce platforms use recommendation systems to shape user experiences
Open Technology Institute (March 2020)
Audit suggests Google favors a small number of major outlets
Columbia Journalism Review (May 2019)
What do we mean by “quality news”?
Tow Center for Digital Journalism (November 2018)
Has the Google of South Korea found a way to save struggling news outlets?
The Atlantic (December 2017)
Key Institutions & Resources
Center for Media Engagement: Research center based at the University of Texas at Austin conducting research to influence media practices for the benefit of democracy.
Centre for Media Pluralism and Media Freedom (CMPF): European University Institute research and training center, co-financed by the European Union.
NewsQ: Hacks/Hackers and Tow-Knight Center for Entrepreneurial Journalism initiative seeking to elevate quality journalism when algorithms rank and recommend news online.
Reuters Institute for the Study of Journalism: University of Oxford institute exploring the future of journalism worldwide through debate, engagement and research.
Tow Center for Digital Journalism: Institute within Columbia University’s Graduate School of Journalism serving as a research and development center for the profession as a whole.
Notable Voices
Charlie Beckett, Director, JournalismAI Project
Emily Bell, Director, Tow Center for Digital Journalism, Columbia Journalism School
Axel Bruns, Professor in the Digital Media Research Centre, Queensland University of Technology
Jeff Jarvis, Director, Tow-Knight Center for Entrepreneurial Journalism, CUNY
Natalie Jomini Stroud, Director, Center for Media Engagement
Connie Moon Sehat, Director, News Quality Initiative
Rasmus Kleis Nielsen, Director, Reuters Institute for the Study of Journalism
Pier Luigi Parcu, Director, Centre for Media Pluralism and Media Freedom
Matthias Spielkamp, Executive Director, AlgorithmWatch
Recent & Upcoming Events
European media between decreasing revenues and a quest for pluralism
Centre for Media Pluralism and Media Freedom
April 28, 2023 – Florence, Italy (hybrid)
Algorithmic Competition
Organisation for Economic Co-operation and Development
June 14, 2023 – Paris, France
Information Integrity in the AI Era: (Central) European Perspective
The Aspen Institute Central Europe
September 26, 2023 – Prague, Czech Republic
4th European Data & Computational Journalism Conference 2023
ETH Zurich
June 22–24, 2023 – Zurich, Switzerland
Digital News Report 2024
Reuters Institute for the Study of Journalism (June 2024)
YouTube launches new watch page that only shows videos from “authoritative” news sources
Nieman Lab (October 2023)
X changes its public interest policy to redefine ‘newsworthiness’ of posts
TechCrunch (October 2023)
Pluralism of news and information in curation and indexing algorithms
Centre for Media Pluralism and Media Freedom (February 2023)
“This is transparency to me”: User insights into recommendation algorithm reporting
Center for Democracy & Technology (October 2022)
Promoting a favourable environment for quality journalism in the digital age
Council of Europe (March 2022)
How can news algorithms do a better job at ranking and recommending journalism?
MisinfoCon (February 2021)
How algorithms are changing what we read online
The Walrus (September 2020)
Can improving algorithms in fact improve news quality?
NewsQ (April 2020)
Why am I seeing this? How video and e-commerce platforms use recommendation systems to shape user experiences
Open Technology Institute (March 2020)
Audit suggests Google favors a small number of major outlets
Columbia Journalism Review (May 2019)
What do we mean by “quality news”?
Tow Center for Digital Journalism (November 2018)
Has the Google of South Korea found a way to save struggling news outlets?
The Atlantic (December 2017)
Center for Media Engagement: Research center based at the University of Texas at Austin conducting research to influence media practices for the benefit of democracy.
Centre for Media Pluralism and Media Freedom (CMPF): European University Institute research and training center, co-financed by the European Union.
NewsQ: Hacks/Hackers and Tow-Knight Center for Entrepreneurial Journalism initiative seeking to elevate quality journalism when algorithms rank and recommend news online.
Reuters Institute for the Study of Journalism: University of Oxford institute exploring the future of journalism worldwide through debate, engagement and research.
Tow Center for Digital Journalism: Institute within Columbia University’s Graduate School of Journalism serving as a research and development center for the profession as a whole.
Charlie Beckett, Director, JournalismAI Project
Emily Bell, Director, Tow Center for Digital Journalism, Columbia Journalism School
Axel Bruns, Professor in the Digital Media Research Centre, Queensland University of Technology
Jeff Jarvis, Director, Tow-Knight Center for Entrepreneurial Journalism, CUNY
Natalie Jomini Stroud, Director, Center for Media Engagement
Connie Moon Sehat, Director, News Quality Initiative
Rasmus Kleis Nielsen, Director, Reuters Institute for the Study of Journalism
Pier Luigi Parcu, Director, Centre for Media Pluralism and Media Freedom
Matthias Spielkamp, Executive Director, AlgorithmWatch
European media between decreasing revenues and a quest for pluralism
Centre for Media Pluralism and Media Freedom
April 28, 2023 – Florence, Italy (hybrid)
Algorithmic Competition
Organisation for Economic Co-operation and Development
June 14, 2023 – Paris, France
Information Integrity in the AI Era: (Central) European Perspective
The Aspen Institute Central Europe
September 26, 2023 – Prague, Czech Republic
4th European Data & Computational Journalism Conference 2023
ETH Zurich
June 22–24, 2023 – Zurich, Switzerland