About this study
About this study
Why we did this study
This report is part of CNTI’s broader 2024 “Defining News Initiative” which examines questions surrounding this theme in policy, technological developments and the views of journalists, in addition to public perceptions. The survey data in this report measure the public’s perceptions of news, journalism and technology in four countries: Australia, Brazil, South Africa and the United States. These data were collected in parallel with a global survey of journalists.
CNTI was motivated by several overarching questions:
- How does the public navigate new ways of being informed?
- Where do they see journalism fitting in?
- How can journalism do a better job of communicating its unique value?
Answers to these questions are central for better understanding the evolving information ecosystem. As CNTI learned in a series of focus groups last year in the same four countries, people are putting a lot of work into getting themselves up to speed on news. At the same time, many are actively tuning news out, expressing a sense that it is overwhelming. This is the first in a series of CNTI reports that examines these ideas.
As with all CNTI research, this report was prepared by the research and professional staff of CNTI. This project was financially supported by CNTI’s funders.
How we did this
CNTI’s survey questionnaire was developed internally by our research team and advisors in consultation with Langer Research Associates. Focus groups were initially run in each of the four counties which informed the development of the questionnaire. In addition to references made throughout this report, themes from these focus groups may be found in a series of essays available on CNTI’s website.
In partnership with Langer Research Associates, the data were collected through different vendors in each country:
- Australian data are from an Infield International RDD CATI/cell phone sample conducted from September 4 to October 1, 2024. The total sample size was 1,000 respondents. The design effect was 1.39 and a margin of error of 3.7 points. The survey was only available in English. All interviews were conducted by telephone, and the median interview length was 18 minutes and five seconds. There were 41 interviewers used by Infield International all of whom were trained. The sample collection age categories were: 18-24, 25-34, 45-54, 55-64 and 65+.
- Brazilian data are from an Inteligência em Pesquisa e Consultoria (IPEC) RDD CATI/cell phone sample conducted from September 13 to October 3, 2024. The total sample size was 1,000 respondents. The design effect was 1.67 and a margin of error of 4.0 points. The survey was only available in Brazilian Portuguese. All interviews were conducted by telephone, and the median interview length was 16 minutes and 49 seconds. There were 76 interviewers all of whom were trained. The sample collection age categories were: 18-29, 30-44, 45-59 and 60+.
- South African data are from an Infinite Insight RDD CATI/cell phone sample conducted from September 23 to October 16, 2024. The total sample size was 1,012 respondents. The design effect was 1.48 and a margin of error of 3.7 points. The survey was available in English (n = 811), Zulu (n = 138), Sesotho (n = 24), Sepedi (n = 17), Setswana (n = 12) and Xhosa (n = 10). All interviews were conducted by telephone, and the median interview length was 19 minutes and 17 seconds. There were 47 interviewers all of whom were trained. The sample collection age categories were: 18-24, 25-34, 35-49, 50-64 and 65+.
- United States data are from Ipsos’s probability-based online KnowledgePanel® and was conducted from September 12-21, 2024. A total of 1,670 panelists were initially selected and 1,053 completed the survey. A total of 28 respondents were removed during the quality control process, yielding a final sample size of 1,025 respondents. The design effect was 1.13 and a margin of error of 3.3 points. The survey was available in English (n = 983) and Spanish (n = 42). All surveys were self-administered online, and the median interview length was 10 minutes and 23 seconds. The sample collection age categories were: 18-29, 30-44, 45-59 and 60+.
Technical reports from the survey vendors are available upon request from info@innovating.news.
How we weighted survey data
Each country’s data were weighted using demographic variables (i.e., age, sex, education and macroregion). While each individual country used a different age breakdown for sample collection, we opted to recode the age breakdown into the following categories: 18-29, 30-44, 45-54 and 55+. These categories were used to ensure there were at least 100 weighted respondents in each group — South Africa has a younger population, especially compared with the United States’ older population.
For specific questions about the sample frames, weighting procedures and/or additional survey details, please send an email to the research team at info@innovating.news.
How we protected our data
Data collection was done by the country-specific vendors listed above. The survey included individual-level information such as age, gender, race, political ideology and macro-region. Survey data supplied to CNTI from the vendors did not include names or specific locations of respondents. Each respondent received an unique identifier.
It would be very difficult, if possible at all, to identify survey respondents because CNTI did not collect personal contact information or contact respondents directly. The survey data for this project are securely stored in an encrypted folder that is only authorized to the core research team at CNTI.
How we analyzed the data
Data were analyzed using the R statistical computing language. In addition to base R functions, several packages (libraries) were used to clean and analyze the data. These included: googledrive, pewmethods, pollster, survey and tidyverse. We recoded missing, refused and don’t know responses into a catch-all category to keep the sample sizes for each country consistent across each question that did not explicitly have survey logic (i.e., questions that were asked to every respondent).
Data were analyzed using the demographic sample weights supplied by vendors and Langer Research Associates. These weights were applied to every analysis and topline in this report. Non-statistical analyses consisted of toplines and crosstabs. Statistical analyses found in this report consisted of Chi-square proportion tests (see the “How we tested for statistical significance” section below).
Exploratory analyses were run to learn about the data and responses within each country. These methods consisted of both linear (ordinary least squares) and non-linear regression (logistic regression, ordered logistic regression). Upcoming reports will implement these approaches to present more information about country-specific attitudes and behaviors.
Preliminary coding of open ends
Each country’s open-ended responses were analyzed separately. We were conservative in large part because of the multilingual nature of our data set. All responses from across the four countries were translated into English.
These translations may not fully reflect the nuance found in the original languages. Terms do not necessarily have exact equivalents between languages, thus we used the translations to stand for larger themes or concepts and avoided attempts to analyze for nuances of meaning against similar terms.
Responses to both the PSDEFJSM and PSJSMPRO questions were analyzed by examining the most frequent words used within each country’s data to gain a general understanding of prevalent and shared ways that respondents might conceive of the (1) differences between news and journalism and (2) traits of those who produce journalism.
We will examine the full set of open-ended responses for both the (1) defining news versus journalism and (2) traits of journalism producers items in an upcoming report.
How we addressed data quality
Data quality in the open-ended responses received attention during the analysis phase of the project. CNTI worked with Langer Research Associates and country vendors to review what interviewers recorded from respondents’ open ends. Several anomalies and mistakes were found. Interviewer recordings of the open-ended responses were reviewed by the vendor and CNTI was supplied with updated data for both open-ended questions.
We also examine results by country rather than as one total, because there may be a mode effect present in the survey results. The U.S. data were collected online, whereas the other three countries’ surveys were conducted through telephone, which may yield higher levels of social desirability responses in these locales. Research shows there is known variability in survey responses across countries and cultures regarding social desirability and acquiescence which may, in part, also be shaped by survey mode.
How we tested for statistical significance
We analyzed the results using Chi-squared proportion tests to assess differences in responses between two countries. We used a standard threshold of p < 0.05 for assessing statistical significance. Differences mentioned in the report text are statistically significant.
One question was removed from this report’s analysis
PSSTART: Due to a question wording error in the United States, this question was removed from comparative analysis in this report. The United States received the question: “How do you most often follow news about issues and events?” rather than the question the other three countries received: “When you want to get informed about issues and events, what is your most common place to start?” While PSSTART was not examined in this report, it will be explored on a country-by-country basis in forthcoming reports, and data are presented in the topline.
Additional Notes
- Respondents in the three countries with telephone surveys — Australia, Brazil and South Africa — were not read a “Don’t know” option for both PSAIHELP1 and PSJHELP2, while this option was made explicit in the U.S. The “Don’t know” and “DK/Refused/Missing” categories were combined to decrease confusion when analyzing these two questions.
- We also note that responses to the question PSAIHELP1 and PSJHELP2 in the online U.S. survey were updated after recontacting individuals who responded “Don’t know.” Responses for recontacted individuals were added back into the survey as a new question and added to the answers from those who did not reply “Don’t know.”
- For each of the four countries, the topline data for PSAIHELP1 and PSJHELP2 reflect all “Don’t know” responses being grouped into the “DK/Refused/Missing” catch-all category.
Acknowledgments
The Center for News, Technology & Innovation thanks Langer Research Associates and the affiliated country vendors for bringing this project to fruition. These vendors include: Infield International (Australia), Inteligência em Pesquisa e Consultoria (IPEC) (Brazil), Infinite Insight (South Africa) and Ipsos (United States). Langer Research Associates provided valuable feedback on the focus group and survey designs and we thank the country vendors for facilitating and collecting the survey data.
CNTI also greatly appreciates the feedback received from our team of country advisors who assisted with questionnaire development and understanding country-level nuance in the report findings:
- Jeremy Gilbert (United States) is the Knight Professor of Digital Media Strategy at Northwestern University.
- Sora Park (Australia) is a Professor of Communication at the Faculty of Arts & Design and Professorial Research Fellow at the News & Media Research Centre, University of Canberra (Australia).
- Nina Santos (Brazil) is a researcher at the National Institute of Science and Technology for Digital Democracy (INCT.DD) and at the Centre d’Analyse et de Recherche Interdisciplinaires sur les Médias (Université Panthéon-Assas).
- Scott Timcke (South Africa) is a Senior Research Associate at Research ICT Africa and a Research Associate at the University of Johannesburg’s Centre for Social Change.
- Subramaniam (Subbu) Vincent (United States) is Director of the Journalism and Media Ethics program at the Markkula Center for Applied Ethics at Santa Clara University.
Thank you for providing CNTI with your experience and expertise.
Stay tuned for more reports using the data collected through this project.
As with all CNTI research, this report was prepared by the research and professional staff of CNTI. This report was written by CNTI’s Research Team (Amy Mitchell, Celeste LeCompte, Connie Moon Sehat, Emily Wright, Jay Barchas-Lichtenstein, Nicholas Beed and Samuel Jens). The work could not have been completed without our colleagues Chelsey Barnes and Uduak Grace Thomas; our copy editor Greta Alquist; our graphic and web designers, Jonathon Berlin and Kurt Cunningham, as well as the team at MG Strategy + Design; and our communications team at Black Rock Group.
CNTI does not lobby for or propose specific legislation and instead is dedicated to supporting policy creation through further research and collaborative, multi-stakeholder discussions.
CNTI is generously supported by Craig Newmark Philanthropies, John D. and Catherine T. MacArthur Foundation, John S. and James L. Knight Foundation, The Lenfest Institute for Journalism and Google.
Topline survey data
Below is a link to our topline data.
What the Public Wants from Journalism in the Age of AI
Share
Continue Reading
-
If, When and How to Communicate Journalistic Uses of AI to the Public
Conclusions of a Day-Long Discussion among Global Cross-Industry Experts
-
Focus Group Insights #3: In a Digital World, Getting the News Requires More Work, Not Less
Defining News Initiative
-
Enabling a Sustainable News Environment: A Framework for Media Finance Legislation
An analysis of 23 policies affecting over 30 countries