Artificial Intelligence (AI) is rapidly changing how information is delivered to people and how they consume it. AI summaries, which Google calls “AI overviews,” for example, are becoming an increasingly popular and normalised way of finding information. However, AI-summarised search results often misrepresent facts.
Those AI overviews also deprive websites of search traffic which can impact journalism’s bottom line. This is because they access content (sometimes even paywalled content) and can repackage it in a convenient, digestible way to readers.
From a consumer’s point of view, there is not always a need (or time) to click through to the original content, nor will they often try to find sources directly. The value of an individual news story becomes close to zero.
News organisations are now facing another revenue hurdle: How to create value in a zero-click world?
READ I Generative AI and journalism: Hidden risks reshaping information
Trust in news in the age of AI
To adapt to the new challenge of creating value, news organisations are trying to adapt by minimising costs, and streamlining the workflow with the same AI technology.
Ironically, this is leading to lower trust and scepticism.
In this year’s Digital News Report: Australia we asked respondents how they felt about news produced mostly by AI, compared to news produced entirely by a human journalist. Overall, nearly half (47 percent) say AI news is cheaper to make, and one third (33 percent) view it as more up to date. However, many also think it is less trustworthy (43 percent), less accurate (38 percent), and less transparent (37 percent) than news produced by humans.
We have also interviewed dozens of audience members to understand their experiences with and reactions to AI in greater depth. In addition to being concerned about the accuracy of AI information, news audiences are also concerned about issues of algorithmic bias.
One participant told us: “As humans, everybody has biases, and we do our best to keep them in check, but something like AI will heighten any biases we have because of the way the questions are being asked.”
Figure 1: Perceptions of AI News (%)

Source: Digital News Report: Australia 2025
AI is adding another layer of uncertainty among news audiences who already feel overwhelmed by the volume of misinformation and their lack of ability to discern quality news.
In the words of one of our participants: “I think that’s kind of where part of the problem lies. I don’t know where it’s being used. I wouldn’t be able to say if I’ve seen it used to generate an article because I wouldn’t be able to pick what’s being written by a person, what’s being written by AI.”
In an AI-mediated information ecosystem, unless news organisations radically change the way they do their business, they will no longer be of value to audiences. A single article will have no value when AI takes the story along with hundreds of legitimate stories to create customised content for the consumer.
READ I AI is transforming news, search, fact-checks and feeds
The role of the human in journalism
Maintaining the status of a trusted brand, where people can go to when they really need to know the truth, is something AI can’t provide. This is where the role of a human journalist comes into play. And we know from the research that people still want a human journalist to create content rather than AI; 43 percent say they are comfortable if news is produced by a journalist with some AI assistance. This drops to 21 percent if news is produced mostly by AI with some human oversight. The concern about the use of AI in journalism is in part related to people’s knowledge and confidence about the technology, including concerns about cultural context or handling sensitive topics.
Those who have received news literacy education are much more likely to be accepting of AI produced news; 40 percent of those who received news literacy education say they are fine with news produced mainly by AI with some human oversight compared to only 15 percent of those who have not received news literacy education. Those who are more media literate are more likely to have the confidence to navigate the complex information environment and know how to discern trusted sources of news.
Raising awareness of how AI is used in the creation of news is essential to maintain the trust people have in journalism and news organisations. Some news organisations, such as the ABC or The Guardian, provide public notices about how their staff use generative AI.
Other news organisations haven’t publicly disclosed how their staff use AI and many small- and medium-sized news organisations might lack AI policies entirely. All of this is complicated by research that shows audiences say they can’t always tell the difference about if or how AI is used in the journalism they consume.
Changing audience preferences
Regardless of the comfort level of AI being used in the production of news, generative AI is becoming more popular, especially among younger audiences, to get news. When asked how they feel about online news content that is tailored to individuals, about a third of respondents showed interest in receiving summarised versions of news, followed by story recommendations or news alerts based on their interests.
Overall, convenience and relevance appear to be the main drivers of interest in AI-generated news personalisation. Generally, younger people are more interested in using AI to personalise news. Younger people expressed a much stronger desire to use AI to make news easier to understand. Those under 35 are almost three times as likely to say they are interested in news articles where the language level can be modified (24 percent vs 9 percent).
Figure 3: Interest in news personalisation by age (%)

Source: Digital News Report: Australia 2025
The future of adoption of AI in journalism may depend on balancing the benefits and risks of technological advancements, and raising audience comfort, trust and news literacy.
Professor Sora Park is the Director of the News and Media Research Centre at the University of Canberra. Dr TJ Thomson is a senior lecturer and Australian Research Council DECRA Fellow at RMIT University, where he co-leads the News, Technology, and Society Network. Originally published under Creative Commons by 360info
.
Note: This article is part of a series on AI, Journalism and Democracy. The other articles are:
https://360info.org/ai-in-journalism-and-democracy-can-we-rely-on-it/
https://360info.org/ai-and-the-news-how-it-helps-fails-and-why-that-matters/