How AI could hurt — and help — local journalism

AI and local journalism
Carefully deployed, AI may also help fill civic information gaps and support public-interest reporting.

As news deserts expand and resources shrink, AI threatens to further undermine local media outlets. Yet, carefully deployed, it may also help fill civic information gaps and support public-interest reporting.

Artificial Intelligence (AI) is increasingly being used by local news producers to create efficiencies and assist journalists. Much like the rise of digital platforms, it can be both a friend and foe to local news outlets and our social and political systems.

It is widely acknowledged that public interest journalism is an essential part of a thriving democracy. When done well, it holds public institutions and powerful individuals accountable, keeping the population informed and providing a platform for public debate.

With more than 550 municipalities operating across Australia and hundreds of judicial courts, the role of local news media is especially vital to keeping people informed of civic affairs in all parts of the country.

READGenerative AI and journalism: Hidden risks reshaping information

Local news struggles

Despite its importance, many areas of the local news sector globally are confronting declining advertising revenue and audience fragmentation caused by massive disruptions after the rise of Big Tech.

The Covid-19 pandemic led to more cuts, closures and consolidation. Amid this, concerns have grown about ‘news deserts’ — towns or cities with little to no access to locally relevant news.

These trends are concerning for the proper functioning of democracy as studies have found links between local news decline and increases in government inefficiency, corruption and political polarisation.

The rise of AI might appear to be another nail in the coffin for local news producers, as platforms such as Google’s Gemini AI Overviews and ChatGPT allow people to bypass news websites (and social media) to receive information summaries. Although generative AI tools can be unreliable, they are becoming more accurate and credible every day.

A copyright battle looms with media businesses suing AI companies over the use of news content. While some publishers are pushing for deals to monetise content, local news producers face greater difficulty due to their smaller size, lack of collective bargaining power and limited legal resources.

READIn the era of AI-generated news, readers value trust over customised content

Local AI opportunities: council, courts and collaboration

Paradoxically, AI might also help ease the very burdens digital technologies such as itself have – in part – created. It can do this by providing new efficiencies and tools to help fill information gaps and support journalism.

Research also shows physical presence and news relevance are vital to the credibility and trust of local journalism. Although AI cannot replace human presence and the close connections local journalists build through in-person relationships, it can potentially support efforts to help synthesise information from local government, local courts, public institutions and other authorities, thus improving an outlet’s local news focus.

Local providers have already started using AI to create news production efficiencies and automate the flow of information. Some examples in Australian local news rooms include interview transcription, to summarise documents, extract key facts, produce sports results, add metadata and generate social posts, advertising copy and alerts.

In 2023, News Corp Australia revealed it was producing 3,000 articles a week using generative AI, overseen by a unit named Data Local. Court lists are an example of the type of content generated, such as this story in the Cairns Post that includes a list of the district court sittings at Toowoomba.

This year, The Guardian reported that journalists at News Corp had voiced concerns over plans by the company to broaden its use of AI for news production, including one tool that would be used to produce and edit stories, effectively performing the role of a sub-editor.

National and international research suggests the use of AI is growing in newsrooms. A 2024 study involving a survey of 1,004 UK journalists found 56 per cent reported using AI professionally at least once a week, mainly for speed and efficiency. Few UK journalists used it for audio or video generation, however. There is, nonetheless, a gap in research about how local journalists are using AI. A 2025 study used an AI-generated text detection tool to determine nine per cent of 186,000 articles from 1,500 US newspapers were either partially or fully AI-generated and noted that AI use was rarely disclosed.

One area where cautious development of AI support may be warranted is local court reporting. Our own research, and other studies suggest court reporting is often uneven and sporadic due to a lack of resources. Covering court hearings can be one of the most time-consuming roles for news outlets at the local level.

International research has identified a particular lack of public scrutiny of coroners’ courts by local media in the UK, with inquests rarely reported in news deserts. In the UK, researchers at the University of Surrey have built an AI transcription tool for use in Supreme Court hearings that they claim makes justice more accessible and transparent. Another potential use case identified is to train an AI tool to free up time for a journalist by sorting through court documents to find search warrants that are worth further investigation.

The Coroners Court of Victoria implemented a pilot program in 2024 that is exploring the use of AI to review large documents and extract key pieces of information. All experiments stress the importance of AI supporting, rather than replacing humans. Human oversight is not only needed to ensure accuracy but as has been argued, journalists play a crucial role  when reporting on people at their “worst moments”, or to overturn inappropriate court reporting restrictions.

The challenges that local news organisations face around AI innovation mirrors their limited bargaining power with Big Tech – they often lack the money, resources, technical capacity and leverage to experiment on their own. This is why our research suggests that cross-sector, industry-wide collaborations make sense for AI development and experimentation in the local news sector.

International examples demonstrate that when done well, cooperation to tackle shared challenges can benefit all involved, including audiences.

READAI is transforming news, search, fact-checks and feeds

AI risks for the local news sector

There are a mounting number of cautionary tales about the use of AI for short-term gains as it poses risks to the long-term viability, relevance and legitimacy of news organisations due to their need to maintain audience trust, not to mention their public and legal responsibilities.

Earlier this year, staff in Australian Community Media (ACM) newsrooms spoke up about the rollout of a generative AI model creating legal headaches for journalists, with some regional journalists telling the ABC they refused to use the technology due to their concerns about its inaccurate headlines and legal advice. An Australian radio station faced backlash after it admitted it used an AI-generated host for months without telling its audience, risking trust and credibility.

Southern Cross Austereo (SCA) is reportedly currently investigating whether AI is to blame for the wrongful naming of a journalist as the alleged perpetrator of an attack on police officers. The long-term risk for the organisation’s local relevance and viability is as significant. Producing generic, robotic content destroys the very aspect of local news that makes it a valuable commodity – its localness.

The use of AI could also potentially amplify the problem of syndicated generic ‘local’ content, which can erode the value of a business if taken too far because it dilutes the geographic focus of news. As the rise of generic syndicated content is already further reducing content diversity in vulnerable regional news areas, any efficiency that exacerbates such a trend should be disincentivised by policy makers, particularly as the Australian Government has identified safeguarding content diversity as a key policy concern.

At the end of the day, AI-produced news content can never fill the quality gap if it’s not supporting original public interest journalism created by humans. Studies in the US highlight that only 20 per cent of local news organisations have public AI usage policies with news organisations expressing uncertainty over standards, audience perceptions, and lack of time to develop a carefully thought out policy approach.

As local news outlets in Australia are currently seeking funding from the government to support the sector, the government has a powerful role to play. The financial support it provides can ensure AI is used to improve rather than further erode local news content quality and diversity.

Kristy Hess is a Professor of Communication at Deakin University. Dr Angela Ross is Research Lead, ABC News and an honorary research fellow at Deakin University. Originally published under Creative Commons by 360info™.

READ I From watchdog to lapdog: How Indian media lost the plot