AI in Local News: Uses & Policies

Source: laconiadailysun.com

Published on October 3, 2025

AI in Newsrooms: Aiding, Not Replacing

Newsrooms in New Hampshire and elsewhere are exploring how artificial intelligence can support their work. They're also being careful, establishing boundaries to ensure AI isn't used to create articles or images.

Many journalists are using AI tools like Otter to transcribe interviews, which saves time. Some news organizations use AI to monitor public meetings, helping them find story ideas or sources.

Cautionary Tales

There have been some well-known incidents involving journalism and AI. For example, the Chicago Sun-Times and Philadelphia Inquirer published a summer reading list with nonexistent books. The freelancer responsible admitted to using generative AI without verifying the results.

Jonathan Van Fleet, editor of the Concord Monitor, stated that any use of AI requires human oversight. AI can make journalists more efficient, but it cannot replace their reporting, writing, editing, or fact-checking.

Local Newsrooms Weigh In

Julie Hirshan Hart, editor at The Laconia Daily Sun, says that while there are no official AI policies yet, the topic is under discussion. A firm rule is that The Daily Sun will not use generative AI to write articles. According to Hart, content should not be run through an AI generator, left unread, and then included in a story.

Journalists have used AI for brainstorming headlines and photo captions. Hart has also considered using it to automate routine tasks like formatting police logs. However, she stresses that AI will never replace journalists: "I see it as a tool that you can use in your brainstorming process or in your writing process, but it should not replace, as a journalist, your news judgment or your experience or your particular voice," she said.

Similarly, the Concord Monitor uses AI to enhance its journalists' work. Van Fleet mentioned that they've used AI to suggest URLs that perform better in search results. It can also quickly convert large PDF files of public records into searchable documents.

"We're using the tool to help us do what we do faster and more efficiently," he said. "But what we're not doing is saying, ‘Cover this meeting for us.’”

Trust and Transparency

Van Fleet noted that as AI content grows, readers will question what they can trust. News outlets must be transparent about their AI use. The Monitor has published its AI policy on its website, requiring staff to be clear about any AI use in reporting, writing, or editing. All AI-generated information must be checked by a reporter or editor before publication.

The policy states that AI tools help them work more efficiently by suggesting headlines, summarizing stories, and organizing public information, but they do not replace human judgment, reporting, or editing. Van Fleet emphasized the importance of this point: "We are not generating fake articles," he said. "We are not having a robot cover the news of your community. You are going to interact with a human being. You're going to speak with a reporter, and you're going to be quoted accurately, and if you have questions about that story, you can talk to a human being about it.”

This story is part of Know Your News — a Granite State News Collaborative and the New England Newspaper and Press Association's Press Freedom Committee initiative on why the First Amendment, press freedom, and local news matter. Don’t just read this. Share it with one person who doesn’t usually follow local news — that’s how we make an impact. More atlaconiadailysun.com/knowyournews.