Showing posts with label Poynter. Show all posts
Showing posts with label Poynter. Show all posts

Thursday, July 11, 2024

The assignment: Build AI tools for journalists – and make ethics job one; Poynter, July 8, 2024

 , Poynter; The assignment: Build AI tools for journalists – and make ethics job one

"Imagine you had virtually unlimited money, time and resources to develop an AI technology that would be useful to journalists.

What would you dream, pitch and design?

And how would you make sure your idea was journalistically ethical?

That was the scenario posed to about 50 AI thinkers and journalists at Poynter’s recent invitation-only Summit on AI, Ethics & Journalism

The summit drew together news editors, futurists and product leaders June 11-12 in St. Petersburg, Florida. As part of the event, Poynter partnered with Hacks/Hackers, to ask groups attendees to  brainstorm ethically considered AI tools that they would create for journalists if they had practically unlimited time and resources.

Event organizer Kelly McBride, senior vice president and chair of the Craig Newmark Center for Ethics and Leadership at Poynter, said the hackathon was born out of Poynter’s desire to help journalists flex their intellectual muscles as they consider AI’s ethical implications.

“We wanted to encourage journalists to start thinking of ways to deploy AI in their work that would both honor our ethical traditions and address the concerns of news consumers,” she said.

Alex Mahadevan, director of Poynter’s digital media literacy project MediaWise, covers the use of generative AI models in journalism and their potential to spread misinformation."

Wednesday, July 3, 2024

We asked people about using AI to make the news. They’re anxious and annoyed; Poynter, June 27, 2024

 , Poynter; We asked people about using AI to make the news. They’re anxious and annoyed

"Sometimes, when it comes to using artificial intelligence in journalism, people think of a calculator, an accepted tool that makes work faster and easier.

Sometimes, they think it’s flat-out cheating, passing off the work of a robot for a human journalist.

Sometimes, they don’t know what to think at all — and it makes them anxious.

All of those attitudes emerged from new focus group research from the University of Minnesota commissioned by the Poynter Institute about news consumers’ attitudes toward AI in journalism.

The research, conducted by Benjamin Toff, director of the Minnesota Journalism Center and associate professor of Minnesota’s Hubbard School of Journalism & Mass Communication, was unveiled to participants at Poynter’s Summit on AI, Ethics and Journalism on June 11. The summit brought together dozens of journalists and technologists to discuss the ethical implications for journalists using AI tools in their work. 

“I think it’s a good reminder of not getting too far ahead of the public,” Toff said, in terms of key takeaways for newsrooms. “However much there might be usefulness around using these tools … you need to be able to communicate about it in ways that are not going to be alienating to large segments of the public who are really concerned about what these developments will mean for society at large.”

The focus groups, conducted in late May, involved 26 average news consumers, some who knew a fair amount about AI’s use in journalism, and some who knew little. 

Toff discussed three key findings from the focus groups:"