AI trends in newsroom report, revolutionizing story creation process
Last updated on: 11 October,2019 04:16 pm
AI systems have shown huge potential for investigative journalism by helping reporters analyse data.
US (Web Desk) – News experts after the research has reported a new trend in 2019 which focuses on the rise of artificial intelligence (AI) in the news creation process, which is impacting and transforming various features in the news industry.
Go-to-newsroom expert, the Wall Street Journal’s Franceso Macroni has given guidance before introducing AI in the newsroom. He is just one of the strategists and newsroom managers WAN-IFRA talked to for the latest Trends in Newsrooms report.
According to the World Association of Newspapers and News Publishers (WAN-IFRA), over the past five years, AI has slowly been making its way into newsrooms, and its impact is continuing to grow. From creating news stories automatically to optimising content delivery, newsrooms are making use of AI to automate and augment their reporting and other newsroom processes, making work-flows more efficient, speeding up time-consuming tasks, and increasing the breadth of their coverage.
In Norway, business daily Dagens Næringsliv is combining human and artificial intelligence in a bid to improve content recommendations; Swiss publishing group Tamedia built an algorithm to take over the curation process for one of its news app to test whether it can successfully replicate a journalist’s ‘gut feeling’; and UK-based news service RADAR is marrying traditional reporting and automation with the goal of scaling up the production of local news reports.
AI systems have also shown huge potential for investigative journalism by helping reporters analyse massive amounts of data, and enabling them to quickly find relationships among different entities. For the collaborative teams probing the Panama and Paradise Papers, this proved to be an unquantifiable asset.
But AI is not without risks. Although it hasn’t brought about the job losses many feared (quite the opposite, in fact, according to some experts), conclusions drawn by machines are not always correct, meaning journalists need to continuously question outcomes, validate methodologies, and ensure explainability. Another emerging risk of AI-driven content generation has manifested itself in the form of so-called deepfakes. In addition to traditional fact-checking processes, journalists must now also be vigilant about the possibility that video or image evidence may have been falsified.
In addition to highlighting these controversial aspects of AI, the latest chapter of the Trends in Newsrooms report showcases numerous examples of how newsrooms across the globe are making use of AI, offers practical tips on introducing the technology, and addresses some of the ethical dimensions to consider when doing so.