How artificial intelligence is transforming journalism

How artificial intelligence is transforming journalism

Automated news production can be seen as the continuation of the automation that began in newsrooms in the late 1980s, and the continuation of data-driven journalism. Picture from March 2010, The Wall Street Journal newsroom (New York).

(AP/Mark Lennihan)

Artificial intelligence (AI) is shaking up present-day journalism. Automated news writing and distribution, without human supervision, is already a reality, often unbeknownst to the reader. This raises a number of basic questions. What will journalists of the future have to learn? Is this new reality likely to improve the working conditions in the industry? What do media businesses stand to gain and lose?

The fourth industrial revolution is stirring up a cocktail of changes in the world of work, and, in the case of the European media, comes at a time when jobs in the industry are precarious (ever-fewer payroll employees and ever-growing numbers of low-paid freelancers) and investment in innovation is severely lacking (such as new tools or staff training). Projects putting their money on innovation have nonetheless started to emerge.

In 2015, the Norwegian News Agency (NTB) started work on a project to generate automated football news coverage, which was launched in 2016. Together with experts in artificial intelligence, a group of journalists learned new skills whilst the robot was being “trained”, a decision crucial to the development of the algorithm.

“A large amount of editorial input is needed to help the robot make the right choices. This learning process in the newsroom has led to many new ideas about possible areas of automation: from simple news regarding the weather and commodity prices to an ambitious plan to offer fully automated election night services for the local election in Norway next year,” explains Helen Vogt, who recently retired after a 42-year-long career in the media.

The automated news reports are supervised by a team of journalists and have proved to be 99 per cent reliable. In the case of events the algorithm is unable to predict, if an incident leads to a match being cancelled, for instance, the robot would have no way of knowing the causes, so it is programmed not to write anything if a match is suspended, explains Vogt.

Automated news production can be seen as a continuation of the automation that began in newsrooms in the late 1980s, and the continuation of data-driven journalism.

“If journalists cannot compete with those systems, which will always be faster, they can make use of them for investigative work or to support their daily routines,” says Laurence Dierickx, a freelance journalist/developer and PhD student at the Belgian university ULB-ReSIC, where she is doing research into automated news production and how journalists use it.

Vogt, the former head of innovation at NTB, has seen how her agency has benefitted from it. “Automation has helped NTB deliver a much broader news service, reporting on lower tier matches that we never had the capacity to cover before. It means all our media customers get the reports they are interested in immediately after the match is over, because the algorithm can work so much faster than humans, and is able to produce scores of stories simultaneously, within seconds.”

How will it affect job prospects?

Will automation put journalists out of work? “There are very few estimates on the matter,” says Dierickx, giving a few figures: “8.25 per cent in Belgium (ING, 2015), 32 per cent for the whole information and communication sector in Wallonia (IWEPS, 2017), 17 per cent for the whole creative sector in Germany (McKinsey 2017) [are expected to lose their jobs as a result of AI]. Other prospective studies say that more journalists are likely to be affected (International Data Corporation 2016 and Ericsson 2017) but, at the same time, all of these studies underline that the jobs involving human relations will be preserved. There are a lot of contradictions, and no one can predict the future,” she acknowledges.

Dierickx and many other experts agree on both points: the contradictions and the inability to predict the future. True, the more repetitive tasks can be automated, but it is impossible to create a technology that replaces the essential human part of the profession such as the relationship with sources, opinion, in-depth analysis or determining newsworthiness.

This expert stresses the empowering potential of innovation: “Instead of seeing automation technologies as adversaries, why not take their best part and make them allies? We now have enough examples to demonstrates that it is working.”

One of the projects bringing artificial intelligence to newsrooms is INJECT, an AI-based tool making it easier to find original angles to a story.

“It is essential that journalists be part of the conversation about the future of journalism and that they press for tech applications that benefit the profession,” underlines Andrea Wagemans, the project’s coordinator, who is determined to bring technology and the debate surrounding it closer to those in the trade.

Aware that what is good for journalism and what makes business sense does not always coincide, she insists on the need to take on a more active role and to develop a closer relationship with technological innovation. “What do you want AI to do? Now, but more importantly, in the future. How do you think it could help you do your job better? And how do you think it could support what journalism is supposed to be?” she asks.

For Vogt, journalists need to work closely with developers. “Many old school journalists seem unable to talk to tech people: they do not understand what developers do, so they often disrespect their work. Learning a bit of python code would probably help. A course in simple programming for journalists is something I definitely recommend.”

What are the ethical and legal implications?

So far, we have no answers, just questions,” explains Matthias Spielkamp, founder and executive director of AlgorithmWatch, a not-for-profit organisation focused on investigating the consequences of algorithmic decision making, ADM, for society.

Should the texts produced automatically, news on football matches or financial reporting, be marked as automated, to let readers know? “There are different approaches to this. Some say readers need transparency but at the same time many readers seem to be unconcerned about reading automatically produced content, as we can see in examples like weather reports. It seems to me a good idea to provide this information to readers right now because many people are simply not aware that automated production of journalistic content exists,” Spielkamp points out.

It is a debate that also stretches to other areas, such as automated content distribution, and where or whether lines should be drawn. “Let’s say the New York Times deployed a massive distributed botnet to push their content to certain targeted audiences – I suppose many of us would be averse to this, no matter how credible the reporting.”

There is no specific legislation on artificial intelligence in the EU. There is regulation tailored to the use of algorithms, like the Directive on Markets in Financial Instruments or the General Data Protection Regulation that addresses automated decisions using personal data. “Just imagine an article produced automatically is libellous. It doesn’t matter that it was produced by a machine, the publisher will have to assume responsibility.

This article has been translated from Spanish.