Hacks versus Bots: Can AI ever replace journalists?
Digital technology is transforming almost every industry on the planet, with companies looking to steal a march on their competitors by using it to enhance their products and services, as well as delivering them more efficiently. So, it’s no surprise that publishers and news organisations are looking to get in on the act.
Around the world, there are examples of media companies adopting technology powered by AI to perform certain functions that were once carried out by their human employees.
Global news agency Associated Press already uses software to generate articles about quarterly earnings reports from thousands of companies – stories it would never have the resources to cover otherwise. Meanwhile, Swedish publisher, MittMedia, produces a robot-written story on every local house sale, a constant supply of editorial intelligence for eager estate agents and nosy neighbours.
There are examples of this technology being used as a force for good, like the French news agency that uses AI to detect doctored photos or a Canadian bureau that has developed a system to speed up translations between English and French content. However, for every success story, there are examples of incidents that highlight its limitations and call the practice into question.
One recent example was the storm that blew up around Microsoft’s MSN website, which took a decision to replace dozens of human journalists with a piece of AI software that selects, edits and publishes content from other news sources for its users. The new system picked up an article from The Guardian in which Little Mix star Jade Thirlwall spoke about her experiences of racist bullying at school, which it then published with a photo of the wrong band member, Leigh-Anne Pinnock, in error.
Of course, news organisations publishing incorrect photos or captions isn’t unheard of, but the fact that it came so soon after the decision to axe staff, and at a time when racism is such a prominent issue, added fuel to the fire.
It is also a timely reminder to media owners that readers will hold you responsible for the content you produce, whatever its original source. When artificially generated content starts looking or feeling different to the rest of your output, or fails to reflect certain cultural or psychological characteristics that users would expect, they may start tuning out or going to competitors for their content.
Setting aside the quality aspect for a moment, what about the ethical implications of using AI in journalism? The majority of news organisations will say the goal of automation isn’t to replace humans with robots, but to let real journalists focus on reporting on the issues that matter, rather than getting bogged down in necessary but mundane tasks. If that’s the case, then many will support the ‘modernisation’ of the industry – but others, including unions, will be maintaining a watchful eye on the reality of the situation and calling out those who they feel are using AI to cut corners and decimate jobs.
Other concerns include unconscious bias of automatically generated articles, whether that be political or corporate, and the lack of adequate scrutiny around the information that is being used to generate the story or illustrate its impact. When people question the integrity of your content, it is unlikely that “the computer wrote it” is a defence that will pass muster, either in law or in the court of public opinion.
There’s also the question of whether your automation can be sabotaged, or become a target of hackers, funded by political forces or market manipulators. However tempting the commercial benefits of employing AI may be, the question of “who’s watching the machine?”, is never going to be far away.
For journalists, this is all about having empathy. It’s about putting yourself into the reader’s shoes and asking the type of questions they would want you to ask. It is also about understanding the plight of a story’s subject in order to report on it in a way that not only informs, but engages your audience.
For businesses, whether you’re launching a new product or service, championing a cause or trying to influence positive behaviour change, it’s about developing a real appreciation of how different audiences are thinking and feeling and understanding the little nuances that can be so important when it comes to effective communication.
These little details are often the very things that get missed when content generation becomes automated and people are removed from the process. This is also why journalism, and its associated industries, including PR and communications, which have story-telling and crafting a narrative at its heart, will always be about people, not machines. Some things will always be best done by humans.
This article was written by our Chief Executive, Angharad Neagle, and appeared in the Western Mail newspaper on 17 June 2020.