Chatbots, AI-driven machines, have become increasingly popular in journalism due to their ability to write articles quickly and efficiently, process data in real-time, and deliver accurate news stories. However, they lack the creativity, empathy, and human touch that human journalists bring to their work. The International Federation of Journalists (IFJ) has warned that chatbots can spread false information and take away job security. They should be designed with regulations and limitations to protect public trust.
IFJ suggests that journalists’ trade unions should play a role in shaping the regulation and implementation of chatbots in the media industry.
“The world of journalism continues to evolve, we are witnessing the rise of new technologies and innovations that are changing the way we communicate and report the news. One such technology generating lots of headlines is chatbots powered by GPT (Generative Pre-Trained Transformer) technology. While these chatbots offer significant potential for the media industry, they also pose unique challenges for journalists,” IFJ said in a statement sent to Malaysia World News.
According to IFJ, chatbots can automate routine tasks and help news organizations engage with audiences, but they can also spread false information and take away job security. They should be designed with regulations and limitations to protect public trust.
“On the one hand, chatbots can enable journalists to automate routine tasks such as fact-checking, article summaries and translations. This could free up journalists’ time to focus on more complex tasks such as investigative reporting and analysis. Chatbots could also help news organizations engage with their audiences in more interactive and personalized ways, such as through chat interfaces that can answer reader questions and provide recommendations on relevant stories.
“On the other hand, chatbots run the risk of inadvertently promoting misinformation and fake news. If chatbots are not designed with appropriate regulations and limitations, they can spread false information and contribute to the erosion of public trust in journalists and news organizations. Furthermore, chatbots could potentially replace journalists altogether, taking away their job security and putting many out of work,’’ IFJ said.
IFJ suggests that all the journalists’ trade unions should play a role in shaping the regulation and implementation of chatbots in the media industry, and they should work with news organizations and technology companies to ensure that chatbots are ethical and responsible, and invest in journalists’ training and development to ensure they serve the public interest.
“Journalists’ trade unions should also ensure that chatbots are designed and operated in an ethical and responsible manner, with transparency and accuracy as the key guiding principles. Trade unions can also advocate for the need to invest in journalists’ training and development to help them adapt to the new technologies and ensure that they are not left be- hind,” IFJ said.
IFJ stresses that through collaboration between the media industry, technology companies and journalists’ trade unions, they can develop a framework for the ethical deployment and regulation of chatbots, while also securing jobs.
“By working together we can ensure that chatbots serve the public interest and enhance rather than undermine, the core values of journalism,” IFJ said.
Although AI technologies have the ability to enhance the work of journalists, they also carry the risk of spreading fake news and causing me- dia professionals to lose their jobs.
In Pakistan and across South Asia, AI threatens to unleash a new wave of misinformation and disinformation, as improving generative engines poise to challenge and mislead even the most media literate.
Recently, this was observed on a global scale when images of Pope Francis wearing a large puffy jacket and images of former US President Donald Trump being violently detained by the New York Police Department were widely posted on social media, fooling and misleading users. Similar incidents have been ob- served in India, where the fueling of existing tensions using AI generations sets a dangerous precedent in countries struggling with misinformation- related violence.
Another significant risk may come in the form of large-scale job losses if the transition within the media industry is handled poorly. In the Republic of Korea and China, the introduction of AI-generated news anchors on television has seen jobs lost for human news readers. A report released in February by analytics and research firm Graphika found several AI-generated news reports are being deployed over social media.
The role of AI in the future is un- clear, but a greater emphasis on fact- checking and tackling misinformation is needed. The conditions and employment of journalists and media workers need to be secured if news outlets want to utilise AI technologies Regardless of its role, if news outlets want to utilise AI technologies, the conditions and employment of journalists and media workers need to be secured, added IFJ in its statement.- Malaysia World News/NOW Magazine