COVID upended lives and accelerated the growth of digital economies, tech migrations and online opportunities. Unfortunately, it also created a few extra monsters on social media platforms.
While greater responsibility fell on journalists to highlight disparities along social, gender and demographic lines as healthcare, vaccination bookings and offices went online, overworked media personnel also found themselves battling additional abuse in cyberspace.
The pandemic has resulted in an online space fraught with misinformation, disinformation and inflammatory messages from trolls – often automated robots – that know no boundaries.
For journalists to combat this backlash, it is important to create a village or a community of like-minded people that will have your back, according to Sree Sreenivasan, journalist and CEO of digital consulting company, Digimentors, who also runs a closed Facebook group of more than 8,000 people, mostly journalists from around the world.
“We all have two footprints: the physical and the digital,” said Sreenivasan at a webinar titled Bridging Divides: Digital Literacy and Tech Access where speakers discussed the digital divide, challenges and opportunities for journalists. You can watch the full recording here.
“The physical footprint is how we are known in our work and in our circle, professionally and personally. Our digital footprint is how we are seen, perceived and understood by other folks apart from that tight physical circle, and when these are not in balance, we’re losing opportunities and resources, especially for journalists.”
Sreenivasan, with other panellists, was speaking at the final episode of a four-part webinar series that focused on Media’s Role in a Sustainable Recovery in Asia and the Pacific hosted by the Asian Development Bank and the Thomson Reuters Foundation (TRF).
Journalists are deliberately targeted by online offenders to derail them because of the issues and concerns they highlight in their stories. Very often those who write vitriolic messages haven’t even read the article in question. With new connectivity and pay-per-tweet campaigns, more messages emerged during Covid from pseudonym twitter handles managed by users hiding behind digital screens, typing abusive messages and death threats aimed at journalists, especially women.
In her book, I Am A Troll, written before the pandemic, journalist Swati Chaturvedi’s investigation showed people were employed by a political party in India to undermine and embarrass journalists on social media. It was an eye-opener for some.
“Block and mute”
Almost two-thirds of women journalists surveyed by the International Federation of Journalists (IFJ) in 2018 said they had been subjected to online abuse.
“Another worrying result is that the majority of abused respondents said these attacks had had psychological effects such as anxiety or stress (63 %), while 38 % admitted to self-censorship and 8% lost their job,” according to the IFJ survey.
It helps to remember that in this highly tech-savvy AI-driven world of automations and algorithms, some of the malicious comments may be triggered by bots – computer programmes that are not people.
“I think the best advice that I got was don’t engage with toxic people on social media. It is just not worth it,” said Rina Chandran, Digital Rights Correspondent at TRF. “Use the block and mute functions … It is simply not worth your effort, your time and your mental health to engage with people whose minds will not be changed and who are there to just get a rise out of you.”
If it’s not computerised algorithms that are attacking women online, it’s the “awful men who are out there trolling women,” said Sreenivasan.
“Us men have no idea of the kind of trouble that women face online, especially women journalists, and we’re seeing this around the world. It’s very, very sad and upsetting,” said Sreenivasan. “I think men have an important role to play in speaking out.”
Misogyny, often racist misogyny, is at the heart of the more recent attacks on journalists, too, according to this article on online harassment of female journalists.
All the more reason to create an online peer-support community, be it through Twitter communities, LinkedIn Groups, Facebook Pages, WhatsApp or any other platform.
If that doesn’t work, learn to ignore, mute and block.
Better still, hire a ‘kind’ bot. Read on.
There’s (tech) help on the way
Imagine this. What if a malicious tweet targeted at you unexpectedly gets a kind and complimentary reply? That would throw the troll off.
That’s what a company called Areto Labs is working on. After using their software to protect women candidates during elections in Canada, New Zealand and the U.S., in a drive to get gender parity in government, Areto Labs found that their auto-messaging made candidates feel safer, their experiences were validated and they felt heard. So the company decided to apply the same to journalism and media, as well as women’s sports.
“With our software, organisations can track abuse directed at their journalists after getting alerts showing a spike in hate speech,” said Jacqueline Comer, founder and Chief Product Officer of Areto Labs. “Our system responds by sending automated mental health check-ins and transmitting automated positive messaging for the journalist under attack. This also activates a journalist’s network and bystanders to support the journalist.”
Similarly, journalists will soon have access to Harassment Manager, a tool developed through an innovative partnership between Jigsaw and the Thomson Reuters Foundation, that will monitor and limit a reporter’s exposure to harmful content online and protect them from attacks.
Online safety for journalists is at the forefront of TRF’s initiatives on media freedom, and the Foundation has also partnered with UNESCO, International Women’s Media Foundation and International News Safety Institute to develop a range of practical and legal tools for journalists, media managers and newsrooms to strengthen responses to online and offline abuse.
“From a business perspective, online hate affects productivity and employee retention,” said Comer. “As humans, we can’t be everywhere all of the time, but we can use software to help do some of that labour-intensive “troll busting” work.”
Online violence has the same impact as violence experienced offline, writes Elisa Lees Muñoz, the Executive Director of the International Women’s Media Foundation. She adds that studies have shown that online violence is contributing to the dep
More News
View AllThomson Reuters Foundation and UNESCO partner to launch the AI Governance Disclosure Initiative
The Thomson Reuters Foundation and UNESCO have partnered to launch the AI…
Announcing this year’s line-up for Trust Conference 2024
How do we defend democracy and trust in…
Thomson Reuters Foundation announces winners of the 2023 TrustLaw Awards
The winners of the 2023…
World Press Freedom Day: How the Foundation is giving independent media the tools to thrive
A thriving independent media is a key pillar of any free, fair and informed…
Thomson Reuters Foundation publishes 2023 Annual Report
Against a backdrop of global uncertainty, deepening…
Thomson Reuters Foundation to host leading corporate disclosure initiative on labour and human rights
Focusing on the ‘S’ in ESG, the data includes indicators on labour and workforce…
Reflections on COP28
One of the key debates at this year’s COP28 was the future of fossil fuels, with climate leaders calling for a…
How the Thomson Reuters Foundation promotes human rights around the world
Human Rights Day is an…
Voices from the frontline: A day in the life of an Editor in Ukraine
In this interview, the Chief…
Frontline reporter of Ukraine conflict among winners of the 2023 Kurt Schork Awards in International Journalism
Named in honour of the late American journalist, Kurt Schork, the Awards recognise the…