Hosting
Tuesday, January 21, 2025
Google search engine
HomeInternetModerators protect us from the worst of the internet. This entails enormous...

Moderators protect us from the worst of the internet. This entails enormous personal costs


Unless you’re a moderator of a local community group discussing waste collections or dog park etiquette, you’re unlikely to fully understand the enormity and scope of the abuse directed at people online.

But when social media moderation and community management are an essential part of your daily work, the toll on people and their loved ones can be enormous. Journalists can face torrents of abuse, often early in their careers.

If they come from a culturally or linguistically diverse background, this reluctance to report may be even greater than with other colleagues.

There is growing concern among employers about how moderating confrontational content can impact people’s well-being. Employers also have a duty to keep their staff safe at work, including online.

The ABC wanted to understand what this looked like in practice. The internal survey data shows how serious the problem has become for moderators deployed to keep audience members safe when contributing to online discussions.

What did the ABC find?

In 2022, the ABC asked 111 employees who engaged in online moderation as part of their work to self-report how often they were exposed to potentially harmful experiences.

First, it was important to understand how long people spent online moderating content. Of those who had to moderate content every day, 63% did so for less than an hour and a half, and 88% did so for less than three hours.

The majority of staff surveyed saw potentially harmful content every week.

71% of moderators reported seeing defamation of their work on a weekly basis, while 25% saw it on a daily basis.



Read more: Can human moderators ever really rein in harmful online content? New research says yes


Half said they see misogynistic content every week, while more than half said they see racist content every week.

About a third reported seeing homophobic content every week.

In the case of offensive language, 20% say they encounter it every week.

It’s a confronting image in itself, but many see more than one type of this content at a time. This worsens the situation.

It’s important to note that the survey did not specifically define what was meant by racist, homophobic, or misogynistic content, so that was open to interpretation by the moderators.

A global issue

We’ve known about the mental health issues moderators face in other countries for a few years now.

Some people employed by Facebook to filter out the most toxic material have taken the company to court.

In one case in the United States, Facebook reached a settlement with more than 10,000 content moderators, including $52 million ($77.8 million) for mental health treatment.

In Kenya, 184 Facebook-contracted moderators are suing the company for poor working conditions, including a lack of mental health care. They are seeking US$1.6 billion ($2.3 billion) in compensation.

The case is still ongoing, as are other separate cases against Meta in Kenya.

In Australia, during the height of the COVID pandemic, moderators reported how confronting it could be to deal with misinformation and threats from social media users.

A 2023 report from Australian Community Managers, the peak body for online moderators, found that 50% of people surveyed said maintaining good mental health was a key challenge of their work.

What is being done?

While not without its problems, the ABC is leading the way in protecting its moderators from harm.

The company has long worked to protect its workforce from exposure to trauma with a variety of programs, including a peer support program for journalists. The program was supported by the Dart Center for Journalism and Trauma Asia Pacific.

But as the level of abuse against staff increased in tone and intensity, the national broadcaster appointed a full-time Social Media Wellbeing Advisor. Nicolle White manages the workplace health and safety risks generated by social media. She is believed to be the first in the world in such a role.

As part of the survey, ABC moderators were asked how they could be better supported.

Unsurprisingly, turning off responses was rated as the most useful technique for promoting well-being, followed by management support, peer support and preparing responses for expected audience responses.

However, disabling comments often leads to complaints from at least some people that their opinions are being censored. This is despite media publishers being legally liable for commentary on their content, following a 2021 Supreme Court ruling.

Educating staff on why people comment on news content is an important part of harm reduction.

Some of the other changes implemented following the survey included encouraging staff not to moderate comments if they related to their own experiences or identities unless they feel empowered to do so.

The peer support program also matches employees with others with moderation experience.

Managers were urged to ensure staff completed self-care plans in preparation for high-risk moderation days (such as the Voice referendum). These include documenting positive coping mechanisms, implementing boundaries at the end of a news shift, debriefing and asking staff to reflect on the value of their work.

Research shows that one of the most protective factors for journalists is being reminded that their work is important.

But the most important advice for anyone working on moderation is to ensure they are given clear guidance on what to do if their wellbeing is affected, and that seeking support in the workplace is normal.

Lessons for others

While this data is specific to the public broadcaster, it is certain that the ABC’s experiences are reflected in the news industry and other forums where people are responsible for moderating communities.

It’s not just paid employees. Volunteer moderators at youth radio stations or administrators of Facebook groups are among the many people who face online hostility.

What is clear is that any business or voluntary organization building an audience on social media must consider the health and safety implications for those charged with maintaining these platforms, and ensure they have supporting strategies in place.

Australia’s eSafety Commissioner has developed a range of publicly available resources to help.


The author would like to acknowledge the work of Nicolle White in writing this article and the research reported therein.



Source link

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular