With new moderator tools, Twitch moves toward a layered safety approach

news@insiderapps.com

Published on 07/23/2022

With new moderator tools, Twitch moves toward a layered safety approach

It's difficult, sometimes a thankless work to moderate an online community, and it's considerably tougher when done in isolation. The Twitch community already exchanges information on people they want to stay away from formally through connected channels. With a new tool that enables channels to exchange blacklists and invites communities to work together to remove persistent harassers and other disruptive users before they can cause issues, the firm is now formalising this ad hoc practice. Alison Huffman, Twitch Product VP, told TechCrunch that the company's ultimate goal is to empower community moderators. According to Huffman, Twitch holds "extensive" talks with moderators to ascertain what they require to feel more productive and to make their communities safer. In November, Twitch launched a new method to help moderators spot people attempting to avoid channel bans. The new functions align with Twitch's goal of providing "layered" safety on its platform, where producers broadcast live, often to hundreds of thousands of viewers, and moderation choices must be made in real-time at every level.

Changes that Twitch brought

The largest on-the-spot choice that moderators have to make is usually determining which people are acting honestly and aren't purposefully causing issues and which aren't. Creators and channel moderators can ask other channels they'd like to share lists of banned people via the creator dashboard. Since the tool is reciprocal, any channel that requests the list of another streamer will also provide their own in exchange. A channel can approve all requests to exchange ban lists or limit requests to those from Twitch Partners, Affiliates, and mutually followed channels. All channels can share ban lists with up to 30 other channels, allowing them to compile a sizable list of individuals they'd like to keep out. Channels may decide when to cease sharing their lists. Any account channels discovered through these shared lists can be automatically monitored or prohibited; by default, they will be the latter. Users who are "monitored" can still communicate, but their initial message will be highlighted in a red box showing where else they have been banned. They will also be tagged so that their behaviour can be closely observed. A channel may then decide whether to immediately prohibit them or to give them the all-clear and elevate them to "trusted" status.

Twitch’s regulations to prevent harassment:

The newest moderation tools from Twitch are an intriguing approach for channels to impose their rules on people who could be troublesome but may not go so far as to violate the company's more general rules that forbid overt bad behaviour. Especially for underrepresented groups, it's not difficult to imagine someone intentionally harassing a channel without directly violating Twitch's policies against hate and harassment. Although Twitch agrees that harassment may take "various forms," the activity that will result in a suspension from Twitch is "stalking, personal assaults, encouragement of physical violence, hostile raids, and intentional false report brigading." The shared ban tool is a start in the right direction, but there is a grey area of conduct outside of that description that is more challenging to capture. However, Twitch encourages a channel to report users if they violate the platform's rules rather than simply the local ones for their channel. The content moderation challenge on Twitch serves as a type of crucible where viewers may see harmful streams and inflict harm while they take place in real-time. Other platforms concentrate on post-process content detection, where content is submitted, examined by automated systems, or reported. Then it either remains online, is taken down, or is marked with a user or platform-facing warning. The firm is rethinking its safety strategy and paying attention to community feedback. It considers the concerns of underrepresented groups like Black and LGBTQ broadcasters who have long struggled to establish a safe place or a noticeable presence on the platform. Through its #TwitchDoBetter campaign, Color of Change urged the business to do more to safeguard Black artists in March. The business has been under criticism from the trans community and the larger LGBTQ community for not doing more to stop hate raids, in which malevolent people overwhelm a streamer's channel with deliberate harassment. Late last year, Twitch filed a lawsuit against two users for organising automated anti-Semitic activities. Ultimately, effective regulations that are consistently implemented and upgrades to the toolbox available to moderators are likely to have a greater daily impact than litigation, but adding additional lines of defence can't hurt.

Profile picture for user news@insiderapps.com
Peter Daniels
Peter Daniels is the lead journalist for InsiderApps.com


The business app store.
All the best web apps you need for your business. Curated and compared.
1,000+ Apps for every business category you can imagine. We independently review and compare software applications to find you the best ones for you what you need.
To accomplish your goals, you need the right tools.

interview news apps

signNow

Electronic Signature Tools for Businesses

Pixpa

No-code Portfolio Website Builder

HoneyBook

Client management software for small businesses

Tanium

Security and Systems Management Platform

InviteReferrals

360-Degree Referral Marketing Software

Dashlane

Password Management Software

Enboarder

Smart people activation platform

Freshsales

Sales and Leads Management CRM

Kareo Clinical

Electronic Health Records solution

PRPosting

Link Building & Blogger Outreach Service

Connecteam

Employee Management for Desk-Less Teams

WinTask

RPA & Task Automation Tool

The website encountered an unexpected error. Please try again later.