Twitch’s New Policy Targets "Misinformation Superspreaders"

Just days after Discord announced its updates to tackling misinformation, Twitch has also updated its community guidelines surrounding content creators who spread harmful lies both on or off the platform.

Twitch's community guidelines now have a section devoted to "harmful misinformation actors," which outlines how the platform prohibits "harmful misinformation superspreaders who persistently share misinformation on or off of Twitch."

There are three categories that these misinformation superspreaders must fall into in order for Twitch to take action. First, they must persistently share misinformation that is secondly widely disproven, and that misinformation must be harmful. Twitch provided several examples, such as COVID-19 vaccine conspiracy theories or lies related to the November 2021 US election and the insurrection that followed.

For civic misinformation claims, Twitch is partnering with "independent misinformation experts" from the "Global Disinformation Index, as well as information from election boards and congressional certification."

The streaming platform stated that one-time statements or discussions aren't grounds for disciplinary action. Everyone's allowed to be wrong or tricked by falsehoods once. But persistent misinformation sharers will trigger Twitch's penalty system, which can include strikes on accounts, suspensions, and the removal of content.

You can report misinformation directly to Twitch at [email protected]. Remember to include links, screenshots, or other evidence to prove the misinformation and also mention the account name.

Discord has announced similar changes affecting its platform on March 28. Those new rules seemed mostly aimed at tackling vaccine misinformation in the wake of Joe Rogan's continuing spread of falsehoods regarding the COVID-19 vaccines on Spotify. Discord promises to ban accounts and channels that are likewise used to spread misinformation.

Although these moves seem more targeted at ongoing issues surrounding elections and vaccines, they could also apply to the ongoing conflict between Russia and Ukraine. Russia has long been known to spread misinformation on social media channels on a variety of topics, and its illegal invasion of Ukraine is just the latest example. However, platforms have already moved to block Russian misinformation regarding the war, which has claimed the lives of hundreds of civilians.

Source: Read Full Article