Twitch, the livestreaming giant popular among video gamers, has been thrust into the national spotlight after the suspect in the Buffalo grocery store mass shooting tried to broadcast the attack on the platform.
Twitch removed the livestream less than two minutes after the violence began on Saturday, a spokesperson for the company told CNN. Despite the quick action by Twitch to delete the content, clips and copies of the disturbing video of the shooting, which police says was a racially-motivated hate crime, quickly spread across other social media platforms over the weekend.
While Twitch may not be as much of a household name as some other large social platforms, the Amazon-owned service has gained immense popularity in recent years beyond gaming. Twitch has become a destination to watch others play video games, raise goats and other animals, and even just sleep.
But its core feature – livestreaming – poses a huge set of challenges. Live video and live audio have proven difficult for a number of social platforms to effectively moderate, given their real-time and ephemeral nature.
“The vast majority of the content that appears on Twitch is gone the moment it’s created and seen,” the company said in a 2021 transparency report. “That fact requires us to think about safety and community health in different ways than other services that are primarily based on pre-recorded and uploaded content.”
To address this issue, Twitch relies on a mix of machine detection systems, human moderators and user reporting to identify content that violates its guidelines, not unlike other social platforms. But as the Buffalo mass shooting shows, even taking action a couple minutes after a broadcast begins may not be enough to stop the video’s spread online.
Twitch has been in the news for videos featuring violent content before: A gunman who killed two people in a 2019 shooting near a synagogue in Germany livestreamed the attack on Twitch before it was eventually removed by the platform. Twitch said in a statement at the time that it was “shocked and saddened” by the incident in Germany and stressed the platform “has a zero-tolerance policy against hateful conduct.”
A 180-page document that has been attributed to the Buffalo shooting suspect allegedly references how the 2019 attack in Germany was streamed on Twitch and remained online before being removed. (The Twitch video of the Germany attack was 35 minutes long.)
Here is what you should know about Twitch, as it again finds itself under scrutiny following the deadly attack in Buffalo.
What is Twitch?
Launched in 2011 and acquired by Amazon in 2014 for nearly $1 billion, Twitch initially gained immense popularity among the online video game community. The platformsays it has an average viewership at any given moment of more than 2.5 million users and has more than 31 million average daily visitors.
The platform has pivoted beyond just gaming content, attracting a wider range of viewers and digital creators who livestream on topics ranging from pop culture to music. Users do not need an account to tune into a live broadcast.
A recent report from livestreaming software company Streamlabs found that Twitch represented more than 90% of the market share for hours streamed compared to competitors YouTube Gaming and Facebook Gaming during the first three months of 2022.
Who uses it?
The same report said Twitch’s most-watched category since late 2020 has been the wide-ranging “Just Chatting” category. The Just Chatting page features everything from clips on the Johnny Depp defamation trial to cooking tutorials.
Nearly 75% of Twitch users are between the ages of 16 and 34, the company has said. Politicians including Rep. Alexandria Ocasio-Cortez, D-NY, and former President Donald Trump have used Twitch in an apparent attempt to draw in younger voters. A Trump campaign account, however, was eventually suspended from Twitch in the wake of the Jan. 6 mob attack at the US Capitol.
What issues has Twitch faced before?
Apart from the 2019 livestreamed attack in Germany, Twitch also had to address videos of a mass shooting that year at two mosques in New Zealand that left 51 people dead. The gunman first livestreamed the attack on Facebook, but it was subsequently shared on Twitch. Twitch took legal action in 2019 against users who posted the mass shooting video and other content on the platform, Bloomberg reported at the time.
Like other big platforms, Twitch has been forced to grapple with the spread of online harassment. In particular, it attracted headlines last year when a group of streamers campaigned to boycott Twitch for a day amid the rise of “hate raids.” Hate raids, which found outsized popularity on Twitch, usually involve users ambushing a Twitch streamer’s chat with an onslaught of similarly hateful messages – typically consisting of racist, transphobic or other comments targeting marginalized groups. Twitch has attempted to crack down on this behavior, including by updating the tools it uses to respond to them.
How does Twitch’s response compare to other platforms?
Twitch said the video from the Buffalo suspect was removed from the platform less than two minutes after the violence started. While this is markedly quicker than the 17 minutes it allegedly took Facebook to remove the New Zealand shooting video in 2019, critics including New York Gov. Kathy Hochul decried that it wasn’t deleted by platforms “within seconds.”
In a statement sent to CNN confirming the shooting was streamed on its platform, Twitch said the user “has been indefinitely suspended from our service, and we are taking all appropriate action, including monitoring for any accounts rebroadcasting this content.” The company did not immediately respond to follow-up questions.
Spokespeople for Facebook, Twitter, YouTube and Reddit all told CNN that they had banned the sharing of the video on their sites and are working to identify and remove copies of it.