Prominent white supremacists are still on YouTube in wake of ban - CNN

Prominent white supremacists are still on YouTube in wake of ban

New York (CNN Business)Six days after YouTube said it would ban supremacist content and remove videos that deny well-documented atrocities like the Holocaust, accounts belonging to some of the most prominent purveyors of hate in the US, such as white supremacist Richard Spencer and former KKK leader David Duke, are still on the platform.

YouTube has taken some action against Duke's account, which he uses to, among other things, rail against what he calls the "Zio" media — "Zio" is a code word he uses for "Jewish" — and post bizarre fitness videos with advice on how to avoid shrunken testicles. Features like comments and sharing have been removed from the channel, and YouTube has added a warning that his videos contain "inappropriate" or "offensive" content. But a YouTube spokesperson told CNN Business that those actions predated the company's announcement last week.
The majority of videos on the account for the National Policy Institute, a white supremacist group that Spencer runs, do not contain any content warnings and most of the videos can still be shared and commented on. Spencer, who helped found the alt-right movement, was one of the leaders of the Unite the Right rally in Charlottesville, Virginia, in August 2017. Violence at that rally led to dozens of injuries and the death of counterprotester Heather Heyer.
    One video that has a content warning and other restrictions shows Spencer interviewing Maram Susli, a YouTube creator known as "Syrian Girl," who has contributed to conspiracy site InfoWars.
      In its blog post on Wednesday, YouTube said it was prohibiting "videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status." YouTube also said it would remove hundreds of thousands of videos that it had not previously considered to be in violation of its policies.
      A YouTube spokesperson declined to comment on specific accounts, but said that enforcement of the updated policy will take time and that the company will expand its coverage of the new rules over the next several months. The spokesperson also said accounts are removed after they have repeatedly violated YouTube's "Community Guidelines" or if the channel is dedicated to violating YouTube's policies.
      How effectively YouTube will enforce its new policy is an open question. CNN Business found on Thursday that one Nazi channel that YouTube has twice before deleted was back up, and making no attempt to hide itself or its connection to the two previously banned accounts.
        The channel was first taken down in April 2018 in wake of a CNN investigation which found that ads from over 300 companies and organizations ran on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda. Run by Brian Ruhe, who had emphasized to CNN in 2018 that he did not want to be referred to as a "neo-Nazi," because he thinks of himself as a "real, genuine and sincere Nazi," the account deleted on Wednesday had over 3,300 subscribers when it was taken down. Earlier this year, Ruhe had posted to the channel a video of himself and friends celebrating Adolf Hitler's birthday, complete with a cake featuring a Swastika made out of icing and "Heil Hitler!" salutes.
        Even though that account was deleted, a new Brian Ruhe account was already up on the site and posting videos on Wednesday, only hours after YouTube's policy announcement.
        After CNN Business asked YouTube about the new account, the company took it down. Ruhe confirmed to CNN Business that both accounts belonged to him. He said YouTube told him the accounts were taken down for "severe or frequent violations" of YouTube's policy prohibiting hate speech. But Ruhe claimed: "I deny that I have hate or that I use hate speech."
        YouTube's policies and its enforcement of them can be vague and inconsistent.
        The company says its rules are based on content, and not the person behind the content.
        But CNN Business' attempts to get answers as to YouTube's actions and thinking regarding several channels apparently owned by Ruhe were met with vague answers and new actions by YouTube that contradicted its previous positions.
        In addition to the new account Ruhe started Wednesday, CNN Business found two other accounts belonging to him. One focused on his brand of Buddhism; the other, which was dedicated to him livestreaming, contained only two lengthy videos, one of which included mentions of Adolf Hitler and Nazi ideology.
        After CNN Business asked YouTube about the accounts, it removed the livestreaming account and the account that Ruhe had started after the new policy was announced last week, though not the account about Buddhism.
        A cursory review of the account Ruhe started last week, though, did not reveal any content in obvious violation of YouTube's policies. When CNN Business asked YouTube why it was removed, since both it and the Buddhism account did not immediately seem to be in violation, YouTube responded by removing the Buddhism account. A YouTube spokesperson declined to provide any further explanation about these decisions.
        Asked why YouTube hadn't caught an account that had been banned twice before and was making no effort to hide what it was, the YouTube spokesperson said the platform relies on a combination of machine learning and user flags to address banned users making new accounts. The spokesperson also said YouTube removes reuploads of videos when flagged by its systems or users. The spokesperson declined to provide more details about how YouTube will address this issue in the future or how Ruhe was able to create multiple accounts.
        The company has long faced criticism for letting misinformation, conspiracy theories and extremist views spread on its platform.
        The company takes action on videos that violate its policies in several ways. It says it has four "pillars" for protecting users from harmful content, including deleting videos, restricting features on "borderline content," promoting authoritative voices, and rewarding trusted creators with the ability to make money from their channel, while demonetizing those who violate its hate speech policies.
        YouTube deletes videos for violating its guidelines, including uploading pornography, copyrighted material or content whose primary purpose is inciting hatred.
        For videos that are what the company calls "borderline content," it can opt to restrict certain features, such as removing the sidebar that appears to the right of most videos that recommends other content, or restricting the questionable videos from appearing on the "recommended" tab on the YouTube homepage. It can also add a content warning or disable comments.
        While YouTube's community guidelines forbid "racial, ethnic, religious, or other slurs where the primary purpose is to promote hatred," it has resisted removing Duke's page, which includes among other things, a video in which he rails against the "Zionist Matrix of Power" that he falsely claims "controls Media, Politics and Banking." Instead, YouTube has chosen to strip away several features from his videos, such as disabling comments, removing the sidebar next to the video that recommends videos and adding a content warning filter.
          YouTube can also take action on a channel by cutting off its ability to make money, such as through ads running on its videos. It's unclear whether Duke and Spencer have the ability to monetize their channels, but neither channel appears to have ads running on them.
          YouTube's new policies come after Facebook (FB) said in March it was banning white supremacist content from its platforms. Facebook's ban came after the suspect in the terror attack at two New Zealand mosques live streamed part of the massacre on its platform.