Misinformation watch on the 2020 election | CNN Business

Misinformation Watch

how to spot misinformation screengrab 01
Here's how to spot misinformation online
01:43 - Source: CNN Business

What we cover here

Misinformation Watch is your guide to false and misleading content online — how it spreads, who it impacts, and what the Big Tech platforms are doing (or not) about it.

Watch this space for regular updates leading up to the election, and through any turmoil after.

180 Posts

Analysis: What comes next?

We created Misinformation Watch to provide CNN readers a destination for misinformation-related U.S. election coverage. With the election process having come to a close, this is our last post on Misinformation Watch. It is far from the end of our coverage of misinformation, though. 

In the few short months of this project’s operation, a conspiracy theory took root within misinformation communities that had long thrived online. Former U.S. President Donald Trump did not lose the election, the theory falsely posited it was stolen.

The theory moved through well trod channels of misinformation – QAnon groups, conspiratorial hashtags, fringe YouTube personalities, the former President’s social media accounts – growing more complex as it spread. Eventually, the theory was so varied and multi-pronged that it was all but impossible to disprove to those who subscribed to it. It spilled out into the physical world and, eventually, its adherents spilled blood.

We watched, in real time, both the birth and the ultimate destructive power of that false reality. It was the perfect storm of misinformation, and it was another warning.

Misinformation is not simply the result of conspiratorial internet posters or grifters sowing fear to make a buck. Though it thrives under tech giants’ uneven content moderation policies, it will not be stamped out solely by more robust self-policing by these platforms.

Misinformation’s impact has as much to do with the ways in which we are served online content as the content itself. It is a consequence of how social media and internet companies are built and how they profit. A company that can both collect an unfathomable amount of information about a person – where they live, who they love, if they are happy, if they have a job, what secret questions they ask, if they might be interested in buying a bulletproof vest – is able to target content and advertisements at them precisely.

When that company’s ultimate goal is simply to keep that person scrolling, to keep them returning to the platform, then the alternative realities we have seen emerge are inevitable.

In the wake of the Capitol insurrection, panicked tech titans took broad action against purveyors of lies and conspiracy, ousting tens of thousands of accounts, including ones belonging to then-President Donald Trump. It was a moment of real change.

Misinformation and conspiracy communities that were pushed out of mainstream homes like Facebook and Twitter found safe haven on Parler, which was, in turn, pushed off its web hosting service.

“Stop the Steal” conspiracy theorists, QAnon believers, and other fringe communities scattered and splintered as they sought new homes online. The impact of this exodus for the internet and the rest of the US is yet to be known. 

While this project has come to an end, we’ll continue to cover both the origins and impact of misinformation. It is a topic of vital importance and we’re committed to covering it. Thanks for reading, and please keep coming back for more of our coverage of the issue — we’ve got a lot more in mind.  

Many believed conspiracy theories about Trump and the election. Now, they're losing faith

c997ad31-4216-47f0-b43c-73778fd686db.mp4
03:17 - Source: CNN

The moment of reckoning promised by the QAnon conspiracy theory never came. Now, many believers feel confused, duped, and uncertain of what comes next.

QAnon believers are in disarray after Biden is inaugurated

For years, believers of the QAnon conspiracy theory had been waiting for the moment when a grand plan would be put into action and secret members of a supposed Satanic pedophilia ring at the highest ranks of government and Hollywood would suddenly be exposed, rounded up and possibly even publicly executed. They were nearly always sure it was right around the corner, but “The Storm” never came — and the moment of Joe Biden’s inauguration was the last possible opportunity for President Donald Trump to put the plan in motion.

But as Biden raised his hand and swore an oath to defend the Constitution, becoming the nation’s 46th president — nothing happened.

The anti-climax sent QAnon adherents into a frenzy of confusion and disbelief, almost instantly shattering a collective delusion that had been nurtured and amplified by many on the far right. Now, in addition to being scattered to various smaller websites after Facebook (FB) and Twitter (TWTR) cracked down on QAnon-related content, believers risked having their own topsy-turvy world turned upside down, or perhaps right-side up.

Members of a QAnon-focused Telegram channel, and some users of the image board 4chan, vowed to keep the faith. Others proclaimed they were renouncing their beliefs. Still others devised new theories that purported to push the ultimate showdown further into the future. One of the ideology’s most visible icons, Ron Watkins — who goes by the online moniker CodeMonkeyZ — told supporters to “go back to our lives.”

Read more here

Facebook says it has removed tens of thousands of QAnon accounts since last summer

Facebook said Tuesday that since August it has removed about 18,300 Facebook profiles and 27,300 accounts on Facebook-owned Instagram for violating its policies against QAnon. The company has also removed 10,500 groups and 510 events for the same reason.

In a blog post, updated on the eve of the inauguration, the company said it had taken action on tens of thousands of “militarized social movements” and self-described militias since last summer.

“As of January 12, 2021, we have identified over 890 militarized social movements to date,” the company said.

Facebook has come under scrutiny for the role its platform played in the lead-up to the deadly insurrection at the Capitol earlier this month. Groups and individuals spreading lies about the 2020 election and calling to protest the outcome continued to hide in plain sight on Facebook even after the Capitol riots.

Facebook announced it would crack down on QAnon last summer. The baseless conspiracy theory has been circulating since 2017. People identifying as part of QAnon were part of the mob of Trump supporters who stormed the Capitol.

Last week, Twitter announced it had suspended more than 70,000 accounts for promoting QAnon.

Despite crackdowns, the conspiracy theory continues to spread on Twitter, Facebook and fringe social platforms, according to new research from nonpartisan nonprofit Advance Democracy.

Over the holiday weekend, more than 1,280 accounts related to QAnon posted on Twitter about 67,000 times, peddling conspiracy theories about the election and President-Elect Joe Biden, according to the research.

For example, one QAnon account shared a 45-second video rife with false claims about election fraud. The video racked up about 360,000 views. After CNN Business flagged the video, Twitter took down the account for violating its rules against ban evasion.

Facebook shows ads for tactical gear despite announcing a temporary ban

Less than 24 hours before Joe Biden is set to become president, Facebook continues to show ads for tactical gear despite vowing to ban those promotions ahead of the inauguration.

A review by CNN and other internet users this week showed that ads for body armor, holsters and other equipment were being displayed on the platform as late as Tuesday afternoon. 

Often, the advertised products are pictured alongside guns, ammunition, or people clad in camouflage fatigues. 

The ads have frequently appeared in the timelines of military veterans and contribute to a false narrative of an imminent violent conflict in the United States, according to Kristofer Goldsmith, founder and president of High Ground Veterans Advocacy. 

“They’re selling the idea of pending violence, or inevitable violence, and that’s the kind of thing that becomes a self-fulfilling prophecy,” said Goldsmith. 

In one example still on Facebook Tuesday afternoon, a pair of noise-reducing earbuds was being advertised as a form of active hearing protection, shown inserted in the ears of a gunman aiming down his rifle sights. 

Another ad, for body armor, promises consumers that the product can shield them from bullets, knives, stun guns and other threats. 

A third series of ads, for hard-knuckled gloves, showed a man wearing desert camouflage and a tactical rig performing various tests on the gloves, including punching concrete walls, breaking a glass bottle by hand and rubbing broken glass on the gloves’ palms.

“They put people in combat gear in a civilian setting,” Goldsmith said of the ads. “They’re promoting this image of, ‘You need to get ready for combat.’”

Asked for comment, Facebook referred CNN to its earlier blog post announcing that it will ban “ads that promote weapon accessories and protective equipment” in the United States through at least Jan. 22. 

“We already prohibit ads for weapons, ammunition and weapon enhancements like silencers,” Facebook said in the blog post. “But we will now also prohibit ads for accessories such as gun safes, vests and gun holsters in the US.”

After Facebook introduced the ban on Saturday, BuzzFeed News reported the following day that some ads for tactical gear were still active. Many of the ads observed by CNN had been active, in some cases, for months. Others had been launched within the past week.

Facebook appears to have removed some of the advertisements CNN found, including a series of ads for armored plates and plate carriers. The plates had, in some cases, been shown being held by heavily muscular individuals dressed in fatigues or being inserted into camouflage-patterned backpacks. Despite having seemingly removed some of the advertisers’ ads, Facebook has allowed other ads for the same products, by the same advertisers, to persist on the platform.

Another now-removed series of body armor ads included marketing copy that claimed specific levels of protection under the rubric established by the National Institute of Justice. 

Veterans are a popular target for misinformation and conspiracy theorists, Goldsmith said, because as a group they enjoy political and social authority. An endorsement by a veteran can reinforce a conspiracy theory’s apparent credibility.

“If you change the mind of a veteran, there’s a good chance you change the minds of those within that veteran’s immediate circle — friends, family, coworkers,” said Goldsmith. 

'Stop the steal' groups hide in plain sight on Facebook

Groups and individuals spreading lies about the 2020 election and calling to protest the outcome have continued to hide in plain sight on Facebook, even as COO Sheryl Sandberg this week tried to downplay the platform’s role in the Capitol riots. 

From altering the names of their online forums to abusing the core features of Facebook’s own services, conspiracy theorists have worked to evade content moderators despite the company’s vows of a crackdown, new research shows. 

These groups’ efforts to remain undetected highlight the sophisticated threat confronting Facebook, despite its insistence the situation has been less of a problem compared to on other platforms. It also raises new concerns that the groups’ persistence on these mainstream social networks could spark a new cycle of violence that stretches well into Joe Biden’s presidency. 

The latest examples surfaced on Thursday, as extremism experts at the activist group Avaaz identified 90 public and private Facebook groups that have continued to circulate baseless myths about the election, with 166,000 total members. 

Of those, a half-dozen groups appeared to have successfully evaded Facebook’s restrictions on “stop the steal” content, according to Avaaz. Though many initially had “stop the steal” in their names, the groups have since altered their profiles, according to page histories reviewed by CNN Business — allowing them to blend in with other Facebook activity. 

“So instead of ‘Stop the Steal,’ they became ‘Stop the Fraud’ or ‘Stop the Rigged Election’ or ‘Own the Vote,’” said Fadi Quran, campaign director at Avaaz.

Read more here.

YouTube hires a doctor to help combat Covid-19 misinformation

YouTube is working with top health organizations to create authoritative medical videos for the platform in an effort to crackdown on Covid-19 misinformation.

The new health partnership team will be headed by Dr. Garth Graham, YouTube’s new director and global head of healthcare and public health partnerships. Graham was most recently the chief community health officer at CVS Health.

YouTube will work with organizations including the American Public Health Association, Cleveland Clinic and the Forum at the Harvard School of Public Health to make “high-quality health content” for its users, according to a blog post.

Like other tech platforms, YouTube has had to tackle the spread of misinformation about Covid-19.

In October, the Google-owned platform said it would take down videos that include misinformation about Covid-19 vaccines. It previously took action on other content containing falsehoods about the virus, such as videos disputing Covid-19 exists. At the time, the company said it had removed more than 200,000 videos containing dangerous or misleading information about Covid-19 since February 2020.

Messaging app Zello bans thousands of armed extremist channels after Capitol riots

The messaging app Zello said it has removed more than 2,000 channels on its platform related to armed extremism, and banned all “militia-related channels,” after it found evidence that some of its users participated in the Capitol riots. 

Zello, a voice messaging app that provides a walkie-talkie-like function, condemned the violence in a blog post on Wednesday.

“It is with deep sadness and anger that we have discovered evidence of Zello being misused by some individuals while storming the United States Capitol building last week,” the company said. “Looking ahead, we are concerned that Zello could be misused by groups who have threatened to organize additional potentially violent protests and disrupt the U.S. Presidential Inauguration Festivities on January 20th.”

Zello added that “a large proportion” of the channels it removed on Wednesday had been dormant for months and in some cases years.

The company is further analyzing the groups on its platform to determine whether any may violate its terms of service. But it added that because it does not store message content, the task is not as simple as running searches for keywords or hashtags and blocking them.

Telegram struggling to combat calls for violence amid surge in growth

The messaging app Telegram is battling an increase in violent extremism on its platform amid a surge in new users, the company acknowledged to CNN Wednesday. 

In the last 24 hours, the company has shut down “dozens” of public forums that it said in a statement had posted “calls to violence for thousands of subscribers.”

But the effort has turned into a game of cat and mouse, as many of the forum’s users set up copycats just as soon as their old haunts were disabled. Screenshots and Telegram groups monitored by CNN show that a number of channels containing white supremacy, hate and other extremism have been shut down, but that at least some have been replaced by new channels. And at least one meta-channel has emerged that maintains lists of deactivated groups and that redirects visitors to the replacements. One now-defunct group that CNN reviewed had more than 10,000 members.

“Our moderators are reviewing an increased number of reports related to public posts with calls to violence, which are expressly forbidden by our Terms of Service,” Telegram spokesperson Remi Vaughn told CNN. “In the past 24 hours we have blocked dozens of public channels that posted calls to violence for thousands of subscribers.” 

Vaughn added: “Telegram uses a consistent approach to protests and political debate across the globe, from Iran and Belarus to Thailand and Hong Kong. We welcome peaceful discussion and peaceful protests, but routinely remove publicly available content that contains direct calls to violence.” 

Telegram has surpassed half a billion active users worldwide. The company announced Tuesday that it had grown by 25 million users over the past several days – with about 3 percent of that growth, or 750,000 new signups, occurring in the United States alone, Telegram told CNN.

Apps such as Telegram, Signal and MeWe have experienced explosive growth in recent days after WhatsApp sent a notification to its users reminding them that it shares user data with its parent, Facebook – and following the suspension of President Donald Trump and the alternative social network Parler from many major tech platforms. 

One of the people who has been reporting violent channels to Telegram is Gwen Snyder, a Philadelphia-based activist who said she has been monitoring far-right extremists on the platform since 2019. Earlier this week, as Telegram was witnessing a surge in new users, Snyder enacted a plan to organize mass pressure against Telegram’s content moderators.

“We started two days ago calling for Apple and Google to deplatform Telegram if they refused to enforce their terms of service,” Snyder told CNN. “We had dozens if not hundreds of relatively large-follower Twitter accounts amplifying the campaign.”

It’s difficult to determine whether Telegram’s actions may have been a direct result of the activism; Snyder said she never heard from Telegram or from Apple or Google, either. 

But at least some of the Telegram channels affected by the crackdown appeared to believe that Snyder’s efforts were responsible — and soon began posting her personal information online and targeting her with death threats.

“That’s my home address,” Snyder said in a public tweet, attaching a redacted screenshot of an extremist Telegram channel that had shared her information. Addressing Telegram, she added: “You’re okay with this? ENFORCE YOUR OWN TERMS OF SERVICE.”

Facebook sees online signals indicating more potential violence 

Facebook has seen online signals, on its platform and elsewhere, indicating the potential for more violence following last week’s insurrection, a company spokesperson told CNN Wednesday. 

The company is working with organizations that track terrorists and dangerous groups to monitor conversation on other platforms, like 8Kun (formerly 8chan) and 4chan, in an effort to prevent talk of violence from those platforms becoming popular on Facebook, the spokesperson said.

One example of work Facebook is doing on this, according to the spokesperson, is collecting and indexing promotional fliers being distributed on other sites for more demonstrations this weekend and on Inauguration Day. Indexing promotional material like this can help make it easier for Facebook to identify and remove that material from its platforms or prevent it from being posted in the first place.

The spokesperson said Facebook is monitoring and removing praise of or support for last week’s storming of the US Capitol from its platform. 

Facebook has passed on information to the FBI and is cooperating with the agency’s efforts to identify members of last week’s insurrection, the spokesperson said. 

Google pauses all political ads until after the inauguration

Google will temporarily ban all political advertising on multiple platforms after formally designating the Capitol riots, the impeachment process and the inauguration as “sensitive events” under its policies, the company said Wednesday.

The pause will last from Jan. 14 until at least the day after the inauguration next week, the company said in a letter to marketers, which was obtained by CNN Business.

Google said in the letter that it will restrict advertising “referencing candidates, the election, its outcome, the upcoming presidential inauguration, the ongoing presidential impeachment process, violence at the US Capitol, or future planned protests on these topics.”

“There will not be any carveouts in this policy for news or merchandise advertisers,” Google continued in the letter.

Ads will be banned from Google as well as YouTube, according to the letter.

In a statement to CNN Business, a Google spokesperson said the ban is driven by last week’s Capitol violence.

“Given the events of the past week, we will expand our Sensitive Event policy enforcement to temporarily pause all political ads in addition to any ads referencing impeachment, the inauguration, or protests at the US Capitol,” Google said in the statement. “We regularly pause ads over unpredictable, ‘sensitive’ events when ads can be used to exploit the event or amplify misleading information. Beyond this, we have long-standing policies blocking content that incites violence or promotes hate and we will be extremely vigilant about enforcing on any ads that cross this line.”

Google imposed a similar “sensitive events” ad blackout surrounding Election Day and for several weeks after. The company lifted its election-related moratorium on political advertising on Dec. 10, indicating in a letter to advertisers obtained by CNN Business that “we no longer consider the post-election period to be a sensitive event.”

But the events of the past several weeks, culminating in last week’s riots, suggest that determination may have been premature.

Fact check: Man in viral airport tantrum video was kicked off plane for rejecting mask policy, not because of Capitol insurrection

A video that shows an agitated man in an airport terminal, complaining that he had been kicked off a plane and insulted, has now been viewed more than 20 million times on Twitter.

Why has the 18-second video gone so viral? In part because someone on Twitter – not the person who actually recorded the video – added a caption that suggested that the man had been put on a no-fly list for being part of the insurrection at the US Capitol.

“People who broke into the Capitol Wednesday are now learning they are on No-Fly lists pending the full investigation. They are not happy about this,” the tweeter, who goes by the handle @RayRedacted, said in the caption. 

Facts FirstThe Twitter caption was inaccurate: The airport incident was not about the Capitol insurrection. Rather, the man in the video had been asked to get off a Charlotte-to-Denver flight for refusing to comply with American Airlines’ mandatory mask policy, airline spokesman Curtis Blessing told CNN.

Read more here.

YouTube is suspending President Donald Trump's channel

YouTube is suspending President Donald Trump’s channel for at least one week, and potentially longer, after his channel earned a strike under the platform’s policies, the company said Tuesday evening.

A recent video on Trump’s channel had incited violence, YouTube told CNN Business. That video has now been removed.

YouTube declined to share details of the video that earned Trump the strike, but said that after the one-week timeout, it will revisit the decision. YouTube also removed content from the White House’s channel for violating policy, the company told CNN Business, but the channel itself has not been suspended or been given a strike – just a warning.

Until now, YouTube had been the only remaining major social media platform not to have suspended Trump in some fashion. Facebook has suspended Trump’s account “indefinitely,” while Twitter has banned Trump completely.

Read more here.

How a 'nobody' in Texas with 200 Twitter followers created a viral post with false impeachment claims

A viral tweet claims that impeaching President Donald Trump for a second time would mean he would lose the ability to run for president in 2024.

That’s not true. Nor are other claims in the tweet.

The tweet was posted on Friday, two days after a Capitol insurrection by a mob of Trump supporters sparked a new impeachment push from House Democrats. As of early Monday, it had more than 181,000 retweets and 725,000 likes.

When we called Ben Costiloe, the person behind the tweet, to tell him that we were planning a fact check and that much of the tweet was inaccurate, he said good-naturedly: “Tear it a new one. Go for it, baby.” He said he is “nobody,” a man who lives with diabetes in Texas and did the tweet because he had seen the information pop up somewhere on his Facebook feed and “it made me feel good.” 

He said he was never sure the content was correct and was amazed the tweet went so viral. He said he had only 200 Twitter followers at the time he posted it. 

“I don’t want to mess up the world. I just wanted to make me feel good,” he said. “It turns out it made a lot of people feel good.”

Read more here.

Twitter says it has banned 70,000 accounts since Friday that promoted QAnon

Since Friday, Twitter has suspended more than 70,000 accounts from its platform for promoting the baseless QAnon conspiracy theory, the company said in a blog post Monday evening. 

The social media platform has been on an enforcement spree in recent days as it has removed major QAnon adherents including Michael Flynn and Sidney Powell. Many of the banned account-holders operated multiple accounts, Twitter said.

The moves have contributed to major fluctuations in some users’ Twitter accounts, the company acknowledged. 

“In some cases, these actions may have resulted in follower count changes in the thousands,” Twitter said. 

The company’s blog post also goes over other steps the company has taken in recent days to limit the spread of violent rhetoric on its platform, including making it impossible for any tweets “labeled for violations of our civic integrity policy to be replied to, Liked or Retweeted.” 

Parler has been removed from the Google Play store

Parler, the alternative social media platform popular with conservatives, has been banned from the Google Play Store, Google told CNN Business Friday evening. 

Google said its app store has long required that apps displaying user generated content have moderation policies in place to prevent the spread of violent rhetoric. 

“We’re aware of continued posting in the Parler app that seeks to incite ongoing violence in the US,” a Google spokesperson said. “We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content. In light of this ongoing and urgent public safety threat, we are suspending the app’s listings from the Play Store until it addresses these issues.” 

The decision marks a major blow to President Donald Trump’s supporters, many of whom have found a home on the Parler platform. But it does not completely deny them access to the app. Because Android allows for third-party app stores, Parler can still be hosted on app stores not operated by Google. 

Google’s decision follows a report by BuzzFeed News that Apple has threatened to remove Parler from the iOS App Store. (Apple declined to comment on the report.)

Parler is among a group of relatively new platforms that have billed themselves as free speech alternatives in hopes of courting conservatives who believe larger platforms are censoring their views. 

Read more here.

YouTube says it has banned Steve Bannon's "War Room" channel

YouTube has banned Steve Bannon’s “War Room” podcast channel after earning three strikes on the platform in the last 90 days, the company told CNN.

Bannon’s first strike came in November after he called for putting Dr. Anthony Fauci’s head “on a pike.” 

Two other videos were removed on Friday afternoon for violating YouTube’s policies against questioning the 2020 election outcome, YouTube said, resulting in two additional strikes. Under YouTube’s policy, a channel may be permanently banned after three strikes.

“In accordance with our strikes system, we have terminated Steve Bannon’s channel ‘War room’ and one associated channel for repeatedly violating our Community Guidelines,” a YouTube spokesperson said.

“As we said yesterday, any channel posting new videos with misleading content that alleges widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election in violation of our policies will receive a strike, a penalty which temporarily restricts uploading or live-streaming.”

Twitter bans President Trump permanently

Twitter has suspended President Donald Trump from its platform, the company said Friday evening.

“After close review of recent Tweets from the @realDonaldTrump account and the context around them we have permanently suspended the account due to the risk of further incitement of violence,” Twitter said.

“In the context of horrific events this week, we made it clear on Wednesday that additional violations of the Twitter Rules would potentially result in this very course of action.”

Twitter’s decision followed two tweets by Trump Friday afternoon that would end up being his last. The tweets violated the company’s policy against glorification of violence, Twitter said, and “these two Tweets must be read in the context of broader events in the country and the ways in which the President’s statements can be mobilized by different audiences, including to incite violence, as well as in the context of the pattern of behavior from this account in recent weeks.”

The first tweet was about Trump’s supporters.

“The 75,000,000 great American Patriots who voted for me, AMERICA FIRST, and MAKE AMERICA GREAT AGAIN, will have a GIANT VOICE long into the future. They will not be disrespected or treated unfairly in any way, shape or form!!!”

The second indicated Trump did not plan to attend Joe Biden’s inauguration.

Read more here

Twitter bans high-profile QAnon promoters including Michael Flynn and Sidney Powell

Twitter said Friday it has banned Michael Flynn, Sidney Powell and the administrator of 8kun, the online forum that has incubated QAnon claims, as part of a crackdown of accounts that have spread the baseless QAnon conspiracy theory.

Twitter said it launched the crackdown, which affects thousands of accounts, under its policy against coordinated harmful activity, amid concerns that QAnon supporters could seek to incite violence.

“We’ve been clear that we will take strong enforcement action on behavior that has the potential to lead to offline harm,” Twitter said in a statement, “and given the renewed potential for violence surrounding this type of behavior in the coming days, we will permanently suspend accounts that are solely dedicated to sharing QAnon content.”

NBC News was first to report the sweep.

The move caps off a tumultuous week in which Twitter, in response to Wednesday’s Capitol riots, imposed a temporary lock on President Donald Trump’s accounts. Trump is currently able to tweet again but Twitter warned that future violations its rules “will result” in Trump being booted off the platform.

The crackdown on Friday affects some of Trump’s most loyal allies, whom Twitter said it was removing for sharing QAnon claims.

Flynn served as Trump’s national security adviser before he was convicted of lying to the FBI and then pardoned by the President. Powell is a lawyer who helped the Trump campaign spread baseless theories of election fraud. ##Twitter##

Twitter and Trump are now locked in a game of chicken

President Donald Trump’s tweeting privileges have now been restored following Twitter’s temporary lock on his account this week for inciting what became a violent insurrection at the US Capitol. In his first tweet since being let out of the penalty box, Trump shared a video conceding that he will be a one-term president. 

But for Twitter, the challenge of what to do about Trump’s account may only be getting more difficult, not less. As Trump reemerges, the company now faces a test of its commitment that any further violations of its policies by the President will result in a permanent ban. Even one more transgression could land Trump in Twitter jail — forever. 

It’s a game of chicken that Trump, whose entire presidency has been devoted to breaking rules and testing boundaries, is sure to play. 

For the last four years, Twitter has been central to Trump’s presidency, a fact that has also benefited the company in the form of countless hours of user engagement. Twitter took a light-touch approach to moderating his account, often arguing that as a public official, Trump must be given wide latitude to speak.

But as Trump nears the end of his term — and as public pressure has grown against the platform — the balance may be shifting. Last spring, the company began applying warning labels to Trump’s tweets in an attempt to correct his misleading claims ahead of the election; it arguably had the opposite effect, prompting Trump to retaliate with an executive order and ever more baseless claims of election fraud. 

With those claims having reached their zenith by spurring a full-blown riot, Wednesday saw the most aggressive moves yet by Twitter and other companies to rein Trump in. For the first time in four years, it seems, Trump will need to appease Twitter more than Twitter needs to appease him.

Read more here.