James Holmes is accused of killing 12 people during a screening of "The Dark Knight Rises" in Colorado.

Story highlights

Facebook pages backing the suspect in Colorado theater killings have popped up

James Holmes is accused of killing 12 moviegoers early Friday morning

A Facebook fan page for Holmes had more than 800 likes on Wednesday

Professor of pop culture warns against reading too much into a tiny piece of the Internet

CNN  — 

The stock photo, posted on a Facebook fan page for the accused Colorado shooter, shows two young men in a movie theater turning around to tell the people behind them to be quiet.

“If you don’t shut up,” it says, “we’ll James Holmes your a–.”

It’s not new for Facebook pages to pop up in support of accused killers and other distasteful figures, but a few dozen Holmes fan pages – including one with more than 800 followers that appeared the day Holmes is accused of opening fire on a theater in Aurora, Colorado, killing 12 people and wounding dozens – are raising new questions about what constitutes free and appropriate speech in the digital age, especially on Facebook.

The network, which has 900 million monthly active users, has long been regarded as a sort of brightly lit town square in an Internet that has some dark and seedy corners. People must sign up for Facebook under a real identity, so the trolling habits that anonymity seems to encourage tend to be less tolerated on Facebook than on sites such as 4chan or even YouTube.

Still, Facebook can be fairly lenient about what it will let people say on its network. The company has decided not to take down the James Holmes fan pages, although employees are monitoring them closely for types of speech that would violate the social network’s community guidelines, spokesman Fred Wolens said.

The most popular page, “while incredibly distasteful, doesn’t violate our terms,” he said.

For that fan site to be taken down, it would have to post “credible threats” against specific people or post something that was intended to incite violence, Wolens said.

Facebook declined to comment on whether particular messages or comments on that particular page had been removed because they met either of those two criteria. Facebook sometimes will remove comments, posts or photos without taking down an entire page.

Wolens added that the Holmes fan pages are not representative of how Facebook users are responding. He pointed to several pages where Facebook users are rallying around the victims of the Colorado shooting by posting memorials, messages of support and by trying to raise money for a victim who was shot and now is in an intensive care unit.

“We are heartened that the vast majority of activity on Facebook surrounding this tragedy has been focused on helping the community cope and beginning the healing process in the wake of these events,” the company said in an e-mailed statement.

In the past, Facebook has been criticized both for leaving up certain Facebook pages and images and for taking down others.

Last year, the site sparked outrage when it removed a support page for nursing mothers because it featured breastfeeding photos, but reinstated it two days later. It is now part of the site’s guidelines to allow such images to be posted on the site.

The social network in March 2011 took down a page calling for a Palestinian intifada after the Israeli government complained.

It left up a Holocaust-denial page in 2009, saying that “being offensive or objectionable” does not mean a site can be removed. At the time, Dallas, Texas, attorney Brian Cuban urged the network to set tighter controls, saying: “There is no First Amendment right to free speech in the private realm. This isn’t a freedom-of-speech issue. Facebook is free to set the standard that they wish.”

Facebook’s “Community Standards” document, which is posted online, addresses violent and threatening speech in this way: “Safety is Facebook’s top priority. You may not credibly threaten to harm others, or organize acts of real-world violence. We remove content and may escalate to law enforcement when we perceive a genuine risk of physical harm, or a direct threat to public safety. We also prohibit promoting, planning or celebrating any of your actions if they have, or could, result in financial harm to others, including theft and vandalism.”

The site also does not tolerate speech that promotes self-harm, hate speech, bullying or “graphic content,” nudity or pornography, according to that online policy.

Other technology companies have become embroiled in similar debates about what is and isn’t appropriate communication on online platforms.

Apple is regarded as running one of the tightest ships, since it pre-approves apps that will be sold in its App Store. It also has been criticized, however, for rejecting apps that would compete with the company’s own offerings or that it finds distasteful for one reason or another. In 2010, for example, the company initially rejected an app that featured the work of a Pulitzer Prize winning political cartoonist. The app later was approved by Apple, following pad press.

Google+, that technology giant’s social network, came under fire for initially requiring its users to sign in using their real names. Some groups protested the policy, saying that political dissidents in authoritarian regimes, for example, couldn’t use the service without fear of violence. The company softened the policy in January, allowing nicknames.

Twitter bans users from impersonating others, infringing on trademarks, using the service unlawfully and threatening violence, according to its standards document, called “The Twitter Rules.” It bans using pornographic images “in either your profile picture or user background,” but does not seem to ban users from posting links to such material.

Facebook is letting the Holmes fan page saga play out.

Many users have commented that they find the page “disgusting” and “sick minded.” “You mock the death of these poor people? Seriously?” one wrote.

Others are calling for Facebook users to ignore the page in order to avoid giving it more power. “He’s just a troll with nothing better to do than to fish for negative attention on the internet. I actually feel sorry for him,” one commenter wrote.

The page’s administrator, who has not revealed his or her identity on that fan page, doesn’t seem bothered.

“Whatever you have to say to me, I don’t care. Whenever you report me. This page isn’t affected. (I’ve been reported over like a billion times and nothing has happened),” the page’s administrator writes, adding: “Also, I don’t believe in karma, and I don’t believe in hell. Please keep this in mind when you post. Unless its something smart or funny, Please know; I’m just going to laugh at you and all you’re doing is wasting your time.”

While the page is whipping some people into a fury, Robert Thompson, a professor of pop culture at Syracuse University, cautioned against reading too much into a tiny piece of the Internet that has little value to the larger public discourse.

“Probably the amount of attention that we give to this stuff is totally disproportionate to how most people feel,” he said by phone. “But because anybody with the Internet has got an international distribution system at their fingertips, if you start something that is a pro-mass-murderer fan page, it’s going to get the attention of people.”

It’s understandable that people would be outraged by the Facebook page, he said, especially since it mocks the victims of a mass murder that is not even a week old. But the page might be written in an ironic tone, he said, and it certainly isn’t representative of mainstream views in America – or anyone’s view, other than its creator.

“The mistake is made when people say what does this page mean about America?” he said, “This page doesn’t say thing about America, besides maybe that there are too many Facebook pages.”