Now that he’s back on Twitter, neo-Nazi Andrew Anglin wants someone to explain the rules to him.
The account of Anglin, the founder of an infamous neo-Nazi website, was reinstated on Thursday. Anglin was one of many previously banned users who benefited from an amnesty granted by Twitter’s new owner, Elon Musk. The next day, Musk suspended Ye, the rapper formerly known as Kanye West, after he posted an image of a swastika fused with the Star of David.
“That was good,” Anglin tweeted on Friday. “Whatever the rules are, people must follow them. We just have to know what the rules are.”
That’s something to ask Musk. Since the world’s richest man paid $44 billion for Twitter, the platform has struggled to define its standards on misinformation and hate speech, issued conflicting statements and failed to fully address what researchers say It is a worrying increase in hate speech.
As the “tweeter boss” is learning, running a social network with nearly 240 million daily active users requires more than just good algorithms and often requires imperfect solutions to sticky situations: tough decisions that ultimately must be made by a human being and that will surely upset someone.
Musk, a self-described free-speech absolutist, has stated that he wants to make Twitter a global, digital public square, but has also said that he will not make major decisions about content or about restoring banned accounts before he creates a “content moderation council” that integrates diverse points of view.
He soon changed his mind after polling Twitter users. He offered to restore the accounts of a long list of previously banned users, including former President Donald Trump, Ye, satire site The Babylon Bee, comedian Kathy Griffin and neo-Nazi Anglin.
While Musk’s own tweets indicate that he will allow all legal content on the platform, Ye’s suspension showed that this is not entirely the case. The swastika image posted by the rapper falls into the “lawful but nefarious” category that often plagues content moderators, according to Eric Goldman, a tech law expert and professor at Santa Clara University School of Law.
While Europe has imposed rules that force social media platforms to create policies against misinformation and hate speech, Goldman stressed that, at least in the United States, the lax rules allow Musk to run Twitter as he sees fit. despite his incoherent approach.
“What Musk is doing with Twitter is fully permitted under United States law,” Goldman stressed.
Pressure from the EU may force Musk to expose his policies to ensure he complies with new European law, which will take effect next year. Last month, a senior EU official warned Musk that Twitter would need to step up its efforts to combat hate speech and misinformation: Failure to comply could lead to hefty fines.
In another confusing move, Twitter announced in late November that it would end its policy that prohibits misinformation about COVID-19. Days later, however, he posted an update saying: “None of our policies have changed.”
On Friday, Musk revealed what he said was the behind-the-scenes story of Twitter’s 2020 decision to limit the spread of a questionable New York Post article based on information allegedly obtained from the laptop of Hunter Biden, a lawyer. and American lobbyist who is the second son of US President Joe Biden. Facebook also took steps to limit the spread of the article.
Twitter initially blocked links to the article on its platform, citing concerns that it contained hacked material, but later reversed the decision after then-Twitter CEO Jack Dorsey criticized the decision.
The information revealed by Musk included Twitter’s decision to remove a handful of tweets after receiving a request to that effect from the Joe Biden campaign. The tweets included nude photos of Hunter Biden that had been shared without his consent, in violation of Twitter’s rules against revenge pornography.
More than revealing nefarious conduct or collusion with Democrats, Musk’s disclosure highlighted the kind of difficult decisions about content moderation that he will now face.
“Difficult, confusing and thorny decisions” are inevitable, said Yoel Roth, Twitter’s former head of trust and security who resigned just weeks after Musk took ownership of the platform.
While the old Twitter was far from perfect, it strove to be transparent with users and consistent in enforcing its rules, Roth said. That changed with Musk, he added while speaking recently at a Knight Foundation forum.
“When push comes to shove, when you buy a $44 billion thing, you have the final say on how to run that $44 billion thing,” Roth said.
While much of the attention has focused on Twitter’s decisions in the United States, the layoffs of many people who worked in content moderation are affecting other parts of the world as well, according to activists for a campaign called #StopToxicTwitter ( “#FinAlTwitterTóxico”).
“We’re not talking about people not being resilient enough to listen to hurtful feelings,” said Thenmozhi Soundararajan, executive director of Equality Labs, which works in South Asia to combat caste-based discrimination. “We are talking about preventing dangerous genocidal hate speech that can lead to mass atrocities.”
Soundararajan’s organization is part of Twitter’s Trust and Safety Council, which has not met since Musk took office. She said that “millions of Indians are terrified to know which account will be reinstated.” Twitter has stopped responding to concerns expressed by the group.
“So what happens if there is another call for violence? Do I have to label Elon Musk and hope he deals with the pogrom?” Soundararajan wondered.
Cases of hate speech and racial epithets spiked on Twitter after Musk bought the company, as some users tried to test the limits of the new owner. Since then, the number of tweets containing hateful terms has continued to rise, according to a report released Friday by the Center for Countering Digital Hate, a group that tracks hate and extremism online.
Musk says Twitter has reduced the spread of tweets containing hate speech by making it harder to find unless a user searches for it, but that didn’t satisfy the Center’s executive director, Imran Ahmed, who called the rise in hate speech hate as a “blatant failure to live up to its own self-proclaimed standards.”
On the heels of Musk’s inauguration and the firing of much of Twitter’s staff, researchers who had previously reported harmful hate speech tweets or misinformation on the platform confirmed that no one has responded to their pleas.
Jesse Littlewood, vice president of campaigns for the organization Common Cause, said his group contacted Twitter last week about a tweet from Republican Rep. Marjorie Taylor Greene alleging voter fraud in Arizona. Musk reinstated Greene’s personal account after she was banned from Twitter for spreading misinformation about COVID-19.
This time, Twitter acted quickly, telling Common Cause that the tweet did not violate any rules and would remain on the platform, despite the fact that Twitter requires any content that spreads false or misleading claims about the results to be tagged or removed. the elections.
Twitter did not give Littlewood any explanation as to why it was not following its own rules.
“I find that quite confusing,” Littlewood said.
Twitter did not respond to messages seeking comment for this story. Musk has defended the platform’s sometimes jerky moves since he took over, alleging that bugs will occur as it evolves. “We will do a lot of dumb things,” he tweeted.
For Musk’s many online fans, the clutter is a feature, not a bug, of the site under its new ownership, and a reflection of the free-speech mecca they hope Twitter will be.
“Love Elon Twitter so far,” tweeted a user calling himself Some Dude (“A guy”). “Chaos is glorious!”