Smith Brain Trust / August 25, 2021

Social Media’s Taliban Conundrum

Social Media’s Taliban Conundrum

The U.S. military’s abrupt Afghanistan withdrawal has left social media companies facing a complex new set of policy decisions.

Maryland Smith’s Jui Ramaprasad said the subsequent Taliban takeover is especially problematic for the likes of Facebook and YouTube, which categorize the Taliban as “a dangerous organization” in accordance with the group being sanctioned as a terrorist organization under U.S. law. Facebook, for example, applies the policy when it shuts down a Taliban-affiliated account.

But as the Taliban positions itself as a legitimate governing body, it also has been social media savvy for years, via the ability of its members and supporters to operate substantially within the rules of the likes Twitter, Instagram and Facebook. These factors suggest a potential easing of restrictions on the Taliban.

But at the same time, Facebook stands out for continuing to silence former President Donald Trump – a conflict that it would need to reconcile, as Ramaprasad, an associate professor of information systems, recently told Forbes. “It is still unclear what role platforms like Twitter and Facebook want to take in managing the information that they facilitate the spread of,” she said. “Though Facebook has instituted the Oversight Board – ‘The Facebook Supreme Court’ – and Twitter has become more active in suspending and even banning accounts, e.g., Donald Trump, a clear policy on what constitutes the criteria for suspension or account locking is lacking.”.

Ramaprasad added: “If Donald Trump was banned for potential for further inciting violence, one would think that the Taliban would be as well. But this is the result of shying away from clear, uniform and consistent policies and implementation of these policies.”

Social media companies, more broadly, may now have to be more straightforward about what is allowed on the platforms, and what should be subject to removal, Ramaprasad said. “Indeed, concisely outlining what constitutes a violation of platform policy – e.g., the spreading of ‘conspiracy theories, disinformation, and hate speech’ – is not always straightforward in implementation. However, it is not sustainable to continue to evaluate these on a case-by-case basis.”

Media Contact

Greg Muraski
Media Relations Manager
301-405-5283  
301-892-0973 Mobile
gmuraski@umd.edu 

Get Smith Brain Trust Delivered To Your Inbox Every Week

Business moves fast in the 21st century. Stay one step ahead with bite-sized business insights from the Smith School's world-class faculty.

Subscribe Now

Read More Research

Back to Top