Social Media Platforms Need to Pick a Side

Social media sites can no longer fence-sit in the fight between established, institutional power and calls for social justice
Long read
Sponsored

Looking for the best app for international calls?

Compare options and get matched to the best price quotes for your needs

Compare options

In recent weeks, the power of social media platforms has come into sharper focus than ever before.

Platforms including Instagram, Snapchat, Facebook, and Twitter have been used by Black Lives Matter activists from around the world to shed light on the causes and effects of systemic racism. Meanwhile, those same platforms have been used by politicians and public figures, notably President Trump, to issue threats of retaliation and violence.

Social media platforms are more impactful than ever when it comes to spreading messages of both positive change and hate speech. But, sooner or later, they will all need to pick a side – do they toe the line with established powerbrokers, or do they opt to support less powerful individuals in the fight for equality and justice? The days of having it both ways may be numbered, as the decision-makers behind each platform scramble to keep on the right side of their users, their own staff, political leaders, and the wider public mood.

Facebook

“I know many people are upset that we’ve left the President’s posts up, but our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies”– Facebook’s Mark Zuckerberg.

Clearly, President Trump’s post, which read “Liberal Governors and Mayors must get MUCH tougher or the Federal Government will step in and do what has to be done, and that includes the unlimited power of our Military and many arrests,” was not clear enough for Zuckerberg to remove.

Nor was Trump’s post which read, “Looting leads to shooting, and that’s why a man was shot and killed in Minneapolis on Wednesday night.”

Facebook’s own employees, however, had a different take. On Workplace, Facebook’s internal chat tool, employees voiced their dismay with Zuckerberg’s response.

“I have to say I am finding the contortions we have to go through incredibly hard to stomach,” wrote one employee. “All this points to a very high risk of a violent escalation and civil unrest in November and if we fail the test case here, history will not judge us kindly.”

“It’s been said previously that inciting violence would cause a post to be removed. I too would like to know why the goals shifted, and where they are now,” wrote another.

Employees have even held a virtual walkout to protest at Zuckerberg’s decision. Despite making a $10m donation from Facebook to to racial equality causes, Zuckerberg appears to be sticking to his guns, in the face of mounting pressure to abandon them.

Update: 06/08/2020

Mark Zuckerberg has subsequently announced that Facebook would be reviewing its policies in relation to posts that concern:

“threats of state use of force,” “voter suppression to make sure we’re taking into account the realities of voting in the midst of a pandemic,” and “violating or partially violating content aside from the binary leave-it-up or take-it-down decisions.”

Zuckerberg also went onto say that the company would establish “a clearer and more transparent decision-making process,” as well as reviewing whether it needs to make structural and organizational changes, and creating a workstream to build products that advance racial justice.

However, Zuckerberg’s updates haven’t played well with everyone.

A group of Facebook moderators have penned an open letter to their colleagues in support of the virtual walkouts at the company. “We would walk out with you — if Facebook would allow it,” they wrote. All of the modertators remained anonymous, as well, in order to preserve their jobs, as they are sub-contracted through a third party.

Plus, more than 140 scientists funded by Zuckerberg sent the Facebook CEO a letter on Saturday calling for him to stop Trump spreading “both misinformation and incindeary statements” on the platform.

The scientists have said that their mission “is antithetical to some of the stances that Facebook has been taking, so we’re encouraging them to be more on the side of truth and on the right side of history.”

Twitter

In contrast, Twitter has taken a more proactive approach. It placed a notice in front of Trump’s tweets, saying that while the tweet “violated the Twitter rules about glorifying violence,” the company decided that “it may be in the public’s interest for the tweet to remain accessible.”

The same tweet was then posted on the White House’s official Twitter account. Once again, the same Twitter notice was posted on the tweet.

Trump, meanwhile, decided that it was an affront to his power and signed an executive order aiming to limit the company’s legal immunity for users’ posts.

Presently, social media companies are protected from lawsuits and other liability which could result from users’ posts and actions on their services by section 230 of the Communications Decency Act 1996. Trump’s executive order aims to limit the scope of section 230 by claiming that if a platform restricts content — with exceptions for violent, obscene, or harassing posts — then “it is engaged in editorial conduct” and “should properly lose the limited liability shield.”

Trump claimed that social media companies had “unchecked power” and that he was “fed up with it,” and that “it’s unfair, and it’s been very unfair.”

Twitter’s chief exec Jack Dorsey responded, saying, “We’ll continue to point out incorrect or disputed information about elections globally.”


There was also an official response from Twitter which claimed the executive order was “a reactionary and politicized approach to a landmark law,” and that “attempts to unilaterally erode it threaten the future of online speech and internet freedoms.”

Since then, the executive order has been challenged in a lawsuit filed by the Center for Democracy and Technology which claims it violates the First Amendment. Joe Biden, the Democratic candidate for November’s Presidential elections, also wants to revoke Section 230, but isn’t a fan of Trump’s executive order.

Reddit

While much of the internet’s discussion has focused around Twitter and Facebook, Reddit has been forced to reckon with its own past.

The site’s CEO Steve Huffman posted an open letter to employees saying that “My heart is extremely heavy right now,” before going on to say:

“We work for this platform because we care deeply about community and belonging. But community and belonging are not possible without safety from violence, and now is the time to stand in solidarity with the Black members of our communities (locally, at Reddit Inc., on Reddit, and beyond). As Snoos, we do not tolerate hate, racism, and violence, and while we have work to do to fight these on our platform, our values are clear.”

Reddit also decided to temporarily change its logo from orange to black in memory of George Floyd.

However, Ellen Pao, the interim CEO of Reddit back in 2014, then tweeted the following:

Back in 2018, during an AMA on Reddit, Huffman was asked by a user, “Is obvious open racism, including slurs, against reddits [sic] rules or not?”

Huffman replied saying:

“While the words and expressions you refer to aren’t explicitly forbidden, the behaviors they often lead to are. To be perfectly clear, while racism itself isn’t against the rules, it’s not welcome here. I try to stay neutral on most political topics, but this isn’t one of them.”

Reddit, clearly, has a long way to come — especially so if its CEO thinks that one can sit on the fence as to whether racial slurs are right or wrong.

LinkedIn

Earlier this week, LinkedIn held a virtual town hall meeting to address the social unrest following George Floyd’s murder at the hands of Minneapolis police. According to the Daily Beast, the meeting was billed as a chance for employees to discuss racism by “reflecting on our own biases, practicing allyship, and intentionally driving equitable actions.”

“We’ll spend most of our time together in open discussion, so please consider bringing questions or experiences you’d like to share,” read the invitation email to staff.

During the videochat meeting there was a sidebar section for employees to leave comments and several anonymous (naturally) employees took the chance to rail against the protest and the meaning behind the Black Lives Matter protests:

“As a non-minority, all this talk makes me feel like I am supposed to feel guilty of my skin color. I feel like I should let someone less qualified fill my position. Is that ok? It appears that I am a prisoner of my birth,” wrote one user.

“I believe giving any racial group privilege over others in a zero sum game would not get any support by others. Any thoughts on hurting others while giving privileges with the rosy name called diversity?” wrote another anonymous user.

“Blacks kill blacks at 50 times the rate that whites kill blacks. Usually it is the result of gang violence in the inner city. Where is the outcry?” said another user.

According to the company’s 2019 diversity report, just 3.5% of the company’s staff identified as black, while 5.9% identified as Latino. Meanwhile, 40.3% and 47.5% of the company’s staff identified as Asian and white, respectively.

In response to the disastrous town hall meeting, LinkedIn’s CEO Ryan Roslansky had this to say:

“We offered the ability to ask questions anonymously with the intention of creating a safe space for all. Unfortunately, that made it possible to add offensive comments without accountability. We require members on our platform to have real identities and we will not allow anonymous questions in all hands meetings in the future.”

Snapchat

Snapchat has decided to stop promoting President Trump’s account in its Discover tab following his incendiary tweets.

Users will still be able to access content from Trump’s account, but it won’t organically pop up in their Discover tab.

“We will not amplify voices who incite racial violence and injustice by giving them free promotion on Discover,” a Snapchat spokesperson said in a statement.

Trump’s campaign manager claimed that “Snapchat is trying to rig the 2020 election, illegally using their corporate funding to promote Joe Biden and suppress President Trump. Radical Snapchat CEO Evan Spiegel would rather promote extreme left riot videos and encourage their users to destroy America than share the positive words of unity, justice, and law and order from our President.”

Nextdoor

Hyper-local social media platform Nextdoor has been accused of actively censoring posts about Black Lives Matter.

Users have reported having their posts, which were supportive of the movement, deleted for breaching several community guidelines. According to Nextdoor’s guidelines, posts that concern national or state politics are not allowed, while posts concerning local politics are. Fundraising messages are also prohibited on the platform. Interestingly, Nextdoor’s rules also prohibit using the platform as a “soapbox.”

How Nextdoor draws the line between national and social problems being discussed and protested against at a local level is, at the moment, unclear. Nextdoor, however, tweeted this statement last week.

But while the company maintains that people should feel safe in their neighborhoods, its own app has had to contend with a long history of charges of racial profiling and race-based scaremongering. A Buzzfeed News report claimed that some Nextdoor users had been trying to turn the app into a proactive neighborhood watch program, posting messages about “African Americans youths,” and making sure they know “they are being watched.” Other unsavoury examples of such behavior on the site include posts such as:

“i don’t really know what made him suspicious.but he looked like a Cholo.maybe it was how he was wearing his pants.”

“Concerned me to see Dreads biking by my house this morning with 3 other less desirables. Sure don’t need him with a posse of like minded individuals roaming our neighborhood.”

Nextdoor has, apparently, implemented an algorithm that can root out racial profiling. In a statement to Buzzfeed News, a Nextdoor spokesperson said:

“We encourage all members who believe their posts or comments align with our Community Guidelines but were removed to report the matter to us so we may investigate and restore, as appropriate.”

Clearly, in Nextdoor’s mind, the issue is with the users, not its own ruleset or its implicit encouragement of curtain-twitching.

Which, of course, returns us to the overarching theme of what responsibility the various social media platforms will accept in the age of the Black Lives Matter movement. Sharing a black box for “Blackout Tuesday” will hardly suffice for a platform used by millions, when there’s clearly more policy to be decided over the activity on the platform itself. For some of the most powerful brands in tech, there’s some urgent self-reflection to be done, and the time for fence-sitting is drawing to a close.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Tom Fogden is a writer for Tech.co with a range of experience in the world of tech publishing. Tom covers everything from cybersecurity, to social media, website builders, and point of sale software when he's not reviewing the latest phones.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today