The first change is really about going to the government and whether the government is … [+] censored it.


Following Derek Chauvin’s verdict for the murder of George Floyd in Minneapolis, four-time NBA Champion LeBron James tweeted a picture of a Columbus police officer involved in the death of 16-year-old Ma’Khia Byrant entitled “YOU’RE” NEXT RESPONSIBILITY. “

Many quickly accused the basketball superstar of using Twitter to incite violence. The tweet, which has since been deleted, came just days after Rep. Maxine Waters (D-Calif.) Urged protesters to “stay on the streets” and even become “more confrontational” should Chauvin be acquitted.

This resulted in a firestorm on social media that was not lacking in misunderstandings.

The first change and social media

Many have tried to make this a “First Amendment” problem, giving James, Waters, and others the right to voice their opinions. However, this would be a misinterpretation of the constitution. The first change is really about reaching out to the government and whether the government is censoring them.

“The first amendment only restricts government regulation or punishment of expression, although it does not only apply to speeches addressed to the government. The first amendment generally protects the term unless the term falls within one of the very narrow ones Categories of unprotected language, “said Chicago attorney Ari Cohn, who specializes in First Amendment issues.

“Private businesses, including social media platforms, will not be restricted by the first change,” added Cohn. “On the contrary, it protects them. They have their own first adjustment right to determine what speech they want to allow on their private property, much like you or I have the right to kick someone out of our house for saying something that we don’t like it. “

It is also important that we recognize from the start that social media companies are private actors and are not only subject to initial change, but can also set and enforce the policies they deem appropriate.

“A good example of this is of course Twitter’s decision in January 2021 to ban President Donald Trump,” said Bob Jarvis, attorney and law professor at Nova Southeastern University.

“At the same time, however, social media companies (SMCs) are protected by Section 230 of the Communications Decency Act,” added Jarvis. “This Federal Act of 1996 protects websites from liability for the content that their users create, post, and comment on. Section 230 assumes that SMCs only provide platforms for users to express their thoughts on, which SMCs such content or Do not review or endorse comments. While the first amendment does not prevent SMCs from acting, Section 230 gives them an incentive not to act. “

The old media rules still apply

Another consideration is that many of the traditional or “old media” rules still apply very strongly to social media, even if the content is generated differently.

“Social media companies, like newspapers, are private companies, but they don’t offer their own content,” said James R. Bailey, professor of leadership at the George Washington University School of Business.

“Other people provide this content,” Bailey explained. “Newspapers can make the decision whether or not to publish something. But this is where social media gets hung up because they don’t create their own content and because they don’t create their own content. They are in the unfortunate position of having to police other people’s content , and then come in and say whether it is appropriate or not. “

Some users have cried “censorship” when their post or photo was removed, or when more extreme measures were taken, e.g. B. when they have been removed from service. However, it is still not a First Amendment issue as it is the rules that companies impose.

“We’ve already seen what we allow and what we don’t. Facebook and Twitter have pounced on the former president and shed a few other people, and that essentially controls the content,” added Bailey.

Furthermore, it could be argued that social media companies, like any other company, have an ethical obligation to conduct their business responsibly. This could include, for example, not allowing an individual’s personal information to be published. It would then be up to the social media company to react accordingly.

“Part of that ethical duty is to avoid foreseeable harm to others,” suggested Robert Foehl, executive in residence with the Business Law and Ethics of Ohio University’s Online Masters of Business Administration program.

“If a person or group uses a social media platform to promote acts of violence against another person or someone else’s property, the social media company has an ethical obligation to take corrective action to remove such published content, so that the resulting damage is minimized. ” Feel added. “This reactive reaction is not enough, however. The social media company is ethically obliged to take proactive measures to avoid the publication of such content at all and thus to avoid foreseeable damage.”

Incitement to violence

While it has been debated at length whether Rep. Waters or Mr. James actually intended to incite violence, there have been cases where some have certainly used social platforms to do so. In such a case, the social media companies should certainly take responsibility, as they have already done in the past.

“Donald Trump’s recent ban on Twitter following the January 6 riot has shown that these platforms recognize the responsibility they have to maintain and mitigate potentially harmful misinformation,” said Jui Ramaprasad, professor in Decision Making , Operations and Information Technologies at the University of Maryland’s Robert H. Smith School of Business.

“While banning users is an option, platforms are also starting to think – and should think about – how information / misinformation should be disseminated and what content should be privileged,” added Ramparasad. “When algorithms are behind this process to determine what we see in our feed – no matter what platform we are on – it is not clear that the ‘real’ information is privileged over the wrong.”

However, social media might differ from other forms of mass media in that communication can spread so quickly. A simple tweet can go viral in minutes. Even if deleted from the original poster, it can take on a life of its own – as many celebrities found out when they were thumbs too quick.

“For example, it is highly unlikely that a newspaper article was incitement,” said Cohn. “It’s hard to imagine that an article would lead people to immediately put the paper aside and commit illegal activities. Social media adds an interesting fold given the real-time communication going on. A tweet could play a role.” Incitement? It is much easier to imagine circumstances in which this would theoretically be possible than it is for a newspaper article. “

So it’s easy to see why the social media platforms had to silence some voices and why companies had to deal with disseminating content that could even suggest inciting violence.

“Social media platforms that remove content that incites violence are absolutely within their jurisdiction, and there is certainly a strong moral argument that they should, in fact,” added Cohn. “I think you would have a hard time finding many people who believe that platforms should post serious threats of violence (as opposed to rhetorical exaggeration) without moderation.”

How social media deals with these issues could be the challenge, especially since the technology is so new and evolving.

“Social media is similar to mass media, but it’s still a different beast,” said Matthew J. Schmidt, PhD, associate professor of national security and political science at the University of New Haven.

“There’s still a lot to be figured out about how social media can handle issues when people post that annoys others, but we’re confused,” added Schmidt. “We’re not exactly digital natives. It could be the children now being born into the world of social media who will find out best. We built it, but we’re still the outsiders. They will live.” it in and figuring out how to properly moderate such content. “