Liability of Social Media Companies for Injuries & Losses Suffered as a Result of Posts Hosted on Their Platforms

US Supreme court

By Chijioke Obute

An Examination Of The Live Case Of Gonzalez V Google  At The United States’ Supreme Court


  1. Overview/Definition of ConceptsReview of Nigerian laws on the liability of Social Media Companies
  2. Foreign Jurisdiction laws on liability of Social Media companies
  3. Writer’s opinion on the appropriateness or otherwise of Social Media liability
  4. Examination of the Ongoing case of Gonzalez v Google in the US.
  5. Conclusion.


  • The issue of the liability or otherwise of Social Media Companies for injuries including but not limited to death and other losses suffered by users and non users alike resulting from posts hosted on their platforms is gradually becoming a burning issue. In the past, it was not envisaged that a time will come when the Social Media platforms would be employed by non state and faceless actors in unleashing their terror on unsuspecting victims. This issue is more remote in developing countries like Nigeria where it is not so relatable. The Social Media Companies have grown exponentially and have spread their tentacles everywhere that these non state actors, especially the terrorist organisations are increasing employing their platforms in advancing and enhancing their nefarious and evil activities.
  • For better perspective, some of these Social Media companies include: Google (Gmail, Youtube, Spam), Meta (Instagram, Facebook, WhatsApp), Microsoft, Twitter, Tiktok, Snapchat etc. These companies and their respective platforms have permeated the remotest parts of the globe that their impact and effect is global and so also their adverse effects. It is the norm to seek to protect these social media companies from liability for injuries and losses occasioned by the use of their platforms. Extant laws in most advanced democracies favour their protection from such liabilities in order to encourage their growth. Overtime however, these platforms are increasingly becoming stadiums for many non state players and organisations in the perpetration of their heinous crimes.

Definition of Concepts

  • Social Media
  • Social Media Companies
  • Liability
  • Foreign Jurisdiction.

Social Media:

Social media are interactive technologies that facilitate the creation and sharing of information, ideas, interests, and other forms of expression through virtual communities and networks. The term social in regard to media suggests that platforms are user-centric and enable communal activity. As such, social media can be viewed as online facilitators or enhancers of human networks—webs of individuals who enhance social connectivity. Users usually access social media services through web-based apps on desktops or download services that offer social media functionality to their mobile devices (e.g., smartphones and tablets). Some of the most popular social media websites, with more than 100 million registered users, include Facebook (and its associated Facebook Messenger), TikTokWeChatShareChatInstagramQZoneWeiboTwitterTumblrBaidu Tieba, and LinkedIn. For the purposes of this discourse, other popular platforms that fall under social media services include YouTubeQQQuoraTelegramWhatsAppSignalLINESnapchatPinterestViberRedditDiscordVKMicrosoft Teams.[1]

Social Media Companies:

These are companies that set up, own and run the social media platforms and/or apps. These companies are the entities that are recognised by the law and clothed with legal personalities. These companies have legal personalities and are responsible for maintaining the social media platforms. For the purpose of this discourse, these companies include:

  1. Meta Inc. that own Facebook, Instagram and acquired WhatsApp in 2014.
  2. Alphabet Group popularly known as Google owns YouTube.
  3. Twitter Inc. which owns and operates Twitter app.
  4. Microsoft Inc. which owns LinkedIn.
  5. Snap Inc. formerly Snapchat Inc. owns and operates Snapchat app.
  6. Bytedance, a Chinese company owns and operates TikTok app.
  7. Tencent Inc. another Chinese company owns WeChat and QQ apps Etc.


Liability in the simplest form means the state of being legally responsible for something. Going further, Civil liability refers to the right of an injured party to hold someone responsible for his injuries or damages, which resulted from the other party’s wrongful actions. In order to hold a person or entity civilly liable, the wronged party must have suffered some type of quantifiable loss or damage. This may be in the form of personal injury, property damage, loss of income, loss of contract, and a host of other losses. In a civil liability lawsuit, the injured party’s losses must have occurred due to the defendant’s violation of a law, breach of contract, or other wrongful act, referred to as a “tort.” Examples of civil liability cases include injuries and property damages sustained in automobile accidents, and defamation of character claims. To be successful in a civil liability lawsuit, the plaintiff must prove to the court, or to a jury, that it is more likely than not that the defendant’s actions caused his injuries or loss. This level of proof required is referred to as a “preponderance of evidence.[2]

Foreign Jurisdiction:

Foreign Jurisdiction in lay man’s term means any jurisdiction other than Nigeria. This means every other jurisdiction outside the borders of Nigeria. for the purposes of this discourse, foreign jurisdiction are the laws of other countries particularly the laws of the United States on liability or otherwise of Social Media companies for injuries occasioned by posts hosted on their apps.

Review of Nigerian laws on the liability of Social Media Companies.

  • Regrettably, Nigeria does not have any statutory law that provides for statutory protection from civil liability for social media companies regarding third-party content. In the wake of the suspension of Twitter by Nigeria Government on 05 April 2021, the Nigeria Government realizing that it has no law enacted for the regulation of the social media, directed the Nigeria Broadcasting Commission (“NBC”) to come up with regulations to regulate the social media. Although this is not directly related to this discourse as this discourse focuses on the liability of the Social Media companies for injuries suffered as a result of the use of their apps. It is however important to observe that this lack of laws on this subject exposes the social media companies to uncertainties as well.
  • The law as it is in Nigeria, does not provide any protection for these Social Media companies as in other jurisdictions. Indeed, the law as it is, appears to be anti-social media as the provisions of the NBC Broadcasting-Code appears to conflict with some of the policies of the social media companies. Of particular note, are the provisions of the NBC Code that prohibits the broadcast of sexual contents like adultery, incest, bestiality, same sex, etc. Of greater interest is the provision of Section 3.8.1(b) of the Code which prohibits broadcasters in Nigeria from broadcasting “any language or scene likely to encourage crime, lead to disorder or any content which amounts to subversion of constituted authority or compromises the unity or corporate existence of Nigeria as a sovereign state”. This particular provision suffices to hold social media companies liable for post from their users hosted on their platforms, since same could be interpreted by the Nigerian judiciary to mean a publication/broadcast. The Same Sex Marriage Prohibition Act an extant law in Nigeria is also another statute that could operate to make social media companies liable for posts on their platforms that offend the provisions of the Act. It is remarkable that no Suit seeking to hold any social media company liable for breach of these laws has been undertaken.
  • However, in the face of all these laws, there is no statute on the liability or otherwise of the social media companies for injuries and losses suffered by users and non users alike for posts by users and as a result of the employment of the social media apps by non state actors in the advancement of their nefarious activities.

Foreign Jurisdiction laws on liability of Social Media companies 

  • Social Media platforms and apps are historically indigenous to the United States. Most of the Social Media apps were developed and nurtured in the United States. In recent times, China expectedly has made tremendous inroad in the building and development of social media platforms. This discourse will principally be examining the United States laws on the liability of the social media companies. 
  • There is no principal legislation protecting social media companies per se. However, the most outstanding law in the US that protects social media companies from liabilities arising from posts by users on their apps is the Communications Decency Act 1996 (“CDA”). The CDA was enacted by the United States Congress’ in its first notable attempt to regulate pornographic materials on the internet. Section 230 (C) of the CDA is the relevant provision to our discourse and provides thus:

(c) Protection for ”Good Samaritan” blocking and screening of offensive material

(1) Treatment of publisher or speaker:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) Civil liability:

No provider or user of an interactive computer service shall be held liable on account of –

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

  • This particular section was introduced to promote the development of the Internet (social media) in its infancy by protecting Web services from being drowned in lawsuits, and to address concerns about children’s access to inappropriate content. It has been interpreted to mean that websites and social media platforms are not liable for third-party content produced and shared by users through those mediums. If a user makes a post defaming another person, it is the user, rather than the platform, that is responsible. Section 230 of the CDA also permits “good faith” action taken to restrict egregious content by users. This allows social media handlers to engage in selective removal/deletion of posts without the platforms losing their immunity from legal responsibility for content or liability for such removals. It therefore provides two-fold protection to the Social Media companies. This has greatly prevented any successful liability suit against social media companies in the US.
  • Section 30 of the CDA was applied by the US judiciary in the recent case of Force v Facebook[3], Taylor Force was a business student and an American veteran in Iraq and Afghanistan. In 2016, during the “knife intifada[4],” he was stabbed and killed in Tel Aviv by a Hamas terrorist and his relatives sought justice by bringing action in the US. Social media (Facebook) was seen as a vital tool for terrorists, and legal action was instituted at a federal court in New York on behalf of five families. It was argued that Facebook gave material assistance to a known terrorist organization, Hamas, by providing it with a communications platform and recommending Hamas content to users.

The court didn’t side with the terrorism victims and dismissed the case, as with the Court of Appeal and the Supreme Court. Facebook had an impenetrable wall that made it irreproachable.

Robert Tolchin, who was the Counsel for the petitioners and has worked on anti-terrorism litigation for almost 20 years, said that “the problem that we have come up against in all these cases was Section 230 of the CDA.

  • The UK is currently in the process of making laws to limit the protection social media companies and platforms enjoy as they do not have any substantive law on that. In the proposed law which is at an advanced stage, the UK is proposing for laws that would hold social media platforms and their companies responsible and liable for harmful, violent and dangerous posts on their platforms. The move is reported to be aimed at making the UK the safest place for social media users.

Writer’s opinion on the appropriateness or otherwise of Social Media liability. 

  • Social Media has come to stay and its positives/benefits/merits in the opinion of the author, far outweighs its negatives/demerits. The intention of the US lawmakers in enacting Section 230 of the CDA was and remains valid as it was made principally for the protection of the internet companies and developers from crippling lawsuits and attendant liabilities during its early years. Of interest however, is the fact that the present day social media platforms was not immediately in consideration when that section was enacted. It is indeed praiseworthy that Section 230 of the CDA has worked greatly to protect the social media companies.
  • Overtime, the social media platforms have grown geometrically and prospered greatly, courtesy of Section 230 of the CDA. However, the progress has seen the platforms become increasingly ready and appealing tools for the advancement and propagation of terrorism and harmful contents. The platforms have also become uncensored so much that disinformation and misinformation have taken steady and obvious abode on them. These posts are largely if not completely the handwork of the users and as such the Social Media companies have escaped and continue to escape liability howsoever for harms and injuries occasioned by these contents, hiding under the canopy of Section 230 of the CDA. Of even graver concern is the fact that some known terrorist organisations have active accounts on social media and at times recruit via these platforms. In all of these, the social media companies have evaded liability because of the statutory umbrella of Section 230 of the CDA.
  • The prevailing circumstance, means that the protection provided to the social media companies is due for review. In the jurisdictions that have no proper legislation on this subject, it is due time to enact laws to cover the seeming gap and as such rests on their legislators to do so. Such legislations will put the social media companies on their toes in acting promptly and vigilantly to checkmate all harmful contents put on their platforms. It will also serve as a wake up call to the social media companies to disable all accounts with verifiable link to terrorism. Nigeria is not left out among the jurisdictions that urgently need to legislate for the liability of the social media companies in this regard. Being bedeviled and troubled by terrorism and other societal ills, many of who employ the social media platforms in perpetrating their evils, it is even more compelling to for Nigeria to legislate to hold the social media companies liable where their negligence and indolence result to injury or loss.
  • The need to sanitize the system without necessarily interfering with the free use of the platforms has ripen for legislative action. The companies should not be protected from liabilities which they reasonably ought to have averted with little duty of care and vigilance. In all of these, there ought to be an objective standard in determining such liability. Arguments by the social media companies that it is not practicable to review all posts on their platform should no longer suffice to protect them from civil liability. The companies ought to device a means to track such posts and take them down or face proportionate civil liability for the outcome. This can be achieved by the social media companies by the deployment of tracking devices to track harmful posts and by monitoring accounts/posts that are affiliated to any terrorist group and deleting same. This liability however, may not be in the form of strict liability. Every case should be heard and decided on their merit applying the letters of the law strictly thereby. The second leg of the protection in Section 230 of the CDA which empowers the Social Media Companies to take down posts that they consider harmful may be retained with the proviso that same is done in “good faith”.

 Examination of the Ongoing case of Gonzalez v Google in the US. 

Nohemi Gonzalez
  • Nohemi Gonzalez, was in Paris, France for her studies when she was murdered by the Islamist State (“ISIS”) terrorists on 15 November, 2015. It is over seven years since her death but the pain and memory persist. Shortly after her death, the decision was made to sue Google (Alphabet Group) which owns Youtube. The argument for YouTube’s liability for the rise of ISIS and the subsequent death of Nohemi is based on the platform’s recommendations systems, which algorithmically suggest content similar to that liked or regularly watched by users. In its brief, the Counter Extremism Project detailed that these algorithms are built with the idea that “edgy” content is more attention-grabbing. This leads to inundation and the radicalization of users. Petitioners contend that this process was monetized by Google through ad programs, which didn’t take the necessary action to remove the wave of jihadist content it was suggesting. Recommending content should make YouTube more than a publisher of third-party content, argued Tolchin. “This is more than a billboard; you are guiding people down the rabbit hole with your algorithms. This isn’t someone going to the library and selecting a book; this is a librarian following around and suggesting books.”
  • A companion case was also filed for Nawra Alassaf’s relatives. The relatives of Nawra Alassaf had suffered as a result of his murder by ISIS terrorists in the 2017 Reina nightclub shooting in Instabul, Turkey. In that case, complaints against Twitter, Facebook and YouTube hold that they knew that Islamic State terrorists were using their platforms and didn’t take action thereby allowing the use of their platforms for communications and recruitment which constitutes material support for terrorists. “US Congress made it clear that if one helps a terrorist organization and that group commits a terrorist act, such a person can be held responsible,” said Keith Altman, a member of the legal team. “The Alassaf’s case seeks to hold social media companies liable for their knowing assistance to ISIS.” While these cases were rejected by the US lower courts on account of Section 230 of the CDA, their appeals were accepted by the US Supreme Court.
US Supreme court
  • The cases were heard by the US Supreme Court on 21 and 22 February 2023. The Petitioners argued that the social media companies should be held liable for their knowing assistance to the terrorist groups. The Social media giants argued that terrorist content is already forbidden by their terms of service. They further argued that they lack the capacity to review all contents, however, they have employed automatic flagging and personnel to remove as much as possible. Their routine services were being abused. Their platforms were not linked directly to the attacks, not having knowingly aided or encouraged them. In Gonzalez case, Google argued that YouTube’s videos couldn’t be proven to have radicalized the attackers. Instead, that there was a general complaint about its role in ISIS’s rise to prominence. In Alassaf, the companies argued that they could not be seen as abetting a criminal act that they had no knowledge of or did not actively assist. When it came to algorithms, Google argued that they were neutral automated tools, and that it was still protected from liability under Section 230 of the CDA. It noted that every lower court had reiterated these protections. Prominent Internet companies and interest groups also filed briefs in support of Google, warning of the impact that would arise if Section 230 of the CDA was to be abolished or restricted.


The wait for the Judgment of the US Supreme Court on this issue has begun. In the meantime, while it is unethical to make statements on issues and matters that are subjudice, in order not to preempt the Court, the likeliness of the US Supreme Court repealing Section 230 of the CDA is a bit remote but not impossible, especially in the light of the escalating employment of the social media and indeed other internet platforms to achieve unwholesome ends. The US Supreme Court has been afforded a golden opportunity to re-interpret Section 230 of the CDA in its Judgment. The next thing that will happen, if this happens is for the US Congress to develop a new law. The real outcome of a successful case would be that “some of the ugly things on the Internet would likely be limited, and that is critical.

Chijioke is an Associate@ PUC. –


[1] Wikipedia, the free encyclopedia

[2] Legal Dictionary

[3] Force v. Facebook, Inc., No. 18-397 (2d Cir. 2019)

[4] Era of rampant knife stabbings in Jerusalem

Share on


Please enter your comment!
Please enter your name here