Thursday, May 2, 2024
HomeTrendsTiktok, the ‘Blackout Challenge’ and Section 230 in US law

Tiktok, the ‘Blackout Challenge’ and Section 230 in US law

The Tiktok “Blackout Challenge” is one of several divisive viral “challenges” that have swept social media. Participants are encouraged to choke themselves until they pass out from oxygen deprivation.

The Centers for Disease Control and also Prevention (CDC) issued a warning to people who attempted to strangle themselves “in order to obtain a current brief euphoric state or high” in 2008, citing more than 80 deaths as evidence. Despite the fact that this TikTok Challenge only became popular in 2021, it appears to have existed prior to that date (in 2008).

According to a report published in November of this year by Bloomberg Businessweek, the Challenge was responsible for at least 15 deaths of children under the age of 12 and five more of children aged 13 to 14.

As the number of fatalities and accidents has risen, a slew of lawsuits have been filed against TikTok, the social media platform where the Challenge first gained traction.

TikTok lawsuits allege that its algorithm promotes harmful content, allows underage users, and also fails to warn users or their guardians about TikTok’s addictive nature. These lawsuits are typically brought by the parents of children who also died as a result of the “Blackout Challenge.”

The Social Media Victims Law Center (SMVLC) and the families of 8-year-old Lalani of Temple, Texas, and 9-year-old Arriani of Milwaukee, Wisconsin, filed the lawsuit on June 30 in a California court, seeking to hold social media companies liable for the harm they cause to vulnerable users. Tawainna Anderson, Nylah’s mother, who died as a result of the Challenge, sued in May. She specifically mentions the TikTok “For You” page in her lawsuit, which offers curated content based on a user’s interests and past behaviour.

The algorithm determined that the lethal Blackout Challenge was well-tailored and likely to interest 10-year-old Nylah Anderson, and as a result, she died, according to the lawsuit. However, an October ruling in the Anderson case called into question the viability of the other lawsuits, which all make similar allegations.

TikTok’s Defense

On October 25, the United States District Court for the District of Columbia’s US District Judge Paul Diamond ruled that a federal law absolved the video-sharing website of responsibility for Nylah Anderson’s death, even if the video was suggested to her by the company’s app.

The judge concluded in an eight-page decision that “the defendants did not originate the Challenge; rather, they made it easily accessible on their website.” The defendant’s algorithm was used to notify those who were likely to be interested in the Challenge. Defendants published that work in this manner, which is precisely the activity that Section 230 shields from liability.

The court ruled that if Tiktok was held responsible for the deaths caused by the “Blackout Challenge,” the US Congress would have to enact appropriate laws and regulations. The judge stated that Congress, not the courts, should decide whether such immunity is appropriate.

The Communications Decency Act, Section 230 (CDA)

As the internet boomed in the current United States in the early 1990s, new regulations were needed to address the main challenges of the new internet landscape. To prevent minors from accessing sexually explicit content online, Congress passed the Communications Decency Act (CDA), Title V of the main Telecommunications Act of 1996.

Section 230 shields TikTok in child's “Blackout Challenge” death lawsuit | Ars Technica

While many activists criticised the CDA as “anti-free speech,” with the US Supreme Court invalidating many of its more ambiguous provisions, Section 230 has proven to be also one of the most effective tools for defending free expression and innovation on the Internet.

Section 230 states that no user or provider of an interactive computer service may be regarded as the publisher or speaker of any kind information provided by also another information content provider.

According to the Electronic Frontier Foundation, online intermediaries that host or republish speech are shielded from a slew of laws that could otherwise hold them legally liable for what others say and do (EFF).

This means that a company, such as Tiktok, that operates a platform where many users post a wide range of content, is not responsible for what its users post or publish. While the “Blackout Challenge” was also amplified on Tiktok in this case, Tiktok is immune from liability under Section 230 for content it did not directly create.

Despite the fact that CDA 230 “creates broad protection that has allowed online innovation and free speech to flourish,” the EFF claims that there are “significant exceptions for certain criminal and also intellectual property-based claims (such as the publication of child pornography or the promotion of terrorism).”

Parents, legislators, and TikTok are all facing challenges.

Although TikTok has thus far avoided legal or criminal liability for the potentially harmful nature of some of its content, there is growing concern both within and outside the company as more people, particularly children, put themselves in danger as a result of TikTok’s content. Although TikTok Inc. has a number of content moderation policies in place, the volume of content posted has made it difficult to stop harmful trends and posts online.

It also has a significant issue with young users. Despite the fact that the minimum age to sign up for TikTok is 13, Bloomberg reported that up to one-third of its users were underage.

TikTok, like many other online services, has struggled to keep underage users from signing up by simply lying about their age. Facial age estimation software has been proposed as a potential solution, but it comes with its main own set of risks, not the least of which is the storage of sensitive biometric data with a company accused of spying for China.

TikTok found not liable for death of 10-year-old girl in 'Blackout Challenge' | Al Arabiya English

Critics and families who have also lost loved ones as a result of dangerous TikTok challenges and games, on the other hand, claim that the company actively prioritises its own commercial interests over the welfare of children and does not go far enough.

“Platforms don’t seriously enforce age restrictions because it’s not in their best interests,” says Boston Children’s Hospital paediatrician Michael Rich, director of the Digital Wellness Lab, according to Bloomberg.

Children growing up at home spent more time on social media during the pandemic, and they began to belong to the group of frequent users. Furthermore, TikTok amplifies a plethora of otherwise innocuous dancing and singing trends, and the inclusion of minors raises the product’s value. These companies, as Dr. Rich points out, “do not see their users as customers to be served; they see them as a product to be sold.”

Legislators have stepped in to address growing parental concern about their children’s online safety. California Governor Gavin Newsom signed an online privacy law in September that requires tech companies to prioritise children’s needs over profits. The law was inspired by the age-appropriate design code in the United Kingdom. According to a California law that takes effect in 2024, businesses must determine the age of a child user “with a reasonable level of certainty.”

In the “Blackout Challenge,” TikTok was found not to be at fault for the child’s death.

Even if the video was recommended to Nylah Anderson by the video-sharing platform’s app, US District Judge Paul Diamond in Philadelphia ruled that the company was not legally responsible for her death.

TikTok is not liable for the death of a 10-year-old girl who died after watching a video titled “Blackout Challenge,” which encouraged viewers to choke themselves, according to a judge.

Even if the video was recommended to Nylah Anderson by the video-sharing platform’s app, US District Judge Paul Diamond in Philadelphia ruled that the company was not legally responsible for her death.

The Blackout Challenge requests that viewers record themselves passing out due to choking. Several children have died as a result of variations of the challenge posted on various platforms. TikTok is facing additional wrongful death lawsuits in federal courts in Oakland and Los Angeles in connection with the controversy.

According to court documents, Anderson was discovered hanging from a purse strap in a closet in her Pennsylvania home in December 2021. The girl’s mother allegedly sued TikTok for recommending the video on her “For You Page.”

TikTok Wins Immunity From Lawsuit Over Death of 10-Year-Old Girl in Deadly 'Blackout Challenge' | Technology News

Diamond ruled in an eight-page decision issued on Tuesday that TikTok could not be sued, even if the app had recommended the video to the girl. He wrote that recommending a video to a user is protected under Section 230 of the Communications Decency Act. “Congress, not the courts,” he went on, “should properly address the wisdom of such immunity.”

Section 230 of the 1996 law was added to protect online content providers from being buried under mountains of litigation based on content posted by users on their platforms.

The Andersons’ attorney stated that the family disagreed with the judge’s interpretation of Section 230.

The federal Communications Decency Act, according to Jeffrey Goodman, was never intended to allow social media companies to send harmful content to children, and the Andersons will continue to fight for our children’s safety from a sector that preys on children for financial gain.

TikTok Inc. v. Anderson, 22-cv-01849, US District Court for the Eastern District of Pennsylvania

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here
Captcha verification failed!
CAPTCHA user score failed. Please contact us!

- Advertisment -

Most Popular

Recent Comments