91

TikTok has to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated For You Page (FYP). The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”

TikTok’s algorithmic recommendations on the FYP constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.

all 36 comments
sorted by: hot top controversial new old
[-] t3rmit3@beehaw.org 55 points 2 weeks ago

I am generally very skeptical of lawsuits making social media and other Internet companies liable for their users' content, because that's usually a route to censor whatever the government deems "harmful", but I think this case actually makes perfect sense by attacking the algorithmic "curation" that they do. Imo social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

[-] chahk@beehaw.org 33 points 2 weeks ago

social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

But then how would they make money if they can't keep users doomscrolling forever to keep serving them ads? Won't someone think of the shareholders?!

[-] technocrit@lemmy.dbzer0.com 4 points 2 weeks ago

Unfortunately nobody can stop me from doomscrolling.

[-] Kolanaki@yiffit.net 9 points 2 weeks ago

As if that would at all stop these dumbass challenges from being posted and copied? People have been hurting themselves copying something they saw someone else doing even before the invention of the camera.

[-] t3rmit3@beehaw.org 14 points 2 weeks ago* (last edited 2 weeks ago)

Yes, but that is not the entirety or even majority of the problem with algorithmic feed curation by corporations. Reducing visibility of those dumb challenges is one of many benefits.

[-] schnurrito@discuss.tchncs.de 5 points 2 weeks ago

No it wouldn't, but people would only see them if they were part of a preexisting community where such things are posted or they specifically looked for them.

On the Internet, censorship happens by having too much information for our limited time and attention span, so going after recommendation algorithms will work.

[-] some_guy@lemmy.sdf.org 31 points 2 weeks ago

I'm gonna take the side that tok is potentially liable on the algo argument but these parents also failed their children. Teaching your kids to avoid replicating unsafe internet content should be just as primary as looking both ways before crossing the road.

[-] winkerjadams@lemmy.dbzer0.com 9 points 2 weeks ago

"If your friend told you to jump off a bridge, would you?"

-Any decent parent

[-] Trafficone@slrpnk.net 7 points 2 weeks ago

"Bridge jumping challenge"

  • TikTok shitposter
[-] Kissaki@beehaw.org 8 points 2 weeks ago

As a society, we're responsible for all our children. The point of child protection laws, and population protection in general, is to support and protect them, because often times, parents are incapable of doing so, or it's social dynamics that most parents can't really understand, follow, or teach in.

Yes, parents should teach and protect their children. But we should also create an environment where that is possible, and where children of less fortunate and of less able parents are not victims of their environment.

I don't think demanding and requiring big social platforms to moderate and regulate at least to the degree where children are not regularly exposed to life-threatening trends is a bad idea.

That stuff can still be elsewhere if you want it. But social platforms have a social dynamic, more so than an informative one.

[-] stardust@lemmy.ca 26 points 2 weeks ago* (last edited 2 weeks ago)

I remember reading that China's version of tiktok more promotes stuff like sciences to kids. Then for everyone else they get degeneracy of stuff like stealing KIAs, licking grocery store items, and now black out challenges.

It would be interesting if how the algorithm is tuned for China and the rest of the world was available. Makes me wonder if it's intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

Stuff like Facebook and Twitter are insane too so it's all self sabatoge at this point, but tiktok has seemed to become the trend setter.

[-] LukeZaz@beehaw.org 14 points 2 weeks ago

Makes me wonder if it’s intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

Good lord, this is a massive reach. A much simpler explanation is that algorithmic garbage is profitable, and China's government does not care about negative ramifications that occur outside China itself and so do not regulate it.

China's run by a terrible government, not an MCU villain.

[-] stardust@lemmy.ca 4 points 2 weeks ago

Uhhh... I don't think you got my point for why I also included Facebook and Twitter at the end as examples of domestic companies also willingly allowing harmful societal trends.

Money being a reason doesn't absolve and provide a convient out and let companies do whatever they want without consequence or criticism. I put them all in the camp of willingly selling out a worse society for profit, and whether a country sees that as a win for them or not doesn't change that.

[-] Yoruio@lemmy.ca 7 points 2 weeks ago* (last edited 2 weeks ago)

this is just how capitalism works - you have to appeal to your audience more than your competition, and guess which kind of content teenagers want to watch more. Hell, even adults want fun content as opposed to educational content.

they're not willingly selling a worse society for profit, that's just the only way to stay competitive.

any platform that pushes educational content in North America would just not get any customers and go bankrupt.

edit: there's plenty of educational video platforms out there, like Khan academy. Try and get your kids to scroll through that during their free time instead, I bet they won't.

[-] stardust@lemmy.ca 5 points 2 weeks ago

I know how capitalism works... I was just sharing my thoughts on the situation of a company knowingly adjusting the algorithm in a positive direction for one demographic but a negative for another showing a clear awareness of impact. Not sure why you are so worked up about tiktok getting criticized too. Whatever.

[-] Yoruio@lemmy.ca 2 points 2 weeks ago

In the US, publically traded companies have a legal obligation to make as much money for their shareholders as legally possible (See Ford getting sued by shareholders after giving workers raises). It would be borderline illegal for a company to adjust their algorithm in a way that makes them less competitive.

This needs to be regulated by government, not the companies themselves. Thay would mean that the companies would be forced to all change their algorithms at the same time, and not impact their competitiveness.

So the government going after tiktok is a good first step, IF it does the same thing to Facebook / instagram / YouTube / snapchat. But I'm betting it won't be because those companies spend an absurd amount of money on lobbying.

[-] t3rmit3@beehaw.org 3 points 2 weeks ago* (last edited 2 weeks ago)

This is a false narrative that stock traders push. The fiduciary duty is just one of several that executives have, and does not outweigh the duty to the company health or to employees. Obviously shareholders will try to argue otherwise or even sue to get their way, because they only care about their own interests, but they won't prevail in most cases if there was a legitimate business interest and justification for the actions.

[-] viking@infosec.pub 14 points 2 weeks ago

Yeah Douyin is pushing educational content and is very fast to censor harmful stuff. Still full of garbage and racism though, just the sanctioned kind against people the government doesn't like.

[-] DeltaTangoLima@reddrefuge.com 16 points 2 weeks ago

Shit like this is why I intend to keep my (currently) 9yo as far away from social media as I can, for as long as I can. This fucking terrifies me, as it should any parent.

[-] DdCno1@beehaw.org 17 points 2 weeks ago

Educating your kid about the many possible pitfalls of social media is even more important. They will eventually experience it, are likely already to some degree through their friends' devices exposed to it. Don't make the mistake of turning social media into some kind of forbidden fruit, but instead provide them with the tools to deal with it responsibly.

That said, I would still not allow this Chinese psy-ops tool on any device in my household. Other social media is already terrible enough, but TikTok seems to be engineered to cause nothing but damage.

[-] BCsven@lemmy.ca 7 points 2 weeks ago

I know some amazing parents that have super open communication and excellent teaching moments with their kids, they still fell into the social media morass...because friends (and teenage brain) are a heavy influence even with a safe supportive home

[-] LukeZaz@beehaw.org 2 points 2 weeks ago

This is why I think monitored access is a better idea than total withholding. Kids are going to end up on social media; either as they grow up and eventually become adults, or as a result of peer providing access & pressure. Best to let them on, but ensure they are safe, know how to be safe, and know why to be safe.

[-] DdCno1@beehaw.org 1 points 2 weeks ago

That's a universal truth about parenting though and not limited to just social media.

[-] BCsven@lemmy.ca 1 points 2 weeks ago

Right, but i was commenting about educating your kids about the pitfalls of social media, like you said. My adult children are teachers and they see social media is destroying kids even with education about it...their brains can't stop even if they know the consequences, especially because it is psychologically tailored to engage them more and more

[-] DeltaTangoLima@reddrefuge.com 6 points 2 weeks ago

My own belief is that all social media is a cancer, and to be avoided entirely. I'm able to do that for myself, but I'm also realistic about the chances of keeping my kids away from it. So, I focus my energy on trying to equip them with the mental skills to neutralise the toxic aspects of social media.

For my 9yo, that means teaching her to employ natural skepticism and critical thinking. I'm also trying to drum into her the understanding that social media is inherently untrustworthy and unreliable, and exists solely for the benefit of the corporations that run it.

That said, I've blocked Tik Tok on my home network, much to the older kids' chagrin. They have to use mobile data if they want to access that shit on their phones.

[-] null@slrpnk.net 7 points 2 weeks ago

My own belief is that all social media is a cancer, and to be avoided entirely. I'm able to do that for myself

You just posted this to a social media site...

[-] Scary_le_Poo@beehaw.org 3 points 2 weeks ago

Come on dude, you know exactly what he meant. Social media is a broad category, but when someone mentions it in this context, it's very clear what they mean.

[-] null@slrpnk.net 4 points 2 weeks ago

I disagree. I don't think it's clear at all what he considers dangerous about social media if he's excluding things like Lemmy, Reddit, and other message boards.

[-] DeltaTangoLima@reddrefuge.com 1 points 2 weeks ago

Later in the same comment I mention how I think social media only benefits the corporations that run it.

It’s pretty clear what I meant.

[-] null@slrpnk.net 2 points 2 weeks ago

So a family group-chat, that's a no-no, right?

[-] theangriestbird@beehaw.org 10 points 2 weeks ago

ah shit me and my friends used to do this, pre-social media. I remember one time in middle school recess, going out to the farthest corner of the playground with my friends, and we all did a thing where we took turns holding our breath while someone else squeezed our chest. I remember blacking out, hearing the pokemon theme in pitch darkness, and then waking up on the ground.

I don't think we did it more than once (at least I didn't). But of course, the crucial difference was that I was with my dumbass friends, so at least there was someone to run for help if someone didn't wake up.

[-] tilefan@lemm.ee 4 points 2 weeks ago

tiktok was somehow the only platform carrying the trend?

[-] technocrit@lemmy.dbzer0.com 4 points 2 weeks ago* (last edited 2 weeks ago)

No, but platforms based in the USA work hard to conceal and obfuscate the genocide in palestine.

[-] cupcakezealot@lemmy.blahaj.zone 4 points 2 weeks ago

terrible ruling which only serves to further gut section 230, which has been the goal of conservatives for years.

[-] thingsiplay@beehaw.org 3 points 2 weeks ago

Tik Tok should be 18+.

this post was submitted on 29 Aug 2024
91 points (100.0% liked)

Technology

37554 readers
172 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS