TikTok Requires Users To 'Forever Waive' Rights To Sue Over Past Harms (arstechnica.com) 23
An anonymous reader quotes a report from Ars Technica: Some TikTok users may have skipped reviewing an update to TikTok's terms of service this summer that shakes up the process for filing a legal dispute against the app. According to The New York Times, changes that TikTok "quietly" made to its terms suggest that the popular app has spent the back half of 2023 preparing for a wave of legal battles. In July, TikTok overhauled its rules for dispute resolution, pivoting from requiring private arbitration to insisting that legal complaints be filed in either the US District Court for the Central District of California or the Superior Court of the State of California, County of Los Angeles. Legal experts told the Times this could be a way for TikTok to dodge arbitration claims filed en masse that can cost companies millions more in fees than they expected to pay through individual arbitration.
Perhaps most significantly, TikTok also added a section to its terms that mandates that all legal complaints be filed within one year of any alleged harm caused by using the app. The terms now say that TikTok users "forever waive" rights to pursue any older claims. And unlike a prior version of TikTok's terms of service archived in May 2023, users do not seem to have any options to opt out of waiving their rights. Lawyers told the Times that these changes could make it more challenging for TikTok users to pursue legal action at a time when federal agencies are heavily scrutinizing the app and complaints about certain TikTok features allegedly harming kids are mounting.
Perhaps most significantly, TikTok also added a section to its terms that mandates that all legal complaints be filed within one year of any alleged harm caused by using the app. The terms now say that TikTok users "forever waive" rights to pursue any older claims. And unlike a prior version of TikTok's terms of service archived in May 2023, users do not seem to have any options to opt out of waiving their rights. Lawyers told the Times that these changes could make it more challenging for TikTok users to pursue legal action at a time when federal agencies are heavily scrutinizing the app and complaints about certain TikTok features allegedly harming kids are mounting.
Waive rights might go against law (Score:5, Interesting)
Waive rights might go against law in some areas. A lot of agreements and terms can be considered invalid if they are breaking the law.
But maybe it's time to strengthen the laws for many when it comes to what can be in terms.
Re: (Score:3)
Maybe in the EU, which seems to give a little bit of an actual darn about consumers and regular citizens. Here in the good ol' US of A, these changes are totally enforceable.
Re: Waive rights might go against law (Score:2)
Whats consideration? When you sign a contract, consideration is the thing/good/service/idea/right that you are trading in exchange for whatever the other party is promising (also consideration).
In this case, TikTok is saying they promise to let you use the app as long as you give up specific rights. Rights to created content is 1 thing but giving up thr legal right to sue for damages is significant and continued access to the app isn't suf
Re: (Score:2)
If you were harmed and didn't bother to care to do anything legally within a year well that's on you. Obviously it was not harmful enough.
Depends on whether the harm was momentary or ongoing. Harm spread out over a long period of time can be cumulative, up to the point where people snap and cause harm to themselves or others. Would the liability be limited to the last year of harm in that situation? Because that's not how these things typically work, nor should it be.
Re: (Score:2)
At some point people have to take some responsibility for their own life choices. But that is a hard thing to do when you are entitled to sue for damages because someone made you cry and you kept going back so it could happen again.
We're talking about an app that is mostly used by kids. From the perspective of the law, they're not old enough to be required to take responsibility for their life choices yet. And there are limits to the extent to which parents can monitor their kids' cell phone activity. So no, what you are advocating isn't really all that reasonable from a legal liability perspective.
Re: (Score:2)
meh whatever (Score:4, Insightful)
Doesn't work (Score:3)
There are certain rights you simply cannot waive legally around here. One of them is the right to sue for damages. You can write into your contracts "Oh, and by the way, if you get harmed by our product, sucks to be you" all you want and I'll gladly sign it. With a hint of luck, that voids the whole contract if I (as the non-contract-drafting party) so please.
So please go ahead..
Re: (Score:2)
Binding arbitration clauses have been largely upheld in court. So, yes, you can waive your right to sue.
With that said, waiving statute of limitations has not been tested in court - as far as i'm aware. Wouldn't surprise me to see some laws passed targeting TikTok and their shenanigans though.
Kind of funny, though, that binding arb has come full circle where companies have realized that is actually a greater liability.
Re: (Score:2)
I believe they can only be upheld when there's actual expectation that the party knows what it is they are waiving. I.e. you can force arbitration clauses in a contract about performance metrics, such as uptime, because when signing the contract you know uptime is something that may be argued. But "harm" especially "past harm" is legally dubious.
Kind of funny, though, that binding arb has come full circle where companies have realized that is actually a greater liability.
The problem is people are slack. If every person who would sign up for a class action payout would instead take the company to arbitration the company would be roya
Re: (Score:2)
Talk for your jurisdiction, I don't know a single case in our legal system where this shit actually didn't get thrown out as soon as a judge had the time to stop laughing long enough to say "the fuck?"
General protection or something specific? (Score:3)
This is the clause here [tiktok.com]:
YOU AND TIKTOK AGREE THAT YOU MUST INITIATE ANY PROCEEDING OR ACTION WITHIN ONE (1) YEAR OF THE DATE OF THE OCCURRENCE OF THE EVENT OR FACTS GIVING RISE TO A DISPUTE THAT IS ARISING OUT OF OR RELATED TO THESE TERMS. OTHERWISE, YOU FOREVER WAIVE THE RIGHT TO PURSUE ANY CLAIM OR CAUSE OF ACTION, OF ANY KIND OR CHARACTER, BASED ON SUCH EVENTS OR FACTS, AND SUCH CLAIM(S) OR CAUSE(S) OF ACTION ARE PERMANENTLY BARRED.
That got me wondering, if for example they got hacked 5 years ago and some folks had their identity stolen but it only gets discovered this year, does that mean the claim is expired or does the clock start with the public disclosure?
It just got me wondering if there was some specific past incident that they were trying to protect themselves from (seems like a far-fetched legal strategy).
Re: General protection or something specific? (Score:1)
meh (Score:2)
What's my age again? (Score:4, Interesting)
Is the required age to sign up for and use these services still only 13? I don't think a minor has the ability to enter into a legal agreement with Byte Dance regarding waiving rights, and their legal guardians certainly never agreed to the terms.
And also handover ... (Score:1)
.. your personal live to us, drone.
Maybe, just don't use it (Score:2)