Facebook Design Flaw Let Thousands of Kids Join Chats With Unauthorized Users (theverge.com) 49
A design flaw in Facebook's Messenger Kids app allowed children to enter group chats with unapproved strangers. "For the past week, Facebook has been quietly closing down those group chats and alerting users, but has not made any public statements disclosing the issue," reports The Verge.
The alert reads as follows: "Hi [PARENT],
We found a technical error that allowed [CHILD]'s friend [FRIEND] to create a group chat with [CHILD] and one or more of [FRIEND]'s parent-approved friends. We want you to know that we've turned off this group chat and are making sure that group chats like this won't be allowed in the future. If you have questions about Messenger Kids and online safety, please visit our Help Center and Messenger Kids parental controls. We'd also appreciate your feedback." From the report: The bug arose from the way Messenger Kids' unique permissions were applied in group chats. In a standard one-on-one chat, children can only initiate conversations with users who have been approved by the child's parents. But those permissions became more complex when applied to a group chat because of the multiple users involved. Whoever launched the group could invite any user who was authorized to chat with them, even if that user wasn't authorized to chat with the other children in the group. As a result, thousands of children were left in chats with unauthorized users, a violation of the core promise of Messenger Kids. It's unclear how long the bug was present in the app, which launched with group features in December 2017.
The alert reads as follows: "Hi [PARENT],
We found a technical error that allowed [CHILD]'s friend [FRIEND] to create a group chat with [CHILD] and one or more of [FRIEND]'s parent-approved friends. We want you to know that we've turned off this group chat and are making sure that group chats like this won't be allowed in the future. If you have questions about Messenger Kids and online safety, please visit our Help Center and Messenger Kids parental controls. We'd also appreciate your feedback." From the report: The bug arose from the way Messenger Kids' unique permissions were applied in group chats. In a standard one-on-one chat, children can only initiate conversations with users who have been approved by the child's parents. But those permissions became more complex when applied to a group chat because of the multiple users involved. Whoever launched the group could invite any user who was authorized to chat with them, even if that user wasn't authorized to chat with the other children in the group. As a result, thousands of children were left in chats with unauthorized users, a violation of the core promise of Messenger Kids. It's unclear how long the bug was present in the app, which launched with group features in December 2017.
Unapproved Strangers? (Score:2)
As opposed to Approved Strangers?
Talk about corporate BS...
Re: (Score:2)
Or. As opposed to unapproved-worthy family members, who seem present by default, despite the statistical likelihood you'll be murdered or molested by an insider.
Re: (Score:2)
Fuck Zuck!
Silly boy . . .
In Putinist Russia . . . Zuck fucks you!
"We've turned off this group chat..." (Score:3)
The Donald will not be pleased.
Re: (Score:3)
Re: (Score:3)
I'd actually say this is a fundamental failure in QA. This is an obvious corner case that should have been addressed by the product team, first of all, and ultimately QA. The developers should probably also have noticed, but sometimes it's hard when you're in the weeds to see the trees.
No, this is a symptom of the fact that Facebook does not give a single fuck about the privacy of your children, and refuses to dedicate resources to protecting their privacy at a corporate level. As long as they can show a
Re: (Score:2)
Re: (Score:2)
I'm going to go out on a limb and speculate that either you don't understand what this flaw entails, or you don't have the foggiest clue how software QA is generally expected to work.
Above poster has it exactly right. While it's entirely regrettable that QA did not catch this kind of issue before it it went live, given what is actually occurring here, it's not terribly surprising that they didn't, even though it now seems entirely obvious in retrospect. It is as a reminder that people are imperfect, n
Re: Testing (Score:2)
I'm going to go out on a limb and say you've probably never worked with a competent QA team.
Re: (Score:2)
You say that this sort of problem seems obvious to you. That's fine. It remains debatable whether or not that comment is only being made in hindsight, and nobody else is equipped to evaluate whether that is the case but you. Regardless, what might be obvious to you might not be obvious to someone else, and that isn't necessarily a reflection of how competent they are.
I recall that when I was working once in a software QA department, I found what seemed to me like a perfectly obvious security exploit
Re: (Score:2)
Re: Testing (Score:2)
If your product is a messaging app for kids, privacy should be pretty much the one and only concern. There is no way a competent QA team with the proper direction wouldn't find this. It should literally be the first thing they are thinking about with every feature and addressing these issues should trump all other features and release dates. It's not hard. It's not tricky. It just takes the will and money to do so. We know they have the money, so it's obvious they don't give a single fuck.
Re: (Score:2)
I'd say prove it, but of course you can't... as there's no way to either reverse time or the direction of information flow.
You possess the clarity of hindisght, and how obvious something appears in retrospect cannot be reasonably assumed to be a reflection of how obvious it actually was. It's one thing to say that it *ought* to have been obvious (because you will get no argument from me that it should have been), but it's qu
Re: Testing (Score:2)
I could ask my QA team to review the feature as an anecdotal evidence, but I doubt that's what you're looking for.
Re: (Score:2)
If you can somehow do that without telling them what was actually wrong, sure... Remember though, in the design flaw, the person creating the group chat would still have to be someone that the parent would have had to approve of chats with. The actual bug is that the people *other* than the person who created the group might be in the group chat and not approved. If your QA team can figure out that this sort of thing needed to have been tested without having it spelled out or pointing them in that dire
No Real Danger (Score:2)
If your kid's friends are conspiring to introduce them to molestors or abductors you've got a much bigger problem facebook's child protections aren't going to help.
Putting barriers that require your child to actively circumvent them to encounter content and people that might be a bad influence makes sense but this didn't let random strangers initiate chats with them only their friends.
If you're desperately trying to keep your child from chatting with a friend of a friend who you think is a bad influence you
Re: (Score:2)
You never know which normal-seeming child has an Uncle Toucher who babysits on weekends.
Re: (Score:2)
Seems like a useful feature to me... Group chat with an authorized friend of an authorized friend acceptable.
At least it should be an On/Off option that the parent can choose to allow this, or specify a "List of friends" to
whom their child could also interact with in group chat. If the option is Off, and someone not in the list is in a group
chat, then replace their messages on display with some boilerplate text "[Message from stranger suppressed]"
and vice-versa "[Cannot show messages from this s
Article title could just have said: (Score:2)
and it would summarize pretty much any article about Facebook. There could even a special Slashdot graphic to accompany all the other articles about FB design flaws. Eventually these would be posted so often that they could just prefix the post titles with ``FDF: ''.