Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Cellphones Government Security

South Korea's "Smart Sheriff" Nanny App Puts Children At Risk 54

Starting in April, the South Korean government required that cellphones sold to anyone below the age of 19 be equipped with approved monitoring software that would allow the user's parents to monitor their phone use, report their location, and more. Now, however, researchers have discovered that one of the most popular of the approved apps, called Smart Sheriff, may not actually be very smart to have on one's phone. Researchers from Citizen Lab and Cure53, at the request of the Open Technology Fund, have analyzed the code of Smart Sheriff, and found that it actually endangers, rather than protects, the users. Reports the Associated Press, in a story carried by the Houston Chronicle: Children's phone numbers, birth dates, web browsing history and other personal data were being sent across the Internet unencrypted, making them easy to intercept. Authentication weaknesses meant Smart Sheriff could easily be hijacked, turned off or tricked into sending bogus alerts to parents. Even worse, they found that many weaknesses could be exploited at scale, meaning that thousands or even all of the app's 380,000 users could be compromised at once.
This discussion has been archived. No new comments can be posted.

South Korea's "Smart Sheriff" Nanny App Puts Children At Risk

Comments Filter:
  • by Anonymous Coward on Monday September 21, 2015 @02:27AM (#50565013)

    There will always be shoddy code that makes it into apps, though this is pretty awful and unacceptable. I'm also really troubled by the government mandate that such a program be installed on children's phones. Shouldn't it be up to the parents if they want this level of monitoring or not? Also, can't this be implemented by wireless carriers in a secure fashion by monitoring traffic from the device instead of apps on the phone? Surely such a thing would be more secure and probably a lot harder to circumvent. Why is the government of South Korea turning into a nanny state and requiring something that should be solely the decision of the parents?

    • by sociocapitalist ( 2471722 ) on Monday September 21, 2015 @04:58AM (#50565285)

      There will always be shoddy code that makes it into apps, though this is pretty awful and unacceptable. I'm also really troubled by the government mandate that such a program be installed on children's phones. Shouldn't it be up to the parents if they want this level of monitoring or not? Also, can't this be implemented by wireless carriers in a secure fashion by monitoring traffic from the device instead of apps on the phone? Surely such a thing would be more secure and probably a lot harder to circumvent. Why is the government of South Korea turning into a nanny state and requiring something that should be solely the decision of the parents?

      The question as always is, who profits?

      Follow the money spent on this crapp and you'll know the 'why' of it.

    • by Ecuador ( 740021 ) on Monday September 21, 2015 @06:36AM (#50565505) Homepage

      It is just the first step of a two step process to protect the South Korean youth from one of the five most common causes of injury or death [wikipedia.org]. The second step is to install gps devices on all electric fans. When the system shows a youth in the same location as a working fan for more than 30 minutes, the authorities will be alerted.

    • This is a country in which it is widely believed that you can die from sleeping in a room with a fan because the fan will soffocate you. Common sense doesn't seem widely available there, and given their other, existing nanny-state crime/drug/gun laws, this is just part for the course for them.
  • by PolygamousRanchKid ( 1290638 ) on Monday September 21, 2015 @02:40AM (#50565031)

    . . . and then they won't worry about being spied on by the government later in their lives.

    I find this Korean law very creepy. I think that "trust" is one of the most important aspects of the parent-child relationship. If parents need to spy on their children . . . there is a lack of trust.

    • They have a point, though: the Surveillance Age is upon us and it's not going away, ever. Instead of wringing our hands we should learn to cope with the new reality, and part of it is teaching our children about the new normality. They will have to exist within it, after all, and the sooner they get used to being watched over the better. It will teach them useful skills like separating what they are from what they must appear to be, and actively work towards becoming what they must appear to be, renouncing

    • If parents need to spy on their children . . . there is a lack of trust.

      Sometimes there are children who just cannot be trusted to make good decisions - who have proven that they are going to make bad decisions over and over again.

      And sometimes children just make mistakes and in this unkind world such mistakes can be very dangerous indeed.

      Case in point: http://www.bbc.com/news/magazi... [bbc.com]
      Snippings from that article for your convenience:
      "When we got into her Facebook account, we realised that she had a profile that we didn't know about..."
      "Sixteen days after Karen disappeared, she

    • Start them young and they will despise surveillance all their life.

      Trust me. My dad did his best to keep me under the magnifying glass. All it did was to turn me into the "privacy from my cold, dead hands" person I'm today.

    • by gweihir ( 88907 )

      Without trust, society eventually collapses. Yes, it is that bad. I agree though on this being an obvious step in "conditioning" the children to find the surveillance normal.

  • Daaaamn, that is a train wreck of an app. There's nothing at all that excuses such a complete disaster security-wise. Those issues are the kind that should have been caught by even a completely cursory security review of the app, though anybody doing their job here damn well should have insisted on a lot more than a cursory review.

    So... what was the approval process for these apps like? Who approved this app? How nice is their new yacht?

    • by Anonymous Coward on Monday September 21, 2015 @02:59AM (#50565057)

      But not as bad as GMail...

        * Storing most people's mail at one single company
        * That company making its money from sifting through the contents of the email
        * That company being based in the US
        * The US doing its best to be a dick about privacy

      Now THAT is a proper train-wreck waiting to happen.

      • by Anonymous Coward

        The only one that decided to use GMail was you.

        • by _merlin ( 160982 )

          Not always. There are people who use gmail for work, or who host their business e-mail on gmail. So you could be sending e-mail to an apparently innocuous address that's actually Google-hosted. Then even if what you receive isn't all being slurped up, a significant proportion of what you send may be.

          • Anyone who acts as if an email isn't a electronic postcard is fooling themselves. No unencrypted email is even remotely private, whether I use Hilary's server or Gmail's.

      • by Nyder ( 754090 )

        But not as bad as GMail...

        * Storing most people's mail at one single company

        * That company making its money from sifting through the contents of the email

        * That company being based in the US

        * The US doing its best to be a dick about privacy

        Now THAT is a proper train-wreck waiting to happen.

        How much do you get paid to do posts like this? I'm broke for this upcoming holiday system and I think I can spread FUD about whatever company like the best of ya.

  • It isn't exactly news, even if it's impolite to say so in as many words; that there are a lot of architectural similarities between the remote management tools used for IT admin work, the 'child safety' remote monitoring stuff; and good old fashioned spyware. The main difference is in who authorizes the deployment(and the license fees).

    Now, we have a market where use of this software is mandated, which means that there is going to be a race to the bottom to put the cheapest-possible product that ticks th
  • by drolli ( 522659 ) on Monday September 21, 2015 @06:51AM (#50565543) Journal

    I lived in Japan for some years. In 2008 Mobiles with such functions started to appear in Japan. My Boss (Japanese) told me his daugthers (around 12/15 back then) got phones with such functions. I asked him what the function exactly does, if it can be triggered by the children, if it can be triggered by the parents, orr if it logs the position all the time, and how the connection is secured.

    He was not interested, but just said that his wife (housewife) decided on the phone and that he did not get into the "details". The funny part is: my boss had a PHD in physics and we worked in a field related to cryptography.

    So I wonder: People are so fucking uninterested in what their kids are doing that they donâ(TM)t even go "into the detail" if they actually could; this brings me to the conclusion that the money they spent on these apps is "just to do something about something and feel betetr since it costs money" instead of talking to their kids and making real, respectful decisions.

    Give you child a panicbutton - ok. Give you child something which is triggered by specific circumstances - ok. Put your child on an electronic leash - and you will wonder that you child will easily cut the leash at some point, without you noticing.

  • Sue the government. Hahahahaha!
  • Duh! (Score:5, Insightful)

    by gstoddart ( 321705 ) on Monday September 21, 2015 @09:02AM (#50565965) Homepage

    When will people start to realize that all of the shit they do because they think will solve one technology problem usually creates another one?

    If you start putting in an app to track your children and monitor what they do ... any exploit in that is going to have really bad results. And your band-aid solution slapped together is always going to have exploits. If you poke holes in encryption for law enforcement, law enforcement will never be the only ones who can exploit those holes.

    As long as corporations aren't under any legal standard for encryption and security and bear no penalty for doing a bad job, this will always happen. Because they write the stuff which looks cool in a demo, and they may or may not ever get around to realizing they've been totally inept at security. And if they do realize they've been inept at security, they're likely to do nothing.

    Almost without fail, these schemes of "won't someone think of the children" or "yarg, teh terrorists" end up with stupid solutions being implemented by people without a clue. And almost without fail someone loudly says "this has huge holes and issues in it and won't work".

    And almost without fail, this proves to be true.

    So, this is unfortunate. But, it's also something which was pretty much 100% predictable as something doomed to fail ... because the people demanding it, and the people implementing it are seldom aware of, or qualified to deal with, the security holes created by shit like this.

    This was kind of inevitable from the start.

    If you institute something to track your children under the guise of protecting your children ... you better be damned sure you're doing it to the highest possible standard. Otherwise, all you're doing it creating the situations where you're going to make this information available to someone else.

    • When will people start to realize that all of the shit they do because they think will solve one technology problem usually creates another one?

      I'm hoping, but I'm not certain they will. Safety Culture has run amok, and in those weird twists that cultures are capable of, they'll just claim "See? SEE? We'v gotta duz more to keep R chidlren save!!! Look at how this system can be violated! ERMAGHERD We gotta DO sumpin!"

      Pretty good gig when Safety culture can cause a problem, then have no responsibility, and then demand another layer of protection.

      I'm foreseeing the day, and soon - that there will be gps enabled shock collars for children that can

It is easier to write an incorrect program than understand a correct one.

Working...