Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Cellphones Google Operating Systems Privacy Security Your Rights Online

Many More Android Apps Leaking User Data 299

Posted by CmdrTaco
from the because-they-can dept.
eldavojohn writes "After developing and using TaintDroid, several universities found that of 30 popular free Android apps, half were sharing GPS data and phone numbers with advertisers and remote servers. A few months ago, one app was sending phone numbers to a remote server in China but today the situation looks a lot more pervasive. In their paper (PDF), the researchers blasted Google saying 'Android's coarse grained access control provides insufficient protection against third-party applications seeking to collect sensitive data.' Google's response: 'Android has taken steps to inform users of this trust relationship and to limit the amount of trust a user must grant to any given application developer. We also provide developers with best practices about how to handle user data. We consistently advise users to only install apps they trust.'"
This discussion has been archived. No new comments can be posted.

Many More Android Apps Leaking User Data

Comments Filter:
  • But how? (Score:5, Insightful)

    by Drakkenmensch (1255800) on Thursday September 30, 2010 @12:37PM (#33749636)

    "We also provide developers with best practices about how to handle user data. We consistently advise users to only install apps they trust.'"

    How exactly is one supposed to do this? What is the process for building trust vis-a-vis apps when the only protection you receive from your service provider is "don't walk into dark alleys you don't trust"?

  • by Nadaka (224565) on Thursday September 30, 2010 @12:38PM (#33749658)

    Not only the ability to display what permissions an app requests, but the ability to deny the use of those features on a per feature basis for each app.

    For instance, an app may request internet access (cellular radio or wifi), the user should be able to choose to limit that to just wifi or even turn off connectivity for that app all together.

  • by sotweed (118223) on Thursday September 30, 2010 @12:38PM (#33749666)

    It is hard enough to know if I should trust my child, and I raised him. He doesn't
    tell me much. App developers tell me less, and some of them are devious. This is not
    a good security model. And Google knows better.

  • by inviolet (797804) <slashdot@@@ideasmatter...org> on Thursday September 30, 2010 @12:39PM (#33749678) Journal

    "Android has taken steps to inform users of this trust relationship and to limit the amount of trust a user must grant to any given application developer. We also provide developers with best practices about how to handle user data. We consistently advise users to only install apps they trust." -- Google

    What a bunch of fluff. The relevant developers don't care about "best practices" or any other voluntary standard. And how the f*** are users supposed to establish trust in certain apps? The platform does not significantly monitor an application's ongoing behavior, nor is anyone performing serious code-reviews or blackbox testing. Google COULD HAVE set up profiling tests similar to those run in TFA, but didn't.

    For ONCE would a company please admit that they reduced privacy in order to provide the dumbed-down usability needed to capture market share and attract developers?

  • Re:15 of the 30... (Score:5, Insightful)

    by wgaryhas (872268) on Thursday September 30, 2010 @12:42PM (#33749740)
    Being able to know where you are and when isn't personal information?
  • Re:Bye Bye Droid (Score:2, Insightful)

    by Nocuous (1567933) on Thursday September 30, 2010 @01:03PM (#33750118)

    Can i buy your phone? serious question. Must accept sim cards and be 3g.

    He doesn't have a phone for you to buy. He's a "magical! revolutionary!" fanboi troll.

  • by Specter (11099) on Thursday September 30, 2010 @01:18PM (#33750360) Journal

    ^ this.

    This is the value of the App Store that geeks/developers consistently underrate. Apple's walled garden provides a barrier to entry that helps to reduce the risk of ending up with a fart app that's also downloading your private banking information to China.

    Google's free-for-all Marketplace is a real risk to Android's long term success because it sets up Android phones to become the must-see destination for viruses, mal-ware, and other shady operations. How long do you think it's going to be before having an Android anti-virus application is a practical requirement? What the uber-geek sees as the positive benefits of the Android eco-system (freedom and unlimited choices) are in fact NEGATIVE attributes to most of the rest of the mobile phone consuming populace. It's sorta like Android is the Linux of mobile phones...oh wait.

    I enjoyed the EVO vs. iPhone YouTube video as much as anyone but more than a funny rip on Apple, it's also a perfect demonstration of how a lot of the technical community doesn't get it. Android's popular because the iPhone is hard to get and it's a pretty respectable facsimile of an iPhone, not because it has more WIFIs and GBs than Apple. When rogue apps start to make Android painful to use and own expect consumers to start looking for The Next Big Thing (tm).

  • by Terazilla (1545215) on Thursday September 30, 2010 @01:24PM (#33750460)
    I don't get it, why is this being positioned as an Android problem? Last I checked, iPhone apps aren't even required to tell you what data they use in the first place -- is there an iPhone equivalent to the "uses internet access", "uses coarse location services" page that the Android Market displays to you? There's a ton of iPhone, Blackberry, Parlm, etc apps using advertising support, which is what the vast majority of this article is finger-pointing.

    Nobody, at any marketplace service, is going to have time to do a code review of everything that gets submitted. Even console games -- which have a months-long and intensely painful approval process the likes of which you've never seen -- don't do code review. The very concept is ridiculous, there's way too much code and way too many people involved. You're going to have to trust your developers folks, and make use of the user-ratings tools if you don't.

    Android's model of showing you what special access the software uses is about as good as I think you can get in the real world without learning to use a packet sniffer. RIM's ability to disable individual types of access is cool as well, but if the software needs it to function (or says it does) I'm not sure how the user is supposed to be in a position to use it intelligently. To avoid these sort of data harvesting problems, they'd have to somehow psychically know that the contact manager they're trying out uses that internet access for more than the occasional ad serve, and how would they know that?
  • by RocketScientist (15198) * on Thursday September 30, 2010 @01:28PM (#33750522)

    "We consistently advise users to only install apps they trust."

    How the hell am I supposed to know that? Compile and review every line of source myself? Sorry, I have a day job.

    Maybe I'll just find some application marketplace where they (1) certify apps are safe and perform well, and (2) don't violate my privacy without sending data around without my permission. That'd be an awesome idea. Some kind of marketplace that would actually verify that the application works on my device, does what it says it does, and behaves itself. That's a service I'd really pay for.

    Oh wait, I do pay for that.

    Welcome to iPhone.

  • by d_engberg (226359) on Thursday September 30, 2010 @01:29PM (#33750552)

    The headline doesn't really match the contents of the paper as far as I can tell.
    For example, "Evernote" is listed in the paper for:
    1) Taking pictures with the camera
    2) Recording audio with the microphone
    3) Determining your location
    And for transmitting this data to its servers.

    These functions are, however, exactly what the application is designed for. You take notes (including snapshot notes and voice notes) and upload them to your account. When you launch the app, there are big buttons for "take a snapshot note" , "take an audio note", etc. Geo-tagging via the location APIs can be disabled from the Settings page, but this is another core advertised feature of the product.

    So this is a bit like making it into Slashdot by discovering that a mail client transmits text that you type (and your email address!) to a mysterious "SMTP" server.
    Headline: "Researchers discover nefarious 'e-mail' application leaking your data ... on the INTERNET!"

  • by Anonymous Coward on Thursday September 30, 2010 @02:02PM (#33750992)

    sorry to piss on the fanbois flames spouting "iPhones walled garden is much safer" and other such uninformed crap
    the iPhone App Stores dirty secret is its worse, much worse

    http://www.slashgear.com/iphone-spyware-debated-as-app-library-phones-home-1752491/ [slashgear.com]

    http://gadgets.boingboing.net/2009/04/13/pinch-media-statisti.html [boingboing.net]

  • by 99BottlesOfBeerInMyF (813746) on Thursday September 30, 2010 @02:10PM (#33751106)

    ...is there an iPhone equivalent to the "uses internet access", "uses coarse location services" page that the Android Market displays to you?

    Yes. Both systems use similar schemes for jailing apps, with user permissions for access to various services.

    There's a ton of iPhone, Blackberry, Parlm, etc apps using advertising support, which is what the vast majority of this article is finger-pointing.

    True, but most are transitioning to iAd, which divorces the advertiser and location services from one another such that it is not so much of privacy concern.. at least if you trust Apple to do what they say (as opposes to every app developer).

    Nobody, at any marketplace service, is going to have time to do a code review of everything that gets submitted.

    Well, they could if they put the resources into it. It might even be important enough to end users if malware becomes a real issue on mobile platforms. That said, while they can't review all the code for every app they certainly can review the ACLs for every app, which spell out what an app is and is not allowed to access to see if the app makes sense. You don't need to read the code for a "flashlight" app if you look at the ACL and see it wants to access location and internet and the phone number list. After that you can look at the code a little and test it to see what it actually tries to do, much of which can be automated. We have enough experience automagically detecting the existence of malware these days that we can weed out a good percentage that way.

    You're going to have to trust your developers folks, and make use of the user-ratings tools if you don't.

    I don't want to have to trust developers. Thats what access control is all about, letting me safely run software from people I don't trust and trusting as little as possible to get what I want.

    Android's model of showing you what special access the software uses is about as good as I think you can get in the real world without learning to use a packet sniffer.

    Sadly, that's still pretty useless to the average user. What users really want are vetted apps tied to real developers so that they know someone looked to see if it is malware and they have two someones to sue if it is discovered to be malware.

    Ideally, the system could be more open than Apple's model where they weakly vet apps and if their efforts are poor, the user has no recourse. Better yet would be a system where various organizations (Google, phone makers, security companies, security organizations, government agencies, etc.) all vet apps based upon the ACLs included with those apps and the result is weighted baed upon the security feeds and how the end user has weighted them. Some could even be pay services like anti-malware software is now.

    RIM's ability to disable individual types of access is cool as well, but if the software needs it to function (or says it does) I'm not sure how the user is supposed to be in a position to use it intelligently.

    I might note, if software requires you to tell it your location to function at all, there's no reason the OS can't hand it dummy data when the user says "No" to the permission dialogue. It's harder for internet access, since the app can test that easily.

  • by Specter (11099) on Thursday September 30, 2010 @02:31PM (#33751420) Journal

    I don't doubt that you're right or that Android will continue to be popular with the technically savvy. The risk for Android is that it puts Linux's chaos and complexity front and center in the mobile phone market and ends up burning out customers because people are overwhelmed with choices and malware. (Is it the year of the Linux desktop yet?)

    Let's face it: Apple doesn't police the App Store out of some Machiavellian power trip or pure altruism, they do it to protect their brand identity (and therefore their ability to demand a premium for their products). That it also happens to be a nice benefit for their customers is just a happy side-effect.

    Google's abdicated this role in the Marketplace and I think that's dangerous for the long term viability of Android as a mobile platform. Google isn't acting like it believes it has to care, but it should.

    If Nokia weren't so culturally opposed to anything they didn't invent themselves this would be a grand opportunity for them: adopt Android and build a walled garden for Android in the Apple style. A variety of cutting edge phones, with high end features, global support from multiple carrier partners AND a protected/policed app store? It would be a game changer for both Android and Nokia, but they'll never do it. (Look up in NIH syndrome and you'll get a redirect to Nokia's home page.)

  • by BasilBrush (643681) on Thursday September 30, 2010 @02:35PM (#33751478)

    All this article shows is that Android security sucks. The whole popping up a dialog to ask the user for technical permissions system is fatally flawed, because most users don't understand and will just hit yes to proceed.

    iPhone don't have the same degree of problem, because this kind of stuff will mean an app won't get into the App Store. Or if it manages to get through, it will be pulled rapidly once the security problem is discovered. That's one of the benefits of a single app store.

  • by MightyMartian (840721) on Thursday September 30, 2010 @02:46PM (#33751642) Journal

    Indeed. It just offloads the problem on to someone else. I have no more reason to trust the guys at the App Store are going to be able to find sophisticated security holes. It's just another form of a false sense of security, with the added bonus that those bizarre Apple worshipers get to fit more snuggly into Jobs' uterus, believing themselves safe because their God and Protector wouldn't dare let anything nasty get through.

  • by d_engberg (226359) on Thursday September 30, 2010 @03:51PM (#33752632)

    Right, the paper lists some common applications used by millions of people (BBC, Evernote, Weather Channel) that appear to be using the requested APIs for exactly what you'd expect. It lumps those in with a few obscure and sketchy ones doing nefarious things with those APIs. It makes no attempt to determine which apps are actually doing anything unexpected/evil, and which are behaving in exactly the way that a user would expect.

    The unfiltered list gets posted on Slashdot, showered with the obligatory snark and tinfoil.

    A first pass sanity check on the apps would have been more responsible.
    E.g. "The Weather Channel app sends my location to their servers ... could this have a legitimate purpose for telling me the weather?"
    This would have probably pruned the list of applications down to a handful of garbage ones that no one had ever heard of.

  • by MrHanky (141717) on Thursday September 30, 2010 @03:59PM (#33752744) Homepage Journal

    The problem with the article is that they label apps as "suspicious" when they work as intended. Bump, for instance, is an information sharing app. It's designed to share your contact info (if you choose so) with other phones. I can't imagine it isn't one of the two apps that transmit the phone number, IMSI, etc., to the app's server, as that's how it's supposed to work.

    Of course, Bump is also available for the iPhone through Apple's app store.

  • by ceoyoyo (59147) on Thursday September 30, 2010 @04:40PM (#33753342)

    The majority of the general cell phone using population is always going to be ignorant of security, and is always going to want someone else to deal with it.

    iOS is also quite secure by design. It is based on a real UNIX that also has very few wild viruses. iOS has had a couple of bad remote exploits in it's existence, both of which were fixed pretty fast. Android (just like Linux and any other OS) has some too. Fixing them in Android might actually be problemmatic as many carriers seem to take the view that os upgrades are optional. Both systems are inherently as vulnerable to trojans as anything else. The difference is, Apple does a pretty thorough job of prescreening, and doesn't let you install pretty.scr that your friend emailed you. Google doesn't. And tossing your users out to look after their own security doesn't work. Otherwise Windows would be the safest OS.

    Google is going to have to step up before something bad and widespread happens. If they don't, someone else, probably the carriers, will do it for them. And if you think Apple is repressive, you've clearly forgotten what (popular) cell phones were like before the iPhone.

  • by bonch (38532) on Thursday September 30, 2010 @05:20PM (#33753728)

    You don't know how good Apple's security screening is, so you just choose to trust them for no reason whatsoever.

    You're trusting them because if they fuck up, it's on their hands, and they potentially lose you as a customer.

  • by scot4875 (542869) on Thursday September 30, 2010 @05:35PM (#33753860) Homepage

    The only reason it got pulled was because it was doing something *Apple* didn't want it to do, not because it was doing something the *users* didn't want it to do.

    Do you have a list of applications that have been pulled from the Apple website because they were data mining their users? If not, you have no evidence that Apple cares about this at all.

    --Jeremy

What this country needs is a dime that will buy a good five-cent bagel.

Working...