'They're Basically Lying' - Mental Health Apps Caught Secretly Sharing Data (theverge.com) 43
"Free apps marketed to people with depression or who want to quit smoking are hemorrhaging user data to third parties like Facebook and Google -- but often don't admit it in their privacy policies, a new study reports..." writes The Verge.
"You don't have to be a user of Facebook's or Google's services for them to have enough breadcrumbs to ID you," warns Slashdot schwit1. From the article: By intercepting the data transmissions, they discovered that 92 percent of the 36 apps shared the data with at least one third party -- mostly Facebook- and Google-run services that help with marketing, advertising, or data analytics. (Facebook and Google did not immediately respond to requests for comment.) But about half of those apps didn't disclose that third-party data sharing, for a few different reasons: nine apps didn't have a privacy policy at all; five apps did but didn't say the data would be shared this way; and three apps actively said that this kind of data sharing wouldn't happen. Those last three are the ones that stood out to Steven Chan, a physician at Veterans Affairs Palo Alto Health Care System, who has collaborated with Torous in the past but wasn't involved in the new study. "They're basically lying," he says of the apps.
Part of the problem is the business model for free apps, the study authors write: since insurance might not pay for an app that helps users quit smoking, for example, the only ways for free app developer to stay afloat is to either sell subscriptions or sell data. And if that app is branded as a wellness tool, the developers can skirt laws intended to keep medical information private.
A few apps even shared what The Verge calls "very sensitive information" like self reports about substance use and user names.
"You don't have to be a user of Facebook's or Google's services for them to have enough breadcrumbs to ID you," warns Slashdot schwit1. From the article: By intercepting the data transmissions, they discovered that 92 percent of the 36 apps shared the data with at least one third party -- mostly Facebook- and Google-run services that help with marketing, advertising, or data analytics. (Facebook and Google did not immediately respond to requests for comment.) But about half of those apps didn't disclose that third-party data sharing, for a few different reasons: nine apps didn't have a privacy policy at all; five apps did but didn't say the data would be shared this way; and three apps actively said that this kind of data sharing wouldn't happen. Those last three are the ones that stood out to Steven Chan, a physician at Veterans Affairs Palo Alto Health Care System, who has collaborated with Torous in the past but wasn't involved in the new study. "They're basically lying," he says of the apps.
Part of the problem is the business model for free apps, the study authors write: since insurance might not pay for an app that helps users quit smoking, for example, the only ways for free app developer to stay afloat is to either sell subscriptions or sell data. And if that app is branded as a wellness tool, the developers can skirt laws intended to keep medical information private.
A few apps even shared what The Verge calls "very sensitive information" like self reports about substance use and user names.
There are no "free" apps (Score:5, Informative)
People, start understanding that there are no free apps. If they are free, you are the product and they will try to monetize your data any way they can and damn the consequences for you. The only apps that are actually free is small stuff where somebody did a demo and shows it to the world for bragging rights. The nature of these will always be obvious. Anything else needs to monetize in some way and that will almost universally be user data.
Now, doing this to people with mental health issues is an extreme moral low. I do seriously hope these people will not get any good reincarnation options for a long, long time.
Re:There are no "free" apps (TANSTAAFL) (Score:2)
You forgot to say TANSTAAFL there. The exponent of beer in my sig should be higher, but "free beer" as in "free lunch" is one of them. You're still focusing on the obvious.
My favorite solution approach is in a longer comment below, but I'd be glad to read of a better one. Even better to see any good solution implemented in the real world.
Re: (Score:2)
You are wrong, and fundamentally so. There is the occasional free lunch, and, if you look at FOSS as an example, there can be a massive amount of it. There are also situations where, due to various effects, it is not you that pays. You just need to be careful to find out whether something is actually free or not. And, if unsure, you should expect it to be non-free.
So, no, I am not "still" focusing on the obvious (condescend much?), I actually have a better understanding of the situation than your simplistic
Re: (Score:1)
No, that is NOT what I wrote. Not remotely.
Apparently you were unable to understand that I was mostly agreeing with you. Perhaps you are that confused by your own writing? Or are you simply that confused about what "free" means?
Feel "free" [one of the alternative senses of "free beer" referenced in my sig] to attempt to convince me that this "discussion" has not been terminated.
Re: (Score:2)
I'm less familiar with the US regulations, but certainly in the GDPR era in Europe, health data would be considered sensitive personal data, which requires additional protections. In particular, some of the more generic/ambiguous options for a lawful basis for processing are insufficient on their own, and you need a stronger basis such as explicit, opt-in consent. If these apps didn't have one, the penalties if the data protection regulators decide to make a point could pose an existential threat to a typic
Re: (Score:3)
Re: (Score:2)
health "insurance" companies write all of our health care laws
Nope. Personal Injury lawyers write them.
Re: (Score:3)
health "insurance" companies write all of our health care laws
Nope. Personal Injury lawyers write them.
Children, children, they take turns depending who wins the election, everybody knows that. And tie your shoes before you trip and hurt yourself.
Re: (Score:2)
This is probably solved simply by this not being health data. It is just some things the users said. Proper "health data" is created by a medical professional.
Re: (Score:3)
People, start understanding that there are no free apps. If they are free, you are the product and they will try to monetize your data any way they can and damn the consequences for you.
Damn, I'll need to ditch Emacs, then.
Re: (Score:2)
Emacs is not an "app". The "app" ecosystem has not a lot of actual FOSS.
What about all the FOSS apps? (Score:2)
Also, F-Droid is brimming with Free apps.
Re: (Score:3)
People, start understanding that there are no free apps.
That's only true in your branded AppyApp Store(TM)(R)(KFC).
Over at FDroid a lot of the apps are Free as in Freedom, and they don't phone home at all, they don't display ads at all, they don't have any mechanism to profit off of users at all.
They're called "App Stores" for the same reasons a grocery store is called a store. But you're still allowed to eat the fruit that grows in the park. For free. But you won't find that at the store.
Some people think they need to pay to grow a copy of some data. Other peop
Re: (Score:2)
You misunderstand. Of course, there are free apps. But the default assumption needs to be that they are not free, unless there is sound evidence to the contrary. For the average person, this simplifies down to "there are no free apps". Yes, it is hyperbole to say that, but somebody that gets hurt by this is already not too smart and you have to use simple, absolute statements to get through.
So tell us the truth about the business models (Score:3)
I get kind of tired of old problems with old solutions. Especially obvious solutions. "Caveat emptor" is a just cheap dodge.
If the google and Apple actually wanted to stop these kinds of abusive applications, then they should help the potential suckers know when to avoid them. The obvious solution approach is to address the apps' financial models as directly as possible.
As it could be implemented in Google Play (since I'm on the Android side, but if Apple was doing it on the iPhone side I'm sure I'd have heard of it by now), each app should have a section (or a tab) about the financial model. In most cases the developer would be able to pick from a small number of frequently used business models. That part of the financial-model section would be controlled by the developer, but there would be a second section where the google offered their response. For most of the obvious options, the google will be in a position to say "Yes, that's what seems to be going on here", "Maybe, but we can't tell", or "We know nothing" (much as it pains the google to admit ignorance). Of course the developer should be able to ignore the standard options and say something else or add comments about the standard models in use, but that will increase the likelihood of a "Buyer beware" or "Be extra wary" response from the google. (However the google could work with developers to allow for such complicated evaluations as "The developer has privately shared financial information with us [the google] that supports the above description of the funding of this application. We concur that additional information would jeopardize the application's success or be excessively helpful to competitors.")
Per my sig, meaningful freedom is about informed choice, but without knowledge of the financial model when we download apps... Well, this story is just focusing on one flavor of the wholesale abuse that is resulting.
you'd have to be mad (Score:3)
IOW, you'd have to be mad to believe them?
Re: Center For Self Leadership is the absolute wor (Score:1)