Academics Probe Apple's Privacy Settings and Get Lost and Confused (theregister.com) 24
Matthew Connatser reports via The Register: A study has concluded that Apple's privacy practices aren't particularly effective, because default apps on the iPhone and Mac have limited privacy settings and confusing configuration options. The research was conducted by Amel Bourdoucen and Janne Lindqvist of Aalto University in Finland. The pair noted that while many studies had examined privacy issues with third-party apps for Apple devices, very little literature investigates the issue in first-party apps -- like Safari and Siri. The aims of the study [PDF] were to investigate how much data Apple's own apps collect and where it's sent, and to see if users could figure out how to navigate the landscape of Apple's privacy settings.
The lengths to which Apple goes to secure its ecosystem -- as described in its Platform Security Guide [PDF] -- has earned it kudos from the information security world. Cupertino uses its hard-earned reputation as a selling point and as a bludgeon against Google. Bourdoucen and Janne Lindqvist don't dispute Apple's technical prowess, but argue that it is undermined by confusing user interfaces. "Our work shows that users may disable default apps, only to discover later that the settings do not match their initial preference," the paper states. "Our results demonstrate users are not correctly able to configure the desired privacy settings of default apps. In addition, we discovered that some default app configurations can even reduce trust in family relationships."
The researchers criticize data collection by Apple apps like Safari and Siri, where that data is sent, how users can (and can't) disable that data tracking, and how Apple presents privacy options to users. The paper illustrates these issues in a discussion of Apple's Siri voice assistant. While users can ostensibly choose not to enable Siri in the initial setup on macOS-powered devices, it still collects data from other apps to provide suggestions. To fully disable Siri, Apple users must find privacy-related options across five different submenus in the Settings app. Apple's own documentation for how its privacy settings work isn't good either. It doesn't mention every privacy option, explain what is done with user data, or highlight whether settings are enabled or disabled. Also, it's written in legalese, which almost guarantees no normal user will ever read it. "We discovered that the features are not clearly documented," the paper concludes. "Specifically, we discovered that steps required to disable features of default apps are largely undocumented and the data handling practices are not completely disclosed."
The lengths to which Apple goes to secure its ecosystem -- as described in its Platform Security Guide [PDF] -- has earned it kudos from the information security world. Cupertino uses its hard-earned reputation as a selling point and as a bludgeon against Google. Bourdoucen and Janne Lindqvist don't dispute Apple's technical prowess, but argue that it is undermined by confusing user interfaces. "Our work shows that users may disable default apps, only to discover later that the settings do not match their initial preference," the paper states. "Our results demonstrate users are not correctly able to configure the desired privacy settings of default apps. In addition, we discovered that some default app configurations can even reduce trust in family relationships."
The researchers criticize data collection by Apple apps like Safari and Siri, where that data is sent, how users can (and can't) disable that data tracking, and how Apple presents privacy options to users. The paper illustrates these issues in a discussion of Apple's Siri voice assistant. While users can ostensibly choose not to enable Siri in the initial setup on macOS-powered devices, it still collects data from other apps to provide suggestions. To fully disable Siri, Apple users must find privacy-related options across five different submenus in the Settings app. Apple's own documentation for how its privacy settings work isn't good either. It doesn't mention every privacy option, explain what is done with user data, or highlight whether settings are enabled or disabled. Also, it's written in legalese, which almost guarantees no normal user will ever read it. "We discovered that the features are not clearly documented," the paper concludes. "Specifically, we discovered that steps required to disable features of default apps are largely undocumented and the data handling practices are not completely disclosed."
Re: (Score:2)
iOS is no better than Android. And visa versa.
Re: (Score:2)
KaiOS on a flip phone?
TFA didn't mention that Apple keeps turning Bluetooth on at any excuse.
Re: (Score:2)
Options are a lot more limited than they used to be, but there are still a few options.
Re: Not good (Score:2)
Only option is to make it illegal to collect user data beyond what's necessary for billing the customer.
Re: (Score:2)
Only option is to make it illegal to collect user data beyond what's necessary for billing the customer.
Yes, but the problem is that no one who has any ability to make this a thing is ever going to do it.
There's just too much money in play for for this to be made a legal stricture, let alone enforce it.
I'd love for it to be the case, but I don't see it happening.
Re: (Score:2)
You can still have a non-smart "feature" phone and learn texting with T9 again...
Re: (Score:3, Insightful)
But infinitely better than using an OS directly from a fucking advertising company!
No sympathy for anyone who complains a single word about privacy, but then uses any Google (or Facebook or Microsoft) related device.
Nice thing about android is that it is open source. I only buy phones I am able to load third party OS images from sources I trust. Can't imagine ever owning a mobile device without a full suite of application and network access controls.
Re: (Score:2)
Re: (Score:2)
But infinitely better than using an OS directly from a fucking advertising company!
No sympathy for anyone who complains a single word about privacy, but then uses any Google (or Facebook or Microsoft) related device.
Nice thing about android is that it is open source. I only buy phones I am able to load third party OS images from sources I trust. Can't imagine ever owning a mobile device without a full suite of application and network access controls.
Android is Open Source because of AOSP?!? That's a laugh!
That's like saying iOS and macOS are Open Source because of Darwin!
Because laywers (Score:1)
It's pretty obvious. All lawyers know how to do it make things more complicated for everyone else... so that you'll have to pay them lots of money to sort it out. What a racket.
Re: (Score:2)
Re: (Score:2)
Ohhhh.. LAWYERS design the user interface! That explains Mac OS! Sorry for dupe post, first one was on wrong comment by mistake.
Slashdot 'could' allow editing.
And they could allow Rich Text and UniCode.
But they enjoy the Computer Priesthood too much.
Pinned certificates should be outlawed (Score:4, Insightful)
It should be illegal for software and hardware manufacturers to use pinned certificates. It should ALWAYS be possible for the owner of a device to load their own certificate into any device that they own that allows them to see any and all traffic to and from their devices. This does NOT compromise the security from the owners perspective. Yes it is a man in the middle attack, but no different than one used by corporations with proxies. Any device owner should be allowed to proxy their traffic so that they can see all traffic to and from that device. This does not allow people on the internet to see the traffic, only the owner of both the device and the proxy being used.
Re: (Score:2)
And the fact it's a MITM attack is the reason why they get pinned, because people were using them in attack scenarios.
The common case is a normal user is not doing any proxying of any kind, nor any network packet inspection. This is the 99% case where 99% of people are doing. The 1% will be the researchers doing the analysis.
In this case, the user going to Google or Apple or whatever, should have their session shut down if it detects the certificate has been changed, because that's most likely what's happen
Re: (Score:3)
This is pure FUD. Not allowing pinned certificates does NOT enable any old person from doing MITM attacks. The person would still have to have access to the device. There is absolutely no difference or loss of security for loading certificates in a web browser on windows, linux or mac that can already be done with absolutely no loss of security as long as you control the endpoint devices. Pinned certificates do absolutely nothing except allow companies to weaponize devices and applications against the owner
Re: (Score:2)
Re: (Score:2)
If you can't install your own certificates, it's not your device.
If it's not your device, why would you trust it?
If you don't run a corporation, why would you trust it?
If Apple is as good at security as they claim, they should be able to give you this functionality.
If cell phones can have an entire secure system on board (the baseband processor) they should be able to have a secure security system onboard.
Re: (Score:2)
It should be illegal for software and hardware manufacturers to use pinned certificates. It should ALWAYS be possible for the owner of a device to load their own certificate into any device that they own that allows them to see any and all traffic to and from their devices. This does NOT compromise the security from the owners perspective. Yes it is a man in the middle attack, but no different than one used by corporations with proxies. Any device owner should be allowed to proxy their traffic so that they can see all traffic to and from that device. This does not allow people on the internet to see the traffic, only the owner of both the device and the proxy being used.
Ever hear of Wireshark?
Privacy policies have only one purpose (Score:5, Informative)
The use lots of responsible-sounding words to say, essentially, "We'll do with your data whatever we want to do with it, whenever we want to do it." Using confusing language is the actual point. If it were clear and precise, they would then be limited in what they can do with your data, and that would go against the real aims of the company.
For example, when they say they will share your data "only" with their "affiliates," what isn't obvious in the text is that any company they do business with, is an "affiliate." That includes advertisers on their platform. So basically, anyone who wants to buy the data, can do so.
One more minute (Score:5, Interesting)
Steve Jobs: (Score:2)
"...You're reading it wrong!"
It's nby design (Score:3)
A study has concluded that Apple's privacy practices aren't particularly effective, because default apps on the iPhone and Mac have limited privacy settings and confusing configuration options.
No shit, Sherlock, this is by design. This isn't accidental, it's not sloppy UI coding, this is 100% deliberate.
Anything that gets in the way of slurping up that sweet, sweet user data has to be discouraged in any way possible, including making it nearly impossible to configure 'privacy' settings properly.
macos (Score:2)