Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Privacy IOS Apple

Academics Probe Apple's Privacy Settings and Get Lost and Confused (theregister.com) 24

Matthew Connatser reports via The Register: A study has concluded that Apple's privacy practices aren't particularly effective, because default apps on the iPhone and Mac have limited privacy settings and confusing configuration options. The research was conducted by Amel Bourdoucen and Janne Lindqvist of Aalto University in Finland. The pair noted that while many studies had examined privacy issues with third-party apps for Apple devices, very little literature investigates the issue in first-party apps -- like Safari and Siri. The aims of the study [PDF] were to investigate how much data Apple's own apps collect and where it's sent, and to see if users could figure out how to navigate the landscape of Apple's privacy settings.

The lengths to which Apple goes to secure its ecosystem -- as described in its Platform Security Guide [PDF] -- has earned it kudos from the information security world. Cupertino uses its hard-earned reputation as a selling point and as a bludgeon against Google. Bourdoucen and Janne Lindqvist don't dispute Apple's technical prowess, but argue that it is undermined by confusing user interfaces. "Our work shows that users may disable default apps, only to discover later that the settings do not match their initial preference," the paper states. "Our results demonstrate users are not correctly able to configure the desired privacy settings of default apps. In addition, we discovered that some default app configurations can even reduce trust in family relationships."

The researchers criticize data collection by Apple apps like Safari and Siri, where that data is sent, how users can (and can't) disable that data tracking, and how Apple presents privacy options to users. The paper illustrates these issues in a discussion of Apple's Siri voice assistant. While users can ostensibly choose not to enable Siri in the initial setup on macOS-powered devices, it still collects data from other apps to provide suggestions. To fully disable Siri, Apple users must find privacy-related options across five different submenus in the Settings app. Apple's own documentation for how its privacy settings work isn't good either. It doesn't mention every privacy option, explain what is done with user data, or highlight whether settings are enabled or disabled. Also, it's written in legalese, which almost guarantees no normal user will ever read it. "We discovered that the features are not clearly documented," the paper concludes. "Specifically, we discovered that steps required to disable features of default apps are largely undocumented and the data handling practices are not completely disclosed."

This discussion has been archived. No new comments can be posted.

Academics Probe Apple's Privacy Settings and Get Lost and Confused

Comments Filter:
  • It's pretty obvious. All lawyers know how to do it make things more complicated for everyone else... so that you'll have to pay them lots of money to sort it out. What a racket.

    • Ohhhh.. LAWYERS design the user interface! That explains Mac OS! Sorry for dupe post, first one was on wrong comment by mistake. Slashdot 'could' allow editing.
      • Ohhhh.. LAWYERS design the user interface! That explains Mac OS! Sorry for dupe post, first one was on wrong comment by mistake.

        Slashdot 'could' allow editing.

        And they could allow Rich Text and UniCode.

        But they enjoy the Computer Priesthood too much.

  • by rtkluttz ( 244325 ) on Friday April 05, 2024 @05:15PM (#64373596) Homepage

    It should be illegal for software and hardware manufacturers to use pinned certificates. It should ALWAYS be possible for the owner of a device to load their own certificate into any device that they own that allows them to see any and all traffic to and from their devices. This does NOT compromise the security from the owners perspective. Yes it is a man in the middle attack, but no different than one used by corporations with proxies. Any device owner should be allowed to proxy their traffic so that they can see all traffic to and from that device. This does not allow people on the internet to see the traffic, only the owner of both the device and the proxy being used.

    • by tlhIngan ( 30335 )

      And the fact it's a MITM attack is the reason why they get pinned, because people were using them in attack scenarios.

      The common case is a normal user is not doing any proxying of any kind, nor any network packet inspection. This is the 99% case where 99% of people are doing. The 1% will be the researchers doing the analysis.

      In this case, the user going to Google or Apple or whatever, should have their session shut down if it detects the certificate has been changed, because that's most likely what's happen

      • This is pure FUD. Not allowing pinned certificates does NOT enable any old person from doing MITM attacks. The person would still have to have access to the device. There is absolutely no difference or loss of security for loading certificates in a web browser on windows, linux or mac that can already be done with absolutely no loss of security as long as you control the endpoint devices. Pinned certificates do absolutely nothing except allow companies to weaponize devices and applications against the owner

        • by tlhIngan ( 30335 )

          This is pure FUD. Not allowing pinned certificates does NOT enable any old person from doing MITM attacks. The person would still have to have access to the device. There is absolutely no difference or loss of security for loading certificates in a web browser on windows, linux or mac that can already be done with absolutely no loss of security as long as you control the endpoint devices. Pinned certificates do absolutely nothing except allow companies to weaponize devices and applications against the owner

          • If you can't install your own certificates, it's not your device.

            If it's not your device, why would you trust it?

            If you don't run a corporation, why would you trust it?

            If Apple is as good at security as they claim, they should be able to give you this functionality.

            If cell phones can have an entire secure system on board (the baseband processor) they should be able to have a secure security system onboard.

    • It should be illegal for software and hardware manufacturers to use pinned certificates. It should ALWAYS be possible for the owner of a device to load their own certificate into any device that they own that allows them to see any and all traffic to and from their devices. This does NOT compromise the security from the owners perspective. Yes it is a man in the middle attack, but no different than one used by corporations with proxies. Any device owner should be allowed to proxy their traffic so that they can see all traffic to and from that device. This does not allow people on the internet to see the traffic, only the owner of both the device and the proxy being used.

      Ever hear of Wireshark?

  • by Tony Isaac ( 1301187 ) on Friday April 05, 2024 @05:44PM (#64373646) Homepage

    The use lots of responsible-sounding words to say, essentially, "We'll do with your data whatever we want to do with it, whenever we want to do it." Using confusing language is the actual point. If it were clear and precise, they would then be limited in what they can do with your data, and that would go against the real aims of the company.

    For example, when they say they will share your data "only" with their "affiliates," what isn't obvious in the text is that any company they do business with, is an "affiliate." That includes advertisers on their platform. So basically, anyone who wants to buy the data, can do so.

  • One more minute (Score:5, Interesting)

    by felixrising ( 1135205 ) on Friday April 05, 2024 @05:53PM (#64373672)
    Whatever you do, don't look at Screen Time settings and parental controls. That is a HOT MESS on apple, and is buggy to the point the kids regularly end up with apps that just keep playing indefinitely after "one more minute". It's Clayton's screen time controls.
  • "...You're reading it wrong!"

  • by JustAnotherOldGuy ( 4145623 ) on Friday April 05, 2024 @09:40PM (#64373926) Journal

    A study has concluded that Apple's privacy practices aren't particularly effective, because default apps on the iPhone and Mac have limited privacy settings and confusing configuration options.

    No shit, Sherlock, this is by design. This isn't accidental, it's not sloppy UI coding, this is 100% deliberate.

    Anything that gets in the way of slurping up that sweet, sweet user data has to be discouraged in any way possible, including making it nearly impossible to configure 'privacy' settings properly.

  • I always thought the confusing interfaces in MacOS were because Apple hires really bad UI specialists or lets the developers make the UI. Seriously, it's like something from 1995.

news: gotcha

Working...