Warning: Use of undefined constant REQUEST_URI - assumed 'REQUEST_URI' (this will throw an Error in a future version of PHP) in C:\xampp\htdocs\mbc1\wp-content\themes\jannah4\functions.php on line 73
Apple Flexes Its Privacy Muscles – jj

Apple Flexes Its Privacy Muscles


Apple events follow a consistent pattern that rarely changes beyond the details of their particular announcements. This consistency becomes its own language. Attend enough Apple events and you start to pick up the deliberate undertones that the company wants to communicate but not directly express. These are the postures and facial expressions accompanying the words of the slides, demos, and videos.

Five years ago I walked out of the WWDC keynote with a feeling that those undertones were screaming a momentous shift in Apple’s direction, that privacy was emerging as a foundational principle for the company. I laid out my interpretation of Apple’s privacy principles in this piece in Macworld. Privacy had been increasing in importance for years before at Apple, but that WWDC keynote was the first time the company clearly articulated that privacy not only mattered but was being built into its foundational technologies.

This year I sat in the WWDC keynote, hearing the undertones, and realized that Apple is upping its privacy game to levels never before seen from a major technology company. That is, beyond improving privacy in its own products, the company is starting to use its market strength to extend privacy through the tendrils that touch the Apple ecosystem.

Regardless of Apple’s motivation—altruism, the personal principles of Apple executives, or a shrewd business strategy—Apple’s stance on privacy is unique and historic in the annals of consumer technology. The real question now isn’t if Apple can succeed at a technical level, but if its privacy push can withstand the upcoming onslaught from governments, regulators, the courts, and its competitors.

Apple executives say that they believe that privacy is a human right. History, however, is strewn with the remains of well-intentioned champions of such rights.

Sign in with Apple

When discussing shifts in strategy, whether at Apple or any other technology firm, we should keep in mind that such changes typically start years earlier and are more gradual than we realize. In the case of Apple, the company’s privacy extension efforts started at least a couple years before WWDC 2014, which was when Apple first started requiring privacy protections for developers who wanted to participate in HomeKit and HealthKit.

The most obvious privacy push to come out of WWDC 2019 is “Sign in with Apple,” which offers benefits to both consumers and developers. Additional WWDC sessions made it clear that Apple is using a carrot-and-stick approach with developers: developers are required to use the service when they include competing offerings from Google and Facebook, but, in exchange, they also gain built-in fraud prevention. Every Apple ID is already vetted by Apple and secured with two-factor authentication, and Apple provides developers with the digital equivalent of a thumbs-up or thumbs-down if Apple’s monitoring code thinks the connection is from a real human being. Since Apple uses similar mechanisms for iCloud activity, iTunes, and App Store purchases, the odds are that this is a reliable indicator.

Apple also emphasized that Sign in with Apple extends this privacy to the developers themselves, saying that it isn’t Apple’s business to know how developers engage with their users in their apps. Apple serves merely as an authentication provider and collects no telemetry on user activity. This isn’t to imply that Google and Facebook necessarily abuse their authentication services. Google denies these accusations and also offers features to detect suspicious activity. Facebook, on the other hand, has famously abused phone numbers supplied for two-factor authentication.

The difference between Sign in with Apple and previous privacy requirements within Apple’s ecosystems is that the feature extends Apple’s insistence on privacy beyond the company’s walled garden. Previous requirements—from HomeKit’s data use strictures to App Store rules about how apps can collect and use data—applied mostly to apps running on Apple devices. While this is technically true for Sign in with Apple, practically speaking the implications extend much further.

That’s because, when developers add Sign in with Apple to an iOS app, they likely also will need to add it to their apps on other platforms if they expect their customers to ever use anything other than an Apple device. If they don’t, they will create a confusing user experience (which, I hate to say, we will likely see a lot of). Once users create their accounts for an app with their Apple IDs, there are technical complexities in supporting those same user accounts with alternative login credentials. Thus developers likely will support Sign in with Apple across all their different platforms, extending the feature’s inherent privacy beyond Apple’s usual reach.

Intelligent Ad Tracking Prevention

Two other technologies stand out as additional examples of how Apple is extending its privacy regime. The first is an important update to intelligent tracking prevention for advertising. Privacy-preserving ad-click attribution provides (at least some) privacy in the ugly ad-tracking market. The second technology is HomeKit Secure Video, which offers a new privacy-respecting foundation to video security firms that want to be feature-competitive without dealing with the mess of building their own back-end cloud services.

Let’s look first at Intelligent tracking prevention. This Safari feature reduces the ability of services to track users across different Web sites. The idea behind it is that users can and should be able to enable cookies for a trusted site without having additional trackers continue to monitor them through the rest of their browsing to other sites. Cross-site tracking is epidemic, with many sites hosting sometimes dozens of trackers. Such tracking is meant to support advertising and to provide one key marketing metric: did an ad lead the user to visit the target site and buy something?

Effective tracking prevention is an existential risk to online advertisers and the sites that rely on it for income, but increased scrutiny from Apple (and other browser makers) is almost completely the result of overly intrusive tracking by advertisers. While intelligent tracking prevention (combined with other browser privacy and security features) is the stick, privacy-preserving ad-click attribution is Apple’s carrot. Its method of monitoring clicks allows advertisers to track conversion rates without invading user privacy.

This privacy-preserving ad-click attribution is an upcoming feature of Safari (and a proposed Web standard) that enables the browser to remember ad clicks for 7 days. If a purchase is made within that time period, it is marked as a potential ad conversion. After a semi-random time delay to limit user identification, that conversion is then reported as a delayed ephemeral post to the search or advertising provider using a limited set of IDs that can’t be linked back to the actual user.

By building a privacy-preserving advertising technology into the second-most popular Web browser on the planet (Safari’s market share is about 15%, behind Google Chrome with 63%) and by making it an open standard, all while making Herculean efforts to block invasive forms of tracking, Apple is again leveraging its market position to improve privacy beyond its walls. What’s most interesting about the technology is that, unlike Sign in with Apple, it improves user privacy without completely disrupting the business model of Apple’s advertising-driven competitors like Google and Facebook. Those companies can use Apple’s technology and still track ad conversions, and Apple still supports user-manageable ad identifiers for targeted advertisements.

HomeKit Secure Video

As I said above, HomeKit Secure Video is another technology with which Apple is extending its privacy push. Coming in macOS 10.15 Catalina, iOS 13, and iPadOS, it provides HomeKit security cameras with a privacy-preserving update. I’m a heavy user of such cameras myself, even though they are only marginally useful at preventing crime. Nearly all home security camera systems, including my Arlo cameras, record their video directly to cloud-based storage (see “The HomeKit-Compatible Arlo Baby Security Cam Is Not Just for Parents,” 3 September 2018). Cloud storage is a feature you generally want in order to avoid the risk of having bad guys steal your security footage, as happens so often on popular crime shows. Security camera companies also use cloud processing to identify people, animals, and vehicles, and to offer other useful features. Like many customers, I’m not thrilled that these companies also have access to my videos, which is one reason none of their cameras run inside my home when anyone in my family is present.

HomeKit Secure Video will send encrypted video from supported cameras to iCloud, where it’s stored, for free, for 10 days without impacting your iCloud storage limits. If you have an Apple TV or iPad on your network, it will use that device for machine learning analysis and image recognition instead of performing any analysis in the cloud. This is an interesting area for Apple to step into: it certainly doesn’t seem like the sort of thing that would drive profits since Apple doesn’t sell its own cameras, and security camera support isn’t a motivator when customers decide to purchase a phone or tablet. It’s almost as though some Apple executives and engineers were personally creeped out by the lack of privacy protection for existing security camera systems and said, “Let’s fix this.”

HomeKit Secure Video opens the security video market to a wider range of competitors while protecting consumer privacy. It is a platform, not a product, and it eliminates the need for manufacturers to build their own back-end cloud service and machine learning capabilities. Companies using the platform will experience less friction when they bring a product to market, and it simultaneously allows them to provide better user privacy.

Apple Created a Culture of Privacy, but Will It Survive?

These are just a few highlights that demonstrate Apple’s extension of privacy beyond its direct ecosystem, but WWDC featured even more privacy-related announcements.

Apple continues to expand existing privacy features across all its platforms, including the new offline Find My device tracking tool (see “How Apple’s New Find My Service Locates Missing Hardware That’s Offline,” 21 June 2019). Having seen how some apps abuse Wi-Fi and Bluetooth data for ad hoc location tracking, Apple now blocks app access in iOS to such data unless it’s needed as a core feature. Users now can also track the trackers and see when even approved apps accessed their location.

Then there’s the upcoming Apple credit card, which is the closest thing we can get to a privacy respecting payment option. Even speech recognition is getting a privacy polish: developers will soon be able to mandate that speech recognition in their apps runs on-device, without ever being exposed to the cloud. In fact, Apple dedicated an entire WWDC session to examples of how developers can adopt Apple’s thinking to improve privacy within their own apps.

During John Guber’s The Talk Show Live, Craig Federighi said that Apple’s focus on privacy started back in its earliest days, when the company was founded on creating “personal” computers. Maybe it did, maybe it didn’t, but Apple certainly didn’t build a real culture of privacy (or any technical protections) until the start of the iPhone era. When Microsoft launched its highly successful Trustworthy Computing Initiative in 2002 and reversed the company’s poor security record, one of its founding principles was “Secure by Design.” During Apple’s developer-focused Platform State of the Union session, privacy took center stage as Apple talked about “Privacy by Design.”

Apple and other tech firms have already run into resistance when building secure and private devices and services. Some countries, including Australia, are passing laws to break end-to-end encryption and require device backdoors. US law enforcement officials have been laying the groundwork for years to push for laws that permit similar access, even while knowing it would then be impossible to guarantee device security (see “Apple and Google Spark Civil Rights Debate,” 10 October 2014). China requires Apple and other non-Chinese cloud providers to hand over their data centers to Chinese companies who can then feed information to the government. Apple’s competitors aren’t sitting by idly, with Google’s Sundar Pichai muddying the waters in a New York Times opinion piece that equates Google security with privacy, and positioning Apple’s version of privacy as a luxury good. While Google’s security is the best in the industry, equating that security with the kind of privacy that Apple offers is disingenuous at best.

The global forces arrayed against personal privacy are legion. Advertising companies and marketing firms want to track your browsing and buying. Governments want to solve crimes and prevent terrorism whatever the cost. Telecommunication providers monitor all our Internet traffic and locations, just because they can. The financial services industry is sure our data is worth something. And even grocery stores can’t resist offering minor discounts if you just let them correlate all your buying to your phone number. While, theoretically, we have a little  control over some of this tracking, practically speaking we have essentially no control over most of it, and even less insight into how it is used. It’s a safe bet that many of these organizations will push back hard against Apple’s privacy promotion efforts, and, by extension, against any of us that care about and want to control our own privacy.

Calling privacy a fundamental human right is as strong a position as any company or individual can take. It was one thing for Apple to build privacy into its own ecosystem, but as it extends this privacy outside its ecosystem, we have to decide for ourselves if we consider these protections meaningful and worthy of support. I know where I stand, but I also recognize that privacy is highly personal concept and I shouldn’t assume a majority of the world feels the same as I, or that Apple’s efforts will survive the challenges of the next decades.

It’s in our hands now.

Source link

Related Articles

Leave a Reply

Back to top button