For years, Apple has carefully curated a reputation as a privacy stalwart among data-hungry and growth-seeking tech companies.
In multi-platform ad campaigns, the company told consumers that “what happens on your iPhone, stays on your iPhone,” and equated its products with security through slogans like “Privacy. That’s iPhone.”
But experts say that while Apple sets the bar when it comes to hardware and in some cases software security, the company could do more to protect user data from landing in the hands of police and other authorities.
In recent years, US law enforcement agencies have made use of data collected and stored by tech companies in investigations and prosecutions. Experts and civil liberties advocates have raised concerns about authorities’ extensive access to consumers’ digital information, warning it can violate fourth amendment protections against unreasonable searches. Those fears have only grown as once protected such as access to abortion have become criminalized behavior in many states.
“The more that a company like Apple can do to set itself up to either not get law enforcement requests or to be able to say that they can’t comply with them by using tools like end-to-end, the better it’s going to be for the company,” said Caitlin Seeley George, the campaigns and managing director at the digital advocacy group Fight for the Future.
Apple gave data to law enforcement 90% of the time
Apple receives thousands of law enforcement requests for user data a year, and overwhelmingly cooperates with them, according to its own transparency reports.
In the first half of 2021, Apple received 7,122 law enforcement requests in the US for the account data of 22,427 people. According to the company’s most recent transparency report, Apple handed over some level of data in response to 90% of the requests. Of those 7,122 requests, the iPhone maker challenged or rejected 261 requests.
The company’s positive response rate is largely in line with, and at times slightly higher than that of counterparts like Facebook and Google. However, both of those companies have documented far more requests from authorities than the iPhone maker.
In the second half of 2021, Facebook received nearly 60,000 law enforcement requests from US authorities and produced data in 88% of cases, according to that company’s most recent transparency report. In that same period, Google received 46,828 law enforcement requests affecting more than 100,000 accounts and handed over some level of data in response to more than 80% of the requests, according to the search giant’s transparency report. That’s more than six times the number of law enforcement requests Apple received in a comparable time frame.
That’s because the amount of data Apple collects on its users pales in comparison with other players in the space, said Jennifer Golbeck, a computer science professor at the University of Maryland. She noted that Apple’s business model relies less on marketing, advertising and user data – operations based on data collection. “They just naturally don’t have a use for doing analytics on people’s data in the same way that Google and a lot of other places do,” she said.
Apple’s drafted detailed guidelines outlining exactly what data authorities can obtain and how it can get it – a level of detail, the company says, which is in keeping with best practices.
Despite ‘secure’ hardware, iCloud and other services pose risks
But major gaps remain, privacy advocates say.
While iMessages sent between Apple devices are end-to-end encrypted, preventing anyone but the sender and recipient from accessing it, not all information backed up to iCloud, Apple’s cloud server, has the same level of encryption.
“iCloud content, as it exists in the customer’s account” can be handed over to law enforcement in response to a search warrant, Apple’s law enforcement guidelines read. That includes everything from detailed logs of the time, date and recipient of emails sent in the previous 25 days, to “stored photos, documents, contacts, calendars, bookmarks, Safari browsing history, maps search history, messages and iOS device backups.” The device backup on its own may include “photos and videos in the camera roll, device settings, app data, iMessage, business chat, SMS, and MMS [multimedia messaging service] messages and voicemail”, according to Apple.
Golbeck is an iPhone user but opts out of using iCloud because she worries about the system’s vulnerability to hacks and law enforcement requests. “I am one of those people who, if somebody asks if they should get an Android or an iPhone, I’m like, well, the iPhone is gonna be more protective than the Android is, but the bar is just very low.” she said.
“[Apple’s] Hardware is the most secure on the market,” echoed Albert Fox Cahn, the founder of the Surveillance Technology Oversight Project, a rights organization. But the company’s policies around iCloud data also have him concerned: “I have to spend so much time opting out of things they’re trying to automatically push me towards using that are supposed to make my life better, but actually just put me at risk .
“As long as Apple continues to limit privacy to a question of hardware design rather than looking at the full life cycle of data and looking at the full spectrum of threats from government surveillance, Apple will be falling short,” he argued.
It’s a double standard that was already apparent in Apple’s stance in its most high-profile privacy case, the 2015 mass shooting in San Bernardino, California, Cahn said.
At the time, Apple refused to comply with an FBI request to create a backdoor to access the shooter’s locked iPhone. The company argued that a security bypass could be exploited by hackers as well as law enforcement officials in future cases.
But the company said in court filings that if the FBI hadn’t changed the phone’s iCloud password, it wouldn’t have needed to create a backdoor because all of the data would have been backed up and therefore available via subpoena.
In fact, the company said up until that point, Apple had already “provided all data that it possessed relating to the attackers’ accounts”.
“They were quite clear that they weren’t willing to break into their own iPhones, but they were eager to actually break into the iCloud backup,” said Cahn.
Apple said in a statement it believed privacy was a fundamental human right, and users argued were always given the ability to opt out when the company collects their data.
“Our products include innovative privacy technologies and techniques designed to minimize how much of your data we – or anyone else – can access,” said an Apple spokesperson, Trevor Kincaid, adding that the company is proud of new privacy features such as app tracking transparency. and mail privacy protection, which gives users more control over what information is shared with third parties.
“Whenever possible, data is processed on device, and in many cases we use end-to-end encryption. In instances when Apple does collect personal information, we’re clear and transparent about it, telling users how their data is being used and how to opt out anytime.”
Apple reviews all legal requests and are obligated to comply when they are valid, Kincaid added, but emphasize that the personal data Apple collects is limited to begin with. For instance, the company encrypts all health data and does not collect device location data.
People are ‘vastly unaware of what’s going on with their data’
Meanwhile, privacy advocacy organizations like the Electronic Frontier Foundation (EFF) are urging Apple to implement end-to-end encryption for iCloud backups.
“When we say they’re better than everyone else, it’s more an indictment of what everyone else is doing, not necessarily Apple being particularly good,” EFF staff technologist Erica Portnoy said.
Portnoy gives Apple credit for its default protection of some services like iMessage. “In some ways, some of the defaults can be a bit better [than other companies], which isn’t nothing,” she said. But, she pointed out, messages are only secure if they’re being sent between iPhones.
“We know that unless messages are end-to-end encrypted, many people could have access to these communications,” said George, whose organization Fight for the Future launched a campaign to push Apple and other companies to better secure their messaging systems.
It’s a problem the company can fix by, for one, adopting a Google-backed messaging system called rich communication services (RCS), George argued. The system isn’t in and of itself end-to-end encrypted but supports encryption, unlike SMS and MMS, and would allow Apple to secure messages between iPhones and Androids, she said.
At the Code 2022 tech conference, Apple’s CEO, Tim Cook, indicated the company didn’t plan to support RCS, arguing that users haven’t said this is a priority. But they “don’t know what RCS is”, George said. “If Apple really doesn’t want to use RCS because it comes from Google, they could come to the table with other solutions to show a good faith effort at protecting people’s messages.”
Kincaid said consumers were not asking for another messaging service because there are many existing encrypted offerings, such as Signal. He also said that Apple is concerned about RCS isn’t a modern standard or encrypted by default.
Golbeck, who has a TikTok channel about privacy, says people are “vastly unaware of what’s going on with their data” and “think they’ve got some privacy that they don’t”.
“We really don’t want our own devices being turned into surveillance tools for the state,” Golbeck said.