The world was recently introduced to Apple’s “enterprise developer certificate” system after Google and Facebook were busted for abusing their certificates in January. These Apple-issued certificates allow businesses to develop apps on iPhones. Because these apps are not reviewed by Apple before they are released, and do not need to comply with standard privacy rules, Apple requires businesses to only use this certificate to make apps for internal corporate use or for testing commercial apps.
Although Facebook and Google’s mischief dominated this news story, equally, if not more, concerning is that Apple has designed a certificate system that gives millions of employers the unfettered ability to view private, personal data on employees’ phones. Seeing as Facebook and Google abused the very power that Apple has granted them and millions of other companies, Apple has to answer for its prioritization of corporate interests over employee privacy.
For Employers, Apparently the Rules Don’t Apply
Generally, Apple requires iPhone users to download apps through the app store. While Apple has self-interested reasons for doing this—such as the 30% cut it receives for all apps and content sold—for security reasons Apple requires every app to go through an intensive screening process before it is published.
Apple recently enhanced its app store privacy rules. Under its new guidelines, Apple will reject any app for content or behavior that is “over the line.” Apple warns: “What line, you ask? Well, as a Supreme Court Justice once said, ‘I’ll know it when I see it.’” Also, Apple now only lets apps access and use data that is “relevant to [their] core functionality.”
Apple has created an app store workaround for businesses through the issuance of what is called an “enterprise developer certificate.” With this certificate, employees can “sideload” apps, meaning that they can download apps directly from the company’s website without going through the app store.
This certificate gives businesses more control over the development and testing of their internal apps. With this certificate, Facebook can, for instance, push an updated version of its cafeteria or shuttle-service app onto employees’ devices. Facebook can also “beta-test” (i.e. test prior to commercial release) apps without needing to submit every update to Apple. (Of course, once ready for public release, the app has to go through Apple’s screening process.)
When Apple grants this type of certificate, it relinquishes complete control over the app-development process, including privacy/security review. Businesses have exclusive control of their internal corporate apps and can even grant themselves, if they so choose, “root access” to their employees’ phones.
With root access, an app can see virtually everything on a person’s phone: iMessages, emails, private messages in social media apps, encrypted data, browsing history, continuous location data, photos, microphone, contacts, calendar, wifi and data usage, app usage, and even when your screen is turned on and off.
To limit businesses’ taking advantage of this level of access, Apple contractually requires that businesses only develop apps for internal distribution under this certificate. Apple defines internal distribution clearly: employees or independent contractors, and only independent contractors engaging in “development work.”
A Certificate System Ripe for Abuse
Apple’s enterprise developer certificate program has created a host of privacy issues. First, as the Facebook and Google story showed, it is relatively easy to misuse these certificates, given that Apple cannot easily monitor how businesses are precisely using them. In fact, it was TechCrunch, not Apple, that discovered Facebook and Google’s conduct.
Facebook and Google are not the only players in Silicon Valley to misuse Apple’s certificate. Many gig-economy apps—including Amazon Flex, DoorDash, Instacart, Postmates, TaskRabbit, Handy, Dolly, and HopSkipDrive—are also violating their certificates by having independent contractors use sideloaded versions of apps. These workers—primarily drivers, grocers, and handymen—do not qualify as independent contractors engaging in development work. Notably, Uber, Lyft, and Grubhub offer their worker-side apps through the app store.
Equally problematic is that employers can distribute apps in a way that complies with Apple’s policy but vastly exceeds appropriate levels of access. An employer could theoretically read your personal email account and your private messages on WhatsApp and Facebook. An employer could look through your calendar and see that you’re attending a political rally for an unpopular cause, that you’ve scheduled a psychiatric appointment or a job interview, or that you’ve bought pregnancy tests on Amazon. It falls entirely on the employer to exercise good judgment.
How much are employers snooping in practice? That’s precisely the problem: we have no clue. Apple removes itself from the review process, and all employees need to do is press a single “trust” button when they download a sideloaded app. Employers may elect to provide their employees with a user agreement. But even then, all employees can really do is trust that their employers will honor that agreement.
One idea is that if employers want this level of access, they should provide employees with work-only devices. That’s fairly typical in the corporate world of law firms and investment firms. However, in the era of BYOD (bring your own device) to work, it is increasingly common for employers to expect employees to use their personal phones for work. This is precisely how the gig-economy operates. I would be shocked if Instacart or Amazon Flex ever considered providing their contractors with work-only phones.
Why would Apple create a certificate process that creates so much potential for abuse? Two tech experts who weighed in, Professor Hany Farid at UC Berkeley and Partner David Cowan of Bessemer Venture Partners, doubt Apple has a compelling reason. Farid believes Apple designed this system as a way to incentivize large companies like Facebook and Google to use and develop apps on iPhones. Cowan explained that while it makes sense to grant root access to a company that is beta-testing its latest app (so that it has complete control over the fleet of company-owned iPhones testing the app), there is no clear reason why employers would need root access to employee phones. Both Farid and Cowan agree that Apple has given many employers far more power than is necessary or appropriate by designing only one type of certificate to meet a wide range of business needs. Cowan suggested that to combat this potential for abuse, Apple should either change its certificate system or make it abundantly clear to employees through status icons and repeated notifications that their employer, as the owner of the enterprise certificate, has full access to all the data on their iPhones.
There is a fundamental problem with giving employers the ability to snoop on what employees do in their private time on their personal devices. As the supposed champion of consumer privacy, Apple either needs to defend its position or do more.