Apple, who was caught listening to our data earlier this month, just apologized for their privacy violations. In a post to their site, they claim that privacy is “a fundamental human right.” It didn’t seem to be when they were essentially wiretapping our Siri voice…
Apple, who was caught listening to our data earlier this month, just apologized for their privacy violations. In a post to their site, they claim that privacy is “a fundamental human right.” It didn’t seem to be when they were essentially wiretapping our Siri voice recordings.
According to The Guardian, an Apple whistleblower said,
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
In a statement on their site, they claim they’re “vigilantly protecting” their customers’ privacy. That’s a funny form of protection, Apple. Maybe they could give our privacy the ultimate protection and hack into our bank accounts and share our financial data with the world.
The infringement centers around Siri’s “grading programs.” Contractors were allowed to spy on us to improve Siri’s performance. On their “Siri Privacy and Grading” webpage, Apple says,
“By using grading across a small sample of Siri requests over time, Apple can make big improvements that help ensure that our customers around the world have the best Siri experience possible.”
That sounds great for us, but of course, it’s even better for Apple. The better Siri becomes, the more products they can sell. In the end, they’re just using us to make more money.
The main infringement, according to The Guardian, is that Apple did not explicitly disclose this information. On top of that, they never said humans would be listening to our recordings.
To their credit, Apple’s apology comes with an apparent plan of action. They claim they will stop recording audio from Siri interactions. Users will now be able to opt-in if they want to “help Siri improve by learning from the audio samples of their requests.” Finally, they say that only Apple employees will be allowed to listen to the audio samples.
Apple isn’t the first tech behemoth to use our voice recordings. In July, Google admitted that they leaked over 1,000 voice recordings of customer conversations to a Belgian news site. They also hid microphones in their home alarm systems. According to Bloomberg, Amazon has teams of people listening to their customer’s Alexa conversations. And then, of course, there’s Facebook. They paid a $5 billion fine for their well-documented privacy breaches.
The first, and least appealing, option for many is to stop using these products. Since most of us have developed a dependency on these companies, the next best thing is to become aware of how to protect yourself. Don’t help Siri if you want to maintain privacy. Update yourself on Alexa’s privacy tips. Learn how to protect yourself from Facebook’s appetite for your data. Most of all, don’t forget that these companies are not your pals. While they do offer compelling services, they always come at a price.
Last modified: January 10, 2020 2:54 PM UTC