Last year, a former Apple contractor made waves by raising concerns over how the tech giant handled Siri voice assistant recordings, ultimately leading to the tech giant toThe whistleblower is back, having sent a letter to European regulators, asking them to investigate and potentially punish the company.
“I am extremely concerned that big tech companies are basically wiretapping entire populations despite European citizens being told the E.U. has one of the strongest data protection laws in the world,” former Apple contractor Thomas Le Bonniec wrote in a Wednesday letter to EU regulators. “Passing a law is not good enough: it needs to be enforced upon privacy offenders.”
The letter is just the latest example of the fine line that tech companies must walk, balancing the need to use those recordings to improve the effectiveness and smarts of voice assistants with the need to protect the privacy of their customers. Similar criticisms and concerns have been lobbed at Google and Amazon over how they deal with their voice assistants too.
Apple didn’t immediately respond to a request for comment about the letter.
Le Bonniec was a contractor with Apple until he quit in 2019,with the UK’s the Guardian newspaper about how the company was behaving. Le Bonniec said Apple collected and transcribed some voice recordings collected by Siri in an effort to improve the service’s quality. But, he said, the recordings invaded people’s privacy without their knowledge, including recordings of medical diagnosis, sexual encounters and intimate moments.
“I listened to hundreds of recordings every day, from various Apple devices (e.g. iPhones, Apple Watches,or iPads),” Le Bonniec said in his letter. “The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and whoever could be recorded by the device. The system recorded everything: names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever.”
Apple last year said the recordings were recorded in secure facilities and didn’t have any additional information attached to them, such as whose account they came from. “All reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple said at the time. The company also promised to change the way it handled Siri,to share their recordings with the company’s teams, and giving them an option to opt out.