Previous Apple contractor now turned whistleblower spills the beans regarding the tech giant's insider program that actually picks up on the user's Siri recordings. The whistleblower revealed the secrets last summer and has now made himself known to the world.
A certain Thomas le Bonniec has previously worked under a Siri "grading project" that is tasked with gathering certain snippets or audio clips in order to help improve the program's accuracy.
Just a year ago, le Bonniec suddenly revealed to The Guardian that while he was working for Apple, he was actually able to hear private and sometimes even intimate recordings with criminal activity, medical discussions, intimate moments, and even official corporate talks.
This whistleblower initially kept his identity anonymous but has then decided to reveal his identity as a form of protest against the nonexistent or slow action taken against Apple for major violations of the "fundamental human rights."
The open letter
Le Bonniec sent an open letter to the European privacy regulators with statements regarding his concerns. He also said to the guardian that there is still no constructive list of those involved as well as the amount of data that was actually breached.
Le Bonniec also said that identifying the persons that they were listening to is also not hard since there are accidental details like addresses, names, and etc. that pop up every once in a while.
The letter that was sent includes concerns that Apple repeatedly ignores and violates certain fundamental rights while continuing its massive data collection methods. The letter also expresses extreme concerns over big tech companies that are actually wiretapping the whole population despite the European citizens being assured that the EU is protected by one of the strongest existing data protection laws around the world.
Is Apple still reliable?
According to the Whistleblower, all of this was done even without the user's own consent. The data collection was said to spread beyond the user but also included relatives, children, friends, and even colleagues.
Apple had already admitted to these unreported practices after being exposed back in July of 2019. There is also a small portion of Siri that requests data to analyze and improve Siri and its dictation. These user requests are actually not associated with the user's personal Apple ID according to the company.
It was also announced that Siri's responses are actually analyzed in private and secure facilities and all of the reviewers are under Apple's strict confidentiality requirement with sensitive data.
Apple has acknowledged the many concerns and has suspended any existing human grading of Siri requests in order to begin a thorough review of their practices and their policies. They also stated that they plan t6o make certain changes to Siri in order to guarantee the safety and privacy of their users.