APPLE APOLOGIZES FOR SIRI EAVESDROPPING; SUSPENDS HUMAN REVIEWS, WILL DELETE RECORDINGS

August 30, 2019 in News by RBN Staff

Published: August 28, 2019

SOURCE: ZEROHEDGE

 Apple has formally apologized after they were busted using human contractors listen to customers using the Siri digital assistant – including during sexual encounters.

“We realize we haven’t been fully living up to our high ideals, and for that we apologize,” reads a statement from the company. Several new changes to the privacy policy were also announced.

First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

The Silicon Valley behemoth was one of several tech companies eavesdropping on customers,  including GoogleAmazonFacebook, and Microsoft according to The Verge.

According to a report by The Guardian, Apple contractors were listening to up to 1,000 recordings per day – many of which were triggered accidentally.

Following the report, Apple told The Verge that they would suspend the ‘grading program’ which governs the manual reviews, vs. the company’s policy of retaining random audio clips from Siri for six months, after which they would be stripped of identifying information and kept for another two years.

Wednesday’s announcement, however, is the suspension of both non-optional recordings as well as the entire grading program. The company will no longer keep Siri recordings unless a user opts in. When customers don’t give Apple contractors permission to access their data, only Apple employees will have the ability to do so.

RELATED ARTICLES:

Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.

SHARE THIS ARTICLE…