Apple apologizes for letting contractors eavesdrop on Siri’s sex recordings and your confidential talks

Share to friends

Sorry we’ve been listening to your sex conversations and all your confidential talks – Apple.

Apple apologized to its users on Wednesday for allowing third-party contractors to listen to Siri recordings. They pledged not to listen to your Siri conversations by default anymore.

The program in which Apple had third-party contractors listened to recordings for quality test of Siri’s performance received criticisms after a whistleblower told The Guardian  the voice assistant routinely recorded people having sex, as well as discussing confidential medical information, even drug deals.

“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said in a statement on Wednesday

READ ALSO  Billionaire owner of TikTok parent company ByteDance says he’s resigning due to lack of managerial skills

Apple suspended the recordings after the whistleblower, a former Apple contractor, exposed Apple’s eavesdropping practice to The Guardian newspaper in June 2019.

“The sound of a zip, Siri often hears as a trigger,” the contractor said.

Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch,” the Guardian reported.

Apple now said it will develop new guidelines for Siri grading program that will allow users to opt in to the program. It will not keep Siri audio recordings for grading Siri without users’ permission, it said.

Facebook, Google, and Oracle cloud are tracking the porn you watch in private

The phone manufacturer also said it will allow only its own employees to review the audio recordings — not third-party contractors. Apple promised to delete “any recording which is determined to be an inadvertent trigger of Siri.”

READ ALSO  People don't like working at Facebook as much as they used to – new data shows

“We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place,” the company said. “Those who choose to participate will be able to opt out at any time.”