In late July, we reported about a serious privacy issue related to Apple Siri. People who check Siri voice data for inaccuracies were also listening to user’s private conversations. Apple responded to the issue that it only reviews 1% of the recordings. Early this moth, Apple changed its mind and announced that it is suspending the Siri quality control program which allowed contracts to listen to Siri recordings. A class-action lawsuit has been filed against Apple as a result of its Siri recordings which came to light last week. Today, Apple announced that several changes to improve Siri’s privacy protections.
We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.
- First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
- Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
- Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
Apple will resume Siri quality evaluation process later this fall when software updates are released to our users based on the above changes.