Privacy Concerns heard. Apple changes the way it listens to Siri.

Today Apple announced major changes in its “Siri Audio Grading Program” after getting criticized for using humans to listen to audio recordings collected from Siri without their knowledge or consent.

Apple did anonymize the data sent to its contractors however these conversations included private discussions between doctors and patients, business deals, some “criminal dealings”, sex and so on. Even though the data was anonymized peoples names were said during the recordings which could lead to possible embarrassment.

Today Apple released “Improving Siri’s privacy protection” where they state they intend to continue the program in the fall but only after making significant changes to it.

  • First, Apple will no longer retain audio recordings of Siri interactions by default. Instead, the company will continue to use computer-generated transcripts to help Siri improve.
  • Second, Apple will allow users to opt-in to having their audio recordings listened to by human reviewers to help improve Siri’s responses. Users who choose to participate can opt-out at any time.
  • Third, if you opt into the grading program, only Apple employees will be allowed to listen to audio samples of your Siri interactions, rather than third-party contractors. The company also aims to delete Siri recordings when it determines users triggered it accidentally.

Apple also assured its users that Siri has never been used by an outside company saying, “When we store Siri data on our servers, we don’t use it to build a marketing profile, and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private.”