Not only Amazon and Google: Apple also seems to evaluate voice recordings of its users of real people. The only way to stop it is to not use Siri.
In recent weeks, among other things, Amazon and Google had to take criticism that employees listen to anonymous user conversations with Alexa and the Google Assistant and evaluate. Now this accusation is also facing Apple in the room. Because, as the British Guardian reports, conversations that guide users with language assistant Siri are sometimes shared with third-party companies for evaluation.
All providers of voice services can evaluate data
The evaluation serves the purpose of improving speech recognition and reactions of Siri. In a statement to the newspaper Apple said that only about one percent of the records would be passed to the evaluation. In addition, the files would be anonymized, so that they can no longer be assigned to a specific user.
One of the employees, who works in a third-party company on the evaluation of the Siri data, accuses Apple, however, that the anonymization partially unclean. Thus, the transmitted data sets would also include GPS data and contact information.
In addition, the third-party companies would also receive audio recordings from users who were apparently made unintentionally. So it should happen regularly that Siri is accidentally activated by a misunderstood command. Especially with newer Apple Watch models it should come more often to unintentional records, because the voice assistant by the movements of the wrist is active.
Customers only have the option to completely dispense with Siri
In particular, the staff rated Siri users as being insufficiently informed that their data could be evaluated by humans – an allegation Amazon and Google are exposed to. Also, there would be no way for users to prevent the storage and evaluation of their data in principle. The only way would be to not even use the language assistant.