-
Apple’s revelations about keeping/scanning Siri recordings demand a response
Excellent article out this morning from Johnny Evans in Computerworld.
You may have heard on Friday the Guardian assertion:
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant
For a company that touts its privacy superiority, that’s clearly way over the line. Even I was shocked – and I’ve been jaded by years of Microsoft’s snooping.
This morning, Johnny Evans published a clear plan for fixing the wrongs:
- Apple should introduce much clearer and easier to understand privacy warnings around use of Siri on its devices.
- When setting up Siri on a new device you as a user should be given the chance to explicitly reject use of your voice for any purpose other than the original request.
- Apple should bring this [contracted human snooping] work in-house, become completely accountable for what its voice workers and management do with these recordings, and ensure customers have some way in which to punish any infraction of their data privacy.
- In the event Siri is invoked but no specific request is made, the system should be smart enough to ignore the interaction and delete any recording made as a result of that interaction.
- Only in those instances in which different voice recognition systems can’t find a way to agree on what is said should human ears be necessary.
It’s an excellent article. Windows users take note.