MICHAEL TOMINAC
  • Home
  • About
    • Michael Tominac
    • Testimonials
  • Coaching
  • FAQ
  • Blog
  • Contact
  • Home
  • About
    • Michael Tominac
    • Testimonials
  • Coaching
  • FAQ
  • Blog
  • Contact

Apple contractors 'regularly hear confidential details' on Siri recordings

7/28/2019

 
Author: Alex Hern
Picture
Workers hear drug deals, medical details and people having sex, says whistleblower.

Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.
Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.

But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.

Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

A whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information.

Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

​Read More: Here


Comments are closed.

    Archives

    October 2019
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    March 2016
    February 2016
    December 2015
    July 2015

    Categories

    All
    Business
    Culture
    Finances
    Health Wellness
    Leadership
    Management
    Michael Tominac
    Music
    Parenting
    Privacy
    Relationships
    Religion Spirituality
    Science Technology
    Vaccine Debate

    RSS Feed

© 2015 - 2022  MICHAEL TOMINAC INTERNATIONAL, INC.  All rights reserved.
Terms of Use and Privacy Policy

MICHAEL TOMINAC INTERNATIONAL, INC.
P.O. Box 445
Nanuet, NY 10954 USA
845.501.4108​