October 31, 2024
Apple Apologizes for Falling Short on Siri Privacy, Outlines Changes Coming This Fall

Apple Apologizes for Falling Short on Siri Privacy, Outlines Changes Coming This Fall

Posted August 28, 2019 at 3:23pm by iClarified
Apple issued a press release today apologizing for falling short on Siri privacy protections and outlining changes that will be put in place this fall.

Earlier this month, the company announced it was suspending Siri grading globally following a report that contractors working on Siri regularly hear confidential information including medical details, drug deals, and sexual encounters.

Helping Apple improve Siri via audio samples will now be opt-in, only Apple employees will be allowed to listen to these audio samples, and inadvertent triggers will be deleted.


More details in the full announcement shared below...

-----
At Apple, we believe privacy is a fundamental human right. We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. This is true for our services as well. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while vigilantly protecting their privacy.

We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.

How Siri Protects Your Privacy
Siri has been engineered to protect user privacy from the beginning. We focus on doing as much on device as possible, minimizing the amount of data we collect with Siri. When we store Siri data on our servers, we don’t use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private.


Siri uses as little data as possible to deliver an accurate result. When you ask a question about a sporting event, for example, Siri uses your general location to provide suitable results. But if you ask for the nearest grocery store, more specific location data is used.

If you ask Siri to read your unread messages, Siri simply instructs your device to read aloud your unread messages. The contents of your messages aren’t transmitted to Siri’s servers, because that isn’t necessary to fulfill your request.

Siri uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device’s data is disassociated from the random identifier.

In iOS, we offer details on the data Siri accesses, and how we protect your information in the process, in Settings > Siri & Search > About Ask Siri & Privacy.

How Your Data Makes Siri Better
In order for Siri to more accurately complete personalized tasks, it collects and stores certain information from your device. For instance, when Siri encounters an uncommon name, it may use names from your Contacts to make sure it recognizes the name correctly.

Siri also relies on data from your interactions with it. This includes the audio of your request and a computer-generated transcription of it. Apple sometimes uses the audio recording of a request, as well as the transcript, in a machine learning process that “trains” Siri to improve.

Before we suspended grading, our process involved reviewing a small sample of audio from Siri requests — less than 0.2 percent — and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability. For example, did the user intend to wake Siri? Did Siri hear the request accurately? And did Siri respond appropriately to the request?

Changes We’re Making
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

● First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
● Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
● Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.
-----

Let us know what you think about Apple's response in the comments and please follow iClarified on Twitter, Facebook, or RSS for updates.


Apple Apologizes for Falling Short on Siri Privacy, Outlines Changes Coming This Fall
Add Comment
Would you like to be notified when someone replies or adds a new comment?
Yes (All Threads)
Yes (This Thread Only)
No
iClarified Icon
Notifications
Would you like to be notified when we post a new Apple news article or tutorial?
Yes
No
Comments (1)
You must login or register to add a comment...
1reader
1reader - August 28, 2019 at 3:48pm
“Trust us, please, please, pretty please-please?!”
Recent. Read the latest Apple News.
RECENT
Tutorials. Help is here.
TUTORIALS
Where to Download macOS Sonoma
Where to Download macOS Ventura
AppleTV Firmware Download Locations
Where To Download iPad Firmware Files From
Where To Download iPhone Firmware Files From
Deals. Save on Apple devices and accessories.
DEALS