Advertisement
Tech

Leaked docs reveal Apple rewrote Siri answers for ‘sensitive topics’

Siri has new answers about feminism and #MeToo.

Photo of Mikael Thalen

Mikael Thalen

Article Lead Image

Leaked documents reveal that Apple rewrote responses for its Siri voice assistant on topics deemed sensitive, including the #MeToo movement, The Guardian reports.

Featured Video

While Siri once replied that it would “blush if I could” when being called a “slut,” a recent rewrite has made it so the voice assistant will now simply say “I won’t respond to that.”

Siri has also been reprogrammed not to use the word “feminism” when asked about the topic but to instead use the term “equality.”

The documents further state that certain questions “can be deflected… however, care must be taken here to be neutral.”

Advertisement

“Siri should be guarded when dealing with potentially controversial content,” Apple’s internal guidelines say.

If Siri is asked directly whether it is a feminist or a supporter of gender equality, responses are about treating everyone equally.

“I believe that all voices are created equal and worth equal respect,” one answer says.

In previous years Siri would instead respond with statements such as “I just don’t get this whole gender thing.”

Advertisement

In a statement to The Guardian, Apple argued that the new changes are designed to be factual rather than based on opinion.

“Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers,” the company said. “Our approach is to be factual with inclusive responses rather than offer opinions.”

The documents further stress that Siri is a “non-human” and “genderless” entity that does not have its own “point of view.”

“An artificial being should not impose its own principles, values, or opinions on a human,” the documents add.

Advertisement

The information about the changes to the voice assistant was provided by a former Siri grader tasked with reviewing the accuracy of Siri’s responses.

The program has since been shut down after it was revealed last month that human contractors were being used to listen in on Siri users.

READ MORE: 

Advertisement

H/T The Guardian

 
The Daily Dot