Apple’s voice-controlled mobile assistant, Siri, is often under fire from various critics for bias in how it “serves” you with its soft, indulgent, and possibly sexist female voice. Forbes puts forth the argument that, at its core, Siri is unapologetically sexist and fails to consider women. Here’s an excerpt:[quote]The results when I asked for an abortion were even worse; even though that same Planned Parenthood performs abortions, Siri claimed it had no knowledge of any abortion clinics in the area. Other women running similar trials have had the same problem, if not worse. In some cases, Siri suggests crisis pregnancy centers when you mention the word “abortion”, even though CPCs not only don’t provide abortions, but are established solely to lure unsuspecting women in and bully them out of the choice to abort.[/quote]
This isn’t what I’d call an open-minded critique, nor is it particularly well-researched. The author hinges much of her argument on Siri’s voice being female without doing the due diligence to explain why Siri in the UK is male (or does she simply postulate that the entirety of the UK is sexually liberated?). And some of her arguments run the gamut from “forced” to “downright weird” (such as “But no matter how many ways I arranged mouth-based words — such as ‘lick’ or ‘eat’ — with an alternate name for a cat, Siri was confused and kept coming up with a name of a friend in contacts”). Larger (and more adult) arguments, such as the question of Siri’s inability to locate birth control resources, have been debated to death and wild assumptions about Apple’s over-arching pro-life/pro-choice stance abound. We (and many others) have addressed this controversy several times in previous articles and won’t rehash those stories here. Let’s just say that Apple has been clear that the problem was unintentional and the software is still in beta, and the beta period is for questions and feedback… not accusations.
On the other hand, if the stumblings of this beta software assistant were evenly distributed among various political or philosophical perspectives, it would be easier to believe that Siri’s blind spots are purely due to its “work in progress” status. If, however, you can type “I need a blow job,” and Siri replies with a list of escort services close to you, it becomes clear that Apple has prioritized certain kinds of assistance above others. While it’s the author herself who discredits her own allegations of sexism against Siri due to bias and poor logic, Forbes‘ article seems to underline the old saying “a broken clock is still right twice a day”. Put simply: It’s possible to write a bad article about a real problem.