Apple iPhone X couldn’t distinguish between Chinese faces

Last week, in Nanjing, a major city in the east of China, a woman by the name of Yan was twice offered a refund from Apple for her faulty iPhoneX, which her colleague unlocked using facial recognition technology. Both women are ethnically Chinese.

Yan told local news that the first time that this happened, she called the iPhone hotline, but they didn’t believe her. It wasn’t until she and her colleague went to a local Apple store and showed the store’s staff that they offered her a refund and she purchased a new phone, thinking that perhaps a faulty camera was to blame.

Customers line up to enter the second Apple store in Nanjing, Jiangsu Province of China. (Photo by VCG/VCG via Getty Images)

But the second phone had the same problem, suggesting that it was not a faulty camera, as the store’s workers suggested, but an issue with the software itself.

This would not be the first case in which facial recognition software, and the AI behind it, has had trouble recognizing non-white faces.

In 2015, Google Photos accidentally tagged a photo of two African-Americans as gorillas, while in 2009, [HP computers] had trouble recognizing and tracking black faces – but no problem with white faces. That same year, Nikon’s camera software was caught mislabeling an Asian face as blinking.

“This is fundamentally a data problem,” wrote Kate Crawford, a principal researcher at Microsoft and the co-chairman of the Obama White House’s Symposium on Society and A.I. “Algorithms learn by being fed certain images, often chosen by engineers, and the system builds a model of the world based on those images. If a system is trained on photos of people who are overwhelmingly white, it will have a harder time recognizing nonwhite faces.”

Jacky Alcine, the Brooklyn-based programmer whose photo was mislabeled by Google, agreed. Of his experience he said, “This could have been avoided with accurate and more complete classifying of black people.”

But the racism that is coded into AI, even if it is unintentional, has implications beyond just facial recognition.

ProPublica investigation in 2016 found that black criminals were twice as likely to be mistakenly flagged as likely to recommit crimes than white criminals, while the growing trend of “predictive policing” uses algorithms to forecast crime and direct police resources accordingly.

Yet minority communities have been historically over-policed, leading to the possibility that “this software risks perpetuating an already vicious cycle,” says Crawford.

Back in Nanjing, China, Yan received a second refund on her second iPhoneX. From local news reports, it’s unclear as to whether she then purchased a third.

What was at stake, this time, may have just been a single consumer. But Yan’s case was an example of the continued need for the technology industry to design with diversity and inclusivity in mind.

Source:yahoo.com

Check Also

No Eggs would be allowed in Tamale, Eggs sellers association.

Source: savannanews24.com/ Abdul Basit Sulemana The eggs sellers association in Tamale yesterday disclosed their displeasure …

Consider Hussein Yussif in your next government- Northern Patriots pleads with Akufo-Addo

Source:savannanews24.com/Hamza Lansah Lolly: The Northern Patriots, a youth wing of the New Patriotic Party (NPP) …

Don’t engage in acts of violence in Tamale- REGSEC warns NDC demonstrators

Source:savannanews24.com/Hamza Lnsah Lolly: The Northern Regional Security Council (REGSEC) has caution supporters of the National …

Vote for humble Anyars, reject arrogant Murtala – NPP-USA to Tamale Central

Source:savannanwws24.com/Hamza Lansah Lolly: The NPP USA branch has urged the people of the Tamale Central …

Leave a Reply

Your email address will not be published. Required fields are marked *