Tuesday, December 19, 2017

iPhoneX cannot identify minority faces?

Apple's iPhoneX has "Face ID," a face recognition architecture for the security lock. Thus, the user can log in with a glance at the incamera of the iPhone. It is a replacement of fingerprint certification equipped with conventional iPhones. Since iPhoneX has no Home Button, Apple has introduced this innovative solution.


However, it does not work occasionally. Misidentification occurs in several situations, as the media report.

A Chinese woman's colleague could unlock the lock with her own face. This case was introduced as "racial bias" by a blogger. And it ignited controversy.

International Business Times: Is the iPhone facial recognition feature racist?

In my sense, Apple has no intention to discriminate Asian or other people. In an objective view, some Asian people including Chinese and Japanese have a lower nose and relatively flat face. This fact can be associated with increased misrecognition rate of face ID. Indeed, some foreigners say they can hardly distinguish a particular Japanese from others.

Therefore, the criticism against Apple as racial discrimination is not valid, I think.

But, there is another issue around this case. I remember Google AI misidentified a black person as a gorilla. This issue cannot be fixed with enhanced quality of facial recognition. Naming a human as a gorilla is bad regardless of their figures. The AI lacks such common sense.

My past entry: Microsoft made a Hitler-lover AI, mistakenly?

The current issue contains less ethical elements than gorilla misidentification. Nonetheless, we will face a similar situation on this topic. For example, A Brother of the user is tricking to unlock the iPhoneX. In this case, will this AI let the brother open the lock? Or does it sense "something bad" on the cheater? Try it.

No comments:

Post a Comment