geekchick77: (Default)
[personal profile] geekchick77
Yesterday I came across several articles about facial recognition software failing on non-Caucasian faces. The articles are several years old, at this point, but they got me thinking. The first is about a Nikon Camera that “helpfully” asked “Did someone blink?” when taking photographs of smiling east Asian people. The second is a HP web-cam failing to track a dark-skinned face. The third was about the Microsoft Kinect having issues recognizing some dark-skinned gamers.

I am not familiar with the specifics of the technology behind these devices. However, I doubt anyone set out to create substandard software. Certainly, in my years as a software engineer, I have never seen anyone diabolically plotting to exclude whole classes of users. What I have seen is cluelessness and lack of empathy. Engineers often have trouble imagining using a system as someone other than themselves, whether that someone is less technical, from a different culture, or simply looks different.

In each of these cases, it seems very likely to me that the system has some kind of basic AI, perhaps a neural network, and was trained using inadequate sample data. When it was tested, it was no doubt done using similar sets of data. It is quite possible that during the entire development process, it did not occur to a single person to question whether the software would work adequately for darker skin or different facial features.

People often ask me why I care about the lack of diversity in software development. I do care, deeply, and for me much of comes from a sense of justice. It seems terribly unfair that people are discouraged from pursuing a field they are interested in and care about. Even if the inherent injustice does not bother you, though, here is a reason to care:

We are making worse software because there isn’t anyone to question the basic assumptions of predominantly young, white, male software developers.


I think that computer people as a whole tend to be invested in rationality and logic to the point that it is difficult to recognize our own humanity. We are affected by social conditioning and cultural values just like anyone else. If we are going to progress toward the meritocracy we say we value, we need to become aware of our blind spots. Ultimately, I hope we will see a broader range of people in software development. In the meantime, we can try to be more thoughtful and create excellent software that works for ALL our users.

Image processing and darkness

Date: 2012-06-16 12:26 am (UTC)
From: [identity profile] https://www.google.com/accounts/o8/id?id=AItOawnaCCt0upjULlPjqkRIxBhb0CrkSu7aIQM
All your links are broken. Might want to fix that.

Cunsumer Reports tested the Kinect (http://news.consumerreports.org/electronics/2010/11/consumer-reports-tests-kinect-facial-recognition-problems-video.html) and didn't find what GameStop reported.

Kinect has facial recognition issues in dark environments because (a) it's just a fancy camera and (b) image processing is hard, so recognition of anything is pretty finicky to start with. In sub-par lighting conditions all bets are off. A person with more contrast might have an ever so slightly better chance of getting recognized in a dark room? But that doesn't even matter, because it's out of the supported operational parameters of the device.

Re: usability design

Date: 2012-06-19 06:50 pm (UTC)
From: (Anonymous)
"Engineers often have trouble imagining using a system as someone other than themselves..."

Definitely. The creator of something always understands it best, while it may not be intuitive to others. This is why usability testing done by non-developers (even better if of the targeted demographic) is important - I get my family to test a lot of my web UI stuff.

Profile

geekchick77: (Default)
Jessamyn Smith

October 2014

S M T W T F S
   12 34
567891011
12131415161718
19202122232425
262728293031 

Style Credit

Expand Cut Tags

No cut tags
Page generated Nov. 23rd, 2014 10:34 am
Powered by Dreamwidth Studios