HP’s social media strategist, Tony “Frosty” Welch, aka @frostola on Twitter, has been getting mostly high marks for his handling of a YouTube video that dramatizes how HP’s facial recognition software tracks white faces a whole lot better than black faces. We’re not so sure, however, how well Frosty is really doing.
The vid that sparked the controversy is all over the web by now. (It’s also here, of course, below.) It was made by two people—an African-American man named Desi and a white woman named Wanda—who appear to be co-workers in a computer store. It’s hilarious and avoids any mean-spirited charges about HP’s intentions, but Desi concludes the show by asserting that “HP computers are racist.” He has a right to complain. He bought one for Christmas and then discovered it wouldn’t recognize its owner’s face.
The video was first posted on December 9th and basically languished for a couple of weeks before becoming a modest viral hit sometime around December 20th. Two days after that, it had racked up 370,000+ views and a couple thousand comments on YouTube.
HP responded on the 21st, which was fast and was absolutely the right thing to do in the real-time world of social media. The forum for their response was a section of the corporate blog, The Next Bench. Also a good choice. But the content of their response was worse than lame to our (admittedly content-sensitive) sensibilities. Instead of addressing the implications of the video directly, Frosty led with a weirdly side-stepping headline: “Customer Feedback is Important to Us.” Not exactly transparent, open and honest; not exactly facing the music.
The tone of the post beneath the lame headline was technocratic and defensive. In the very first paragraph, Frosty enthused about what a great company he works for. “On any given day,” he gushed, “I might collaborate with HP employees in regions ranging from Japan to India and Latin America to Europe.” Wonder why they left Africa out of Frosty’s tour of the continents? You could feel HP biting its corporate tongue to keep from telling us that some of their best friends (and employees) are black.
Following this bout of racial tin-earism, Frosty went on to speculate that the software’s failure to track dark faces was either bad lighting or faulty algorithms or both…or something else. He promised to work on it.
That same day, however, before HP could get to work on it, the news editor of Laptop Magazine, K. T. Bradford (@ktbradford) posted a piece charitably titled “HP Face Tracking Software Not Racist, Just Contrast Challenged.” Using multiple videos and solid logic, Bradford demonstrates the problem lies largely in the Backlight Correction settings of HP’s software. (Frosty deemed this “interesting” in a later Twitter exchange with Bradford.) More to the point, Bradford pinpointed the ways in which HP’s software problems actually do involve a degree of institutional racism. As Bradford put it:
“Though it’s obvious that the bias isn’t racial in nature
but instead based on lighting conditions and the camera’s ability to
distinguish between shadow and human face, this does bring up an interesting
question: when testing this software, how many of the involved project
members were dark-skinned? How many different lighting conditions were tested?
The software was likely developed by a third party, not HP, so this is probably
an issue on a number of webcams across several manufacturers. Perhaps this
incident is an indication that the software makers should involve a greater
pool of testers when designing these features.”
Peeling back Bradford’s charity toward HP, I think we’d all agree that this problem would never have existed if there were more blacks in HP senior management or middle management or on the development or QA teams that worked on the software. So the fact that HP’s software has special problems with black faces is, in fact, a reflection of a racial imbalance. If HP were a minority-owned company, I do not believe this problem would exist. (If you disagree, you’re probably white like me. So imagine for a moment a piece of facial recognition software that has extreme difficulty recognizing white faces. Hard to imagine, isn’t it?)
We understand that HP isn’t alone in having this kind of built-in bias problem. But life will be better for HP if it learns to acknowledge problems head-on and deal with them. The correct social-media-thing-to-do is (A) admit the problem straightforwardly, and then (B) do something about it.
HP has failed, initially, at Part A. We’re hoping it is, nevertheless, applying itself diligently to Part B.
Meanwhile, we’d advise Frosty to be more direct if he finds himself in similar predicaments in the future. (We’re sure he understands that some of the Twitter followers who were praising his approach are, like him, HP employees.)