SAN FRANCISCO — Google has apologized after its new Photos
application identified black people as "gorillas."
On Sunday Brooklyn programmer Jacky Alciné tweeted a
screenshot of photos he had uploaded in which the app had labeled Alcine and a
friend, both African American, "gorillas."
Image recognition software is still a nascent technology but
its use is spreading quickly. Google launched its Photos app at Google I/O in
May, touting its machine-learning smarts to recognize people, places and events
on its own.
Yontan Zunger, an engineer and the company's chief architect
of Google+, responded swiftly to Alciné on Twitter: "This is 100% Not
OK." And he promised that Google's Photos team was working on a fix.
And it's only
photos I have with her it's doing this with (results truncated b/c personal):
pic.twitter.com/h7MTXd3wgo
— Crémas CHAMPION
(@jackyalcine) June 29, 2015
The first fix was not effective so Google ultimately decided
not to give any photos a "gorilla" tag. And Zunger said that Google
is working on "longer-term fixes," including "better recognition
of dark skinned faces."
@jackyalcine Thank
you for telling us so quickly! Sheesh. High on my list of bugs you *never* want
to see happen. ::shudder::
— Yonatan Zunger
(@yonatanzunger) June 29, 2015
In a statement, Google spokeswoman Katie Watson said:
"We're appalled and genuinely sorry that this happened. We are taking
immediate action to prevent this type of result from appearing. There is still
clearly a lot of work to do with automatic image labeling, and we're looking at
how we can prevent these types of mistakes from happening in the future."
Alciné responded on Twitter: "I understand HOW this
happens; the problem is moreso on the WHY."
The gaffes point to the chronic lack of diversity in Silicon
Valley technology companies, writes Charles Pulliam-Moore, a reporter for the
media outlet Fusion.
"It's hardly the first time that we've seen software
show an implicit bias against people of color," he wrote.
Last month Flickr also rolled out new technology to help tag
photos. It identified a black man and a white woman as apes on two occasions.
"The mistakes are made because algorithms, smart as
they are, are terrible at making actual sense of pictures they analyze. Instead
of "seeing" a face, algorithms identify shapes, colors, and patterns
to make educated guesses as to what the picture might actually be. This works
wonderfully for inanimate objects or iconic things like landmarks, but it's
proven to be a sticking point for people of color time and time again."
At Google, seven out of 10 employees are men. Most employees
are white (60%) and Asian (31%). Latinos made up just 3% of the work force and
African Americans just 2% — a far cry from fulfilling the mission of Google
founders Larry Page and Sergey Brin to have their company reflect the racial
and ethnic diversity of its users in the USA and around the world.
"Perhaps if the titans of Silicon Valley hired more
engineers of color, things like this wouldn't happen so often,"
Pulliam-Moore wrote "Or, you know, ever."
Joelle Emerson, founder and CEO of Paradigm, a strategy firm
that consults with tech companies on diversity and inclusion, says the incident
should be a wake-up call for Silicon Valley.
"How much more evidence do we need that the lack of
diversity in tech companies has a real, and sometimes very serious, impact on
how products are designed and developed?" Emerson said. "Every single
tech leader should read this and worry. And after that, they should go have a
meeting to figure out what they're going to do to make sure nothing like this
ever happens again."
No comments:
Post a Comment