
In recent
months, pressure has been mounting for major tech firms to develop strong
policies regarding facial recognition. Microsoft has helped lead the way on
that front, promising to put in place stricter policies, calling for greater
regulation and asking fellow companies to follow suit.
Hidden
toward the end of a blog post about using artificial intelligence to benefit
health clinics in Asia, Google SVP Kent Walker affirmed the company’s
commitment not to sell facial recognition APIs. The executive cites concerns
over how the technology could be abused.
“[F]acial
recognition technology has benefits in areas like new assistive technologies
and tools to help find missing persons, with more promising applications on the
horizon,” Walker writes. “However, like many technologies with multiple uses,
facial recognition merits careful consideration to ensure its use is aligned
with our principles and values, and avoids abuse and harmful outcomes. We
continue to work with many organizations to identify and address these
challenges, and unlike some other companies, Google Cloud has chosen not to
offer general-purpose facial recognition APIs before working through important
technology and policy questions.”
In an
interview this week, CEO Sundar Pichai addressed similar growing concerns
around AI ethics. “I think tech has to realize it just can’t build it and then
fix it,” he told The Washington Post. “I think that doesn’t work,” adding that
artificial intelligence could ultimately prove “far more dangerous than nukes.”
The ACLU,
which has offered sharp criticism over privacy and racial profiling concerns,
lauded the statement. In the next paragraph, however, the company promised to
continue to apply pressure on these large orgs.
“We will
continue to put Google’s feet to the fire to make sure it doesn’t build or sell
a face surveillance product that violates civil and human rights,” ACLU tech
director Nicole Ozer said in a statement. “We also renew our call on Amazon and
Microsoft to not provide dangerous face surveillance to the government.
Companies have a responsibility to make sure their products can’t be used to
attack communities and harm civil rights and liberties — it’s past time all
companies own up to that responsibility.”
The
organization has offered particularly sharp criticism against Amazon for its
Rekognition software. This week, it also called out the company’s patent
application for a smart doorbell that uses facial recognition to identify
“suspicious” visitors.
Comments