Welcome to part three of our series on facial recognition technology (FRT). Last week’s post discussed a few controversial uses of FRT as part of China’s Sharp Eyes surveillance network, where the technology is being used to more effectively repress ethnic and religious minorities. This week we’ll return to the United States and the debate around its use by law enforcement agencies.

 

Facial recognition is already widely used by law enforcement agencies throughout the country. Federal law enforcement agencies like the FBI use it. Their Next Generation Identification System uses FRT to compare images and video footage against the federal mugshot database. Immigration and Customs Enforcement (ICE) uses FRT for surveillance, as does the Drug Enforcement Administration (DEA). The US Secret Service  recently began incorporating it into their CCTV system surrounding the White House.

 

Even smaller state law enforcement agencies have experimented with FRT. The first court conviction involving on facial recognition happened way back in 2014, based on a positive identification by the Chicago Police Department. The Orlando Police Department  just announced a second pilot of the tech for this year.

So what’s the big deal?

 

As Microsoft President Brad Smith put it early last month:

 

For the first time, the world is on the threshold of technology that would give a government the ability to follow anyone anywhere, and everyone everywhere. It could know exactly where you are going, where you have been, and where you were yesterday as well. And this has profound potential ramifications for even just the fundamental civil liberties on which democratic societies rely.

 

Now, this is assuming a much more advanced, accurate form of the technology than currently exists – something even more advanced than what China has in place. Smith assumes we’ve got at least another 5 years before it’s feasible.

 

But even the imperfect FRT we currently have poses its own problems. As we have discussed in previous posts, FRT often exhibits a significant racial bias. It can be used to glean sensitive personal information, like medical information or sexual preference. It is also significantly less accurate than other forms of biometric identification.


It can be used to glean sensitive personal information, like medical information or sexual preference. It is also significantly less accurate than other forms of biometric identification.


Flashpoint: Amazon’s Rekognition

Amazon’s facial recognition solution, Rekognition has been getting a ton of press since it was revealed that the company has been marketing it specifically to US law enforcement agencies. The company has seen pushback from the public, shareholders, and US Government officials. In May, a spokesperson for Amazon  likened FRT to a computer – while the possibility exists for its misuse, that does not mean the technology is inherently bad. “Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology.”

In the opinion of this writer, this argument is both a fair point worth discussing and also an attempt to sidestep the underlying critique of the technology. The argument that a tool is a tool, the “guns don’t kill people” argument, doesn’t hold up to deeper scrutiny. As a society we regulate all sorts of tools. Some have restrictions on their purchase and use, like guns. Some are forbidden entirely, like eugenics. As a society, we’re ok with police having guns but we restrict their use (sometimes not so effectively). It might raise some eyebrows, however, if we gave the police landmines. It’s not anti-police or anti-law enforcement to question their methods. You could be, as Amazon has claimed  it is, “waveringly in support of our law enforcement, defense, and intelligence community” and still ask, “but why do the police need landmines?”

 

So are Rekognition and similar FRT surveillance solutions more like computers or landmines or some other tool?

 

Let’s look at what they do, very basically. They gather data on who is where, when. FRT could be thought of as more of an upgrade to existing surveillance technology than anything, since the US already employs extensive surveillance technology – wiretaps and bugs, cell phone and laptop tracking, spyware, security cameras, etc. The major functional difference is that with FRT, the police could program a computer to track an individual, or theoretically all individuals, anywhere with camera surveillance. It would save a lot of leg work. It also potentially feeds existing biases – primary against those with prior contact with the police.

This bias-reinforcement happens with fingerprint analysis already. Let’s say police find a fingerprint at or near a crime scene – if that print is partial, which is common, it won’t be a match to any one person but instead to thousands. When police run those prints through their database, they will match first and foremost the people who are already in that database. Again, that print could match thousands of people, but the priority will be finding those who have already had run-ins with the police. Convenient? Yes. Biased? Yes.

It is currently unknown the extent to which law enforcement agencies use Rekognition. It is public knowledge that Amazon has sought out the business of the Department of Homeland Security, including ICE. It is also public knowledge that ICE has recently contracted for USD 28,000 to conceal FRT surveillance cameras in streetlights, which the agency can do without any oversight from the public or the courts. It is not known where those cameras are being employed or for what purpose.

Christie Crawford, owner of Cowboy Streetlight Concealments and the recipient of the ICE contract told Quartz in November that, “things are always being watched. It doesn’t matter if you’re driving down the street or visiting a friend, if government of law enforcement has a reason to set up surveillance, there’s a great technology out there to do it.


“Facial Recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression.”

- Brad Smith -


In response to the controversy surrounding Rekognition, 8 US Members of Congress sent a public letter  to Amazon questioning and implicitly criticizing the company’s approach. Signatories included famous civil rights advocate John Lewis (D-GA) and Jan Schakowsky (D-IL), whose district includes Keyo’s Chicago offices.

The letter expresses concerns about potential racial bias, about accuracy concerns, and that Rekognition’s use would “stifle Americans’ willingness to exercise their First Amendment Rights in public”. According to the letter, Amazon has not responded to previous inquiries by Congress about third party verification, a lack of training by Amazon in the software’s use, which government bodies currently use the software, or the frequency of internal tests and audits. It also reprimands Amazon for soliciting business from ICE, the DEA, and other agencies directly, sidestepping Congressional oversight.

Assuming Amazon is correct and FRT is a tool like any other, as a society we need to consider what the proper uses of such a tool should be. Will restrictions on FRT’s use come through legislation, norms, industry standards? What, if any exceptions should be added for law enforcement agencies? We should do this because the technology is advancing faster than legislation, or even public opinion, can keep up; and because as Brad Smith wrote earlier this year, “Facial Recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression.”

Next week’s post will discuss the current limits on the technology, from limits inherent to the technology to US state and federal legislation to external legislation like GDPR. It will discuss potential legal problems FRT faces across multiple use cases, not just surveillance. Much of the debate of the legality of biometrics in general, and especially FRT rests on questions of consent. This series has only briefly touched on the subject of consent so far; we’ve been saving our favorite subject for last.

Yaay! Welcome to our waiting list!

We are excited to have along for the ride.

Share this page with friends, the more signups we get from your area the faster contact you with exciting new steps.
Google+
Google+
Tweet about this on Twitter
Twitter
Share on LinkedIn
Linkedin
Share on Facebook
Facebook