The authoritative guide to ensuring science and technology make life on Earth better, not worse.

Amazon’s video doorbell: 1984 at the front door?

By Matt Field | January 3, 2019

Amazon's Seattle properties include The Spheres. Credit: Biodin CC BY-SA 4.0 via Wikimedia Commons.Amazon's Seattle properties include The Spheres. Credit: Biodin CC BY-SA 4.0 via Wikimedia Commons.

Civil libertarians are raising the alarm over a recent Amazon patent application describing a system that can compare images from video-capable doorbells against databases of people who’ve allegedly committed crimes. They fear the system will drive racial profiling and automatically alert the police when it registers suspicious people at the door.

Activists have criticized Amazon for pitching its facial recognition tool, known as Rekognition, to law enforcement and the US Immigration and Customs Enforcement agency. Now Amazon may be weighing whether to combine that tool or a similar one with video doorbells, which marry a doorbell with a security camera, allowing those inside a home or apartment to see, via their smartphones, who’s at the door. Amazon recently purchased a firm that makes video doorbells. The American Civil Liberties Union (ACLU) claims the facial recognition system described in the patent application could flag people to law enforcement for engaging in everyday activity—like knocking on a door.

“Simply walking up to a friend’s house could result in your face, your fingerprint, or your voice being flagged as ‘suspicious’ and delivered to a government database without your knowledge or consent,” ACLU lawyer Jacob Snow writes.

The patent application doesn’t mention Rekognition by name, but if that’s the tool the tech giant uses with its doorbell, activists could be justified in their concerns. The ACLU ran a test on Rekognition last year that compared pictures of members of the US Congress to a database of arrest photos. They found 28 false matches; a disproportionate share of them involved minority members. Famed civil rights leader John Lewis, for instance, was flagged as a criminal. Amazon says the ACLU didn’t use the settings it recommends for law enforcement searches.

RELATED:
AI goes nuclear

More broadly, researchers and members of Congress have concerns about inherent biases in facial recognition software. Georgetown Law researchers, for instance, found that black Americans are more likely to be identified by law enforcement facial recognition software because they are already disproportionately represented in law enforcement databases. A researcher at MIT found that systems produced by Microsoft, IBM, and Megvii are far more accurate when it comes to identifying the gender of whites than is the case with minorities, especially darker skinned women.

Snow, the ACLU lawyer, writes, “The history of discriminatory government surveillance makes clear that face surveillance will disproportionately harm people already targeted by the government and subjected to racial profiling and abuse—immigrants, people of color, and the formerly incarcerated.”

Amazon’s patent application describes a potential system that could create a composite facial image based on images from multiple cameras. Those composite images could be compared against a database containing images of alleged criminals. Based on the results, users could receive an alert and contact the police. Alternatively, the system could automatically notify law enforcement when a door-knocker is “matched” to someone in the criminal database.

The application describes a number of ways the potential system could operate. One scenario seems to describe a network of cameras in neighborhoods spying on people in the area and, if they register as “suspicious,” reporting them to the police:

“Once it is determined that the person is a suspicious person, an alert may be automatically sent to law enforcement. For example, the partial facial images may be captured at several houses within a neighborhood in response to motion events at each of the houses,” the application states.

RELATED:
AI goes nuclear

A composite image could then be “compared against databases of suspicious persons.” Law enforcement officials could then “make their own determination of whether action is required, such as dispatching police to the neighborhood.”

Snow paints a grim picture of the expansive surveillance web he believes Amazon is developing.

“With Amazon selling the devices, operating the servers, and pushing the technology on law enforcement, the company is building all the pieces of a surveillance network, reaching from the government all the way to our front doors,” he writes.

 

 


Publication Name: ACLU
To read what we're reading, click here

Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments