Report: NYPD Used Celebrity Images, Including Woody Harrelson, In Facial Recognition Dragnets
May 17, 2019, 5:40 p.m.
A report revealed that NYPD detectives used an image of Woody Harrelson to help catch a serial beer thief.

Actor Woody Harrelson, whose photo was used by the NYPD in facial recognition software.
A report released yesterday by Georgetown University's Center on Privacy and Technology revealed unorthodox facial recognition techniques the NYPD and other law enforcement agencies are using—specifically, disseminating celebrity photos to try and identify potential suspects.
In the report, author Clare Garvie highlights an incident from April 2017, when New York City detectives were out looking for a serial beer thief who'd been allegedly stealing brews from CVS. While the pharmacy's surveillance camera caught the robberies on tape, the image of said thief was too pixelated for the NYPD's Facial Identification Section to make out who it was. They were stumped, until one of the detectives apparently noticed that the suspect resembled actor Woody Harrelson. So they looked for photos of Harrelson online, and once they got a "match," they sent it over to investigators, who used it to identify someone and arrest him for petit larceny.
Police departments around the United States, the NYPD included, have increasingly used facial recognition in recent years in attempts to positively identify crime suspects. The department typically starts with a photo, which they run through their system. But, as Garvie told WNYC, they found that since many of these photos are low-quality, agencies are turning to celebrity photos, heavily-edited photos, or photos of someone who they believe looks like a possible suspect.
Detractors say the use of celebrity photos carries unintended risks. "It is one thing for a company to build a face recognition system designed to help individuals find their celebrity doppelgänger or painting lookalike for entertainment purposes," Garvie states in her report, noting that on another occasion, the FIS used the photograph of a Knicks player to plumb their facial recognition database for someone wanted for a Brooklyn assault. "It's quite another to use these techniques to identify criminal suspects, who may be deprived of their liberty and ultimately prosecuted based on the match."
NYPD spokesperson Sophia Mason said in a statement that "facial recognition is merely a lead; it is not a positive identification and it is not probable cause to arrest," and went on to say that "No one has ever been arrested on the basis of a facial recognition match alone. As with any lead, further investigation is always needed to develop probable cause to arrest."
The department says they don't "engage in mass or random collection of facial records from NYPD camera systems, the internet, or social media," instead leading with one image which, when put side by side with others, starts to identify leads that can help them possibly catch culprits. (Such as the recent arrest of a man for allegedly throwing urine at MTA conductors.)
While these facial recognition methods only produce possible identifications and not positive ones, there's no one streamlined way that departments find evidence to bolster potential matches, which widens the margin of error.
In New York, legislators are currently attempting to curb landlords' facial recognition technology at private residences, a measure that stemmed from rent-stabilized tenants protesting their landlord's use of it at their complex. As facial recognition technology advances at an astonishing clip, and its potential for surveillance and tracking becomes more worrisome, lawmakers in some cities are facing pressure to limit its use. Earlier this week, San Francisco became the first city to ban local agencies from using facial recognition software.