Albany Lawmakers Introduce Bill Banning Landlords From Using Facial Recognition Technology

May 15, 2019, 11:01 a.m.

The move to regulate facial recognition arises out of mounting privacy and civil liberty concerns, and the technology's emergence in New York City residential buildings.

Brooklyn tenants are fighting a plan by their landlord, Nelson Management, to install facial recognition technology at the entrances of their rent-regulated complex in Brownsville.

Brooklyn tenants are fighting a plan by their landlord, Nelson Management, to install facial recognition technology at the entrances of their rent-regulated complex in Brownsville.

A pair of New York state legislators want to prohibit residential landlords from using facial recognition, a technology that is largely unregulated in the United States and which has come under scrutiny in recent months in New York City.

The bill, which was introduced Tuesday by State Senator Brad Hoylman and state Assemblywoman Latrice Walker, would authorize the Attorney General to seek injunctions and penalties of up to $10,000 against landlords who use facial recognition systems on their premises. It would also allow tenants to pursue civil lawsuits against such landlords.

The latest move to regulate facial recognition in New York arose specifically from efforts by a group of rent-stabilized tenants in Brownsville protesting their landlord for trying to install facial recognition technology at the entrances of their complex known as Atlantic Towers. Walker's district includes parts of Brownsville.

"It was the tipping point for me because facial recognition presents a major privacy concern," said Hoylman, who represents large swaths of Manhattan, including the Upper West Side, Midtown, and the downtown. "It gives the landlords the ability to track every entry and exit of tenants and their guests. Fundamentally, tenants should have the right to freely go to and from their homes."

Earlier this month, more than 130 of the tenants in Brownsville filed a complaint with the state against their landlord, Nelson Management. The residents, most of whom are black, expressed concerns about how the data would be stored and used as well as how reliable it would be. Two studies have shown that the tool is inaccurate when used on people of color. And civil liberties groups have charged that facial recognition systems have been disproportionately used against minorities and immigrants.

In China, facial recognition technology is in widespread use, with increasingly Orwellian intent. Most recently, the country has been criticized by human rights advocates for using advanced facial recognition technology to track and control Uighurs, ethnic Muslims that have been detained by Chinese authorities.

Because Atlantic Towers is rent-stabilized, the landlord was required to seek approval from the state’s Homes and Community Renewal agency before installing the facial recognition system because it constituted a change in tenant services. The state has yet to make a decision.

Few states have laws governing the collection of biometric data and none do when it comes to residential spaces, according to Samar Katnani, the attorney at Brooklyn Legal Services who is representing the Atlantic Towers tenants.

On Tuesday, San Francisco became the first city in the U.S. to ban the use of facial recognition technology by local agencies. Other cities and states may soon follow San Francisco's lead. Oakland is considering a similar measure. Massachusetts is considering a state bill to place a moratorium on facial recognition and other remote biometric surveillance systems. And a federal bill introduced last month in Congress would prohibit commercial users of face recognition technology from collecting and sharing data for identifying or tracking consumers without their consent.

From a regulatory perspective, the issue has yet to gain traction in New York City, where the technology is used by the New York City Police Department but which has not said how the data collected is used. Last year, City Councilmember Ritchie Torres introduced a bill that would require businesses to disclose to the public if they use facial-recognition technology as well as details on what information is being collected, and how it is shared and stored.

In March 2018, the New York Times reported that Madison Square Garden had quietly introduced the system at its arena but a spokesperson for the company refused to answer questions about its use.

Asked about whether New York should regulate the use of facial recognition technology by city and state police, Hoylman said that such a measure would require "a more nuanced examination," but that an assessment needed to occur at every level, including the private sector.

He added: "I think the idea is we need to look before we leap. But I think San Francisco's bold step is probably something New York City should be considering very soon."

In the residential sphere, New York City has already embraced facial recognition technology in affordable housing projects. As part of a city Department of Housing Preservation and Development partnership with a private developer, Omni New York, in February the city opened a 154-unit affordable housing complex in the Bronx touted as having a “state of the art facial recognition system." The apartments are designated for low-income tenants and homeless veterans.

The bill will first be discussed by the housing committees in the state Senate and Assembly. It is too soon to tell how much support the bill will garner among legislators and civil liberties and tenant advocates.

Reached for comment about the bill, Daniel Schwarz, privacy and technology strategist at the New York Civil Liberties Union, issued the following comment:

“While the NYCLU hasn’t taken a position on this new legislation, we share lawmakers’ concerns that the imposition of facial recognition software by landlords onto tenants infringes on both tenants’ privacy and their civil rights. This invasive technology creates an unprecedented level of surveillance with real risks for freedom of association and access to housing in addition to privacy rights, especially for women, children, and people of color who are more likely to be misidentified by biased facial recognition software.”