Rite Aid facial recognition misidentified Black, Latino and Asian people as ‘likely’ shoplifters
This article is more than 1 month oldSurveillance systems incorrectly and without customer consent marked shoppers as ‘persons of interest’, an FTC settlement says
Rite Aid used facial recognition systems to identify shoppers that were previously deemed “likely to engage” in shoplifting without customer consent and misidentified people – particularly women and Black, Latino or Asian people – on “numerous” occasions, according to a new settlement with the Federal Trade Commission. As part of the settlement, Rite Aid has been forbidden from deploying facial recognition technology in its stores for five years.
The FTC said in a federal court complaint that Rite Aid used facial recognition technology in hundreds of stores from October 2012 to July 2020 to identify shoppers “it had previously deemed likely to engage in shoplifting or other criminal behavior”. The technology sent alerts to Rite Aid employees either by email or phone when it identified people entering the store on its watchlist.
The FTC said in its complaint that store employees would then put those people under increased surveillance, ban them from making purchases or accuse them in front of friends, family and other customers of previously committing crimes. The facial recognition system was largely used in New York City; Los Angeles; San Francisco; Philadelphia; Baltimore; Detroit; Atlantic City; Seattle; Portland, Oregon; Wilmington, Delaware and Sacramento, California, according to the settlement.
The settlement addresses charges that the struggling drugstore chain did not do enough to prevent harm to its customers and implement “reasonable procedures”, the government agency said. Rite Aid said late on Tuesday that it disagreed with the allegations, but that it was glad it had reached an agreement to resolve the issue.
As part of its contract with two private, unnamed vendors, Rite Aid created or directed the companies to create a database of “persons of interest” that included images of the people and other personally identifying information. Those images were often low quality and were captured through Rite Aid’s CCTV cameras, the facial recognition cameras or on the mobile phones of employees, according to the settlement.
Security workers were trained to “push for as many enrollments as possible” and the company “enrolled at least tens of thousands of individuals in its database”, according to FTC documents.
The federal complaint also said there were “numerous instances” in which the technology incorrectly identified someone who entered the store and Rite Aid failed to test its accuracy before using it. For instance, Rite Aid did not ask one of the two private vendors it worked with whether its technology had been tested for accuracy, according to the settlement. In fact, the vendor explicitly states in its contract that it “makes no representations or warranties as to the accuracy and reliability” of its facial recognition system.
The FTC also said the company “failed to take reasonable steps to train and oversee the employees charged with operating the technology in Rite Aid stores”.
Civil liberty and digital rights group, the Electronic Privacy Information Center (Epic), said that facial recognition can be harmful in any context but that Rite Aid failed to take even the most basic precautions. “The result was sadly predictable: thousands of misidentifications that disproportionately affected Black Asian, and Latino customers, some of which led to humiliating searches and store ejections,” said John Davisson, Epic’s director of litigation.
Rite Aid says the allegations center on a pilot program it used in a limited number of stores and it stopped using this technology more than three years ago.
“We respect the FTC’s inquiry and are aligned with the agency’s mission to protect consumer privacy, the company said in a statement posted on its website. “However, we fundamentally disagree with the facial recognition allegations in the agency’s complaint.”
Studies have shown facial recognition systems have been found to routinely misidentify Black and brown people. In the last few years in the US, there have been six known cases of Black people being falsely arrested due to facial recognition.
“This is a groundbreaking case, a major stride for privacy and civil rights, and hopefully just the beginning of a trend,” Davisson said. “But it’s important to note that Rite Aid isn’t alone. Businesses routinely use unproven algorithms and snake oil surveillance tools to screen consumers, often in secret. The FTC is right to crack down on these practices, and businesses would be wise to take note. Algorithmic lawlessness is not an option any more.”
Rite Aid also noted in a prepared statement that any agreement would have to be approved in US bankruptcy court. The company announced last fall that it was closing more than 150 stores as it makes its way through a voluntary chapter 11 bankruptcy process. The company has struggled financially for years and also faces financial risk from lawsuits over opioid prescriptions like its bigger rivals, CVS and Walgreens.
This article was amended on 21 December 2023 to correct a misspelling of John Davisson’s name.
ncG1vNJzZmivp6x7tbTEoKyaqpSerq96wqikaKyVmLWvu8uonrJnYmV%2FdHvDnppoamBkv6rAxGaYopxdqLWwvMuina2hnpx6p63CopilZaKasLCzzaKroqeeYrO1r4ysnK2snJq6prrT