Rite Aid, the pharmacy chain, used facial recognition technology to falsely and disproportionately identify people of color and women as likely shoplifters, the Federal Trade Commission said on Tuesday, describing a system that embarrassed customers and raised new concerns about the biases baked into such technologies.

Under the terms of a settlement, Rite Aid will be barred from using facial recognition technology in its stores for surveillance purposes for five years, the F.T.C. said. The agency, which enforces federal consumer protection laws, appeared to be signaling just how seriously it would respond to concerns about facial recognition technology.

The F.T.C.’s 54-page complaint also shed light on how a once-theoretical worry — that human bias would bleed into artificial intelligence algorithms and amplify discrimination — has become a cause for concern in the real world.

Samuel Levine, the director of the F.T.C.’s Bureau of Consumer Protection, said in a statement that “Rite Aid’s reckless use of facial surveillance systems left its customers facing humiliation and other harms.”

From October 2012 to July 2020, the complaint said, Rite Aid employees acting on false alerts from the systems followed customers around stores, searched them, ordered some to leave and, if they refused, called the police to confront or remove them, at times in front of friends and family.

Rite Aid’s actions disproportionately affected people of color, especially Black people, Asians and Latinos, all in the name of keeping “persons of interest” out of hundreds of Rite Aid stores in cities including New York, Philadelphia and Sacramento, the F.T.C. said.

Rite Aid said in a statement that while it disagreed with the F.T.C.’s accusations, it was “pleased to reach an agreement.”

“The allegations relate to a facial recognition technology pilot program the company deployed in a limited number of stores,” the company said. “Rite Aid stopped using the technology in this small group of stores more than three years ago, before the F.T.C.’s investigation regarding the company’s use of the technology began.”

The settlement with Rite Aid comes about two months after the company filed for bankruptcy protection and announced plans to close 154 stores in more than 10 states.

Rite Aid was using the facial recognition technology as retail chains were sounding alarms over shoplifting, especially “organized retail crime,” in which multiple people steal products from several stores to later sell on the black market.

Those concerns prompted several stores, include Rite Aid, to safeguard merchandise by locking up much of it in plastic cases.

But those worries appear to have been overblown. This month, the National Retail Federation retracted its incorrect estimate that organized retail crime was responsible for nearly half the $94.5 billion in store merchandise that disappeared in 2021. Experts say the number is probably closer to 5 percent.

Rite Aid did not tell customers that it was using the technology in its stores, and employees were “discouraged from revealing such information,” the F.T.C. said.

It is not clear how many other retailers are using facial recognition technology for surveillance. Macy’s told Business Insider that it uses it in some stores. Home Depot says on its website that it collects “biometric information, including facial recognition.”

Alvaro M. Bedoya, the commissioner of the F.T.C., said in a statement that “the blunt fact that surveillance can hurt people” should not get lost in conversations about how surveillance violates rights and invades privacy.

“It has been clear for years that facial recognition systems can perform less effectively for people with darker skin and women,” Mr. Bedoya said.

Woodrow Hartzog, a professor of law at Boston University who has researched facial recognition technologies and the F.T.C., said that the agency’s complaint against Rite Aid shows that it views A.I. surveillance technology as a serious threat.

The target of the agency’s complaint was significant, Professor Hartzog said. Even though Rite Aid hired two unnamed companies to help create a database of people it considered likely to shoplift, the F.T.C. only went after Rite Aid.

The F.T.C., he said, is basically saying that “the culpable behavior that we’re targeting is the failure to do due diligence when working with other vendors.”

The complaint notes that Rite Aid used the surveillance systems in urban areas and along public transportation routes, leading to a disproportionate effect on people of color, officials said.

About 80 percent of Rite Aid stores are in areas where white people are the largest racial or ethnic group. But 60 percent of Rite Aid stores that used the facial recognition technology were in areas where white people were not the largest racial or ethnic group, according to the F.T.C.

Rite Aid trained security workers in its stores to feed images into an “enrollment database” of people it considered “persons of interest,” and employees were told to “push for as many enrollments as possible.” The databases were filled with low-quality images, many of which were obtained from closed-circuit television, mobile-phone cameras and media reports, the F.T.C. said.

That faulty system, officials said, caused thousands of “false-positive matches,” or alerts that incorrectly indicated that a customer was a “match” for a person in Rite Aid’s database. Worse, Rite Aid had no way of tracking false positives, the complaint said.

“Rite Aid’s failure to appropriately train or oversee employees who operated facial recognition technology further increased the likelihood of harm to consumers,” the F.T.C. said.

In one instance, Rite Aid employees stopped and searched an 11-year-old girl whom the system had falsely flagged as a person likely to commit shoplifting.

In another example cited in the complaint, a Black man wrote to Rite Aid after he was the victim of a false-positive facial recognition match.

“When I walk into a store now it’s weird,” he said, adding: “Every Black man is not a thief nor should they be made to feel like one.”

Leave a Reply

Your email address will not be published. Required fields are marked *