Audit dings states over faulty facial recognition used for approving unemployment benefits

After pushing unemployment agencies to crack down on fraud, the Labor Department’s inspector general now says the method states turned to – facial recognition – ends up discriminating against minorities.

In an urgent new alert, the inspector general said Tuesday that some states also don’t have sufficient controls on the facial-recognition data they are collecting and giving to outside contractors, leaving it open to misuse.

“These risks must be addressed and mitigated by appropriate oversight and guidance,” Carolyn R. Hantz, the assistant inspector general for audit, said in an urgent alert to officials in the Labor Department’s Employment and Training Administration.

The issue, Ms. Hantz said, isn’t whether facial recognition can sniff out fraud — state unemployment benefit agencies say it does a fine job of that.

It’s not even in its application or accessibility.

Ms. Hantz said jobless people who lack technological skills or have “outdated” phones can’t submit the high-quality photos the facial recognition system needs.

Those who wash out of the electronic verification have to go through a long process that could include phone calls or an in-person visit to an office.

And even when someone does have the skills and a good enough phone, “certain races and genders” get flagged as non-matching more often than others.

That creates an “equity” problem, Ms. Hantz said.

The unemployment insurance program is run by states, with the federal government acting as a monitoring agency. During the pandemic, Uncle Sam also provided unprecedented funding — which fueled unprecedented fraud.

One estimate says perhaps $400 billion in unemployment payments went to fraudsters, while other estimates put the figure at roughly $200 billion. The inspector general does not have an exact estimate yet.

Facing those issues, states rushed to adopt new ways to weed out bogus applications.

Two dozen state workforce agencies — or SWAs, in government-speak — now use some form of facial recognition, the inspector general said.

It cited a 2019 report by the National Institute of Standards and Technology that found facial recognition rejection problems depend on specific algorithms used, but error rates are usually higher for women and younger people.

For high-quality photos, false rejections are higher in Asians and American Indians. When lower-quality images are used, false negatives are higher in people born in Africa and the Caribbean, the inspector general said.

The audit said one state, Oregon, halted its use of facial recognition in unemployment claims to study the issue. The state had found disparities, but couldn’t identify a specific cause.

The inspector general also flagged states’ handling of the data they are collecting.

Third-party contractors are usually doing the facial recognition match, and the audit said not all of the states have written their contracts to ensure applicants’ “biometric” data isn’t being repurposed.

Of 24 states using facial recognition, 15 don’t have clear guidance in their contracts about how the biometric data is stored and 13 didn’t have rules on disposing of the data.

“Therefore, it is up to ETA and SWAs that authorize the use of facial recognition technology to provide consistent guidance and adequate oversight to ensure data related to UI claims is adequately protected,” Ms. Hantz said.

In an official response to the alert Brent Parton, acting assistant secretary for the Employment and Training Administration, said the department is already testing some pilot projects to explore solutions.

“As with any integrity and fraud prevention efforts, ETA must promote solutions that does not have unintended negative impacts on equitable access to UI benefits,” Mr. Parton said.

He agreed with all three recommendations for fixes, such as strengthening contracts and testing for bias.

Mr. Parton said his agency will issue guidance to states by the end of September.