TECHNOLOGY

The pandemic is testing the limits of facial recognition


It is being used more and more in what is presented as a public health interest. Australia recently Expand the program Using facial recognition to enforce COVID-19 safety precautions. People who are in isolation are subject to random check-ins, where they are asked to submit a selfie to confirm they are following the rules. Location data is also collected, according to Reuters.

When it comes to necessities like emergency benefits to pay for housing and food, the first priority should be making sure everyone has access to help, says Greer. She adds that fraud prevention is an ostensibly a reasonable goal, but that the most immediate goal should be to get the benefits people need.

“Systems must be built with human rights and the needs of vulnerable people in mind from the start. These cannot be afterthoughts,” says Greer. “They cannot be bug fixes after something has already gone wrong.”

ID.me’s Hall says his company’s services are better than existing methods of identity verification and have helped states reduce “massive” unemployment fraud since the face verifications were implemented. He says a true 91% success rate with unemployment claims – either alone or through a video call with an ID.me representative.

“[That] Our goal was to get in,” he says. “If we can automate 91% of this, the countries that have just outperformed in terms of resources can use those resources to provide white-glove concierge service to 9%.”

When users can’t pass the face recognition process, ID.me sends them a follow-up email, according to Hall.

“Everything about this company is about helping people get access to the things they are qualified for,” he says.

Technology in the real world

The months that JB lived without an income were tough. Financial anxiety was enough to cause stress, and other problems like a broken computer exacerbated the anxiety. Even my previous employer couldn’t or wouldn’t help get past the routine.

“It’s very aloof to be like, ‘Nobody helps me in any situation,'” JB says.

On the government side, experts say it makes sense that the pandemic has brought new technology to the fore, but cases like JB show that technology in and of itself is not the complete answer. Ann L. Washington, assistant professor of data policy at New York University, says it’s tempting to consider new government technology a success when it works most of the time during the research phase but fails 5% of the time in the real world. I compared the result to the game of musical chairs, where five will always be left without a seat in a room of 100 people.

“The problem is that governments get some kind of technology up and running 95% of the time — they think it’s fixed,” she says. Instead, human intervention is more important than ever. “They need a system to deal regularly with the five people standing,” Washington says.

There is an additional layer of risk when a private company is involved. Washington says the biggest problem with introducing a new type of technology is where to store data. without a trusted entity which has a legal duty to protect people’s information, sensitive data may end up in the hands of others. How would we feel, for example, if the federal government entrusted our Social Security numbers to a private company when they were created?

“The problem is that governments get some kind of technology and it works 95% of the time – they think it’s solved”

Ann L. Washington, New York University

The widespread and unsupervised use of facial recognition tools can also affect already marginalized groups more than others. transgender people, for example, in detailFrequent issues with tools like Google Photos, in which you may wonder if the before and after photos show the same person. This means calculating with the program over and over again.

“[There’s] “Inaccuracies in technology’s ability to reflect the breadth of actual diversity and evolving states found in the real world,” says Daly Barnett, a technician at the Electronic Frontier Foundation. “We can’t rely on it to accurately classify, account for, and reflect those beautiful acute cases.”

worse than failure

Conversations about facial recognition usually discuss how the technology can fail or differentiate. But Barnett encourages people to think about whether biometrics tools are working, or whether bias is showing up in the technology. She backs away from the idea that we absolutely need her. In fact, activists such as Jarir warn, the tools can be much more dangerous when they work perfectly. Facial recognition has already been used to identify, punish or strangle protesters, although people are resisting. In Hong Kong, protesters wore masks and goggles hide their facess from this police surveillance. In the United States, federal prosecutors drop the charges Against a protester identified using facial recognition was accused of assaulting police officers.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button