Advertisement
Australia markets closed
  • ALL ORDS

    7,817.40
    -81.50 (-1.03%)
     
  • ASX 200

    7,567.30
    -74.80 (-0.98%)
     
  • AUD/USD

    0.6410
    -0.0016 (-0.24%)
     
  • OIL

    82.35
    -0.38 (-0.46%)
     
  • GOLD

    2,397.60
    -0.40 (-0.02%)
     
  • Bitcoin AUD

    101,186.04
    +4,448.80 (+4.60%)
     
  • CMC Crypto 200

    1,329.54
    +16.91 (+1.31%)
     
  • AUD/EUR

    0.6017
    -0.0014 (-0.23%)
     
  • AUD/NZD

    1.0891
    +0.0017 (+0.15%)
     
  • NZX 50

    11,796.21
    -39.83 (-0.34%)
     
  • NASDAQ

    17,394.31
    -99.31 (-0.57%)
     
  • FTSE

    7,833.31
    -43.74 (-0.56%)
     
  • Dow Jones

    37,775.38
    +22.07 (+0.06%)
     
  • DAX

    17,723.44
    -113.96 (-0.64%)
     
  • Hang Seng

    16,224.14
    -161.73 (-0.99%)
     
  • NIKKEI 225

    37,068.35
    -1,011.35 (-2.66%)
     

Why Microsoft and Amazon are calling on Congress to regulate facial recognition tech

Some of the biggest companies in the world are pulling their facial recognition technologies from law enforcement agencies across the country. Amazon (AMZN), IBM (IBM), and Microsoft (MSFT) have said that they will either put a moratorium on the use of their technology by police — or are completely exiting the field citing human rights concerns.

The technology, which can be used to identify suspects in things like surveillance footage, has faced widespread criticism after studies found it can be biased against women and people of color. And according to at least one expert, there needs to be some form of regulation put in place if these technologies are going to be used by law enforcement agencies.

“If these technologies were to be deployed, I think you cannot do it in the absence of legislation,” explained Siddharth Garg, assistant professor of computer science and engineering at NYU Tandon School of Engineering, told Yahoo Finance.

“I think some of this will be challenged in court, and we’ll finally find a solution potentially that balances the risk and the benefits, to whatever extent there are benefits.”

FILE - In this Feb. 22, 2019, file photo, Washington County Sheriff's Office Deputy Jeff Talbot demonstrates how his agency used facial recognition software to help solve a crime, at their headquarters in Hillsboro, Ore. The image on the left shows a man whose face was captured on a surveillance camera and investigators used the software to scan their database of past mug shots to match that facial image with an identity. Amazon said Wednesday, June 10, 2020, it will ban police use of its facial recognition technology for a year in order to give Congress time to come up with ways to regulate the technology. (AP Photo/Gillian Flaccus, File)
Washington County Sheriff's Office Deputy Jeff Talbot demonstrates how his agency used facial recognition software to help solve a crime, at their headquarters in Hillsboro, Ore. (AP Photo/Gillian Flaccus, File)

Amazon, IBM, and Microsoft out

The first company to make a move away from offering the technology to law enforcement was IBM. On June 8, CEO Arvind Krishna sent a letter to several lawmakers, including Senators Cory Booker (D-NJ) and Kamala Harris (D-CA), indicating that the company would no longer be developing facial recognition software. The move came in the wake of the death of George Floyd and subsequent protests against police brutality and over-policing.

ADVERTISEMENT

“We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,” Krishna wrote.

“Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe. But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported.”

Amazon followed suit with a June 10 announcement that it was placing a one-year moratorium on the use of its Rekognition technology by law enforcement agencies.

Demonstrators hold images of Amazon CEO Jeff Bezos near their faces during a Halloween-themed protest at Amazon headquarters over the company's facial recognition system, "Rekognition," Wednesday, Oct. 31, 2018, in Seattle. Protesters said that they were there in support of hundreds of Amazon employees who have signed a letter asking the company to stop marketing their facial recognition software to ICE and to drop its contract with software company Palantir and to law enforcement agencies. (AP Photo/Elaine Thompson)

“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” the company said in a statement.

“We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”

The following day at a Washington Post Live event, Microsoft President Brad Smith said the tech giant would also refuse to offer its facial recognition technology to police departments until comprehensive legislation is put in place that regulates the tech.

Amazon, IBM, and Microsoft aren’t the only vendors of facial recognition technology. For example, Clearview AI, which came under heavy scrutiny for the way it collected images for its algorithm via Facebook, told Yahoo Finance it will continue to offer its tech to police departments.

Facial recognition bias

One of the main criticisms against facial recognition technology’s use by law enforcement is that it can show bias against women and people of color. In a study, MIT’s Joy Buolamwini and the University of Toronto’s Inioluwa Deborah Raji found that Amazon, Microsoft, IBM and software company Megvii were significantly less accurate when identifying women and people of color. Women of color were particularly impacted by the results. Amazon, however, pushed back saying its own studies and customer reactions ran counter to the study’s results.

Another study of 189 algorithms from some 99 companies by the National Institute of Standards and Technology found that the majority of the software tested had some form of bias.

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini takes questions from reporters at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. (AP Photo/Steven Senne)
Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini takes questions from reporters at the school, in Cambridge, Mass. (AP Photo/Steven Senne)

According to Garg, bias in facial recognition algorithms crops up due to the data fed to the algorithm. Algorithms “learn” by being provided with massive amounts of information. In the instance of a facial recognition algorithm that may include large batches of photos.

If the algorithms are fed more photos of one group over another, it could result in discrepancies in performance.

“The algorithms aren’t biased, but there is bias in the algorithms to be precise,” Garg explained. “There is well demonstrated evidence of bias in face recognition technology.”

A call for legislation

The tech industry is now calling on the federal government to craft legislation regulating facial recognition technologies. But this isn’t the first time the issue has been raised. Senator Booker previously proposed legislation alongside Senator Jeff Merkley (D-OR) calling for a moratorium on the use of facial recognition technology by the federal government until Congress established a legal framework for its use.

A recent proposal by Representative Karen Bass (D-CA) that focused on larger police reforms touches on the use of facial recognition technology, but isn’t specifically about the use of such software.

Facial recognition technology has already worked its way into different parts of our lives, whether it’s the use of the tech airports, at customs checkpoints, or even in our own smartphones. Whether a full legislative press comes to fruition, however, is very much up in the air.

Got a tip? Email Daniel Howley at dhowley@yahoofinance.com over via encrypted mail at danielphowley@protonmail.com, and follow him on Twitter at @DanielHowley.

More from Dan:

Follow Yahoo Finance on Twitter, Facebook, Instagram, Flipboard, SmartNews, LinkedIn, YouTube, and reddit