U.S. Sen. Sherrod Brown (D-OH) – ranking member of the U.S. Senate Committee on Banking, Housing, and Urban Affairs – today sent a letter to DataWorks Plus asking the company to provide information to Congress regarding the use of the company’s FACE Plus and FACE Watch Plus facial recognition technology in law enforcement across the nation. Brown cited a New York Times report that revealed that the technology led directly to the wrongful arrest of Robert Julian-Borchak Williams of Michigan, causing unnecessary trauma after he was falsely accused of shoplifting and imprisoned. Brown made clear in his letter that DataWork’s technology poses a serious threat, is unreliable, and is currently used to exacerbate systemic racism in technology and law enforcement. He urged the company to address these concerns and change course immediately.
“For years, scientific studies have repeatedly shown that facial recognition algorithms are significantly less accurate for people with non-white skin tones. In addition, despite touting that your FACE Plus and FACE Plus Watch technologies are “accurate and reliable” in marketing materials, your company reportedly does no scientific or formal testing for accuracy or bias,” said Brown. “Inaccuracies aside, today, facial recognition technology is too often used to sustain systemic racism.”
Brown recently introduced a new privacy proposal that would outlaw the use of facial recognition technology.
A copy of the letter appears here and below:
Mr. Brad Bylenga
DataWorks Plus, LLC
Dear Mr. Bylenga:
I write to express concern that DataWorks Plus is assisting in violating the civil liberties of citizens across the nation where your FACE Plus or FACE Watch Plus facial recognition technology has been deployed—including in Michigan, Pennsylvania, California, South Carolina, and Illinois.
Your facial recognition technology played a critical role in forever altering a Michigan man’s life this past January. Robert Julian-Borchak Williams was wrongfully accused of shoplifting due to a lead your software created. Mr. Williams was humiliated when he was arrested in front of his family, “patted down probably seven times,” fingerprinted, and “spent the night on the floor of a filthy, overcrowded cell next to an overflowing trash can”, during his unlawful 30 hour detention. This trauma is a wholly unacceptable byproduct of your facial recognition technology.
Mr. Williams’s case illustrates the harm and abuse resulting from the use of your company’s facial recognition technology. The result is also entirely predictable. For years, scientific studies have repeatedly shown that facial recognition algorithms are significantly less accurate for people with non-white skin tones. In addition, despite touting that your FACE Plus and FACE Plus Watch technologies are “accurate and reliable” in marketing materials, your company reportedly does no scientific or formal testing for accuracy or bias.
Inaccuracies aside, today, facial recognition technology is too often used to sustain systemic racism. In Detroit, your technology has been used “almost exclusively against Black people so far in 2020, according to the Detroit Police Department’s own statistics.” Racially-targeted surveillance is just one aspect of systemic racism that America as a society is finally confronting following George Floyd’s killing on May 25. Furthermore, providing real-time identification has a chilling effect on First Amendment rights to assembly and speech.
To address these concerns, please respond to the following questions no later than July 10, 2020.
Thank you for your prompt attention to this important issue.
 https://www.npr.org/2020/06/24/882683463/the-computer-got-it-wrong-how-facial-recognition-led-to-a-false-arrest-in-michig; http://www.dataworksplus.com/newsarchives.html; http://www.dataworksplus.com/news.html
 https://www.wired.com/story/amazon-facial-recognition-congress-bias-law-enforcement/; https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212; https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software