Tech News

This new case of facial recognition is a matter of human rights

Williams’ erroneous arrest, first reported by the New York Times in August 2020, was set up on a rough game from the Detroit Police department’s face recognition system. Two more events of false arrest since ancient times. They are all black men, and they have all been judged.

Now Williams is following in their footsteps and moving forward — not only by filing a lawsuit against the department to arrest him incorrectly, but by trying to get the profession banned.

On Tuesday, ACLU and the University of Michigan Law School’s Civil Rights Litigation Initiative filed a lawsuit instead of Williams, arguing that the arrest violated his fourth right and was against Michigan’s civil rights law.

The suit calls for compensation, greater visibility in the use of facial recognition, and the termination of employment of the Detroit Police department through facial recognition, whether directly or indirectly.

What the case says

The program of documents issued Tuesday lay the case. In March 2019, DPD released a photo of a black man wearing a red cap from a Shinola video via his facial expression, produced by a company called DataWorks Plus. The machine restored the match with an old Williams photo of the driver. The researchers also included a photo of William pointing as part of the photograph, as well as a security guard at Shinola (who was not actually found during the robbery) identified Williams as a thief. The officers received a letter of recommendation, which required them to remove a number of positions from the leadership of the department, and Williams was arrested.

The complaint alleges that Williams’ fraudulent arrests were made in face-to-face identification, and that “arbitrary arrests and arrests provide evidence of the serious harm caused by misuse, relying on face-to-face expertise.”

The case consists of four numbers, three of which focus on the lack of cause for arrest while one focuses on racial differences due to facial recognition. “By using a technology that is strongly proven not to identify black people at much higher prices than other groups of people,” he says, “the DPD denied Mr. Williams the full and equal enjoyment of Detroit’s Police Department’s duties, responsibilities, and benefits for his race. or a nation. ”

Problems with facial recognition skills to identify black people are the same well written. After George Floyd was assassinated in Minneapolis in 2020, cities and other countries announced bans and restrictions on the use of faces by police. But many others, including Detroit, continued to use it despite many concerns.

“Relying on small pictures”

A MIT Technology statement in an interview with ACLU’s Williams’ lawyer, Phil Mayor, last year, emphasized that the challenges of apartheid law in the United States had made their faces more recognizable.

“This is not a bad thing,” the mayor said. “These are the cases in which we have cases that are swift and very late in protecting human rights, especially when it comes to black people.”

Eric Williams, attorney general for Economic Equity Practice in Detroit, says cameras have a lot of technical limitations, not to mention that they have strong color labels to identify their skin and are often unable to repair dark skin.

“I think every black person in this country has ever had a picture and that image is light or dark.”

“I think every black person in this country has ever been in this picture and the image is a little too bright or too dark,” says Williams, who is a member of the ACLU Michigan committee but is not working on the Robert Williams case. “Enlightenment is one of the most important aspects of imagery. That is why the concept of law enforcement agencies depends, to some extent,…

There have been cases that have challenged biased algorithms and technical expertise in color production. For example, Facebook happened a comprehensive review of human rights after its advertising campaign was found to be sending advertisements on the basis of race, gender, and religion. YouTube was sued a class case made by Black Manufacturers who claim that its AI systems use detection and analysis or color-coded content. YouTube was also sued by LGBTQ + makers who claimed the system was redirected coined the terms “gay” and “lesbian.”

Some experts say that it is only a matter of time before the use of professionalism by a large corporation as a police force will face legal challenges.


Source link

Related Articles

Leave a Reply

Back to top button