Facebook must face $35B facial-recognition lawsuit following court ruling

  News
image_pdfimage_print
The Facebook app displayed on the screen of an iPhone.
Enlarge / The Facebook app displayed on the screen of an iPhone.

Facebook’s most recent attempt to extricate itself from a potentially landmark lawsuit has come to a dead end, as a federal court declined to hear another appeal to stop the $35 billion class action.

In San Francisco last week, the US Court of Appeals for the 9th Circuit denied Facebook’s petition for an en banc hearing in the case. Usually, appeals cases are heard by a panel of three judges out of all the judges who work in a given circuit. An en banc hearing is a kind of appeal in which a much larger group of judges hears a case. In the 9th Circuit, 11 of the 29 judges sit on en banc cases.

Facebook had requested an en banc hearing to appeal the 9th’s Circuit’s August ruling, in which the court determined that the plaintiffs had standing to sue, even though Facebook’s alleged actions did not cause them any quantifiable financial harm. The class-action suit can now move forward.

Three different Illinois residents filed suit against Facebook back in 2015. The suits, which were eventually rolled together into one single class-action complaint, argue that Facebook’s collection of users’ faces for tagging purposes violates the Illinois Biometric Information Privacy Act, a law that requires businesses to gather consent from state residents before their biometric data is collected or used.

The penalty Facebook would face for violating the Illinois law is up to $5,000 for each knowing violation. There are about 7 million Facebook users in Illinois, meaning Facebook could face a maximum fine of around $35 billion if the case goes to trial and the company loses.

Facial recognition: Far from flawless

While one tech giant faces a lawsuit over facial recognition that works too well, another is facing criticism over facial recognition that doesn’t work well enough.

The Massachusetts branch of the American Civil Liberties Union this week released the results of a test it ran on Amazon’s Rekognition software, in which it mistakenly matched many New England professional athletes to mugshots from a database. The ACLU compared images of 188 athletes from the Boston Bruins, Boston Celtics, Boston Red Sox, and New England Patriots teams against a database of about 20,000 public arrest photos. The ACLU found that 27 of the athletes, more than 14%, were falsely identified in the mugshots.

Rekognition’s false-positive problem is neither new nor improving. The Northern California branch of the ACLU ran a similar test in 2018, in which Rekognition incorrectly matched 28 members of Congress to a database of mugshots.

The false positives continue to draw concern as Amazon has been actively courting partnerships with police departments to spread law enforcement use of Rekognition, as well as shopping it to federal border agencies. The company has also been expanding its work with police departments to sell Ring camera doorbells and home surveillance products in the past two years.

The ACLU in 2018 called attention to a patent Amazon filed that would enable Rekognition in Ring cameras, which Sen. Ed Markey (D-Mass.) pressed the company to explain in detail earlier this year.

“The average sports fan would probably be more accurate at identifying these athletes than Amazon’s Rekognition technology,” Markey said in a tweet. “Law enforcement shouldn’t rely on this potentially discriminatory tech.”

https://arstechnica.com/?p=1589519