Judge slams Tesla for claiming Musk quotes captured on video may be deepfakes

  News
image_pdfimage_print
Tesla CEO Elon Musk smiling and pointing with his right index finger.
Enlarge / Tesla CEO Elon Musk at the unveiling of the new Tesla Model Y in Hawthorne, California on March 14, 2019.
Frederic J. Brown/AFP via Getty Images

The judge overseeing a wrongful death lawsuit involving Tesla’s Autopilot system rejected Tesla’s claim that videos of CEO Elon Musk’s public statements might be deepfakes.

Tesla’s deepfake claim “is deeply troubling to the Court,” Santa Clara County Superior Court Judge Evette Pennypacker wrote in a tentative ruling this week. “Their position is that because Mr. Musk is famous and might be more of a target for deep fakes, his public statements are immune. In other words, Mr. Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do. The Court is unwilling to set such a precedent by condoning Tesla’s approach here.”

Plaintiffs want Tesla to admit the authenticity of various statements Musk made about the self-driving capabilities in Tesla cars. Pennypacker’s tentative ruling ordered Musk to be interviewed for a deposition at which plaintiffs can ask whether he made the statements.

“Mr. Musk was either at these places or he was not; he either said these things or he did not. Ironically, Tesla’s refusal to answer these questions only makes a clearer record that Mr. Musk is the only person that has this information to respond to this discovery, one of the pre-requisites to permitting an Apex deposition,” Pennypacker wrote.

Tesla admits deepfake argument is “unusual”

Tesla previously told the court it could not admit or deny “the authenticity of a number of statements allegedly made by Elon Musk in various speeches and interviews over a period of nearly ten years.”

“While at first glance it might seem unusual that Tesla could not admit or deny the authenticity of video and audio recordings purportedly containing statements by Mr. Musk, the reality is he, like many public figures, is the subject of many ‘deepfake’ videos and audio recordings that purport to show him saying and doing things he never actually said or did,” Tesla wrote last week.

Pennypacker wasn’t swayed by Tesla’s objections, writing that “Tesla’s argument that it cannot commit one way or another to the statements, or in some cases even admit that it is Mr. Musk in the videos, because of the ease with which deep fakes can be made is unconvincing.”

Among other challenged statements, Tesla refused to admit that in June 2014, Musk said, “I’m confident that—in less than a year—you’ll be able to go from onramp to highway exit without touching any controls.” This Musk statement, which has been quoted in many news articles, came during a Q&A at Tesla’s 2014 shareholder meeting that can be viewed on YouTube.

A hearing on the tentative ruling is scheduled for today. But as Reuters notes, tentative rulings “are almost always finalized with few major changes after such a hearing.”

The wrongful death and negligence lawsuit was filed in 2019 by the wife and children of Walter Huang, a 38-year-old Apple engineer who was killed in March 2018 while his Tesla Model X was in Autopilot mode. “As Walter Huang approached the paved gore area dividing the main travel lanes of US-101 from the SH-85 exit ramp, the autopilot feature of the Tesla turned the vehicle left, out of the designated travel lane, and drove it straight into a concrete highway median,” the lawsuit said.

Huang was playing a Three Kingdoms video game on his phone when his car crashed, a fact that Tesla cites in its defense. The National Transportation Safety Board found that the crash’s probable causes were “the Tesla Autopilot system steering the sport utility vehicle into a highway gore area due to system limitations, and the driver’s lack of response due to distraction likely from a cell phone game application and overreliance on the Autopilot partial driving automation system. Contributing to the crash was the Tesla vehicle’s ineffective monitoring of driver engagement, which facilitated the driver’s complacency and inattentiveness.”

https://arstechnica.com/?p=1935023