Tesla fires employee who posted YouTube videos of Full Self-Driving accident

  News
image_pdfimage_print
A view from inside a Tesla car a moment before it hit a bollard that appears to separate a car lane from a bike lane.
Enlarge / A Tesla with Full Self-Driving enabled, a moment before hitting a bollard in San Jose.

Ex-Tesla employee John Bernal says he was fired for posting YouTube videos about Tesla’s Full Self-Driving (FSD) beta. He had been creating the videos for about a year. Bernal says that Tesla also cut off his access to the FSD beta in the 2021 Tesla Model 3 that he owns.

The firing and beta cutoff occurred shortly after Bernal posted a video on February 4 of a minor accident in which his Tesla car hit a bollard that appears to separate a car lane from a bike lane in San Jose. In a subsequent video on February 7 providing frame-by-frame analysis of the collision, Bernal said that “no matter how minor this accident was, it’s the first FSD beta collision caught on camera that is irrefutable.”

“I was fired from Tesla in February with my YouTube being cited as the reason why—even though my uploads are for my personal vehicle, off company time or property, with software I paid for,” Bernal said in the latest video, which was posted yesterday on his AI Addict channel. Bernal showed a notice he received that said his Full Self-Driving beta access was disabled “based on your recent driving data.” But that explanation didn’t seem to make sense because “the morning of being fired, I had zero improper use strikes on my vehicle,” he said.

Bernal said his job at Tesla involved helping to develop FSD and test-operating the software. His new video asked Tesla to re-enable the beta on his personal car but explained that he is continuing to test the Full Self-Driving beta in a replacement vehicle. “This channel is meant to educate the public… I care about finding important safety bugs, and I still want to help. Luckily, this is Silicon Valley, where there is plenty of beta to go around, so today I’m in a new Tesla,” he said.

Self Driving Collision (Analysis).

Firing notice “did not include the reason”

We contacted Tesla about Bernal’s firing and will update this article if we get a response. As reported by CNBC, Bernal started working for Tesla “as a data annotation specialist in August 2020” and “was dismissed in the second week of February this year, after having moved into the role of advanced driver assistance systems test operator, according to records he shared with CNBC.”

Bernal’s “written separation notice did not include the reason for his firing,” but he said that “before he was dismissed, managers verbally told him he ‘broke Tesla policy’ and that his YouTube channel was a ‘conflict of interest,'” CNBC wrote. “Bernal said he was always transparent about his YouTube channel, both with his managers at Tesla and with the public… Bernal said he had never seen a policy barring him from creating car tech reviews on his own time using his own property.”

CNBC said it obtained a copy of Tesla’s internal social media policy and that it “makes no direct reference to criticizing the company’s products in public. The policy states, ‘Tesla relies on the common sense and good judgment of its employees to engage in responsible social media activity.'”

Bernal maintained that he “never disclosed anything in his videos that Tesla had not released to the public,” saying that “the FSD beta releases I was demonstrating were end-user consumer products,” according to CNBC.

First accident in a year of testing

In the video analyzing the accident, Bernal said he had tested the software for over a year and that this was his “first incident.” He also noted that he spent a year “human-labeling this software for Tesla” as an employee.

When the accident happened, Bernal was letting the Full Self-Driving features control the car but said he “activated the brakes as hard as I could” once it became clear it was going off course. The Tesla bent the bollard to the ground. “Luckily, it was only plastic,” he said.

“Some may say I should have reacted sooner, which I should have. However, in my year of testing, FSD is usually really good at detecting objects last-minute and slowing to avoid,” he said. While it wasn’t a major accident, Bernal said there are “multiple other instances in the same video of the FSD system attempting to go straight for other bollards after already hitting one. It was not a one-off occurrence.”

“Close Calls, Pedestrians, Bicycles!”

Bernal’s AI Addict channel has nearly 8,800 subscribers, and the two videos of the accident racked up about 230,000 views combined. Most of his videos over the past year focus on Tesla Full Self-Driving. In addition to last month’s accident videos, Bernal posted one in March 2021 titled “Close Calls, Pedestrians, Bicycles!

Bernal told CNBC that after the March 2021 video first ran, “a manager from my Autopilot team tried to dissuade me from posting any negative or critical content in the future that involved FSD Beta. They held a video conference with me but never put anything in writing.”

https://arstechnica.com/?p=1841482