in

Cybertruck Owner Crashes on FSD, Still Praises Tesla—Here’s Why

Photo Credit: Courtesy of Tesla, Inc.

When people think about self-driving cars, they imagine a future where technology handles all the driving while they sit back and relax. But what happens when that technology fails?

A Cybertruck owner from Florida, Jonathan Challinger, recently experienced a major crash while using Tesla’s latest Full Self-Driving (FSD) software.

The accident was severe—the truck hit a curb, then slammed straight into a thick concrete light pole. The damage was devastating. The front end of the truck was completely destroyed, the airbags deployed, and the front wheel even came off.

Despite all this, Jonathan did something unexpected—he praised Tesla.

The Crash: What Happened?

Jonathan was using FSD v13.2.4 when his Cybertruck attempted to merge out of a lane that was ending.

However, instead of smoothly merging, the vehicle failed to recognize the situation and made no attempt to slow down or turn. It drove up the curb and crashed directly into a light pole.

And here’s the shocking part—the vehicle never even attempted to stop.

Most people would be furious if their “self-driving” car drove them straight into a pole. But not Jonathan. Instead, he took to X (formerly Twitter) and shared the accident, not to criticize Tesla, but to warn other drivers to stay alert.

“It was a big fail on my part, obviously. Don’t make the same mistake I did. Pay attention. It can happen.”

Jonathan admits that FSD failed in this situation but still insists that the real mistake was his own.

Why is Jonathan Thanking Tesla?

Despite the crash, Jonathan expressed gratitude to Tesla, specifically for the passive safety features that protected him.

“Thank you, Tesla, for engineering the best passive safety in the world. I walked away without a scratch.”

Essentially, while the software failed, Jonathan believes Tesla’s strong vehicle structure and safety features saved his life.

But his response has left many divided.

Backlash and Reactions from Tesla Fans and Critics

Understandably, not everyone agreed with Jonathan’s praise. Many people questioned why he would thank Tesla after FSD literally drove his truck into a pole.

One X user, Snow Ball, asked:

“So FSD failed, but you still managed to find a way to praise Tesla? You failed too for not taking over in time. But your concern isn’t for the lives of third parties that you and FSD endangered. No, you are worried about Tesla getting bad publicity.”

Jonathan, however, firmly defended Tesla, responding:

“I am rightly praising Tesla’s automotive safety engineers who saved my life and limbs from my own stupidity, which I take responsibility for. Get lost, sir.”

While some Tesla owners respected his position, others thought he was being too extreme.

A Tesla fan, notyourbuddy, wrote:

“Bro, I use FSD exclusively and love Tesla, but this is some out-of-control simping.”

Another user, Jones King, shared a Google Street View image of the crash location and pointed out that the Cybertruck wasn’t even in a proper lane before the accident:

“The lane did not end abruptly. Shame FSD couldn’t avoid the fixed object being the pole.”

This suggests that the software misread the road completely, which raises serious concerns about how well Tesla’s FSD can handle complex driving scenarios.

Tesla Fans Rally Behind Jonathan

While some criticized Jonathan for being too forgiving, other Tesla fans saw his post as a selfless act.

A Tesla supporter, David Cohen, responded with encouragement:

“Your willingness to share this lesson is invaluable for the safety and betterment of all Tesla users. Thank you for putting the long-term good of the community ahead of personal embarrassment.”

Jonathan agreed, saying:

“Yes, it is against every instinct that I post this.”

Will Tesla Learn from This Incident?

Jonathan also reached out to Tesla, asking how he could ensure the company received data from the crash. But according to him, Tesla’s service center has been unresponsive.

This raises an important question: How does Tesla handle FSD failures? If an incident like this happens, should Tesla proactively investigate and improve the software?

Jonathan has dashcam footage of the accident and wants to share it for educational purposes, but he fears it will be used to attack Tesla unfairly.

“I want to get it out there as a PSA that it can happen, even on v13, but I don’t want to give the bears/haters any material.”

This hesitation highlights the divisive nature of Tesla’s self-driving technology. Some users are so loyal to the brand that they’re reluctant to share any negative experiences—even when it could improve safety for everyone.

The Bigger Picture: Is FSD Really Safe?

This crash highlights a fundamental truth about Tesla’s Full Self-Driving:

  • It’s not perfect.
  • It still makes critical mistakes.
  • Drivers must remain fully alert.

Jonathan’s accident is a wake-up call for those who have become too comfortable relying on FSD. He himself admitted:

“It is easy to get complacent now—don’t.”

If Tesla truly aims for full autonomy, it must acknowledge and address these failures. Otherwise, incidents like this could become more common.

What Do You Think?

Do you believe Jonathan is right to praise Tesla despite the crash? Should Tesla be held more accountable for FSD failures?

Share your thoughts in the comments below!

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

Tesla announces 68 Optimus robot job openings and internships across the U.S. — Apply now

“Is scaring our kids”: Cybertruck owner faces hostility over his golden wrap—Tesla refuses trade-in