

Little known fact: the Model S (P) actually stands for Polyphemus Edition, not Plaid Edition.
Little known fact: the Model S (P) actually stands for Polyphemus Edition, not Plaid Edition.
The driver being drunk doesn’t mean the self-driving feature should not detect motorcycles. The human is a fallback to the tech. The tech had to fail for this fatal crash to occur.
If the system is advertised as overrriding the human speed inputs ( traffic aware cruise control, it is supposed to brake when it detects traffic, regardless of pedal inputs), then it should function as advertised.
Incidentally, I agree, I broadly trust automated cars to act more predictably than human drivers. In the case of specifically Teslas and specifically motorcycles, it looks like something is going wrong. That’s what the data says, anyhow. If the government were functioning how it should, the tech would be disabled during the investigation, which is ongoing.
I know what’s in the article, boss. I wrote it. No need to tell me FTFA.
TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car’s sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.
I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.
Here’s the manual, if you’re curious. It doesn’t work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that “The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control,” so it’s all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.
I am absolutely biased. It’s me, I’m the source :)
I’m a motorcyclist, and I don’t want to die. Also just generally, motorcyclists deserve to get where they are going safely.
I agree with you. Self-driving cars will overall greatly improve highway safety.
I disagree with you when you suggest that pointing out flaws in the technology is evidence of bias, or “cherry picking to make self driving look bad.” I think we can improve on the technology by pointing out its systemic defects. If it hits motorcyclists, take it off the road, fix it, and then save lives by putting it back on the road.
That’s the intention of the coverage, at least: I am hoping to apply pressure to improve rather than remove. Read my Waymo coverage, I’m actually a big automation enthusiast, because fewer crashes is a good thing.
Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.
Tesla alleges they’ll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We’ll see.
In Boca Raton, I’ve seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.
Insanely, you can slam on the gas in Tesla’s self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle’s “traffic aware” automation effectively applying a brake.
That’s not sensationalist. That really is just insanely designed.
Point taken: Feel free to amend my comment from “No lights at all” to “No lights visible at all.”
Surprisingly, there is a data bucket for accidents with bicyclists, but hardly any bicycle crashes are reported.
That either means that they are not occurring (woohoo!), or that means they are being lumped in as one of the multiple pedestrian buckets (not woohoo!), or they are in the absolutely fucking vast collection of “severity: unknown” accidents where we have no details and Tesla requested redaction to make finding the details very difficult.
NHTSA collects data if self-driving tech was active within 30 seconds of the impact.
The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).
… Also accurate.
God, it really is a nut punch. The system detects the crash is imminent.
Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.
If it were good, we’d be seeing regular updates on Twitter, I imagine.
I also saw that theory! That’s in the first link in the article.
The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.
I didn’t include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!
The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a “standard” bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.
I think you’re onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That’s why Tesla would be alone in the motorcycle fatality bracket, and that’s why it would always be rear-end crashes by the Tesla.
Accurate.
Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.
Taking out the driver will make this already-unacceptably-lethal system even more lethal.
I’m American too. I trust they’ll be fair (in a free and open society, it is dangerous for an arbiter to not be fair, hard to get reelected if you seem like a fraud to your electorate). I do not trust they’ll be friendly. They’re going to audit those records as hard as they’ve audited anything in a generation.
I’m sure the transport minister will be a friendly and impartial arbiter the day after 25% tariffs were announced on Canada’s vital-and-thoroughly-intertwined-with-the-USA automotive industry.
Bahaha, who am I kidding. But then, if there’s nothing to find, the Canadians won’t fabricate anything, they have a functional 4th Estate and the journalists have a way of finding things like that. Stay tuned.
Just announcing a follow-up self-driving system on their first really pretty bad system has them on an escape trajectory with their stock price.
I’m starting to think Tesla set all the wrong examples for the auto industry.
Someone else had an interesting take elsewhere on the thread, and that got me looking.
Here is that mural you’re looking for, it’s in South Carolina, took me like 60 seconds of searching to find one so I am sure there are others: https://www.atlasobscura.com/places/tunnelvision
Perfectly fair. Sorry, I jumped the gun! Good on you for being incredulous and inspecting the piece for manipulation, that’s smart.