‘Boycott Tesla’ ads to air during Super Bowl — “Tesla dances away from liability in Autopilot crashes by pointing to a note buried deep in the owner’s manual, that says Autopilot is only safe on fr…::undefined
‘Boycott Tesla’ ads to air during Super Bowl — “Tesla dances away from liability in Autopilot crashes by pointing to a note buried deep in the owner’s manual, that says Autopilot is only safe on fr…::undefined
I have a Tesla, they make it extremely clear that you are never supposed to touch autopilot under any circumstances, and you have to go way out of your way to prevent it from disengaging while you’re not paying attention.
When I was looking to buy a new car back in early 2019, I walked into a showroom for a final test drive before I threw some money down for a Model 3.
It started to rain pretty hard on the return drive back. When executing an auto lane change, the sensors freaked out because of the water interference and they violently yanked the car back into the origin lane halfway through the lane change. It hydroplaned a hair and scared this shit out of my wife and I. The Telsa employee assured us “it’s ok, this is normal.” Hearing that was normal was not comforting.
Upon returning to the showroom, a different model 3 in the parking lot started backing toward a small child. My wife saw what was happening, threw herself in front of the car, and that caused it to halt.
I’m sure the software has progressed in the past 5 years, but suffice to say, we changed our minds on the car at that time. Those two incidents within 15 minutes really made us question how that shit was legal.
If the car was backing out, that was a human driver in control, not autopilot. Autopilot can only be enabled while driving on a well-marked roadway. The first part is plausible however. Likely the software at the time could not handle rain appropriately and you are absolutely right to question this if they tell you it was normal.
The car was being summoned from a parking space. Summon / Smart Summon will absolutely back out of a space fully autonomously.
These instances of errors are obviously alarming, but all the evidence we have is that they’re still safer than human drivers. They will make mistakes - and sometimes those mistakes will cost lives - but they will make fewer mistakes than humans. Given this, as visceral as it feels when we hear of these stories, I think our ire is misplaced. Automated driving will never be perfect. If that’s the bar we’re aiming for we should just give up and go home. The goal is better than humans, and in many conditions, it’s already there.
I’m sorry, never supposed to touch autopilot? Under any circumstances? Other than that, yeah, 100% if it detects you are not paying attention via wheel nag or eye-tracking camera at all it will alarm and disengage, potentially banning the user for a short time from using it. They do make it very clear of this result.
This is the opposite of reality. They make it extremely clear that it’s beta and you need to always keep your eyes on the road in case you need to take over, and that you should take over if you feel you need to.
Of course they do but every one here is so filled with hate for Elon they cannot consider the car might be reasonable
Their loss, heh.
Having to barely put in work to drive is incredible. More time to keep eyes on everything around me in all directions. Road-trips aren’t exhausting, bumper-to-bumper traffic is a breeze.
For a technology forum, it is incredibly disappointing to see how closed minded people are to the tech.