A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”

So, you admit that the company’s marketing has continued to lie for the past six years?

  • atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 day ago

    There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

    I can’t say for sure that they are responsible or not in this case because I don’t know what the person driving then assumed. But if they assumed that the “safety features” (in particular autopilot) would mitigate their recklessness and Tesla can’t prove they knew about the override of such features, then I’m not sure the court is wrong in this case. The fact that they haven’t changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

    Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn’t know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that’s a lot of the problem. Other cars have warning about what their “assisted driving” systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don’t claim the car can drive itself.

    • Modern_medicine_isnt@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 hours ago

      You mention other cars overriding your input. The most common is the auto breaking when it sees you are going to hit something. But my understanding is that it kicks in when it is already too late to avoid the crash. So it isn’t something that is involved in decision making about driving, it is just a saftey feature only relevant in the case of a crash. Just like you don’t ram another car because you have a seatbelt, your driving choices aren’t affected by this features presence. The other common one will try to remind you to stay in your lane. But it isn’t trying to override you. It rumbles the wheel and turns it a bit in the direction you should go. If you resist at all it stops. It is only meant for if you have let go of the wheel or are asleep. So I don’t know of anything that overrides driver input completely outside of being too late to avoid a crash.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        16 hours ago

        Some cars brake for you as soon as they think you’re going to crash (if you have your foot on the accelerator, or even on the brake if the car doesn’t believe you’ll be able to stop in time). Fords especially will do this, usually in relation to adaptive cruise control, and reverse brake assist. You can turn that setting off, I believe but it is meant to prevent a crash, or collision. In fact, Ford’s Bluecruise assisted driving feature was phantom braking to the point there was a recall about it because it was braking with nothing obstructing the road. I believe they also just updated it so that the accelerator press will override the bluecruise without disengaging it in like the 1.5 update which happened this year.

        But I was thinking you were correcting me about autopilot for planes and I was confused.

        https://www.youtube.com/watch?v=IQJL3htsDyQ

    • Pyr@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      23 hours ago

      To me having the car be able to override your actions sounds more dangerous than being to override the autopilot.

      I had one rental truck that drove me insane and scared the shit out of me because it would slam on the brakes when I tried to reverse into grass that was too tall.

      What if I were trying to avoid something dangerous, like a train or another vehicle, and the truck slammed on the brakes for me because of some tree branches in the way? Potentially deadly.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        21 hours ago

        I agree. I hate auto braking features. I’m not a fan of cruise control. I very much dislike adaptable cruise control, lane keeping assist, reverse braking, driving assist, and one pedal mode. I drive a stick shift car from the early 2000’s for this reason. Just enough tech to be useful. Not enough tech to get in the way of me being in control of the car.

        But there’s definitely some cruise controls out there even before all the stuff with sensors and such hit the market that doesn’t work the way lots of people in this thread seem to think. Braking absolutely will cancel the set cruise control but doesn’t turn it off. Accelerating in some cars also doesn’t cancel the cruise control, it allows you to override it to accelerate but will go back to the set cruise control speed when you take your foot off the accelerator.

        I absolutely recognize that not being able to override the controls has a significant potential to be deadly. All I’m saying is there’s lots of drivers who probably shouldn’t be on the road who these tools are designed for and they don’t understand even the basics of how they work. They think the stuff is a cool gimmick. It makes them overconfident. And when you couple that with the outright lies that Musk has spewed continuously about these products and features, you should be able to see just why Tesla should be held accountable when the public trusts the company’s claims and people die or get seriously injured as a result.

        I’ve driven a lot of vehicles with features I absolutely hated. Ones that took agency away from the driver that I felt was extremely dangerous. On the other hand, I have had people just merge into me like I wasn’t there. On several occasions. Happens to me at least every month or so. I’ve had people almost hit me from behind because they were driving distracted. I’ve literally watched people back into their own fences. Watched people wreck because they lost control of their vehicle or weren’t paying attention. Supposedly these “features” are meant to prevent or mitigate the risks of that. And people believe they are more capable of mitigating that risk than they are, due to marketing and outright ridiculous claims from tech enthusiasts who promote these brands.

        If I know anything I know that you can’t necessarily make people read the warning label. And it becomes harder to override what they believe if you lied to them first and then try to tell them the truth later.

    • MysteriousSophon21@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      21 hours ago

      Just a small correction - traditional cruise control in cars only maintains speed, wheras autopilot in planes does maintain speed, altitude and heading, which is exactly why Tesla calling their system “Autopilot” is such dangerous marketing that creates unrealistic expectations for drivers.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        17 hours ago

        I’m not sure what you’re correcting. The autopilot feature has adaptive cruise control and lane keeping assist, and auto steering.

        Adaptive cruise control will brake to maintain a distance with the vehicle in front of it but maintain the set speed otherwise, lane keeping assist will keep the vehicle in it’s lane/prevent it from drifting from its lane, and combined with auto steering will keep it centered in the lane.

        I specifically explained that a planes auto pilot does those things (maintain speed, altitude, and heading), and that people don’t know that this is all it does. It doesn’t by itself avoid obstacles or account for weather etc. It’d fly right into another plane if it was occupying that airspace. It won’t react to weather events like windsheer (which could cause the plane to lose altitude extremely quickly), or a hurricane. If there’s an engine problem and an engine loses power? It won’t attempt to restart. It doesn’t brake. It can’t land a plane.

        But Musk made some claims that Teslas autopilot would drive the vehicle for you without human interference. And people assume that autopilot (in the pop culture sense) does a lot more than it actually does. This is what I’m trying to point out.