In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • arc@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    18 hours ago

    I saw the video and I have two points:

    1. Yes it plays like an infomercial for lidar. So take that portion with some skepticism. I can think of some issues exclusive to lidar like 2+ lidar cars blinding each other which needs to be solved, e.g. some kind of light pattern encoding to mask out unwanted signals.
    2. It absolutely 100% demonstrates the issue with camera-only technology in Tesla vehicles.

    Teslas used to have cameras + radar but they cheaped out and removed the radar. I think it would have passed all the tests if they still had the front facing radar but they don’t. The problem with cameras alone is obvious - they can’t see what they can’t see and probably don’t have an innate sense to slow down if there is rain, fog, ice or whatever else that might cause a human to.

    • Ricaz@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      16 hours ago

      Are there no standards for minimum required sensors on a car to get a “self-driving” badge?

      Every other field, especially in the automotive industry, has such strict standards… Radar should be a minimum