In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • Soleos@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    1 day ago

    The bar set for self-driving cars: Can it recognize and respond correctly to a deliberate optical illusion?

    The bar set for humans: https://youtu.be/ks11nuGGupI

    For the record, I do want the bar for self-driving safety to be high. I also want human drivers to be better… Because even not-entirely-safe self-driving cars may still be safer than humans at a certain point.

    Also, fuck Tesla.

    • legion02@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 day ago

      I mean it also plowed through a kid because it was foggy, then rainy. The wall was just one of the tests the tesla failed.

      • Fermion@feddit.nl
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        1 day ago

        Right, those were the failures that really matter, and Rober included the looney tunes wall to get people sharing and talking about it. A scene painted on wall is a contrived edge case, but pedestrians/obstacles in weather involving precipitation is common.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          It’s no longer an edge case if faulty self driving becomes the norm.

          Want to kill someone in a Tesla? Find a convenient spot and paint a wall there.

          Doesn’t even have to be an artificial wall, for example take a bend on a mountain road and paint the rock.

          • merc@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 hours ago

            A better trick would be to paint the road going straight when there’s a cliff. Much easier to hide the evidence that way.

          • Korhaka@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            22 hours ago

            Next test I would love is what is the minimum amount of false road to fool it.

            • Fermion@feddit.nl
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              17 hours ago

              Have you ever seen examples of how the features that ai picks out to identify objects isn’t really the same as what we pick out? So you can generate images that look unrecognizeable to people but have clearly identifiable features to ai. It would be interesting to see someone play around with that concept for interesting ways to fool tesla’s ai. Like could you make a banner that looks like a barricade to people, but the cars think looks like open road?

              This isn’t a great example for this concept, but it is a great video. https://youtu.be/FMRi6pNAoag?t=5m58s

              • Korhaka@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                3
                ·
                16 hours ago

                I was thinking something that the AI would think the road turns left and humans see it turns right

        • Possibly linux@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          I think it does highlight the issue with the real world. There will always be edge cases and situations that lead to odd visuals.