It’s not hard to find videos of self-driving Teslas wilding in bus lanes. Check the videos out, then consider:

"There was an interesting side-note in Tesla’s last earnings call, where they explained the main challenge of releasing Full-Self Driving (supervised!) in China was a quirk of Chinese roads: the bus-only lanes.

Well, jeez, we have bus-only lanes here in Chicago, too. Like many other American metropolises… including Austin TX, where Tesla plans to rollout unsupervised autonomous vehicles in a matter of weeks…"

It’s one of those regional differences to driving that make a generalizable self-driving platform an exceedingly tough technical nut to crack… unless you’re willing to just plain ignore the local rules.

  • FreedomAdvocate
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    3 days ago

    No it didn’t switch it off because of that.

    FSD and autopilot do different things using the same data. It’s fact that they behave differently.

    • anomnom@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      From NHTSA IN 2022:

      “The agency’s analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact,” the report reads.

      https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF

      How else with they suddenly know to hit brakes and stitch off ~1 seconds before impact? Coincidence, or negligent coding?