Not that vehicles shipped after 2023 will be able to either.
Waymo, with Lidar, and all the tech still uses remote human drivers to deal with harder situations.
I don’t think we’re going to see FSD anytime soon. Turns out the last few percent can’t just be ignored, 95% isn’t good enough, and 100% is really fucking hard.
It would be funny if damages got astronomical because he promised everyone that they would be able to rent out their car as a taxi service for extra income.
95% isn’t good enough, and 100% is really fucking hard.
This is a more extreme case of the extremely famous rule in programming called the 90-90 rule:
“The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.”
Humans aren’t better at driving in bad situations, they’re just better at ignoring most of the input and focusing on one thing, and more importantly, taking risks, which a computer isn’t going to be programmed to do. If a human navigates through a bad rainstorm, barely able to see anything, and makes it out fine, then they claim they’re better than a self-driving car which would shut down. Or more simply, a route that is very tight and risky, but a human will YOLO and make it through. They’re not better, they’re just lucky a lot.
Yes, if you read them. If you consider doing a Hail Mary in bad weather and managing to not hit anything better, then I guess they are better… at taking risks.
I just imagine the dumbest thing someone can do in a situation and model for that.
Icy roads? Well, the logical thing to do is make sudden moves at the last possible second without leaving a buffer. Stand on the throttle at every intersection, and start braking when you normally would in the middle of summer, of course.
Honestly, you learn to predict unpredictability. Slight movements will tell you when someone is going to change lanes without a shoulder check or cross three lanes of traffic to make an exit that they could just have easily gone on to the next interchange without endangering themselves and others. Hell, I watch peoples eyes in their side mirrors look at me as they incorrectly judge how much space they have to insert themselves in front of me, when there’s a kilometer of space behind me they could use instead.
Not that vehicles shipped after 2023 will be able to either.
Waymo, with Lidar, and all the tech still uses remote human drivers to deal with harder situations.
I don’t think we’re going to see FSD anytime soon. Turns out the last few percent can’t just be ignored, 95% isn’t good enough, and 100% is really fucking hard.
The funny part is that every CS grad student studying AI understood that perfectly well a decade ago.
exactly. I have at least a dozen projects that prove the fucking point.
one can hope there will be an avalanche class action suits that crush this nazi and his swasticar.
It would be funny if damages got astronomical because he promised everyone that they would be able to rent out their car as a taxi service for extra income.
Techbros are learning this right now with AI too. Who could have guessed getting mostly there isn’t the same as being there.
This is a more extreme case of the extremely famous rule in programming called the 90-90 rule:
“The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.”
You’re wrong. We had Tesla FSD for years now. It just doesn’t work 🙃
Humans aren’t better at driving in bad situations, they’re just better at ignoring most of the input and focusing on one thing, and more importantly, taking risks, which a computer isn’t going to be programmed to do. If a human navigates through a bad rainstorm, barely able to see anything, and makes it out fine, then they claim they’re better than a self-driving car which would shut down. Or more simply, a route that is very tight and risky, but a human will YOLO and make it through. They’re not better, they’re just lucky a lot.
Seriously?
Yes, if you read them. If you consider doing a Hail Mary in bad weather and managing to not hit anything better, then I guess they are better… at taking risks.
Humans are better in bad situations because humans drive like humans, and they expect all the other cars on the road to also drive like a human.
The worst thing on the road is to be unpredictable, and an AI encountering a situation not in its training set is unpredictably unpredictable.
You’re correct on AI. But I laughed at you saying humans are predictable. Seen any dashcam footage?
I just imagine the dumbest thing someone can do in a situation and model for that.
Icy roads? Well, the logical thing to do is make sudden moves at the last possible second without leaving a buffer. Stand on the throttle at every intersection, and start braking when you normally would in the middle of summer, of course.
Honestly, you learn to predict unpredictability. Slight movements will tell you when someone is going to change lanes without a shoulder check or cross three lanes of traffic to make an exit that they could just have easily gone on to the next interchange without endangering themselves and others. Hell, I watch peoples eyes in their side mirrors look at me as they incorrectly judge how much space they have to insert themselves in front of me, when there’s a kilometer of space behind me they could use instead.