• 0 Posts
  • 6 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle
  • I dodnt not read it because “reading bores me.” i didn’t read it because i was busy. I have people round digging up my driveway, i have a 7 week old baby and a 5 year old son destroying the house :p i have prep for work and i just did a bit of browsing and saw the post. Felt compelled to comment for a brief break.

    Im not sure what you mean by “silly opinion.” Everyone who has been arguing with me has been stating that everyone knows that teslas dont use LiDAR, and thats why this test failed. If everyone knows this, then why did it need proving. It was a pointless test. Did you know: fire is hot and water is wet? Did you know we need to breathe air to live?

    No?

    Better make an elaborate test, film it, edit the video, make it last long enough to monetise, post it to youtube, and let people write articles about it to post to other websites. All to prove what everyone already knows about a dangerous self driving car that’s been around for 11 years…

    I am sorry, i just dont get it. I felt like I was pointing out the obvious in saying that a test that’s tailored to give a specific result, which we already know the result of, is a farcical test. It’s pointless.


  • Excuse me.

    1. Did you write the article? I genuinely wasn’t aiming my comment at you. It was merely commentary on the context that is inferred by the title. I just watched a clip of the car hitting the board. I didn’t read the article, so i specified that i was referring to the article title. Not the author, not the article itself. Because it’s the title that i was commenting on.

    2. That wasn’t an 18 wheeler, it was a ground level board with a photorealistic picture that matched the background it was set up against. It wasnt a mural on a wall, or some other illusion with completely different properties. So no, i think this extremely specific set up for this test is unrealistic and is not comparable to actual scientific research, which i dont dispute. I dont dispute the fact that the lack of LiDAR is why teslas have this issue and that an autonomous driving system with only one type of sensor is a bad one. Again. I said i hate elon and tesla. Always have.

    All i was saying is that this test, which is designed in a very specific way and produces a very specific result, is pointless. Its like me getting a bucket with a hole in and hypothesising that if i pour in waterz it will leak out of the hole, and then proving that and saying look! A bucket with a hole in leaks water…


  • That’s fair.

    I didn’t intend to give tesla a pass. I hoped that qualifying what i said with a “fuck tesla and fuck elon” would show that.

    But i didn’t think about it that way.

    In my defense my point was more about saying “what did you expect” the car to do in a test designed to show how a system that is not designed to perform a specific function cant perform that specific function.

    We know that self driving is bullshit, especially the tesla brand of it. So what is Mark’s test and video really doing?

    But on reflection, i guess there are still a lot of people out there that dont know this stuff, so at the very least, a popular channel like his will go a longway to raising awareness of this sort of flaw.



  • As much as i want to hate on tesla, seeing this, it hardly seems like a fair test.

    From the perspective of the car, it’s almost perfectly lined up with the background. it’s a very realistic painting, and any AI that is trained on image data would obviously struggle with this. AI doesn’t have that human component that allows us to infer information based on context. We can see the boarders and know that they dont fit. They shouldn’t be there, so even if the painting is perfectly lines up and looks photo realistic, we can know something is up because its got edges and a frame holding it up.

    This test, in the context of the title of this article, relies on a fairly dumb pretense that:

    1. Computers think like humans
    2. This is a realistic situation that a human driver would find themselves in (or that realistic paintings of very specific roads exist in nature)
    3. There is no chance this could be trained out of them. (If it mattered enough to do so)

    This doesnt just affect teslas. This affects any car that uses AI assistance for driving.

    Having said all that… fuck elon musk and fuck his stupid cars.