• 3 Posts
  • 1.7K Comments
Joined 3 years ago
cake
Cake day: June 10th, 2023

help-circle

  • Transportation Minister Devin Dreeshen says after a period of population growth, Alberta’s cities need more roads, not bike lanes. It’s a position that Edmonton’s mayor and cycling advocates say they have concerns with.

    “When you’re taking that lane away, that has a whole bunch of cascading effects to people that live in that area,” Dreeshen said at an unrelated news conference on Monday.

    This is literally the opposite of what science has been saying for decades now

    Why is it that i cannot get any job without knowing as much as possible about said job, but when it comes to management and needing to make decisions -especially in politics- suddenly the less you know, the better? Why is it that almost no politician actually knows (let alone have expertise and experience) about the subject that they need to make decisions about?


  • The problem is not the tech. LLMs (AI does not exist, not yet anyway) have their uses and are impressive technology

    The problem is the tech bros and all the mouth breathers who follow the tech bros without question while they insert lies, “AI”, everywhere it’s not supposed to go, and the places where it would actually be useful so far have been mildly neglected

    I see, for example, use in having AI check MRI results for cancer. A doctor already checked it found nothing, and an AI does a second check and might find something a doctor overlooked. A real doctor then needs to check the results again to confirm the flagging. Please note, I’m not a doctor, I might be saying nonsense right now, but the point I’m making is that AI may be useful as a second pair of eyes.

    AI can be, and has been used to find new novel mathematics. Mind you, AI is not creative, it just tries really weird and unexpected pathways to get to s solution which sometimes is useful

    But the way AI is used now, making porn of your little niece, chatbots, and hey, how about an AI pilot, eh? And ai of course can take over the work from thousands of developers and DevOps employees, so let’s fire them all and then figure out that AI can’t do any of this shit, not nearly at the level required, and it fucks up about 30%ish of the time…

    People are losing their jobs over this

    I am losing my job over this

    I can’t find a new job either becat all the recruitment and job finding is now all AI slop and where 5 year ago ingot a job with 20-30 applications, I now have sent out 200 applications and gotten a single intro interview and that’s it

    AI promised to take away the mundane boring and dangerous jobs so we could focus on art and fun.

    AI took the art and fun and guess who’s left to do the mundane and dangerous?

    Yeah.

    Don’t even get me started about the shit we’ll face once we make real actual AI. For the ethics, just watch “ST TNG: the measure of a man” to get yours started. It will be a shit show












  • Though this was an idiocity, I think we need to be careful with just blaming the surgeon and that’s it.

    Errors like this usually happen because of a chain of various circumstances and other little mistakes, like with airplane crashes.

    I think it would be much better that we treat these sort of incidents like airplane crashes. Investigate everything that went wrong, all causes, without focussing on guilt during the investigation. Guilt can be determined from the results of that, but primarily I want that we get data on how this happened in the first place, and what we can do to avoid this from happening again. This strategy was highly successful in aviation, I’d like to see that applied here too because too much shit still goes too much wrong in healthcare