• Basic Glitch@sh.itjust.worksOP
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    4 days ago

    Exactly, AI could be a very helpful and powerful tool with the potential to do some really good things, but it’s still a tool not a replacement for an education and common sense. We’re all being intentionally dumbed down by dumb people who have been handed way too much power because nepotism. They’ve accomplished so much destruction (success?) by just acting overly confident, and pretending they knew what they were doing. It’s only recently that people have started saying out loud what I think many of us were probably silently screaming in our heads, but waiting on somebody else to be the first one to say: “For the love of God, just put it down and back away.”

    Calculators are useful tools, but imagine how fucking dumb the world would be if we all just collectively agreed nobody needs to learn math anymore because we have calculators that do that.

    GPS navigation is a useful tool, but it doesn’t mean you should blindly follow the instructions it gives when it tells you to turn into a brick wall or a body of water.

    I’m not sure if it’s a recent liability thing in medicine, or maybe because my insurance requires I go to a giant monopoly to remain in network, but it seems like over the last year, most doctors have had it hammered into their heads that AI should be the first and only step they take when diagnosing a patient. Not that it should be a tool to augment human capabilities, not that it’s what you turn to for help when you can’t figure out what’s going on, or even what you consult to double check your own conclusions.

    The old saying is “treat the patient not the symptoms” but it feels like suddenly it’s become policy to treat every patient like a checklist of symptoms in a diagnosis box.

    That literally only makes sense for a computer making a diagnosis based on the sum of check marks in each box reaching a predetermined cut off value. That could be a useful way to prescreen people or streamline treatment for simple things like diagnosing a cold/flu, but when a patient has persistent symptoms, but only checks one or two symptoms in a lot of random boxes that don’t tally up to meet any predetermined cut off value for a single diagnosis, wtf does the computer do at that point? What is the next step for this patient who is begging for help only to be told, sorry, that doesn’t check off enough symptoms in any one box for me to be able to diagnose you.

    🤷‍♀️ is also literally the answer when the doctor is unable (or maybe not allowed due to policy) to think critically and evaluate the information available in the context of the patient vs a predetermined diagnostic screener somebody else created.

    Anytime I hear “we can just fill it in with AI” I feel like we’re closer and closer to Idiocracy becoming a reality, but it feels especially close when it comes to medicine. It’s like somebody saw this scene and didn’t get that it was supposed to be a joke, not something we aspire to: