Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn’t ready to take on the role of the physician.”
“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”


Lacking for either sex. Even though they’re wrong any way, did you know the supplement RDA are all for women?
And… I’m not sure how much it’s really catching up, and how much it’s just reeling out just enough placatium to let the racket continue.
“For-Profit Medicine”'s an oxymoron that survives with its motto “A patient cured is a customer lost.”. … And a dead patient is just a cost of business. … No wonder “Medicine” is the biggest killer. Especially when you consider how much heart disease and cancer (and most other disease) is from bad medical advice too, thus making all 3 of the top biggest killers (and others further down the list) iatrogenic1.
We may be getting there so slowly as to take longer than the life of the universe, given how so much is still headed in the wrong direction away from mending the system, since seemingly all of the incentives (certainly the moneyed incentives) are all pushing the other way… to maximising wealth extraction, rather than maximising health. We’ve let the asset managers, the vulture capitalists, get their fangs into the already long time corrupted health care systems (some places more than others), and from here, we’ll see it worsen faster, perhaps to a complete collapse asymptote, as the rotters eat out all sustenance from within it.
1 “Induced unintentionally in a patient by a physician. Used especially of an infection or other complication of treatment”