this post was submitted on 19 Jan 2024
190 points (98.0% liked)

Asklemmy

43917 readers
1825 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Context: I'm a second year medical student and currently residing in the deepest pit in the valley of the Dunning-Kruger graph, but am still constantly frustrated and infuriated with the push for introducing AI for quasi-self-diagnosis and loosening restrictions on inadequately educated providers like NP's from the for-profit "schools".

So, anyone else in a similar spot where you think you're kinda dumb, but you know you're still smarter than robots and people at the peak of the Dunning-Kruger graph in your field?

you are viewing a single comment's thread
view the rest of the comments
[–] SnotFlickerman@lemmy.blahaj.zone 7 points 10 months ago* (last edited 10 months ago) (3 children)

infuriated with the push for introducing AI for quasi-self-diagnosis and loosening restrictions on inadequately educated providers like NP’s from the for-profit “schools”.

That's because these decisions are not being made with data, considerations for patient safety, and so on.

It has everything to do with bloated hospital administrations who eat up all the money and spend pennies on actually helping patients. It's usually not fucking doctors who are like "you know what would be cool, if I could replace half my nurses with an AI in my phone. I would save so much money!"

It's just one more pot to piss money away into while saying it improves something.

CEO's and other business leaders regularly ignore data and evidence they don't like. Look at the Return to Office fight, they don't care about the data, they don't care about employee satisfaction or retention, or savings in real estate. They are miserable and they want you to be miserable, too. They have more than enough money for them to weather all the bad decisions they make, because the worst parts of their bad decisions fall on the weakest and poorest in society, as usual. So they could give a damn what anyone thinks of their shitty ideas: their shitty ideas are going to happen, because they've got the purse-strings.

These ideas don't come from regular people. They come from an entire class of people who is completely disconnected from what any of these decisions do. They are only making decisions for a number on a spreadsheet, and sometimes they don't actually give a shit if the number goes down as long as they get to continue feeling in control of other people. They literally don't care that their decisions are dumb and will hurt people, they're going to do them anyway.

IMHO, this has fuck-nothing to do with Dunning-Krueger and everything to do with decisions made by rich out of touch twats.

[–] z00s@lemmy.world 4 points 10 months ago

If inaction were deemed socially acceptable as a management strategy, the world would be an incredibly better place.

So much of the bullshit ideas come from managers who think they have to always seen to be doing something, whereas sufficiently complex systems are often self-balancing and require little to no direct action.

But to act on this they'd have to admit that managerial jobs are largely bullshit and unnecessary.

[–] Tremble@sh.itjust.works 3 points 10 months ago

Sir, this is a Wendy’s. Sorry your fries took so long.

[–] medgremlin@lemmy.sdf.org 2 points 10 months ago* (last edited 10 months ago) (1 children)

At least I can rest assured of the fact that AI will be next to useless in my intended field. Emergency medicine is an environment where you get a random constellation of symptoms and complaints with very little direction on which are related to the current illness, and which ones are not currently relevant. Also, in the time it would take to get all the info into the AI for a trauma/cardiac/code situation, the patient might be dead or rapidly heading in that direction.

[–] Tremble@sh.itjust.works 1 points 10 months ago (2 children)

Can’t AI aggregate the data on triage outcomes to prioritize larger scale emergencies… it has to be useful somehow I would think

[–] montar@lemmy.ml 2 points 10 months ago (1 children)

It wouldn't need to be AI, just sone statistics and "Crdiac arrest? Piority 9. Broken arm? piority 1" decision-making.

I'm a tech wizard not a healer, there probably are factors that define one cardiac arrest even more critical than the other i just do not know them.

[–] medgremlin@lemmy.sdf.org 2 points 10 months ago

There's some things you look for that are difficult to describe to someone who hasn't seen it before. That's part of why experience is so valuable in Emergency Medicine, and it's not uncommon to put your best nurses out in triage. People will do this kinda twitchy/wilting/loss of focus/change in pallor/change in posture right before they go down. I don't have a good way to describe it, and it might be easier to draw even, because it really is a body language thing and the general appearance of the patient that can inform your decision making.

[–] medgremlin@lemmy.sdf.org 1 points 10 months ago

I have thought about trying to plan out a learning algorithm that could spit out suggestions for triage level and preliminary tests based on input data like vital signs, symptoms, and complaints... but I would never implement something like that as anything beyond a tool for the nurses at triage to use. There would have to always be an option to override the algorithm because there's some aspects of patient presentation that are not easily quantifiable. I'd never be able to explain it in a way that one could input it into a computer, but even with my limited experience, I can tell which patients are going to crump on me.