Tldr: this doctor started using an AI tool to transcribe meetings with patients instead of writing up his own notes about the patient’s symptoms, progression, concerns, etc. And it worked very well for that specific purpose.

Unfortunately, this doctor realized, the mental work of writing up his own notes was vital to actually remembering his patients and understanding their needs.

I sat down to review a patient I had seen six weeks previously. I read the note. It was accurate. It was comprehensive. It contained no factual errors that I could identify. And I did not recognise it.

The voice in the note was not my voice. The emphasis was not my emphasis. The clinical narrative — the selective, interpretive story that a GP constructs to capture not just what was said but what mattered, what worried them, what they decided to watch and why — was absent. In its place was a faithful transcription of everything spoken, organised by structure rather than by clinical significance.

I had not written that note. An algorithm had written it, in response to sounds it had heard, and I had approved it, hurriedly, at the end of a session. Now, six weeks later, I was reading someone else’s account of a consultation I had conducted — and I couldn’t recall the patient clearly enough to reconstruct what had been left out.

This is not a small thing. The clinical note in general practice is not merely a medicolegal record. It is, as research in the Journal of General Internal Medicine has articulated, a form of narrative medicine — a clinician-authored story that reflects how the physician understood the patient’s situation at that moment in time. The act of writing it is itself a cognitive process: it forces synthesis, prioritisation, and reflection. It is, in a real sense, how we think.

When we outsource that act to an AI, we are not merely saving time. We are externalising a cognitive function that was doing clinical work we didn’t realise it was doing.

As SpaceNoodle in comments reminded me, people in the tech industry have been warning for years that outsourcing basic coding tasks to LLM tools deprives new coders of the experience they need to take on more complex tasks.

I wonder how many other professions will find the “grunt work” - like interview transcription, the stuff simple enough to be outsourced to an AI - plays a role they hadn’t recognized until they outsourced it?

  • SpaceNoodle@lemmy.worldM
    link
    fedilink
    arrow-up
    35
    ·
    2 days ago

    The amount I learn every time I code up some mundane tool may at times be small, but the cumulative effect is significant. It’s surely how I’ve reached my current point in my career, and that’s how it shall remain when I leave it.

    Surrendering the last of our cognition to a tool has dire consequences.

    • Saprophyte@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      2 days ago

      My sister is a teacher. She explains it as her students don’t need to know the information in their papers, they need to know how to research, cite, and write their papers. She says she feels sorry for the students who just spit out a paper with an LLM and don’t care about the skills they are letting atrophy and will eventually lose. They are losing the ability to think critically and research then take that information and summarize it to convey thoughts and ideas to others.

      What happens when a generation of people lose that capability?

    • homes@piefed.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 days ago

      Practice makes perfect. But if you don’t keep practicing, how are you supposed to get any better?

  • leadore@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    1 day ago

    I read the note. It was accurate. It was comprehensive. It contained no factual errors that I could identify. And I did not recognise it. Now, six weeks later, I was reading someone else’s account of a consultation I had conducted — and I couldn’t recall the patient clearly enough to reconstruct what had been left out.

    If you couldn’t recall enough to reconstruct what had been left out, then how do you know it was accurate and comprehensive? hmm? HMMMM?

    What a nightmare.

  • merc@sh.itjust.works
    link
    fedilink
    arrow-up
    15
    ·
    2 days ago

    This is an interesting story because:

    • The AI transcription was perfectly accurate, even with medical jargon
    • Not having to take his own notes allowed him to spend more time with his patients, and to listen to them more closely
    • He felt less stressed and less burned out as a result
    • It wasn’t de-skilling him, at least not in the way we traditionally think of it

    It’s basically a best case scenario for LLMs and it still made things worse. Taking notes felt like a tedious thing that kept him from doing his job. But, he discovered that taking notes was part of his job, and if he didn’t do it he couldn’t properly care for the patients.

    Maybe once he realizes why it is that it was failing him, he’ll be able to adjust his process so that he can take advantage of the machine learning system. It might be as simple as looking over the results immediately after the consultation and scribbling things in the margins so he doesn’t forget the key takeaways. Or, maybe the old note-taking process is simply the best one and the LLM can’t offer anything to actually help.

  • DevDave@piefed.social
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    When I was younger, I worked exclusively on a v220 (best of the worst) Unix terminal using vi to write programs in c/c++. Yeah I got a little fancy near the end of that with three separate terminals, but the point was I also had memorized an absolutely ridiculous amount of information starting at the syntax rules for four somewhat related languages (bash, c, c++, and whatever the fuck makefile’s are) plus hundreds if not thousands of functions in c and c++ without needing to consult documentation or rely on autocomplete to fill in the blanks.

    30+ years later I have a command palette tool I absolutely depend on along with degraded typing skills that show whenever I have to type something out completely like a peasant.

  • tiredofsametab@fedia.io
    link
    fedilink
    arrow-up
    8
    ·
    2 days ago

    Medical transcription has been a thing for years, so I find it interesting I don’t recall hearing about it until now with AI. I guess it’s not as popular a topic.

    What he says, however, is a reason I don’t like using AI for writing code or design work; I want to learn, know, and remember every part of the systems I have ownership in. Having AI do everything removes that.