NOTE: This post first appeared on KevinMD.com.
Speech recognition software is an important part of my clinic workflow. I use Nuance’s Dragon Naturally Speaking, the industry-leading application, and I estimate that it has saved me at least 1,000 hours of documentation time over the last decade. My typing is much slower than my speaking, and since my goal is to leave the office at a reasonable hour every day, using Dragon is the obvious choice for me.
But there is always a trade-off when using fancy new technologies. Like poor Dr. Bean in the comic strip above, I frequently find transcription errors in my chart reviews. Here is one that I found this week:
“He is continue to take gabapentin 903 times a day …”
Someone who actually takes gabapentin 903 times a day will get a dose of 90,300 mg, assuming 100 mg tabs. This amount is surely enough to render you unconscious long before you take the 903rd dose. But the software doesn’t know that. All it knows is that the speaker said, “nine hundred three times a day.” A lot of Dragonisms are like this. The text is absurd if you read it literally, but if you know medicine then you can probably figure out what the clinician meant from the context.
A few years ago I started collecting these transcription errors for my own amusement. This one might be my favorite:
“Thank you for allowing us to produce pain in the care of this patient.”
This was from a neurosurgery consult note after the resident who dictated it had played hot potato with my service about who would admit the patient. He actually said, “Thank you for allowing us to participate in the care of this patient,” but Dragon stepped in as a truth serum translator.
Here are several more from my file:
“Dr. X was insulted and he requested to see the patient as an outpatient.”
“I spent several months with the patient discussing this plan.”
“It is my opinion that he currently lacks the cognitive skills necessary to participate injury duty, and I recommend that he be excused from this responsibility.”
“Patient does have some essential tremor trick-or-treating with primidone.”
“We talked for a long time about her rug gnosis”
“Her left patella just has been monitoring the vagal nerve stimulator …”
“He also complains of some ‘bumps’ on his head near the crown and at the Indian.”
“To my knowledge patient has had no thanks for your thoughts”
“He and his heart and her were agreeable with this plan and I did manifest answer their questions.”
“She has never lived in the cerebral country.”
“We talked about ways to redirect his a beer when he is agitated or confused.”
“It feels like his years need to pop.”
“She is having some behavioral side effects from this medication, so we will try her on pirate docs seen to see if these reduce.”
“She has had some episodes of leaving the water on a sink, and her husband’s cottages in time to up a flight from happening.”
“Everyone’s mother can get her to walk outside a little bit.”
“We discussed her options for management, including repeating diagnostic tests and adjusting vacation doses.”
(Note: Can I increase my vacation doses?)
My kids laugh hilariously at nonsense like this. Most of these are pretty benign, but rarely I find something really embarrassing or offensive, like this:
“This is a X-year old illogical female who identifies as male . . .”
I actually said “biological,” not “illogical,” and I had no idea this error was in my note until I saw the patient back for a follow up. It is not my practice to make disparaging comments about transgendered people, in my clinic notes or anywhere else. Thankfully the patient accepted my apology when I explained what had happened, but can you imagine trying to explain this to a Twitter mob?
To think that such statements can be found in notes written by some of the most brilliant, competent, hard-working, and well-educated people in our society is kind of sobering when it isn’t funny. Healthcare providers are literate, often incredibly articulate, with legendary attention to detail. So how do we manage to say such stupid things in our chart notes, usually without even noticing that we have done it?
A review by Poder et al (2018) highlighted some of the tradeoffs between human transcription services versus speech recognition software. The major benefit of speech recognition is the dramatically shorter turnaround time for note writing, but this comes at the expense of a major error rate up to three times higher than what you get with human transcription, and clinicians have to spend more time proofreading their transcripts in real time when using speech recognition. I think that last point could largely explain the error rate. When you place a greater burden on an already burned-out workforce, you can expect that burden to be borne poorly.
I once read an ED note that contained 9 errors within 10 run-on sentences, and then closed with this disclaimer:
“This note was created using voice recognition software and may contain some technical errors, but every effort has been made to ensure its accuracy.”
I’m a little skeptical of the “every effort” claim, but I can totally sympathize.
My purpose here is not to say we shouldn’t use speech recognition software, or to slander Nuance’s very impressive and incredibly useful application. I plan to continue using Dragon every day, and my family will thank me for getting home at a sane hour every evening. What I hope is that we will all think more critically about our medical documentation tools, which are just like any other clinical tools. Do the medicines we prescribe and the procedures we perform carry risk? Absolutely! But we know those risks and weigh them carefully when making clinical decisions. The same is true of my documentation tools. I understand the tradeoffs involved, and I accept them.
Speech recognition is error-prone and a bit labor-intensive, but it gets the job done very quickly. And it makes me laugh more often than the other options, which itself can be rather valuable.
Leave a Reply