AI Transcription is Pretty Bad… Don’t use it…Yet

It’s tempting to use an AI transcriber to take notes for you. In fact, it’s all over the healthcare world already, helping physicians write their notes. We are sure that it is tempting to look at this technology as a student and say, “Can my friend use it to transcribe a lecture for me if I’m going to be out of class?” or, if you’re a writer, “Can I use it to transcribe an interview.”

Why are we telling you not to use it? Because it’s really, really bad. Articles show that it also hallucinates, includes racist statements when patients are pausing, and otherwise creates false AI transcriptions that could create corrupted EMR notes.

This is important because patients have a legal right to their EMR notes. If you’ve ever tried to access them, they probably appear in your EMR app. Can you imagine reading that you made racist slurs or said something you didn’t? The worst part is that you can’t prove your innocence because Whisper, the tool most healthcare organizations and providers use for AI Transcription, erases the underlying data.  

What can you do to avoid Corrupted AI Transcribed Files?

First, if you’re not going to attend a class, get your friend to make an original recording of the lecture rather than use an AI transcriber. Making notes with an AI transcription might be easier, but it’s also less accurate, so watch or listen to a phone/video recording. If you’re conducting an interview, don’t use AI transcription technology. Use a traditional recorder and do the work to quote your subject the old-fashioned way. Sometimes, taking a longer time does mean more accuracy.

Which AI Transcription Technology Should You Avoid Right Now?

First, notice we said, “Which AI Transcriptions technology should you avoid RIGHT NOW,” not FOREVER. We are for Artificial Intelligence because we believe it has tremendous potential to improve the world. We also understand why health organizations, students, and writers want to use AI transcription.

Have you ever been in a room with a provider where they don’t even look at you because they’re busy typing patient notes? That’s why AI transcribers exist. Unfortunately, they’re currently inaccurate, so we advise using old-fashioned recordings until the gaps are fixed. In this way, we hold to our frequently blogged position that AI is a great innovation we need to utilize. Still, you need an AI detector and an excellent AI rewriter because vulnerable student populations are at higher risk of being falsely accused.

For now, avoid Whisper. Again, you won’t need to avoid Whisper forever. Fixes will be made, and guessing by the number of stories published about the issue, those fixes will likely happen soon. OpenAI is very responsive to making changes. However, the published news stories suggest you currently avoid any AI transcribing technologies or apps powered by the Whisper platform.  

 Another tool to treat with care is Google’s AI Overviews, which falls into the same category of AI transcription technologies. We’ve already blogged about Google’s AI overviews. Our previous blog warned that Google frequently provided a weaker answer. As an example, the story notes that the Google AI Overview tool once recommended “non-toxic glue to keep cheese from falling off pizza.”

Our takeaway for students and writers. AI tools are coming out all the time. We research so you can succeed. Treat AI transcription technologies carefully and just go the old-school route. We will update this blog when things change. We think they will, though!

Leave a Reply

Your email address will not be published. Required fields are marked *