Last week, Dr David Albert showed us the AliveCor Mobile Heart Monitor for iPhone 4 and 4S and how it can take an EKG in seconds and send it to your provider. Today, Dr Nick van Terheyden talks about Nuance voice recognition system and how it improves physician/patient relationships, improves physician efficiency and therefore saves time and money for healthcare professionals and providers. And in the second video below, Dr. van Terheyden demos the Nuance voice recognition system.
Now watch the video:
and watch the demo:
To see other videos in this series, please go to this page. And if you have a story to tell that can reduce healthcare costs and raise quality of care, please comment below or email me at joan@socialmediatoday.com Thanks!
Transcript of videos (by TranscriptionStar)
Dr van Terheyden video:
Joan: Hello. I’m Joan Justice from HealthWorks Collective, and I have with me today Dr. Nick van Terheyden from Nuance. Dr. van Terheyden has a background in medicine information technology and speech recognition and as Chief Medical Information Officer at Nuance Healthcare Solutions.
Nuance develops medical voice recognition software that helps increase the quality of documentation, improve efficiency for healthcare professionals and providers. Dr. van Terheyden, tell us a little bit about Nuance and what you do to raise the quality of care?
Dr. Nick: Well, hi Joan and thanks for the opportunity to join you. Its always exciting to be in this virtual exchanges and thank you for those kind words. As you describe Nuance, a large organization I think most well-known for its technology in speech recognition I think DragonNaturallySpeaking is probably the product that most people would associate with the company, but as an organization multi-billion dollar with many different sectors Healthcare is about half of what we do. But the balance of that is all the things that you interact with, so when you call up the airlines to get flight information or to interact that’s how technology typically was working in the background.
When you’re in your own car, and you’re interacting with the car systems, VSync and some of those other applications those are our systems as well, so we a have range of different products. And what’s exciting about that from the healthcare standpoint is all the research and development and we spend somewhere of the order of a $160 million per year in research and development doesn’t just benefit those areas. It benefits healthcare as well.
One example that’s kind of interesting when you’re talking to your car that’s a very noisy environment. All of the research is going to making that effective in work has been transferred into the healthcare already.
Joan: I see.
Dr. Nick: And you know specifically in the ED department very similar to talking.
Joan: Sure.
Dr. Nick: You know we worked at how to sort of filter out those things and make the voice more effective. In the healthcare space, I think we’re very well-known for our — both our backend transcription solutions that support transcription and automate that to improve efficiencies and then obviously Dragon Medical NaturallySpeaking that’s implemented somewhere the order of 400,000s clinicians are using on technology in some form or another. Probably using backend where they dictate into a phone or some device I guess process somebody else is involved in reviewing it that’s the transcriptionist [Indiscernible] [0:03:03] and then it gets returned and they signoff on it.
And then the interaction that we had with systems and we voice-enabled them using Dragon allow people to dictate. More recently, we’ve introduced a number of other tools that give you much more flexibility in that exchange. So we started to introduce sort of mobility solutions, so you can start to use mobile devices to dictate so everybody’s liking these devices, but the challenge has been how do I get an easy interaction to capture information sometimes the keyboards on these devices represent a challenge even with the bigger one.
So voice sometimes proves to be very useful tool in there. And then the other thing that I think is being really important in terms of elevating standard of care and starting to introduce tools that facilitate clinicians and help patients get more value from this has been the intelligent voice systems that essentially allow you to interact with technologies and smooth out the complexities. So many clinicians complain about the difficulty of working with an electronic medical records. It’s a very complex environment finding the lab results might require you to go to a specific menu, select the patient and so forth.
And what we find is that some people and it’s not universal, but certainly for those that struggle with that, and you know remembering the command to say show me the lab results can be easier. And with intelligent voice systems what we’re doing is turning that on its head and saying rather than you having to remember this that specific sequence you can do it anyway you want. Show me the lab results. Show me the latest patients. What are my patients today? Anyone of those variations ought to produce the same effects were one of the synonyms that sort of accounted for.
And what we do in the backend of the technology is we understand what they said and we interpret that and action that, so that they don’t have to learn some of the technology. Not always appropriate, but from the perspective of mobility and being out and about it offers you more flexibility than working with keyboard would can be quite challenging in this small devices. We can look at the information, but then how do we interact with it, and how do we actually capture that information, so if I want to enter data or to interact with the system you know note provides online medical decision-making that can be difficult on the keyboard on the small device, but I can voice it, understand it, and then use that for additional functions downstream.
Joan: Okay what are some of the patient benefits to this? I can see where it might be very beneficial patients talk a little bit about that?
Dr. Nick: So patients you know typically have struggled with technology because it is defocused the connection away from the interaction. You know one of the things that we hear repeatedly is when they go into a physician office, they’re frustrated by the challenge of the technology detracting from the interaction, so dealing with the keyboard and dealing with the screen whereas they want me to be focused on them, and I think one of the key facets of this is that intelligent voice systems can start to listening to that exchange so as we interact as a physician-patient you’re explaining your conditions.
You’re talking about my you know challenges I’m having difficulty walking upstairs. We listen into that. We listen into the physician and we’re starting to use some of that clinical understanding technology extracts out the information, and then we can present it back to the physician and to the patient for review. Does two things, one it allows me to focus on you know I think from the patient satisfaction that’s going to increase the level of satisfaction and patients’ interaction will improve?
And number two it makes that information much more readily available not just for the clinician, but also for the patient, and I think we’re seeing more patient engagement, and delivering that quickly to them is a key part of this. You know there’s no waiting for this for hours, days, sometimes weeks to receive it. I want it now, so like its —
Joan: That’s excellent and can it eliminate transcription do you think or will it eventually?
Dr. Nick: Well, so that you know I would say that its — you know transcription is dead long live transcription. I think we’ve been predicting the demise of transcription for a long time, and I think what transcription has done over the course of the last several years has been morphed into a different role, so its much more of a supportive role, and I think the key to success with any voice systems is delivering choice to clinicians and to patients, and choice is sometimes I want to pickup dictate and hang-up and be done with it because I’m so busy perhaps my colleague is off sick with the flu. I can’t cope with the patient load because it just doubled for me suddenly, and its much quicker for me to offload that to somebody and review it that’s really a backend process.
We still apply technology. We can still do all the intelligent things, but it’s a little bit delay. But there might be another time and I’m not as busy or it’s a patient on the weekend you know I’m interacting with them. I want to get that report so that they when they’re admitted to the hospital as an urgent care case that information is transferred immediately, so I’m willing to interact with the voice systems.
So I don’t think will I ever see transcription disappear entirely. I think what we’ll see is it will morph into this supportive role and we’ve already seen that with some of these Scribe solutions that have editors working in conjunction, and so I think transcription will continue to persist in some form, but maybe we should change the name. You know we changed them from transcriptionist to editors. I think they still contribute tremendously. Their value proposition changes and technology will just make them more efficient as it did with back end. They’re going to become more efficient. It might morph into a different role as part of that interaction.
Joan: All right well, thank you so much Dr. van Terheyden for explaining more about voice recognition software, and what it can do in the way of raising quality of care. Thank you so much.
Dr. Nick: Joan thanks for the opportunity. Its always a great pleasure to chat with you. Thanks.
Demo:
Nick: So what I like to is to show you our demonstration environment using the Dragon Medical Speech Anywhere Services. And I’m going to login using my user ID and password, or indeed a system will offer me an opportunity to authenticate using my voice. I can — we’re always using voice biometrics another tool set that tries to smooth out the interaction for people working with technology.
[Welcome on BT how can I help you?]
Nick: So it speaks out information. Now, I can interrupt that or I can turn it off if I want. Show me today’s appointments. Select patient Adam Branson. Show me the vitals.
[Adam Branson. He’s 72 years old. His height is 6 feet 0 inches and he weighs about 175 pounds. His temperature is normal. His body mass index is 23.71 and his blood pressure is 160 to 90 mms of mercury].
Nick: So there you have an interaction that can read out information. I can select patients, and how I use that completely dependent on peoples individual use cases and what they want, but flexibility in that interaction