Natural Language Processing (NLP) and Veteran Suicide Prevention [transcript] [audio]

Guest: Dr. Elizabeth (Liz) Marshall, MD, MBA

Presenter: Neal Howard

Guest Bio: Elizabeth (Liz) Marshall, MD, MBA is the Director of Clinical Analytics at Linguamatics. She served as active duty in computer operations in the United States Air Force as part of the logistics team in Operation Enduring Freedom, the military response to the 9-11 Attacks. After her military career, she became a research physician that dedicated her time to informatics solutions to improve efforts to effectively treat the mental health of Military Veterans. Her work as a Clinical Research Health Scientist at the Ralph H. Johnson Veterans Administration Medical Center in Charleston, SC led to substantial gains in the areas of suicidality and PTSD. Her work led to her acceptance of the Research Training Institute Scholar Award, ICRC-S Injury Control Research Center for Suicide Prevention (2013). As project manager, she was responsible for the development of a highly effective e-learning training program for clinicians, “recognized by the government’s education system, the Talent Management System (TMS).” The Department of Defense has since requested to adopt this training for federal-wide usage. She has reviewed and abstracted relevant study information from electronic patient records of over 1,000 patients and approximately 300 clinicians for overall VA hospital care, including physical and mental outcomes of those patients. After these projects and prior to her fellowship training in Informatics at the Medical University of South Carolina, she served as an Assistant Professor where she served as the clinical manager for a nutrition study for Veterans in efforts to decrease suicidality.  

Segment Overview: Elizabeth Marshall, MD, MBA, Director of Clinical Analytics at Linguamatics discusses how NLP (Natural Language Processing) technology can improve veteran care and maybe even prevent suicides.

Health Professional Radio – Natural Language Processing (NLP) and Veteran Suicide Prevention 

Transcript

Neal Howard: Hello and welcome to the program. I’m your host Neal Howard here on Health Professional Radio. Our guest is Dr. Liz Marshall. She’s a Director of Clinical Analytics at Linguamatics and she’s here to discuss how NLP or Natural Language Processing can improve veteran care and possibly even prevent veteran suicides. Welcome to the program Liz. How are you?

Dr. Elizabeth Marshall: Good. How are you?

N: Doing really well and glad that you could take the time. What is your specialty as a physician?

M: I was actually in mental health research and I have a large background at the Veterans Administration. Not only do I have a background as a research scientist there but I’m also a veteran.

N: Thank you for your service and as a veteran myself, I do understand the high rate of suicide among veterans. Many of which are suffering, or often untreated post-traumatic stress disorder or PTSD. In your experience, what percentage of veterans would you say are suffering from PTSD?

M: I’d have to say a lot more than that are actually diagnosed.

N: I’m thinking in some of my experience and some of talking with some other friends of mine, somewhere around 30, 40% of veterans are suffering from PTSD and that number could be much higher if you break down the population, you say the guys and women who aren’t diagnosed.

M: Correct.

N: Would you say that PTSD is the leading cause of veteran suicides?

M: I can honestly say that I don’t know for sure but I would have to say it’s pretty high on the list.

N: As a research physician, do you practice?

M: I did not practice. What I would do is I did clinical research where the focus is to help other physicians practice better.

N: Natural Language Processing or NLP, what is natural language processing?

M: So I guess the best way to explain natural language processing is when you look at the structured data and the unstructured data. And when you’re referring to like an electronic health record, the structured data is when there’s areas where you have like click buttons or drop-down menus and you can pick something. But with the unstructured data, that’s the areas we are actually typing in information into a field which of course is what most doctors notes are made out of.

N: So these note, I guess a language all into themselves because they’re not readily structured?

M: Yes and you need to utilize natural language processing to get information out of the unstructured.

N: Now you mentioned typing into fields, some of those information being unstructured even though it’s being electronically added into fields on a computer. Are we still dealing with paper notes as well?

M: Well, most health systems have electronic health records but there are certain documents like PDFs that are usually scanned into an electronic health record. And those, you scan them in, you have to use the optics software and then you can get the information out of it, so you can use NLP for that as well.

N: Now my question is, what is it about this unstructured data that’s not causing it to be included with some of this structured information? How does a file not get scanned or how does something in a field not be included into the normal EHR?

M: So it is actually included into the EHR but it’s kind of like trapped within a database because you can’t get that information out readily. So you can’t like run a report and get that information out of the electronic health record.

N: It doesn’t lend itself to my sequel?

M: Yes. It’s very difficult to get that information out. So if you think about an EHR that it was put together by four administrators. It wasn’t actually made for clinicians. So there’s important elements that are structured like, what appointments people have and if they’re going to the laboratory or the next appointment that they have or see different codes for their diagnosis and things like that. But it doesn’t have the information, it kind of holds the information that it says kind of what is all about the patient. But it doesn’t say anything about why. So if you think about it, a patient might have be non-compliant on their medication. But a doctor, a clinician is going to write information within that patient record and explain why are they not adherent. Can they not make it to the pharmacy or they simply they just don’t feel like taking their medication maybe they forget.

N: So the physician annotates that and then somewhere down the line, an analyst like yourself or someone else is going to need that information to make a much more informed decision about how to encourage this patient to be more compliant or compliant at all?

M: Correct. You can’t actually change the behaviors and make patients better unless you know like why they they are doing what they do.

N: Is any of this information withheld deliberately for any reason at all or is it just simply lying there having fallen through the cracks and no one knows that it’s there until they’re actually looking for it and have to go on this quest?

M: The second. I mean we practice medicine to make patients better, it’s not a malicious thing. The problem is when you only have like 20 minutes with a patient and you’re sitting in front of a workstation, the last thing you want to do, you want to type and get all that information in, you want to have some eye contact, it might be nice. You might actually have to do a physical of some sort and to actually trying to find certain pages within that patients record to find the correct button to click. You’re just not going to take that much time to do that.

N: Now I understand that you’re a mental health researcher and I also understand that NLP wasn’t developed or invented for the healthcare industry. In your experience, when you’re talking about some of this life-saving information that can be gleaned through NLP, is it more crucial would you say for the mental health aspect of the veteran or for the injury aspect or do they go hand-in-hand?

M: All of the above. If you think about the VA, so we have the Vista System. So you’re looking in the patient’s records and they’re marked as being having a suicide attempt. So you know from a structured report that they have that suicide attempt in their past, but there’s not any kind of information saying within the structured information saying if they’re actually at risk again or perhaps, they’re well controlled. It’s going to say if the clinician is going to report if they are feeling hopeful or perhaps they’re feeling, they had just lost a loved one or about to almost take another attempt.

N: Do you think that the VA is going to spearhead much more widespread implementation of this technology?

M: Yes. I mean I’m not working there anymore but I do believe that. I’ve talked to quite a few people and there seems to be some efforts into putting it into part of the workforce.

N: When NLP is doing its thing, can it determine what is pertinent based on who’s making the inquiry or what department or entity or does it just say, “You asked for it, here it is.” regardless of the nature or sensitivity of the information?
M: It’s going to have the same HIPAA guidelines when utilizing NLP as if you weren’t utilizing any kind of reporting. It’s very sensitive information.

N: So it’s very HIPAA compliant as most of the things?

M: Yes.

N: Where can we learn more about NLP online?

M: Well, you can go to my company’s website at linguamatics.com and of course, we have many information and of course, there’s always Google. But NLP is one of those areas where it’s so specialized that it’s not one of those things where you can usually take an easy course.

N: So it’s something that has to be seriously considered and researched I guess to get a handle on it. It sounds very complex.

M: It is. It depends on what kind of software that you utilize. So the open source software most of it is people have to almost be a PhD data analyst because it’s coding. And then there’s my platform that I actually utilize during my fellowship training. So it was more important for you to understand the data itself than to know how to actually do the coding and things like that.

N: Well Dr. Marshall, I thank you for coming in and talking with us today.

M: Thank you so much.

N: You’ve been listening to Health Professional Radio. I’m your host Neal Howard in studio with Dr. Elizabeth Marshall, Director of Clinical Analytics at Linguamatics. Learn more at www.linguamatics.com. Transcripts and audio of this program are available at healthprofessionalradio.com.au and also at hpr.fm. You can subscribe to this podcast on iTunes, be sure and follow us on Twitter and Facebook and visit our affiliates page when you visit us at hpr.fm and healthprofessionalradio.com.au.

Liked it? Take a second to support healthprofessionalradio on Patreon!

0 Comments

Leave a Reply

You must be logged in to post a comment.