I want to introduce myself as a Data Scientist who has worked with healthcare data for decades. I’m writing to you also as a frequent healthcare customer (let’s call it like it is) who is discovering the misinformation that now exists as a result of the digitization of health records and processes. I’m writing to you with great alarm over the use of the Electronic Health Record (EHR) as the source of information for healthcare and treatment decisions. Healthcare customers need to be made aware of the impact on healthcare of relying on an electronic record, specifically in conjunction with Artificial Intelligence.
The following link describes how AI currently is being used in treatment decisions, including in Colorado. (https://kffhealthnews.org/news/article/artificial-intelligence-pain-medication-narx-score/). A proprietary algorithm that produces a “NarxScore” theoretically informs likelihood of prescription drug misuse, and produces recommendations around (not) prescribing. The known factors considered are prescribing numbers, including number of opioid prescriptions, number of docs prescribing, sedative prescriptions, and dosages. Other factors in the algorithm aren’t revealed. Which makes it impossible to speak to the accuracy of the information.
I don’t know if my provider uses this software currently, but if not, this just exemplifies my future. Where does the NarxScore gets its data? It could be from Electronic Health Records, which I never consented to have my information used in this way, and I do own the information in my medical record. It could be pharmacy prescribing records, which I never consented to give access to a third party for these purposes. It could be from the state drug monitoring database, again I never consented to my information being used this way. However, Data Privacy is just one of the issues.
The larger problem is that there is no assurance that the data upon which the AI operates is complete, accurate, quality information.
AI is an incredible tool. Electronic Medical Records are a great concept, but the execution has failed. They’re not accurate, in part (mostly?), due to pulling data from other health systems and failing to assure accurate incorporation into the health record.
I’ve been a regular customer of healthcare through the years, mostly due to being a weekend athlete (ortho surgeries). My most recent surgery was in February 2023 at CUHealth. I received three weeks of pain medications. At EVERY follow-up for the surgery (5-6 visits, the last one in August) the opioid prescriptions were back in the electronic chart, despite the surgeon’s office who has prescribed them removing them. Repeatedly. Remember this part.
In August 2023, pickleball hit hard, with a broken foot. As a result of that injury and an altered gait, I’m currently trying to get care from my primary care (New West Physicians) for a hip that is almost constantly spasming. Extremely painful. Not responsive to OTC meds. However, I can’t get any pain medication because the New West chart incorrectly has me on opioids for six months, instead of the 3 weeks. What is the data source? Obviously somehow connected to the UCHealth chart. How can I change it? There’s nothing I can do about it. Even if I knew the path by which the information got there. Because without a change to the EHR data processing systems it will keep happening.
Likewise, reviewing a list of my conditions in the EHR shows a number of inaccuracies. From old diagnoses that no longer apply but are still active, to blatantly incorrect items.
AI is being used now for healthcare prescribing decisions as above. An article from the National Library of Medicine in 2021 states “it is effective to develop AI that can predict the occurrence of specific diseases or provide individualized customized treatments by classifying the individualized characteristics of patients.” (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8473961/)
Next are Diagnostic decisions. Screening decisions. Treatment decisions. AI will be used to diagnose from transcription of clinic visits. AI, based on co-occurring diagnoses and history, will recommend (read: insurer may pay) whether screening or follow-up should occur. Determining the best treatment approach is dependent on complete, accurate information.
But AI doesn’t know what it doesn’t have. And it believes everything it’s told.
So what is the solution? Everyone’s first response – “Go in and fix your own medical record.” Personal philosophy is I clean up my own messes. I didn’t do this, the moneymakers did. Beyond that, I happened to find this error, where else do I need to look? And do I have access to the source data and processes? And what information is being used by the AI? Remember, that’s proprietary for the NarxScore.
The first solution is to put a halt to the use of AI in any healthcare processes, including visit transcriptions and beyond, until there is assurance around the integrity of the information captured in the Electronic Medical Record.
Further, the cleanup of the existing EHR information and digital processes must include a full audit of the data architecture and ETL processes of the systems that produce the EHR. Then, companies developing these products have the obligation to ensure the integrity of the output, i.e., all of the healthcare data sources are confirmed accurate, and the combination of these multiple data streams produces an accurate record for healthcare.
This situation severely limits the ability of the healthcare provider to provide care, and is unfair and negligent to the healthcare customer. If nothing is done to fix the problem of inaccurate Electronic Health Records and allow unfettered development of AI-powered healthcare processes on the basis of this inaccurate information, the quality of US healthcare at the individual level is guaranteed to disintegrate even more quickly.
I am available to discuss this at your convenience. Thank you for your time.
https://ift.tt/LjihbXt Submitted October 12, 2023 at 08:24PM by hippiedawg https://ift.tt/xOS13X9