Two new studies caught our attention last week:
- A May 2015 poster from the OARSI (Osteoarthritis Research Society International) annual meeting in Seattle says that the time between diagnosis of knee osteoarthritis (OA) and total knee replacement (TKR) is, on average, 114 days IF there are no intervening treatments like HA (hyaluronic acid) When there is a single HA course of treatment the time to surgery extends 238% to 386 days. Two courses of HA treatment stretches the time to TKR to 648 days. And so forth. One interpretation of this data would be that HA injections work. Another, also valid interpretation is that ANY intervention of even minimal clinical efficacy would also extend the time to TKR.
Why aren’t the payers all over this study?
- And then this March 2015 study from the International Journal of Spine Surgery compared surgical to non-surgical treatment for chronic sacroiliac joint pain. The Level 1 study randomized 148 patients with SI joint dysfunction and very high pain scores to either SI fusion with an implant or to non-surgical treatments (steroid shots and the like). Turns out, says the study, surgery to stabilize the SI joint reduced pain from 82.3 (VAS Pain Reduction, SF-36 and EQ-5D) to 29.8 at six months versus from 82.2 to 70.4 for the same period for the non-surgical approach.
Did the payer’s notice?
Payers, it is probably fair to say, do not regard clinical studies the same way as physicians.
Accepting the flaws inherent in clinical studies generally—sponsored or not—protocol-driven, Institutional Review Board (IRB) controlled testing of products or procedures on human patients—i.e., clinical studies—remain the most reliable source of information that physicians, nurses and other healthcare providers use to inform their daily practice and improve quality of care.
Right?
Well. Payers may disagree. In fact, payers seem to think that physicians (and providers in general) don’t understand clinical data as well as they do.
Really.
Payers and Providers: Two Ships Passing in the Night?
We noted with interest a recent article published on the Healthcare Information and Management Systems Society (HIMSS) website which made the point that payers were more adept at analyzing “clinical data” than hospitals or clinics.
What the author (and this is endemic in all payer discussion of “clinical data”) meant was claims data mining. That, in their lexicon, is “clinical data.” And that data does, indeed, provide a lot of information about outcomes, cost of care, post-operative complications, pain medicine usage and so forth.
In other words, in the payer’s world, “clinical data” refers to data that arises from a clinical interaction with patients. “Clinical data” in the payer’s world is not like clinical data in the surgeon’s world.
Payer clinical data is about analyzing clinical costs and differences between surgeons, hospitals or clinics.
According to a 2011 HIMSS Analytics (Chicago) whitepaper study of payers’ and providers’ use of clinical data (sponsored by the San Diego, California-based Anvita Health) the main purpose of clinical data analytics was to frame clinical claims data within the context of meaningful use.
And here, said Marc Holland, vice president of market research at HIMSS Analytics, payers are more sophisticated in how they use clinical data than providers are.
As quoted on the HIMSS website, “[Payers] have a vested interest in insuring that the care is delivered with a minimum of cost and a maximum of quality. I think providers [physicians and hospitals] are not necessarily ignoring that, but haven’t had the data to do that effectively as the payers have. The payers have a leg up and a lead on the providers, but that gap is closing.”
Correlation vs. Causation
Data mining is about uncovering correlations between what appear to be independent variables. Conducting a clinical study is about uncovering the causes of an outcome by controlling variables.
Payer data mining is correlative. Protocol-driven, IRB controlled studies try to debunk correlations in the service of medical knowledge. Ideally anyway.
And therein lies the greatest source of frustration with payer reimbursement decisions.
Correlation versus causation.
PearlDiver Technologies, Inc.’s “Zombie Study” illustrates this problem well.
The data guys at PearlDiver have a study which they perform for clients called the “Zombie Study.”
When you see it, it is impressive. Even a little disturbing.
PearlDiver has about 4 billion patient records from the payers—Medicare, Humana, UnitedHealthcare and so forth.
PearlDiver’s data and the software program that drives it is the kind of data mining Marc Holland is referring to. Companies use PearlDiver data to do cost of care studies and market studies right down to the hospital level. Since PearlDiver’s data can go back up to 10 years, it also offers data for terrific longitudinal studies. Like the Zombie study.
A couple years ago the data guys at PearlDiver looked up the code for deaths in a hospital. Yes, there is a code for that. They then asked a simple question: did any of these “dead” patients ever return for a new procedure? Like three months after they “died”?
And the answer is, of course, “yes.” Turns out literally hundreds of thousands of patients are declared “dead” per the Medicare rolls yet somehow return for more treatment.
We’re talking hip and knee replacements, dialysis treatments, chronic wound care and so forth
Hence the name “Zombie Study.”
There are many reasons to explain such an illogical outcome—coding error, fraud and, of course, the preferred explanation—Zombies!
But this illustrates the point that claims data can only describe correlations—not causes. So there is a danger of false positives. Only protocol-driven, IRB controlled, honest-to-God in-clinic studies can debunk correlations.
Then, if, as part of a protocol-driven, IRB controlled study a “dead” patient returns for a hip replacement then we can honestly say that Zombies exist.
For more information about the Zombie study and other uses for PearlDiver’s massive data, email Scott at scott@pearldiverinc.com.
The Rooster Crows for Data
The classic illustration of the dangers of relying on correlational data is the one about the Rooster crowing. The fact that the Rooster crows at dawn does not mean that the Rooster causes the sun to rise. The two actions simply correlate.
What if there was a way to connect data mining techniques with protocol-driven clinical studies?
Several large integrated health systems like Oakland, California-based Kaiser Permanente or the Intermountain Healthcare in Salt Lake City, Utah, are thinking creatively about this very subject. These systems are, in effect, payers who own hospitals.
Integrated systems like this can’t shift patient risk to the payer or the provider since they are both. So, they should be in an excellent position to combine clinical and claims data for population health analytics.
But so far it’s more promise than actuality.
Says Keith Figlioli, senior vice president of healthcare informatics for the Charlotte, North Carolina-based Premier Healthcare Alliance on the HIMSS website, “The data the payers have is a mile wide but an inch deep.”
The Fitbit Link
But maybe there is a way to link protocol-driven patient data with data mining techniques.
According to data from the HIMSS Analytics database, only 30% of U.S. hospitals presently use a clinical data warehouse/mining techniques.
There are several reasons for this:
- Professionally and scientifically, physicians are tuned into protocol-driven clinical studies, not data mining and therefore correlational data.
- Putting data, which can be anything from clinical notes to manually entered data, into the system is cumbersome and error prone.
- Mapping the data. This is hard to do.
- Errors or incomplete data.
- Different databases.
- And then translating the data into information useful both at the clinic level and at the payer level.
Into this world is coming the Internet of things.
Like Fitbit, the Apple watch, and the recent Spine Technology Award winner, Smart Strap.
The Internet of things refers to the ability of wearable sensors to collect real time patient data and send that over the Internet to other smart devices or “things.”
Here’s an illustration.
Instead of requiring patients to return at 30-days, 60-days, 90-days or whenever as part of a protocol-driven study, attach a wearable sensor which would collect the data constantly, real time and send it over the Internet.
That would pull traditional clinical studies into a data mining future. It would create the opportunity for near universal post-market studies.
Another example is illustrated by the award winning Smart Strap from 109 Design. Smart Strap is a wearable strap that replaces the existing straps of a scoliosis back brace. These straps contain sensors and wireless communication ability.
As most parents and physicians of adolescents with scoliosis know, compliance, defined as the brace wear time, is a big issue. Researchers who’ve studied this report that increased compliance leads to better outcomes, in other words lower curve progressions.
The Smart Strap is like a Fitbit for scoliosis patients. It communicates with a smartphone app and sends data to the app via the Internet. It compares actual brace time and tightness to the doctor’s prescriptions
Parents, patients and doctors can see the data by using that app. Doctors, of course, can use the information to fine tune the treatment prescriptions in real time.
Wearable sensors now track all kinds of activities, blood pressure, heart rate, body temperature as well as data related to diet and psychology. And inventors are now linking this data stream to such diseases as spine deformities, joint osteoarthritis, heart diseases, non-small cell lung cancer, diabetes, obesity and various mental health disorders.
Data: a New Four Letter Word
The data disconnect between payers and providers is a big deal. And a payer’s overreliance on correlation based decision models clearly has the potential to hurt patient quality of care. Data in this context is rapidly becoming a pejorative.
The Internet of things phenomenon, however, could transform clinical data collection in ways that would bridge the gaps between payers and providers.
The key will be how quickly protocol-driven studies incorporate these innovations in data collection and allow for both data mining and data sharing.

