Headlines like Kaiser Health News’ late-to-the-party piling on:
Uh -oh. And no less than the New England Journal of Medicine “says” the hotspotter model “failed”. And even Jeff Brenner buys into it: ” “It’s my life’s work. So, of course, you’re upset and sad,” said Brenner
Except that NEJM’s report really doesn’t say that. It says: “Patients receiving extra support were just as likely to return to the hospital within 180 days as those not receiving that help.”
It feels like we’re forgetting that that cadre of extra-helped people were and are not like those “not receiving that help” to begin with. So they behaved like normal people, and not like people identified as needing extra help. But let’s go deeper on the study & its findings. A couple of points:
1) “Half received the usual care patients get when leaving the hospital. The other half got about 90 days of intensive social and medical assistance from the coalition.
- And the result: The 400 patients who received the intensive help were just as likely to return to the hospital as the patients who didn’t. In both groups, nearly two-thirds of people were readmitted within 180 days.”
- 2/3 of 800 = 532. Only 267 of this highly stratified group did NOT return for hospital care. To borrow a frequently misused phrase, “these are sick people”.
So, what was the intensity of care the recidivists needed? Was that about the same across groups? We don’t know; apparently that was beyond the scope of the study.
2) Maybe more importantly, should we really expect the behavior change aspects of the hotspotters’ mostly-not-clinical-care interventions to move the needle on care within just 6 months time, on people whose multiple behavioral & health issues were, by every observers’ account, complicated to untangle and rewire?
What we DO know is that at least part of the reason for the NEJM’s finding that the project “failed” had to do with the project’s capacity to, y’know, actually DO what they KNEW was necessary for success. Quoting again:
“Coalition staff and their patients usually knew what was needed — evidence-based addiction treatment, housing, mental health services — but resources were often in short supply.”
“Resources were in short supply”? Translation: instances in which the study participants did not get these often — but not exclusively — “non-health-treatment” services apparently were discounted or not considered in evaluating how readmissions should be scored (though why foregone or never initiated addiction treatment or “mental health services” would be discounted in the scoring completely escapes me).
The NEJM study contains observations like this:
Engagement with the program was high (95% of patients had at least three encounters with program staff), and patients received an intensive intervention (averaging 7.6 home visits), but two program goals related to the timing of services — a home visit within 5 days after hospital discharge and a visit to a provider’s office within 7 days after discharge — were achieved less than 30% of the time. Challenges in reaching these goals included patients’ lack of stable housing or a telephone and their behavioral health complexities and providers’ few available appointments. The difficulties that this pioneering, data-driven organization had in achieving rapid assistance for patients may portend difficulties in achieving it at scale.
So, the evaluators of a program designed to fortify the most fragile of the fragile – people with complex health conditions who also lacked basic social supports like a telephone — nonetheless blithely completely discount the lack of even the most basic supports in determining whether the study group was “really” equivalent to the control group.