Fortnightly Healthtech Update #2

Does the accountable care organization work? Yes!  The ACO model is leading healthcare providers to focus on cost saving through preventative health. Lower healthcare costs, fewer people getting sick, I love that.  Oh, it definitely works – at least it does for Blue Cross Blue Shield of Massachusetts. Or, not so much…. “Why would I ever take away the volume from those two facilities to a lower-cost setting?” The lack of aligned incentives means that some healthcare organizations are going to (understandable) drag their collective feet in the transition to value-based care. Are they playing a smart game, or are they going to be on the losing end ultimately…? Only time will tell. But, I’ll tell you one thing for sure. American’s simply cannot keep paying more and more for healthcare. Basic economics will dictate that, the money’s just not there.  Something will change.

Ben Taub dead patient was unmonitored, found passed out in bathroom. If only there was a way to monitor patients vitals continuously in the hospital and know where they are. Oh wait, there mostly is…the Philips Wearable Biosensor, (full disclosure, I used to be the product manager) gets some recognition for the ongoing pilot at Singapore General Hospital.

While we’re riffing on biosensors, there’s Aseptika diverging from COPD with the, ahh, curiously named BuddyWOTCH, promising continuous blood oxygen, heart rate, respiratory rate and temperature for 7 days.

Not a wearable, but competing hard with them every step of the way, Early Sense notches up another win in post-acute rehab.

Skipping back to payment reform, CMS is proposing a new bundled payment model for radiation oncology, with site neutrality being a key element.  Opponents argue this could interfere with established care models – and here’s me thinking, “But, that’s the whole point….”  Obviously not to jeopardize patient care, but to at least figure out how to provide the same quality care in a lower cost setting.  See note above, basic economics etc, something has to change…

Pampers and Verily introduce the diaper that sends an alert to your phone when a change is required.  Personally, I always found the loud and incessant auditory stimulation from my little ones provided all the alerting necessary.  But, I guess for the ear bud generation, maybe an app is the way to go.  Especially perhaps for a post ear bud generation with hearing loss.

Meanwhile, North of the border, paramedics and remote monitoring are being use to reduce unnecessary ER visits.

I think emerging nations can often signpost the way to lower cost healthcare solutions. Here’s a test for anaemia, jaundice, and oxygen saturation from Indiathat costs less than one US cent.

Potentially great news for diabetics, with coaching linked to real-time blood glucose measurements. Surely so much more valuable to have actionable guidance triggered by real-time feedback to drive long-term behavior change.

Data is everything.  All other things being equal, the people with the most data – and a decent strategy to leverage it – will win in the long term.  That’s why Amazon’s PillPack wants it, and SureScripts would rather not share

And while on the subject of Amazon, Amazon and Cerner were apparently able to detect the onset of heart failure 15 months out.  Couple of immediate questions for me.  First, I’m not a clinician, but is 15 months notice really that significant for heart failure?  Isn’t the condition often brought on by decades of lifestyle choices?  If so, can an individual do anything meaningful in 15 months to avert the inevitable?  Assuming they can, this is great news – as long as you don’t live in the United States.  In countries that truly see the economic value of preventative healthcare, this truly could be great.  In the US, we’re slowly changing reimbursement models (read ACO, or single payer…) so we can actually get to the point where providers are paid for preventing disease instead of merely treating it.   Until then, it’s really just a bit of a machine learning novelty.

Ohhh, I like this – 3D printed heart valves.  Faster, better, cheaper – what’s not to love.

Curavi Health is eyeing up adjacent markets methinks.  Originally focused on bringing telehealth to SNF’s, it’s also now piloted a solution for use in the home.  Makes total sense in context of the an article in my previous update, where ACO’s are bypassing SNFs and sending patients straight home when they can.  And remote patient monitoring has a big fan in New Jersey, with Valley Health System driving readmission rates down to 2%.

Can’t quite decide which is the more impressive part of this announcement, but I think there is something in here somewhere….PhysIQ has ambulatory respiration rate.  But maybe more importantly, has hardware independent respiration rate.  Hmmm.

iRhythm, focused on cardiac arrhythmia detection, reported Q2 revenue 50% higher than last year.  But, the most significant thing in the announcement is CEO Kevin King saying “…our CPT code change application was accepted by the AMA for review…”.  That’s a bigger story than I have time for now, but I’ll try and squeeze out another post soon that gets into the nitty gritty about why that’s so important.

Very intrigued by this really stretchy wearable coming out of a collaboration from many US and Korean universities.  The full scientific paper here for the geeks among you…

Fortnightly Healthtech Update #1

Caretaker Medical is the only (deep breath…) FDA-approved, Bluetooth-enabled, non-invasive blood pressure monitor for hospital use that I’m aware of.  Now adding end-tidal CO2 via a partner, this should help to bring broader appeal.  Note the use of both Wi-Fi and cellular connectivity, which opens up the possibility for monitoring both in the hospital and in the home.  Since Medicare added reimbursement codes for remote monitoring at the start of the year, many more device manufacturers have started to focus on the home.  While accountable care and bundled payments could ultimately drive that change, there’s nothing quite like a good ol’ fashioned reimbursement to get the market moving…

On that note, CMS reports almost 100% growth in the number of physicians taking part in alternate payment models.  And here’s another change driven by the shift to value-based care: Bundled payments often cut costs by discharging patients home rather than to skilled nursing.  That’s perfectly OK – as long as patients are monitored and rehabbed in their homes.

Intermountain Health has been pushing down the value-based care road for some time.  It’s become so adept that it’s spun off Castell as a platform to help other providers down that path.

While financial incentives need to align for change to happen, communications technology also has to be up to snuff for monitoring in the home to become fully widespread.  So, good to see that the UK is pushing ahead with 5G pilots to fully evaluate the potential for rural communities.  Another (vendors) viewpoint on 5G here.

Talking of financial incentives, the FCC is lining up $100m for 3 years of pilots in telehealth and remote patient monitoring.

Also in the UK, using Amazon’s Alexa to dispense medical advice.  The aim is to take the pressure off overworked primary care docs (general practitioners).  Much easier to try that somewhere with a single payer from cradle to grave – the financial incentives simply align.  Much harder in the fragmented world of US healthcare, but I can see accountable care organizations going down this path too.  Every dollar saved is a dollar in the ACO’s pocket….

The FDA gets more interested in monitoring medication adherence.  Measuring adherence is one thing, improving it something else entirely though….

A new study looks at the use of wearables and machine learning for helping people stay on the path to overcome opioid addiction.  Since we know opioid use depresses respiration rate, it might be good to have a sensor to monitor that too. On a similar thread, Jefferson Health is using analytics to spot opportunities to rein in the prescription of opiates, and that’s no bad thing.

Not the first company to try measuring heart rate and respiratory rate using a camera, but Brainworks is new to me.  Philips has been down this path, while Smart Beat has a direct to consumer device for babies.  And you can find an app for heart rate in your phones app store already.  But, in a clinical setting, respiration rate is often a strong early indicator of impending doom.  For me, the biggest potential application for this type of approach is to help keep people with chronic conditions healthy in their homes.  

If you build it, they won’t come…

Especially in healthcare.  Although truthfully, if-you-build-it-they-will-come has rarely worked well for any company, Apple being one of a handful of exceptions.

But, according to John Brownstein from Boston Children’s and Adam Landman from Brigham and Women’s it definitely doesn’t work in healthcare.  These two fine gentlemen put the record straight on that in a recent Rock Health podcast.

John and Adam are each responsible for innovation centers at their respective hospitals, so they’ve seen their fair share of pitches.  So, what really counts…?  Well, anyone who’s been around healthcare for very long will have heard of the quadruple aim.  Building on the IHI’s triple aim, the quadruple aim looks to:

  1. Improve population health
  2. Increase patient satisfaction
  3. Reduce per-capita healthcare spend
  4. Improve care-giver satisfaction

And the quadruple aim is a must-have for start-ups looking to impress both men.  As John states, “We want to understand how that product meets the IHI’s triple aim, and I would extend that to the quadruple aim.”

So what does that mean in practice for would-be healthcare entrepreneurs?

  • Your product should address a sizable population, not just a handful of patients.
  • It’s got to fit seamlessly into a patients lifestyle.
  • It has to be more cost effective than existing approaches.
  • Finally, at the very least it can’t increase the workload on clinicians.  And trust me, in my experience, if you can reduce the burden on caregivers, you’re golden.

Healthcare might be getting better – depends how you measure it…

I used Tableau Public to compare the performance of acute care hospitals using the JCAHO core measures.  The Centers for Medicare and Medicaid publishes performance data periodically.  Essentially, the core measures show how frequently hospitals perform certain basic care functions – such as heart attack patients given aspirin on arrival at hospital, or surgery patients being given the right type of antibiotics at the right time.

My visualization shows 2012 performance compared to 2010 (you can switch between the two data sets using the tabs at the top left).  The size of each box shows the number of procedures, the bluer the box the better providers are at complying with the guidelines. The data is there to go down to the individual provider level, but I have not taken it that far yet.  So, on the face of it, hospitals are improving care quality because the 2012 treemap is bluer than the 2010 treemap.

That’s great – but not the whole story.  Arguably, the biggest challenge for care quality right now is managing the transitions between care providers and settings.  For example, the IHI estimates that 50% of medication errors occur because of poor communication during transitions.  Technology can help with that – but for now, we’ll take the improvement in core measures….

Machine learning for IIoT: 4 tips to get you started

The Industrial Internet of Things (IIoT) will bring large volumes of fast-moving data.  This brings both challenges and opportunities.  At the risk of stating the  obvious, one challenge is making sense of large complex data sets.  Machine learning approaches can help here, so I’ve got four tips for getting you started with machine learning:

1.  Forget some of what you know about analytics

If you plan to deal with IIoT data, you may need to refresh your thinking about analytics.  Historically, analytics have been a relatively simple and sedate affair.  For example, analysis was often performed on historical data at some point in time after it was generated.  In addition – for better or worse – analytics often mirrored the siloed nature of data.  That is, the integration of data was minimal.  Industrial IoT will bring more data, faster, from a greater variety of sources.  Managing this data complexity to be able to respond to events in a timely way will required a much more automated and frictionless approach to the analytics value chain.  Machine learning is one way to achieve that.  It can be especially powerful with complex data, where patterns are not obvious and it’s difficult – nay, impossible – for humans to formulate and code rules.  Unfortunately, the lack of transparent logic in machine learning can be an obstacle for some people that must be overcome.  Many engineers just aren’t comfortable with black-box solutions.  Tough, get over it.

2.  Explore machine learning as a technology

The cloud changes everything.  In this particular case, it demolishes the barriers to entry for machine learning.  A new generation of machine learning tools (from BigMLMicrosoftAmazon, and IBM for example) are cloud-based products.  Most offer a free trial, some for an indefinite time period.  They also offer a much more guided, tutorial-style development experience than the previous generations of software.  So what’s the cost to learn more and experiment with it…?  It’s your time.  At this point, extensive investigation of machine learning tools prior to selection isn’t strictly necessary.

Here’s how the evaluation process can work:

  • Pick a cloud-based machine learning tool; any one, it doesn’t really matter.
  • Spend a day or two playing with it.
  • If you like it, play some more.
  • If you don’t like it, pick another tool and start over using the experience you’ve already gained.

3.  Don’t be fooled – successful machine learning isn’t all data science

True enough, at a technical level, machine learning can appear enigmatic.  Seemingly without rules or logic, it can be daunting to try and understand the details.  But, that’s what IT professionals, analysts, and data scientists are for.  Like all successful IT projects, successful machine learning projects do not start and end with IT.  Business and domain expertise are crucial to success.  Consider the application of machine learning to maintenance.  Domain expertise is necessary to identify potential source data to feed the algorithms.  Further, domain expertise is required to interpret and provide context to the output of machine learning.  Like all successful IT projects, machine learning applications require a collaborative cross-functional team.

4.  Consider prescriptive maintenance applications

Many enterprises will be breaking new ground with IIoT applications.  It’s critical that the first wave of IIoT applications deliver a tangible and measurable return on investment.  Re-inventing the approach to asset maintenance provides a clear path to measurable benefits.  Research by ARC’s Ralph Rio shows that the most common approach to maintenance is still simple preventative maintenance.  And yet, as the same ARC research also shows, that is not the optimal approach for the majority of assets.  Maintenance applications that incorporate machine learning are a promising approach for capitalizing on Industrial IoT data.  The potential return on investment (ROI) in predictive maintenance is real, tangible, and relatively immediate – all good things you need in a beachhead project.

So those are my four tips – consider it my Christmas gift to you.  And no, you can’t take them back to the store for a refund if you don’t like them…

Shameless plug alert:  This and more in my exceedingly good value research report on machine learning for Industrial IoT.

(Originally published on industrial-iot.com, a blog by ARC Advisory Group analysts)

Industrial IoT: Evolution or Revolution…?

My colleague Peter Reynolds was a bit of a hooligan last week.  His blog post Internet of Things: Have we been doing this for 25 years? certainly created a bit of a stir in our little corner of the internet.

Ken Hart’s said in his comment on the post, “it’s evolution, not revolution”.  Honestly Ken, I think the real answer is “it depends” – and that’s not just me hedging my bets.  It depends on your perspective:

  • From an engineering perspective, sure it’s evolutionary.  ARC analysts will gladly tell anyone who’s careless enough to wander into our workshops that all the technology needed to implement an IIoT application already exists.  We’re not dependent on any blinding breakthrough to make it real.  So engineers, fill yer boots as Peter might say.
  • From a business perspective, it can – and should – be revolutionary.  In fact, I think it really only depends on how big your vision is.  As I noted in my comment on Peter’s post, the more “islands of information” we connect together, the greater the insight into the value chain.  And the greater the insight into the (extended) value chain, the greater the potential for revolution.

So, if you’re elbow-deep in bits and bytes, it’s evolutionary.  On the other hand, the higher up the management stack you get, the more revolutionary IIoT is.  Because Industrial IoT is about more than plant-level integration.

And, if you think about it, that’s the way most things are.  Take the World Wide Web.  A  technical evolution that Sir Tim Berners-Lee worked on as a side project.  And yet, it utterly – utterly – changed the way we share information.  Your kids can’t function without it.  I can barely remember how we used to get work done without it.  Oh, that’s right, we weren’t all screwing around on Facebook…

(Originally published on industrial-iot.com, a blog by ARC Advisory Group analysts)

Let’s play Clue: Who really killed EMC?

I used to love the board game Clue as a kid (or Cluedo as it’s called back home).  Often when you won, you knew with 100% certainty the who, what, and where for the murder before you made your bold pronouncement.  But sometimes, if you thought someone else was close to solving the murder, you had to take an early best guess with a little less certainty.  And that’s a bit like where I am with EMC.  Do I know for sure who killed EMC..?  No.  But I’m willing to go out on a bit of a limb – I think I can guess who killed EMC, where, and with what weapon.

Since the acquisition of EMC by Dell was announced, there’s been a bit of a kerfuffle in the Bay state.  There’s much hand-wringing that another Boston tech giant is, well, no longer a Boston tech giant.  (EMC is relocating it’s HQ to Texas.)  People have long memories, and the ghost of DEC is apparently still haunting my neighbors as we approach Halloween.  Truthfully, I’m a bit shocked that Dell is being cast in a bad light – a bit of a party crasher, a vulture, a bit of an Ebenezer Scrooge.  So let me put that straight – who really killed EMC?

It was Amazon, in the cloud, with a commodity disk drive. Here’s how:

  • The amount of data is growing by about 40% a year – or doubling every two years.  In an ironic twist, I’ll cite numbers from IDC in research bought and paid for by EMC.  To counter this somewhat, the cost per byte of raw disk storage seems to be halving roughly every three years at the moment.  Bottom line, money is still being spent on storage.
  • The storage hardware segment of EMC’s business (Information Storage) has  struggled for growth.  From EMC’s public financials, from 2012 to 2013, revenues grew 4%.  But, from 2013 to 2014, growth rate for this business slowed to only 2%.  And if this data from IDC is accurate (and I have no reason to think that it’s not), EMC lost market share and saw revenues decline early this year – particularly in the lucrative storage systems business.
  • Amazon is building out a colossal computing infrastructure using commodity hardware.  James Hamilton notes this in his excellent presentation from re:Invent 2014:  Amazon saw 132% year-year growth in data transferred in its S3 storage solution, and has over one million customers active on AWS.  Every day Amazon adds enough capacity to AWS to support a $7bn ecommerce operation – effectively all of Amazon’s business back in 2004 when it was a $7bn company.  How much capacity is that?  I’m not sure to be honest, but if Amazon’s average sale in 2004 was $30, that’s over 233m sales transactions that need to be recorded, processed and supported.  Sounds like a lot of storage to me…And I very much doubt Amazon uses EMC’s premium products for that.  As James notes, Amazon typically designs it’s own servers and storage racks.

So, I rest my case.  What used to be stored on EMC systems in corporate data centers is now being stored on cheap disks in Amazon’s cloud.  Amazon did it, Amazon killed EMC.

(Originally published on industrial-iot.com, a blog by ARC Advisory Group analysts)

Re-inventing Healthcare: We need the college scorecard for healthcare

I have college-age kids just around the corner.  It’s a scary time – not least because I was fortunate enough to get my undergraduate degree in the UK at a time when the government paid for it!  Oh happy days…

In the US – and even the UK now – people pay for college out of their own pocket.  But, that doesn’t always mean you get what you pay for.  As I’ve researched colleges with my eldest, it’s been very hard to make a meaningful like-for-like comparison.  Even using so-called college comparison websites.  For example, common measure like the 6 year graduation rate are close to worthless.  So I was excited to see the federal government step in and reveal its own comparison site recently.  I’m sure it will attract criticism, especially from those that are heavily invested in the status quo.

But, now we need the same for healthcare.  We need this for healthcare because without transparency into healthcare there will be no change.   Without change, the US healthcare system is unsustainable.  And that should scare healthcare providers as much as citizens.  Here’s a scenario – imagine I need a total knee replacement.  (I don’t, but those knees have seen a lot of soccer…).  Here’s the problem:

  • How do I chose a knee specialist to perform the surgery?  Where’s the public data – yes, actual data – to help me as a consumer sort the best, from the good, from the mediocre?  It doesn’t exist.
  • Where is the public data to help me compare costs – the cost of the surgeon, and the cost of the hospital or facility for a start?  It doesn’t exist.

Caleb Stowell, MD and Christina Akerman, MD are of course right when they say that better value will come from improving outcomes.  But, as a consumer, I need visibility into both outcomes and costs to make wise decisions about my healthcare.  Sadly, the governments Hospital Compare website doesn’t even come close to providing what we need.  Without such visibility, there is no real consumer choice, no competition among providers.  Without competition, healthcare costs will continue to spiral out of control.  That’s bad for us, but it’s worse for our children.

Two reasons machine learning is warming up for industrial companies

Machine learning isn’t new.  Expert systems were a strong research topic in the 1970’s and 1980’s and often embodied machine learning approaches.  Machine learning is a subset of predictive analytics, a subset that is highly automated, embedded, and self-modifying.  Currently, enthusiasm for machine learning is seeing a strong resurgence, with two factors driving that renewed interest:

Plentiful data.  It’s a popular adage with machine learning experts:  In the long run, a weaker algorithm with lots of training data will outperform a stronger algorithm with less training data.  That’s because machine learning algorithms naturally adapt to produce better results based on the data they are fed, and the feedback they receive.  And clearly, industry is entering an era of plentiful data. Data generated by the Industrial Internet of Things (IIoT) will ensure that.  However, on the personal / consumer side of things, that era has already arrived.  For example, in 2012 Google trained a machine learning algorithm to recognize cats by feeding it ten million images of cats.Today’s it’s relatively easy to find vast numbers of images, but in the 1980’s who had access to such an image library…?  Beyond perhaps a few shady government organizations, nobody.  For example, eighteen months ago Facebook reported that users were uploading 350 million images every day.  (Yes, you read that correctly, over a third of a billion images every day).  Consequently, the ability to find enough relevant training data for many applications is no longer a concern.  In fact, the concern may rapidly switch to how do you find the right, or best, training data – but that’s another story…

Lower Barriers to Entry.  The landscape of commercial software and solutions has been changed permanently by two major factors in the last decade or so:  Open source and the cloud.  Red Hat – twenty-two years old and counting – is the first company that provided enterprise software using an open source business model.  Other companies have followed Red Hat’s lead, although none have been as commercially successfully.  Typically, the enterprise commercial open source business model revolves around a no-fee version of a core software product – the Linux operating system in the case of Red Hat.  This is fully functional software, not a time–limited trial, for example.  However, although the core product is free, revenue is generated from a number of optional services, and potential product enhancements.  The key point of the open source model is this:  It makes evaluation and experimentation so much easier.  Literally anyone with an internet connection can download the product and start to use it.  This makes it easy to evaluate, distribute and propagate the software throughout the organization as desired.

Use of the cloud also significantly lowers the barriers to entry for anyone looking to explore machine learning.  In a similar way to the open source model, cloud-based solutions are very easy for potential customers to explore. Typically, this would just involve registering to create a free account on the provider’s website, and then starting to develop and evaluate applications. Usually, online training and educational materials are provided too.  The exact amount of “free” resources available varies depending on the vendor. Some may limit free evaluation to a certain period, such as thirty days.  Others may limit the number of machine learning models built, or how many times they can be executed, for free. At the extreme though, some providers will provide some limited form of machine learning capacity, free of charge, forever.

Like open source solutions, cloud-based solutions also make it easier – and reduce the risk – for organizations to get started with machine learning applications.  Just show up at the vendors website, register, and get started. Compare both the cloud and open source to to the traditionally licensed, on-premise installed software product. In this case, the purchase needs to be made, a license obtained, software downloaded and installed. A process that could, in many corporations, take weeks to achieve.  A process that may need to be repeated every time the machine learning application is deployed in a production environment…

My upcoming strategy report on machine learning will review a number of the horizontal machine learning tools and platforms available.  If you can’t wait for that to get started, simply type “machine learning” into your search engine of choice and you’re just 5 minutes away from getting started.

(Originally published on industrial-iot.com, a blog by ARC Advisory Group analysts)

Re-inventing Healthcare: Cutting Re-admission rates with predictive analytics

Managing unplanned re-admissions is a persistent and enduring problem for healthcare providers.  Analysis of Medicare claims from over a decade ago showed that over 19% of beneficiaries were re-admitted within 30 days.  Attention on this measure increased when the Affordable Care Act introduced penalties for excessive re-admits.  However, many hospitals – including those in South Florida and Texas – are losing millions in revenue because of their inability to meet performance targets.

Carolinas HealthCare System has applied predictive analytics to the problem, using Predixion Software and Premier Inc.  Essentially, by using patient and population data, Carolinas is able to calculate a more timely, more accurate assessment of the re-admit risk.  The hospital can then put in place a post-acute care plan to try and minimize the risk of re-admission.  You can find a brief ten minute webinar presented by the hospital here.  But, from an analytics, information management  and decision making perspective, here are the key points:

  • The risk assessment for readmission is now done before the patient examination, not after it. Making that assessment early means there is more time to plan for the most appropriate care after discharge.
  • The risk assessment is now more precise, accurate, and consistent.  In the past, the hospital just categorized patients into two buckets – high risk and low risk.  There are now four bands of risk so the care team can make a more nuanced assessment of risk and plan accordingly.  Further, the use of Predixion’s predictive analytics software means that far more variables can be considered to make the determination of risk.  Us puny human’s can only realistically work with a few variables well to make a decision.  Predictive analytics allowed more than 40 data points from the EMR, ED etc. to be used to make a more accurate assessment of risk.  Finally, calculating the risk using software meant that Carolinas could avoid any variability introduced by case managers with different experience and skills.
  • The risk assessment is constantly updated.  In practice, the re-admission risk for any individual patient is going to change throughout the care process in the hospital.  So, a patients re-admission risk is now recalculated and updated hourly – not just once at the time of admission which was situation in the past.
  • The overall accuracy of risk assessment gets better over time.  A software-centered approach means that suggested intervention plans can be built in – so again reducing variability in the quality of care.  But, the data-centric approach means that the efficacy of treatment plans can also be easily measured and adjusted over the long-term.

Overall, this data-driven approach to care is a win-win.  It results in higher care quality and better outcomes for the patient.  And Carolinas HealthCare System improves its financial performance too.  This is all possible because more of the risk assessment is now based on hard data, not intuition.