Dan Munro

Writer

  • HOME
  • BIO
  • PORTFOLIO
    • Current
    • Interoperability
    • Cybersecurity
    • Highlights
  • BOOK
  • CLIPS
    • 2018
    • 2017
  • CONTACT

The Kabuki Dance of ‘Blocking Data’ in Healthcare

April 13, 2015 By Dan Munro

Kabuki Actor

The lack of data interoperability in healthcare continues to plague and haunt the entire industry. Much of the challenge falls squarely in the realm of Electronic Health Record (EHR) software, but EHR software is by no means the only category where this challenge is directly affecting patient lives. I ‒ along with countless others ‒ have written extensively about this topic. Last year I did a 5‒part series on just ‘interoperability’ which I published on Forbes. One piece highlighted the importance and direct effect of lack of interoperability from the patient perspective: The View of Digital Health From an ‘Engaged Patient’

I wish I could say things have improved since ‒ they really haven’t ‒ nor are they likely to improve soon. We remain stuck in competing commercial interests, political agendas and very real technical challenges on how to securely share even the most basic health information. While the debate and agendas continues ‒ this year alone has seen the health data breach of 91 million Americans through just 2 high‒profile cases ‒ Anthem and Premera ‒ and that was just in the first quarter. Whatever the privacy argument is against sharing data ‒ it’s all but irrelevant when more than 90 million Americans have their health data breached. That’s effectively 28% of the U.S. population.

The result of all this is sadly theatrical ‒ a kind of Kabuki dance ‒ and reflects the broader inability of many to grasp the true meaning of “patient‒centered” thinking. It’s excruciatingly painful to watch ‒ let alone live through as patients.

The topic of ‘interoperability’ kept percolating throughout last year and finally reached the hallowed halls of Congress. Outraged that the Government had spent almost $30 billion dollars on “infrastructure” software that locked data in proprietary silos, the Government demanded some accountability through the Office of the National Coordinator (ONC). That report was delivered by ONC to Congress on Friday with this odd ‒ uniquely Government title. Report on Health Information Blocking  

Pardon my French, but what the hell is ‘information blocking?’ As a software engineer, I’ve been around software most of my professional career and I had never heard that phrase before ‒ let alone as applied to software interoperability. Sadder still, it also highlights a real naiveté by ONC and Congress on the actual business of software manufacturing. If it’s not naiveté it’s certainly very theatrical.

The report itself is about 39 pages (including 6 pages of appendixes). It’s a painful attempt at laying blame for the lack of interoperability on the big, bad Independent Software Vendors (ISV’s). Even the first hurdle ‒ to define “information blocking” ‒ is an enormous challenge because it attempts to fabricate a definition from thin air. The report does acknowledge the scale and scope of this enormous difficulty outright.

The term ‘information blocking’ presents significant definitional challenges. Many actions that prevent information from being exchanged may be inadvertent, resulting primarily from economic, technological, and practical challenges that have long prevented widespread and effective information sharing. Further, even conscious decisions that prevent information exchange may be motivated by and advance important interests, such as protecting patient safety, that further the potential to improve health and health care. Finally, it is important to acknowledge that certain constraints on the exchange of electronic health information are appropriate and necessary to comply with state and federal privacy laws; this is not considered information blocking.

Undaunted by these challenges, ONC went ahead and fabricated a definition anyway.

Information blocking occurs when persons or entities knowingly and unreasonably interfere with the exchange or use of electronic health information.

With this broadest of definitions, I asked two healthcare legal minds I know for clarification on that gaping hole of a legal phrase called ‘unreasonable.’

Can U define “unreasonable” in @ONC_HealthIT “info blocking” report 2 Congress? http://t.co/ij20SPcjnX cc: @VinceKuraitis @healthblawg — Dan Munro (@danmunro) April 10, 2015

I fully expected the reply.

.@danmunro Agree, ONC language re: “knowingly & unreasonably” is unnecessarily high burden of proof. @ONC_HealthIT @healthblawg — Vince Kuraitis (@VinceKuraitis) April 10, 2015

It’s not just unnecessarily high. Given all the challenges listed outright by ONC ‒ it’s virtually impossible to prove ‘knowingly and unreasonably.’  Assuming you could even level the charge ‒ who would adjudicate the case ‒ and the resulting appeal ‒ at what cost ‒ and to whom? ONC knows this ‒ and acknowledged it indirectly further in their report.

ONC-ACB surveillance activities and other feedback from the field show that although certified health IT is often conformant with the criteria to which it was certified, there is still a substantial amount of permissible variability in the underlying required standards, unique clinical workflow implementations, and numerous types of interfaces to connect multiple systems. This variability has contributed to information sharing challenges and also creates opportunities for developers or health IT implementers to erect unnecessary technical barriers to interoperability and electronic health information exchange.

Commercial software ‒ specific to the healthcare industry ‒ has taken years to mature and there are distinctly separate categories of EHR software. Two of the largest categories are Ambulatory and Inpatient. One could argue that the first and most successful vendor in this unique category of software called EHR is Epic Systems ‒ founded by Judy Faulkner (1979) and still privately held. Here’s a relatively recent breakdown of market share ‒ by vendor ‒ for both categories.

With a firm hold on 20% of both markets, Epic is the Apple (or Microsoft) of EHR software except that it remains privately held. Whether a software manufacturer is privately held ‒ or traded on the public market ‒ makes little difference to the way the business operates. The reality is that most of these EHR systems started (and largely remain) as enterprise billing engines ‒ that now include a fair amount of clinical data ‒ stored in proprietary formats. Anyone ‒ including ONC ‒ who argues that Independent Software Vendors (ISV’s) are obligated to build software connections (or API’s) into competing software solutions ‒ for free ‒ clearly don’t understand the way revenue is captured and reported to the IRS. They also don’t understand that ISV’s are beholden to shareholders as much (if not more) than their customers and their customers are not patients. They’re alignment to shareholders and customers is purely economic. We’ve supported this model to the tune of over 400 ISV’s that now offer “certified” EHR software to some customers ‒ almost entirely in proprietary formats based on each vendor’s commercial interests.

All of which begs the question. Absent any National Standards often used in other industries as a way to effectively neutralize all of these competing interests ‒ who is truly to blame? Congress for actively “blocking” the development of National Standards ‒ or ONC for lacking the technical ability to police an entire industry for “information blocking?”

But that’s also a serious charge. How is Congress actually ‘blocking’ the development of National Standards in healthcare? For insight into that exact dilemma ‒ it’s important to link the release of ONC’s report with another headline from last week.

Congress Continues to Block Nationwide Unique Patient Identifier

There’s that word again ‒ ‘block’ ‒ only this time aimed squarely at Congress ‒ not the EHR ISV community. According to the article, the resulting damage from this one ‘block’ has serious and direct patient consequences.

According to a survey of healthcare CIOs conducted by the College of Healthcare Information Management Executives, error rates due to patient mismatching averaged eight percent and ranged up to 20 percent.  In addition, 19 percent of the respondents indicated that their hospitals had experienced an adverse event during the course of the year due to a patient information mismatch.

The history to this one Congressional ‘block’ goes back to the passage of HIPAA and I wrote about that history as a part of that 5‒part series on interoperability just over one year ago.

Who Stole U.S. Healthcare Interop?

Long forgotten is a similar dilemma in a vastly different industry ‒ motor vehicle manufacturing. For decades, car companies issued their own vehicle identification number. The resulting mess made it virtually impossible for consumers (let alone law enforcement) to tell if a car they were about to buy had been stolen, was a lemon, part of a manufacturer recall or if it had serious damage (manmade or otherwise) somewhere in its history.

A vehicle identification number, commonly abbreviated to VIN, is a unique code including a serial number, used by the automotive industry to identify individual motor vehicles, towed vehicles, motorcycles, scooters and mopeds as defined in ISO 3833. VINs were first used in 1954. From 1954 to 1981, there was no accepted standard for these numbers, so different manufacturers used different formats. In 1981, the National Highway Traffic Safety Administration of the United States standardized the format.It required all over-the-road-vehicles sold to contain a 17-character VIN.  VIN [Wikipedia]

Today, that VIN is stamped on every major part of every vehicle manufactured in a way that makes it easy for consumers and law enforcement to check on the status of any car (or car part) easily and quickly ‒ nationally.

So, there’s really no mystery to this Kabuki dance between ONC, ISV’s and Congress. ONC is firmly stuck between enormous commercial interests and a clearly recalcitrant Congress to simply authorize the development of any National  Standard for healthcare IT (either NPI or more broadly for data interoperability). ISV’s are, of course, competing aggressively for revenue and market share in any way they can. Patients are the collateral damage in these competing interests.

It’s painful to watch and it’s painful to realize the complete and utter disregard by all for basic patient safety ‒ but that is the healthcare system we have built. No, it’s not broken ‒ it’s just been optimized for revenue and profits ‒ not safety and quality. Congress simply refuses to reign in the commercial interests of the ISV’s and ONC is simply an effective buffer to hide their clear ‒ financially motivated ‒ reluctance.

Everyone’s well meaning, of course. Everyone’s doing “the best they can”, but it’s very much like the quote from the recent Ken Burns biography of cancer. With a little substitution we can easily apply the title of the second installment of that epic series directly to the healthcare interoperability dilemma.

The important thing is that the viral National Patient ID theory was not wrong. The environmental Meaningful Use theory was not wrong. The hereditary Vendor API theory was not wrong – they were just insufficient. It was like the blind man and the elephant. They were catching parts of the whole and then all of sudden – if you stepped back – you saw the whole elephant. Siddhartha Mukherjee, MD ‒ Top 10 Quotes From the Ken Burns Documentary: ‘Emperor of All Maladies’

Unlike the war on cancer, we haven’t seen ‒ nor do we appear to be looking for ‒ the whole elephant.


This article first appeared on HIT Consultant

Filed Under: Interoperability

Why Uber Won’t Be Coming To Healthcare

March 22, 2015 By Dan Munro

Headlines abound for the idea that Uber will appear shortly to massively disrupt the healthcare industry in the same way it’s disrupted the taxi industry. Some boldly proclaim it’s not only here, but it’s here to stay.

The technical challenges are daunting, of course, not the least of which is the idea that there’s an enormous and untapped pool of resources (drivers in the case of Uber ‒ doctors in the case of healthcare) eagerly looking (and available) to turn big blocks of idle hours into cold hard cash.

But there’s another reason and it’s tied very directly to the risk-reward model of early stage venture investing. The challenges from that perspective were described recently by one of Uber’s earliest investors ‒ Bill Gurley.

Bill was in Austin last week and appeared on stage with Malcolm Gladwell at the annual tech festival (and rite of spring) known simply as South by Southwest (SXSW). Malcolm’s credentials as a science/healthcare journalist (and now author) are, of course, just as impressive as Bill’s are to venture investing.

Given this exact pairing, it came as no surprise when the topic of healthcare appeared so quickly in their 60-minute chat.

Malcolm: Last time I saw you we talked a lot about healthcare and I thought it would be really fun to start with the question of whether your world ‒ technology world ‒ can help us fix our healthcare problem.

Fun is a relative term for the carnival atmosphere of SXSW, but Bill was gracious and dove right in.

Bill: Yup, I’ve been involved in a number of companies that have used the resources that silicon valley [is] helping create ‒ the smartphone infrastructure, the internet, the cloud ‒ to try and make industries more efficient. Things like OpenTable OPEN +% and Uber that have been successful ‒ and you look at the technology and what’s possible and your immediate reaction is ‒ but of course, there must be hundreds and hundreds of opportunities for us to solve the healthcare problem. And so about a year‒and‒half ago I tweeted, you know, I’d love to roll up my sleeves and see if I can find one of these opportunities and help an entrepreneur in this field and I met with ‒ I don’t know ‒ a hundred companies ‒ and I became more and more skeptical as I went through the process.

This tweet was one of several that signaled Bill’s original interest in healthcare more broadly.


The timeline to that interest ‒ at about 14 months ‒ was short-lived.

Bill: And the real problem is ‒ and I don’t think entrepreneurs realize this ‒ but there’s an assumption of market forces when you do a startup. Like you expect customers to pay for value and to not pay for bad things ‒ and to want to be more efficient ‒ and the physics are just completely mucked up in the healthcare system. Like those drawing’s where you can’t tell which way is up ‒ and everywhere you turn it’s like that.

He then went on to describe the HITECH Act (enacted under Title XIII of ARRA in 2009) which was designed to pay doctors to implement electronic health record (EHR) software. To date, the government has paid out about $29 billion to all types of healthcare providers under the Act  and more “incentive” payments are in the Government pipeline.

Bill: Putting [the software] in was Meaningful Use 1 ‒ that’s $44,000 ‒ [but] because they’re not sure you’ll use it ‒ Meaningful Use 2 is proving that you’re using the software you put in place in Meaningful Use 1 and that’s another $17,000 for the doctor.  And it’s insane. It’s asinine.

He’s definitely not alone in that assessment. One of the Top Ten Healthcare Quotes I selected for 2014 was this one from Google GOOGL +0.50% Co‒Founder Sergey Brin:

Generally, health is just so heavily regulated. It’s just a painful business to be in. It’s just not necessarily how I want to spend my time. Sergey Brin during Fireside chat with Vinod Khosla 

Veteran VC blogger Fred Wilson wrote this on his popular blog ‒ AVC.

When we look at healthcare, what’s wrong with it, and what needs to happen to fix it, we can’t see as clearly how the web, technology, and large networks of engaged users can impact healthcare in a positive way. But that is starting to change. We know that consumers need to take more control of their healthcare choices, their healthcare costs, and their health. And we know the web and large networks of engaged users can help all of that happen. It is likely that we’ll be doing more looking and studying and less investing in healthcare for a while (as we did in education). Fred Wilson ‒ Healthcare ‒ November, 2011

All of which maps directly to the seminal piece written by Silicon Valley legend Steve Blank in 2012. The headline was simply (and aptly) Why Facebook is Killing Silicon Valley.

If investors have a choice of investing in a blockbuster cancer drug that will pay them nothing for fifteen years or a social media application that can go big in a few years, which do you think they’re going to pick? If you’re a VC firm, you’re phasing out your life science division. As investors funding clean tech watch the Chinese dump cheap solar cells in the U.S. and put U.S. startups out of business, do you think they’re going to continue to fund solar?  And as Clean Tech VC’s have painfully learned, trying to scale Clean Tech past demonstration plants to industrial scale takes capital and time past the resources of venture capital.  A new car company? It takes at least a decade and needs at least a billion dollars. Compared to IOS/Android apps, all that other stuff is hard and the returns take forever. Why Facebook is Killing Silicon Valley ‒ May, 2012

The reality of early stage venture investing in healthcare is mirrored across multiple vectors. Big, successful venture capitalists just aren’t interested in healthcare until it can demonstrate the unicorn-sized returns they’re now accustomed to. This new scale (and rapid timeline) is integral to their “investment thesis.” Healthcare doesn’t have the same unicorn-like trajectory as other industries so the signaling is crystal clear. The one venture bet that everyone is comfortable making is that an Uber-like disruption won’t be happening in healthcare.


This article first appeared in Forbes (March 2015)

Back to Highlights

Filed Under: Systemic Flaw, Tech

Why I’m Not Volunteering For Google’s Medical Study

July 28, 2014 By Dan Munro

Yesterday, Re/code science reporter James Temple wrote an essay saying that he would be happy to volunteer his personal data for Google’s “Baseline Study.” The argument in favor of “citizen science” is definitely popular, but it also masks some larger, more systemic issues with health care — specifically here in the U.S.

Color me skeptical, but I’m inclined to pass on Google’s offer to help with any of its long-term, lofty ambitions around citizen science, for three reasons:

  • Legal risk
  • Health-care data trust in Google
  • Big vanity projects

Relative to legal risk, Temple did identify one big element directly:

But one point did initially give me pause: The information handed over will include full genome sequences of individual participants.

He even went as far as to get the correct legal assessment from Stanford law professor Hank Greely:

 “I’m not saying people shouldn’t sign up for this,” [Greely] said in an interview. “But they need to know going into it that nobody can honestly promise you anonymity or confidentiality.”

That’s a pretty big personal risk, of course, but it’s only one element of the legal assessment. The larger issue that I see is the potential effect of donating genetic data relative to GINA, the Genetic Information Nondiscrimination Act of 2008. GINA is the federal legislation designed to protect my genetic data (against abuse), but there’s a rather lengthy list of things it won’t do:

  • GINA’s health-coverage nondiscrimination protections do not extend to life insurance, disability insurance and long-term care insurance. Wow — some pretty big loopholes there.
  • GINA’s employment provisions generally do not apply to employers with fewer than 15 employees. Okay — and how many startups/employers are there with fewer than 15 employees?
  • For health coverage provided by a health insurer to individuals, GINA does not prohibit the health insurer from determining eligibility or premium rates for an individual based on the manifestation of a disease or disorder in that individual. Lots of big legal phrasing here that sounds like I’m the one at risk.
  • For employment-based coverage provided by group health plans, GINA permits the overall premium rate for an employer to be increased because of the manifestation of a disease or disorder of an individual enrolled in the plan, but the manifested disease or disorder of one individual cannot be used as genetic information about other group members to further increase the premium. Gosh — I’m not an attorney, but it sounds like I need to be.
  • GINA does not prohibit health insurers or health plan administrators from obtaining and using genetic-test results in making health-insurance payment determinations. Okay, so just tell me — is there any legal protection under GINA at all?

Perhaps the single most important element to all of this is simply the timeline. GINA was signed into law on May 21, 2008. Just over six years ago. In legal terms, that’s an unproven law (in both directions), and the greatest risk remains almost entirely mine (to prove abuse). Once my genetic data is digitized, sold, used and reused, I have little faith in this legal soup to protect my individual legal rights without incurring huge legal expenses personally. Especially as it relates to a $400 billion (market cap) company like Google.

Which brings me to my second point: Health-care data trust in Google.

Some would say my lack of faith in Google is overly cynical, but a little history here is warranted. Remember Google Health? The data repository that consumers could use (for free) to store their medical information?

Google Health has been permanently discontinued. All data remaining in Google Health user accounts as of January 2, 2013 has been systematically destroyed, and Google is no longer able to recover any Google Health data for any user.

Sorry, but that didn’t engender a lot of trust for me to let Google handle my health-care data. In itself, not all that bad — except that this wasn’t the only Google health-care “misstep.”

Few will remember, but next month is the three‒year anniversary of Google’s $500 million settlement with the Department of Justice.

As the briefest of backgrounders, from 2003 to 2009 the GOOG basically allowed unlicensed Canadian pharmacies to illegally advertise prescription drugs in the U.S. using Google’s AdWords program. According to the allegations, Google actively provided support and advice to the pharmacies to help maximize those campaigns which were lucrative to both the pharmacies and — of course — Google.

So, what was the harm? Unlicensed pharmacies have no obligation to actually sell real drugs. For all we know, absent any regulatory consumer protection, those pharmacies were selling sugar pills in place of life‒saving medications. We’ll save the debate about U.S. pharmaceutical pricing for another day, but working outside of the regulations for something as critical as prescription drugs didn’t engender a lot of Google health-care trust for me. They paid up, of course, but that was really just Larry Page’s “Get Out of Jail Free” card.

The third argument against Google is really in its own words. The company has made crystal-clear its interest (or lack thereof) in health care.

Generally, health is just so heavily regulated. It’s just a painful business to be in. It’s just not necessarily how I want to spend my time. Even though we do have some health projects, and we’ll be doing that to a certain extent. But I think the regulatory burden in the U.S. is so high that I think it would dissuade a lot of entrepreneurs. — Sergey Brin at Khosla Ventures CEO Summit

This last one highlights the real interest by Google in health care. It’s vanity. You’re kidding yourself if you think that providing your genetic data (for free or paid) will have a meaningful contribution outside of the bio/life sciences community. Why do I need Google for this? I don’t — and I’d prefer to work with organizations that have an expressed commitment (even if it’s economic) to the scientific rigors of a clinical trial process.

We have some very serious, big, ugly, hairy problems in health care, but this darling duo of Silicon Valley can’t (or won’t) engage them directly because they’re “too regulated”? Here’s a short list of problems with our current system:

  • The U.S. health-care system is running at almost $4 trillion per year — an economic unit larger than Germany — consumes 18 percent of our entire GDP and is growing at about five percent per year (for as far as the eye can see — Obamacare or not).
  • PPO coverage for an American family of four is now more than $23,000 annually.
  • In 2012, there were 84 million Americans who either had no health insurance or were underinsured during the year. That’s effectively one‒third of the (under-65) population.
  • In a recent study by the Commonwealth Fund, the U.S. health-care system ranked dead last compared to 10 other countries.
  • Preventable medical errors in the U.S. are estimated at somewhere between 200,000 and 400,000 per year.

Let’s take that last one. It’s a really big number — and kind of hard to grasp — but it’s effectively the same as if the world’s largest passenger airliner — the Airbus A‒380 — crashed every day, with no survivors. The worldwide uproar would have the entire fleet of A‒380’s grounded by day three.

I’m absolutely sure that Google wants my genetic data — preferably for free, of course, and will say anything in order to get it. Does that mean that it has earned my trust to use that data as part of an ambiguous, long-term experiment? Not with my genetic data. At least not yet. Thanks for the offer, but no thanks.


This article first appeared in Re/Code (July 2014)
Back to Highlights

Filed Under: Genetics

  • « Previous Page
  • 1
  • …
  • 6
  • 7
  • 8
  • 9
  • 10
  • Next Page »

Dan Munro is an author and Forbes Contributor who lives outside of Phoenix, Arizona. He has written for a variety of national publications at the intersection of healthcare policy and technology.

CONNECT

  • Medium
  • Twitter
  • YouTube

Copyright © 2025 · Dan Munro · All Rights Reserved