“We really need to drop interoperability as a competitive differentiator in this industry. Once everyone comes to the table and recognizes that we have a moral obligation to provide patients with their health records, I think we’re going to be much better off.”
Zane Burke, President, Cerner Corporation
The ONC (Office of the
National Coordinator for Healthcare Information Technology) issued its Draft
2017 Interoperability Standards Advisory last week. It has issued an interoperability
advisory yearly since 2015, the year after it issued its 10-year vision for
healthcare information technology (HIT) along with its interoperability
roadmap. I reviewed these efforts at the time[1]
& more recently I wrote: “In
June of 2014, “The Office of the National Coordinator for Healthcare
Information Technology (ONC) published its 10-year vision for health
information technology (HIT) in the U.S. Interoperability among HIT
components was a primary & enabling aspect of this
vision. Subsequently, they also published a roadmap for such
interoperability that covered in some detail the technical, administrative
& regulatory tasks & issues associated with achieving interoperability
in the healthcare space (…). I was generally skeptical of what I saw as an
effort that emphasized standards development & adherence as well as a
certification process similar to the Meaningful Use process that I have not
seen produce results other than software that is compliant with the test
criteria, but not especially usable or useful for healthcare organizations
(at least in my experience & opinion).”[2]
In that post, I defined
interoperability (I14Y, geeky contraction) as “the design criteria that allow independent software systems to share
information, function & workflow with each other”. In healthcare, we
are just trying to provide information sharing, even though in other industries
I’ve worked in (financial systems, discrete manufacturing: auto, aero) we have
achieved functional & workflow integration as well. The ONC started
specifying how important information integration is to healthcare long before
it published its 10-year vision. We have been talking about & attempting to
provide minimal levels of integration for at least 10-12 years. Integration is
both the actual work done to achieve I14Y & the state of the systems that
have this work performed. We say they are integrated. Given the focus that the
ONC, HHS, CMS, DoD etc. have had on interoperability, what is our state of
integration in HIT & what does the 2017 Advisory tell us about it?
The Advisory is in four sections with
three appendices as follows:
- Section I: Best Available Vocabulary/Code Set/Terminology Standards and Implementation Specifications
- Section II: Content/Structure Standards and Implementation Specifications
- Section III: Best Available Standards and Implementation Specifications for Services
- Section IV: Questions and Requests for Stakeholder Feedback
- Appendix I – Sources of Security Standards and Security Patterns
- Appendix II - Revision History
- Appendix III – Responses to Comments Requiring Additional Consideration
The first three sections specify the standards that are
relevant for each of the subtopics covered. Section 1: Vocabulary has 22
subsections. Section 2: Content/Structure has 14 subsections & Section 3:
Services has 8 subsections. Each subsection covers a specific HIT area that is
relevant to clinical use & regulatory reporting. These subsections range
from such topics as care plan to ePrescribing to public health reporting. 35
separate standards are cited as being required for interoperability to be
provided, i.e. for integration among systems to occur. Many of these standards,
by ONC’s own admission, are in early stages of adoption – some only in the
proposal stage. Standards range from established practice such as ICD-10,
LOINC, SNOMED or HL7 CCD & a variety of IHE messaging standards (XDR, XDS,
CHD etc.) to HL7 V3 CDS & FHIR. In addition, Appendix 1 lists nine IHE security
standards & 16 other security standards (NIST, FIPP OpenID, OAuth,…) for a
total of 25 security standards that are recommended. This is a total of 60 (!) standards either required or
recommended by the ONC to meet interoperability (& integration) requirements.
I have been designing & developing software for 25+
years &, at least in my experience, it is not possible to provide adherence
to 60 standards in any one software system. The only way this is possible is to
partition the system by function (or by user) & attempt to provide
standards conformance within each partition. I have done this is several
singular instances, most notably with respect to the OMG interoperability
standard (of which I was co-author). The result was several incompatible systems
from major vendors (including Digital Equipment, IBM, Sun Microsystems &
HP). As I have written elsewhere, we accepted the Sun Microsystem’s
representative’s view of interoperability, which was if my system sent his
system a well-formed request for data; his system could send my system an error
message. The problem, not just with complex ecosystems of standards, but even
with single standards is that if they apply to important function, is that vendors
have already developed architectures, infrastructure & application level
software that may or may not have anything to do with adherence to the
“standard”. An example from healthcare that I was involved with would be the New
York eHealth Collaborative (NYeC). While a (way too large) committee was working
on the overall architecture for New York statewide health information exchange,
individual RHIOs & HIEs were adopting vendor solutions that did not conform
with, & were unlikely to conform with, the “standard”. Once adopted,
deployed (& paid for…), it was very difficult to get the individual
healthcare organizations or vendors involved to move toward a standard
architecture that did not represent the implementations already in place. I
have seen this same situation repeated several times in healthcare.
So, again, in my experience, just specifying a set of
standards is not a solution for interoperability. This is especially true if
many of the standards have been developed independently & not in
conjunction with other standards in the proposed set, as is the case in the ONC
proposal. It is also not realistic to expect vendors to conform to standards
that may be less effective & performant than their current products &
architectures. Providing ways of testing software against such standards often
just gives testing organizations the opportunity of reporting on the lack of
conformance of products that may already be in use. The situation with the ONC
interoperability set is more complex.
An example would be the Consolidated Clinical Document Architecture
(C-CDA). C-CDA is an extension of the HL7 Clinical Document Architecture using
XML templates to represent specific clinical (patient) data. It references HL7
CDA 2.0 & HL7 V3 Reference Implementation Model (RIM). C-CDA has nine (9)
sections for different data sets including: 1) continuity of care document, 2)
consultation note, 3) diagnostic imaging report, 4) discharge summary, 5) history
& physical note, 6)operative note, 7) procedure note, 8) progress note
& 9) unstructured document. C-CDA uses XDR (external data representation)
as a serialization format for data transmission (originally proposed by Sun
Microsystems & now an IETF standard) & XDS (cross-enterprise document
sharing), an IHE standard for document sharing as its format & network
transmission standards.
I’ve had a good deal of experience with C-CDA, most deeply
as a technology consultant to a start-up doing medication reconciliation. As
part of their certification to be able to use SureScripts data, they had to be
able to pass SureScripts C-CDA testing. We (myself & a team of the CTO
& two programmers) worked for about 4 months to get to the point where this
was possible. The majority of the issues we had were with the substantial
ambiguity in the C-CDA standard, the huge verbosity & redundancy in the
C-CDA format & issues with how different EHRs used XDR/XDS. I knew at the
time that our experience was not unique, but it was with mixed feelings that I
read the SMART C-CDA Collaborative study[3]
that sampled information exchange from 107 healthcare organizations using 21
different EHRs. Using two different testing regimes, they found 615 errors in
their sample across six broad error categories. They concluded: “Although progress has been made since Stage 1 of
MU, any expectation that C-CDA documents could provide complete and
consistently structured patient data is premature. Based on the scope of errors
and heterogeneity observed, C-CDA documents produced from technologies in Stage
2 of MU will omit key clinical information and often require manual data reconciliation
during exchange.” This is just one example of the ONC proposed standards
being ineffective &/or premature.
OK – interoperability is not totally easy (although it’s not
as hard as we have made it in healthcare). How have we solved this issue in
other industries? I know, we can’t compare healthcare to other industries,…
well actually most of the people who have asserted that to me haven’t worked in
other industries, & I believe that we have a good deal to learn from how
other industries have addressed technical issues.
I have been involved with I14Y efforts in several other
industries, most notably in aerospace (Boeing Commercial Airplane Company, BCAC),
Auto manufacturing (General Motors) & financial services (Goldman Sachs as
a consultant to Ernst & Young Global Financial Services). The two projects
in discrete manufacturing were quite similar. Boeing was attempting to develop
a completely digital design process for the 777 series. Up until this time,
their design process had used a combination of paper designs, digital (mostly
CAD-based) designs, massive bill-of-material spreadsheets used in paper form
& several very large bill-of-materials database systems (~80,000 tables). I
was the architect for Digital Equipment’s relational database, Rdb/VMS at V1
& V2 that Boeing was using, so they approached me to assist with this
project. There were on the order of 100 different digital & paper-based
systems that needed to be consolidated into formats & applications so they
could be stored & operated on digitally. A diverse team that included
design engineers, manufacturing engineers, engineering-manufacturing experts,
front-end data specialists, database specialists & even pilots &
managers was assembled. Work proceeded on three paths: 1) standard definitions
for resources across design, engineering & manufacturing silos, 2) standard
formats for all resources, 3) programming interfaces (APIs), transport
protocols & workflows for information storage & sharing. The effort was
lead by a Boeing Engineering Fellow & an Executive VP. Definitions &
formats were developed & reviewed in about nine months, APIs &
workflows were available in about a year & a testbed was in place &
functioning after about 14 months (from the start of the project). This effort produced an integrated system that
functioned to align the design, engineering & manufacturing process for the
777 commercial airplane. It included thousands of definitions & data
formats, a small number of APIs & allowed information across organizational
& functional silos.
The effort at General Motors, called C4, was similar in
structure. GM created the C4 “car company” – nothing got done at GM at that
time unless it was done by a “car company”. C4 was responsible for developing a
paperless design system that shared information with engineering &
manufacturing. The difference with the BCAC effort was that the GM
organizations were not well aligned & many, such as Powertrain, were not
committed to the C4 goals or project. C4 never got very far despite a very large
commitment of resources (financial & organizational) from GM.
The Goldman Sachs effort was smaller. Its purpose was to be
able to integrate the results from several trading systems to be able to
produce a synthesized view of capital flows, gains & losses in near
real-time. The project was a must have & was initiated by Goldman’s then
Chairman & CEO John L. Weinberg, who made it clear that this was to get
done. Ernst & Young was brought in to provide project expertise &
project management & I served as a consultant to E&Y (I was at Digital
Equipment at the time). The project was similar to the ones in discrete
manufacturing as it emphasized standardized vocabulary & formats as well as
an information bus to share data & function. The whole project took 9 months
& resulted in a system that was in use for about 10 years before it was
rewritten to be web-centric.
What are the similarities among these projects that we can
learn from, both positively & negatively
- Executive understanding & sponsorship is essential. No project as complicated multisystem integration will succeed without this. Buy-in must also occur at operational levels so that priorities & resources are properly set & utilized.
- The Boeing project had both executive sponsorship & operational buy-in. It was solving a problem that everyone agreed was in the interest of the company to be solved.
- The Goldman Sachs project also had this & even though it was a smaller project, it was both technically complex & organizationally & culturally challenging, so the CEO’s imperative was necessary & effective.
- The GM project had executive buy-in, but not much focus. It did not have operational buy-in in the car companies (Chevrolet, Cadillac, GMC, etc.) or in the functional units (Design, Powertrain, Components & Subassembly, GM Research etc.). The primary operational manager was a long-time GMer from Pontiac racing & Chevrolet, but not even he could get the various groups to cooperate.
- My experience on this aspect of the issue in healthcare is that it is more complicated. None of the private sector projects had regulatory requirements to meet (except for Federal & international safety standards) or that specified what standards were appropriate. There is no one “CEO” who can decide to “make it happen”. Instead, thousands of CEOs must be convinced of the necessity of integration. Often this is only through the leverage provided by regulation. Vendors have much more leverage in healthcare – there are many more of them & they are already established in many segments of the industry. If you are a CAD vendor, losing Boeing’s or GM’s business could be catastrophic. If you are Epic or Cerner, losing a single healthcare organization’s business is not as big a deal (unless it’s the VA or DoD or maybe Partners or Kaiser, although Kaiser seems pretty set).
- Normalization matters! – No integration project can succeed without the effort to develop & agree on a common vocabulary & common formats for static storage & use of data. The same is true for in-transit formats & processes.
- The Boeing project spent more time on this than any other aspect including coding & deploying the solution, & it was the primary reason (IMNSHO) that the project was as successful as it was.
- A recent study that I lead[4] looked at data quality in EHRs & readiness for analytic capability at Federally Qualified Health Centers (FQHCs). One of the primary issues with data quality found in the study was the use of non-standard definitions for core concepts (patient, encounter etc. even though the Bureau of Primary Care (Health Resources and Services Administration, HHS) publishes & requires standard definitions for the reporting that FQHCs do. The only health centers in the study that generally did not have issues with normalization were the ones that had done substantial work on this in order to populate a data warehouse.
- Broad participation improves design & practice. – Including stakeholders & end-users as well as technical specialists ensures that the function developed is both usable, that is easy & convenient to use, perhaps even transparent & useful, that is it solves the users’ & stakeholders’ problem, not the problems of interest to the technical experts.
- Again, the Boeing project was the leader here as the project committees all had very broad representation from both the user base & technology groups. The model really was that all perspectives were welcome. Did this work perfectly in practice?... No, of course not, but it worked well enough that the solution when deployed was used as intended.
- The GM project had an interesting aspect that I was involved with. The EVP who was the head of the “C4 car company” contracted with the urban & industrial anthropology group at Wayne State University & Marietta Baba (then Professor of Anthropology at WSU, Assistant Professor of Anthropology at Michigan State University) ran a project to study GM groups as separate cultures[5]. The results of this study allowed me (& other technologists to design technology adoption & transfer processes that were much more effective than if we knew nothing about the organizations.
- None of these successful projects were “standards-based” except to the extent that standards already existed & were in general use. The Boeing project used some definitional & functional standards where they provided better ways of solving a problem, but the Goldman Sachs project was proprietary.
- Neither the Boeing nor the Goldman Sachs projects required the development of products or applications that needed to support standards other than those that were already in use by the organizations. No new standards were required. Among the many reasons that the GFM C4 project failed was its requirement of many new standards, not all of which were already in use as well as the adoption of a new operating system (Berkeley Unix).
- While it could be argued that the majority of the ~60 standards referenced in the HHS I14Y roadmap are already in use, it is actually the case that many of the transport & security standards are not currently in use in healthcare, and that even many of the “healthcare” standards are not in general use. Several of the are also controversial, such as C-CDA, even though they are required. As a whole, this standards ecosystem is not in use as a coherent whole in healthcare at this time.
So what do these lessons tell me about interoperability in
healthcare,… here’s some thoughts:
- We need more than the ONC or CMS to mandate interoperability. Regulations cannot ensure that we’ll meet this goal in 2017, 2018 or at any time… We need a bottom-up movement toward it because thought & operational leaders have decided to make it a priority. This not just true of I14Y, but of any goal we expect to actually achieve.
- We need a roadmap that is not a listing of standards, but rather a compilation of tasks that need to be done first to get ready for interoperability & then to achieve it. Foremost among these is the development of consensus around definitions & processes. It may appear that this is not possible, but I believe that if we have the will to develop such a consensus, we can & will do it. Once this is in place, I14Y becomes much simpler, if it is not, then any interoperability of information exchange has to be done on a one-to-one basis, essentially as a one-off effort that will need to be changed whenever a definition needs to be added or modified.
- We need to include a broader range of people in the effort to understand, plan & achieve I14Y. This ensures that the capabilities that are developed & deployed meet the actual needs of the people who will use the information provided by the integration.
- In light of this, we need to have better understanding of the operational & business models that require integration. HIEs, ACOs etc. have not had compelling operational models & have presented less than successful financial & business models. I believe there are compelling reasons to develop this capability – we just need to agree on them.
- Finally, we need pragmatic approaches to a solution. Trying to provide general solutions for all possible situations is not possible. We need to develop some specialized solutions (some that may be able to be generalized), deploy them & get experience using them in order to understand what is needed, how the solution will be used & how to provide it.
- Direct is an excellent example of this. The use of Direct for provider-to-provider exchange of clinical (& other) information has been successful because it provides a (relatively) simple solution to a specific problem.
- FHIR may be another example. The recent announcement by the Regenstrief Institute that it will develop & test a “point-to-point? HIE using the HL7 FHIR technology is an example of the type of experimentation & innovation that will provide solutions to the integration conundrum. Providing practical ways of addressing specific critical problems will go a much longer way toward creating an integrated healthcare ecosystem than requiring the use of multiple, overlapping standards & counting on vendors to implement products that are compliant (in some vaguely defined way).
We need to agree on why we are focusing on I14Y, not just
that we have to. Once we are agreed on motivation & the practical uses
& advantages of interoperability, it will be much easier to develop &
deploy the technologies that will enable these advantages.
[1] http://posttechnical.blogspot.com/2014/07/the-onc-interoperability-vision-opinion.html, The ONC Interoperability
Vision: An Opinion, & http://posttechnical.blogspot.com/2014/07/the-learning-healthcare-system.html, The Learning Healthcare
System. Both in July 2014.
[2] http://posttechnical.blogspot.com/2016/02/healthcare-information-technology-next_17.html, Healthcare Information
Technology: The Next 10 Years. February 2016.
[3] D'Amore, J.D. et al. 2014. Are Meaningful Use Stage 2 certified EHRs ready
for interoperability? Findings from the SMART C-CDA Collaborative. DOI: http://dx.doi.org/10.1136/amiajnl-2014-002883 1060-1068 First published online: 1
November 2014
[4] c.f. Path2Analytics
Project: Process & Results Review. Association of Clinicians for the
Underserved Annual Conference. Washington, DC. June 1-3, 2015.
[5] c.f.
https://msu.edu/~mbaba/documents/MajorApplicationProjects.pdf.
Also: https://msu.edu/~mbaba/