Insight

Your Interoperability Journey: Technology and Cost Considerations

On July 22nd, Tegria companies Bluetree and Cumberland hosted a virtual panel on technological and financial considerations for interoperability. We sat down with panelists Deepak Sadagopan, SVP, Value Based Care & Population Health Informatics, Providence, Patrick McGill, MD, Chief Analytics Officer with Community Health Network, and Bhaskar Chowdhury, MD, AVP of IT with Geisinger Health Plan, to address some key questions.

Achieving interoperability and meeting regulations like the 21st Century Cures Act is becoming more challenging for healthcare organizations, including payers, providers, community-based organizations, and public agencies, particularly as the industry shifts to focus on social determinants of health.

How can healthcare work toward an interoperability solution that works for payers, providers, and other organizations with varying levels of technological and financial resources?

Mr. Sadagopan: I would actually argue from a Provider perspective, from a technical standpoint, that meeting interoperability regulations is not as challenging as it used to be. This is thanks to the many years of work that providers and delivery systems have invested in CEHRT—whatever the system may be. I think we need to recognize that interoperability is less about how information moves from point A to point B. It is more about how information is captured, processed, and represented at those end points. The new regulations, while more challenging for payers, have not done much to change the state of the art. Sure, we can use FHIR and APIs, but it is helpful to contextualize that we have done that previously as well as an industry—think ePrescribing for example. ADTs—been around for a long time. Patient/Member focused data availability—been available as part of the online availability of member information for three versions of MU. Any vendor who had competent technology would have implemented this using SOA or a microservices architecture where the APIs can be adapted to current needs quite easily. The challenging aspects of interoperability are yet to come—when we shift from exchanging information about one patient at a time to creating the capability to exchange information on populations. The standards don’t exist and neither do capabilities in systems at both ends to process such information. There is a serious disconnect in healthcare: Our business models are evolving to compensation and payment at a population level, while our information infrastructure is still stuck in an FFS world. That change, when it comes, will be hard to navigate.

Dr. McGill: I believe the solution to the interoperability problem will arise organically from an organization outside of healthcare trying to connect patients with their data and services. It will be difficult for health systems, especially smaller or more rural systems, to keep up with interoperability advancements. However, this is where interoperability is most needed, due to patients needing specialty or tertiary care at other locations. With the shift to virtual care during COVID, more options for care have arisen, which also compounds the problem. Therefore, I do believe a technology company, from outside of healthcare, could potentially solve the problem, but it would require a shift away from legacy healthcare thinking.

Dr. Chowdhury: The focus and spirit of the 21st Century Cures Act is to improve data sharing and make patients the owners of their data. The regulation is straightforward but involves serious technical undertaking. Some of the requirements are complex, and health plans with limited resources and capabilities will run into three main issues:

  • Data Management: Understanding the data, the sources, and what data to maintain—along with access to the data, control over the data, and the authority to share. This could become a huge undertaking, depending on the complexity and organizational make-up of the health plan.
  • Technical Capabilities: The regulation introduces several technologies that are not common to most health plans.
  • Vendor Selection: Health plans must choose their vendors wisely—ensure you can partner with them, and that they will adhere to compliance regulations.

Interoperability can be categorized into two main areas—first, receiving data into systems from various sources and then putting this data into usable form. What considerations can help organizations ensure that data is usable to inform strategy and care delivery?

Mr. Sadagopan: Actually, interoperability has three main areas. It starts with how information is captured into systems – and then the other two areas you mention. Plus, the most important aspect of this is who is using the information. Understanding your persona and their business process workflow is super critical to implementing successful interoperability. If you have a good understanding of how information is captured, the persona, and the business process, then interoperability standard and communication modes become secondary to the overall solution framework. If system A captures smoking status as objective text under a vitals section, while system B records it as a problem in the problem list, then expecting a system C to find and use an ADT feed that primarily draws the DGX segment from the problem list will inevitably result in inadequate communication of information. The problem gets amplified if the person consuming the information is actually accountable to having a smoking cessation conversation. They’ll have just missed their quality target with that encounter.

Dr. McGill: One of the biggest concerns I have as a family physician and chief analytics officer is “what is the source of truth?” We have seen this with EHR interoperability regarding different sources of data, without a single source of truth. If the patient becomes the center of the data exchange, are they capable of becoming the source of truth for their clinical problems?

Dr. Chowdhury: Our approach was to start small:

  • Identify what data we have in-house, both from the health plan and from the clinical side that plan has control over.
  • Ensure a clinical subject matter expert is available to help map data.
  • Identify data that is outside of the organization—carve out vendor data and verify the data is usable.
  • Implement master data management to enhance data quality and data integrity.

What are some of the lessons learned that you would like to share as you continue on your interoperability journey?

Mr. Sadagopan: Interoperability is not about data systems, management, standards or even communication protocol. Interoperability is a business problem. It’s about who is capturing what information and how, and understanding who needs to consume that information, at what point in a process, and for what purpose. It also has to do with the economics of what it costs to get that information to the person at that point and what the return on investment is.

Dr. McGill: Interoperability is complex and difficult. It really takes a long-view approach. Addressing interoperability (in its current form, with current tools) is not sustainable for many organizations which causes them to quit. We need a better solution, which is more affordable, but not reliant on the EHRs.

Dr. Chowdhury: We faced several challenges. Some are unique to us, some are because we are an integrated delivery network, and some are expected due to the nature of the beast:

  • Data Consolidation: Determining source and end points to build FHIR resources; what data the plan has control over; physical and logical layers; and terminology normalization.
  • FHIE Enablement: FHIR resource orchestration; terminology mapping; skillset.
  • Authentication and Authorization: Authenticating unknown users; OpenID and OAuth2; Security Policy enablement.
  • Consent Management: Establishing standard processes; Authorizing personal representation.
  • Audit and monitoring.

For organizations working with vendors to address interoperability issues, I recommend the following:

  • Ask the vendor to provide a copy of the Penetration Test result, and review to make sure there is no vulnerability reported.
  • Ensure that the vendor used a third party to conduct security testing, including penetration testing.
  • Ensure proper SLA and performance guarantees documented in the vendor contract.
  • Establish proper liability in case of a data breach.

To continue this conversation, our upcoming webinar Technological & Financial Considerations of Interoperability will expand upon the importance of interoperability and the role technology plays with our expert panelists:

  • Deepak Sadagopan, SVP, Value Based Care & Population Health Informatics, Providence
  • Patrick McGill, MD, Chief Analytics Officer, Community Health Network
  • Bhaskar Chowdhury, MD, AVP IT, Geisinger Insurance Operation, Geisinger Health Plan

Capabilities mentioned here are now carried forward with Tegria.