What We are Getting Wrong About Interoperability
Interoperability has become one of the most widely used and widely misunderstood concepts in healthcare and clinical research.
It is often positioned as the solution to the industry’s most persistent challenges – fragmented systems, inefficient trials, disconnected data. The assumption seems simple, if we can get systems to talk to each other, everything else will follow.
But the conversations at the Vulcan Futures Roundtable at PHUSE in Austin, Texas suggested something more uncomfortable.
We may be solving the wrong problem.
Progress Without Impact
There is no doubt that meaningful progress has been made. Data flows more freely than ever before. Standards such as FHIR have enabled new forms of connectivity, allowing information to move between systems in ways that would have been unthinkable even a few years ago.
Darren Weston of J&J described how lab results now appear on a patient’s phone before the attending physician has even entered the room, streamed directly via FHIR from lab to Apple Health, pulling data from multiple vendors seamlessly and invisibly. No one needed to know the underlying standard was involved. It just worked.
And yet, despite this progress, the real-world impact remains limited.
“The interoperability is not the issue. The challenging part is the content.”
This distinction is critical. The industry has improved its ability to move data. It has not yet solved how to make that data consistently meaningful.
Information arrives in different formats, with different levels of completeness, and often without the context needed to interpret it properly. The same data point can mean different things depending on where it originated. A height recorded in one system may be in centimetres; in another, inches. An adverse event in a clinical trial setting must be captured with rigour, but in routine care it may only appear as a note in the patient record, if it appears at all. In clinical research, where precision is essential, this lack of consistency becomes a fundamental barrier.
A Paradox of Abundance
What emerges is a paradox. The industry is generating and sharing more data than ever before, yet still struggling to extract reliable value from it.
Part of the issue lies in how the problem has been framed. Interoperability has largely been treated as a technical challenge, one that can be resolved through standards, protocols, and system integration. But the discussion in Austin made clear that the real challenge is broader and more systemic.
Different stakeholders approach interoperability from entirely different vantage points. For a data scientist, it may mean the ability to combine datasets for cross-institutional analysis. For a clinician, it means accessing the right patient record at the point of care. For a technology provider, it means system compatibility. Each perspective is valid. But without alignment, they do not converge.
“Different people are talking about interoperability, but meaning different things.”
This lack of shared understanding creates a situation where solutions are developed in isolation, each addressing a specific use case without considering the wider ecosystem. Standards themselves can become siloed, adding complexity rather than reducing it. As one participant noted: each standard considers individual use cases, not the bigger picture.
From Movement to Meaning
If there was a consistent thread throughout the discussion, it was the need to shift the conversation.
Instead of focusing solely on how to connect systems, the industry needs to focus on how to align around outcomes. That means thinking beyond data movement and toward data usability, ensuring that information is not only accessible but interpretable, consistent, and actionable.
It also requires acknowledging that interoperability is not an end in itself. It is an enabler. The real goal is better decision-making, faster research, and improved patient outcomes.
Lilliam Rosario of TransCelerate offered a vision of what success might look like – a future where interoperability standards and connectivity operate seamlessly in the background, allowing scientists and clinicians to focus entirely on the problem at hand, rather than on how to access the data they need to address it.
This is where Vulcan’s role becomes significant, not as another technical solution, but as the collaborative environment where these broader challenges can be worked through. By bringing together sponsors, technology providers, and healthcare organisations, Vulcan creates the conditions for alignment: turning fragmented conversations into shared direction.
Because ultimately, interoperability is not just about connecting systems. It is about connecting understanding.
