Jeremy Pierotti is co-founder and CEO of Datica of Minneapolis, MN.
Tell me about yourself and the company.
I’ve been working in healthcare IT for about 20 years. I started off working at Allina Health in Minneapolis and ended up doing consulting. Then I co-founded Sansoro Health. Sansoro and Datica merged in June 2019, with the go-forward company name being Datica. We help healthcare move to the cloud by addressing compliance and data integration challenges.
What can the merged companies do more effectively than they could have done as separate organizations?
We knew that healthcare is moving to the cloud at an accelerating pace. As Travis Good and I started talking, we recognized that we had a complementary set of products, technologies, and team talent, and that if we put the companies together, we could help digital health engineering teams address the two challenges that they have to solve. Those are cloud compliance — which increasingly means meeting the HITRUST CSF requirements — and data integration, being able to exchange data bi-directionally between lots of different digital health applications. Electronic health records, but also all the different supporting systems that every health system runs.
How far along is healthcare in its seemingly inevitable move to the cloud?
It is toward the beginning of its journey, and it’s going to move fairly quickly. We’ve read lots of reports that show anywhere from 10 to 20% CAGR growth over the next five to seven years, and we’re experiencing that ourselves.
Like every other industry, healthcare is recognizing that what the cloud brings is not just running your software on somebody else’s computer in a data center that they manage, but providing access to a whole new set of tools for data analytics, supporting mobility, and integrating lots of types of data from lots of sources. You just can’t develop software with those features using an on-premise architecture. You are increasingly seeing large companies develop their new applications on a public cloud framework because it gives them the flexibility and the power of the toolset to leverage the capabilities of engineers and development teams all across the world.
How does the work of Cerner, Epic, and Meditech fit into a strategy of making their data available for use by cloud-based services?
They are moving deliberately and cautiously, understanding that they can’t make dramatic changes overnight. Their customers are big, complicated provider organizations for whom stability is enormously important. They are all looking for the right balance of making new capabilities available and taking advantage of cloud functionality that will give customers the features that they want, while at the same time, keeping their core systems stable. That means something a little bit different to each of those vendors, but they are all trying to find and strike that balance.
Would a move to the cloud change the exclusive relationship between a health system and their primary EHR vendor?
In the short term, I don’t think it changes anything significantly. I certainly don’t think it makes it more exclusive. In the long term, I think it makes it less exclusive.
I was listening to a podcast from Andreessen Horowitz, where Mark Andreessen was talking about how in Silicon Valley, you have this rich ecosystem of API-driven data exchange and whole companies that have been developed just to facilitate the development and management of APIs within industries. What we see in other industries will come to healthcare, too. When you have increasing adoption of cloud-based application development, you end up stitching those pieces together with API-driven data exchange. We’re seeing that same thing in healthcare as you look at the emergence of FHIR and other API toolsets for patient data exchange.
A move to the cloud by Cerner, for example, is not going to tie the hands of Cerner’s clients and make them any more dependent on Cerner. It is just part of the the slow, steady move toward health systems being able to choose from a variety of tools and integrate the tools that work for them best.
What is creating the demand for cloud-based services?
My colleague and our chief medical officer, Dave Levin — who used to be CMIO of at the Cleveland Clinic – says he spends all day working in healthcare and then he goes home to the 21st century. The reality is that it’s consumers. It’s our everyday experience with smart phones, tablets, and advanced software that we run on whatever device we choose and that allows us to move from one device to another almost seamlessly.
Those experiences that we who work in healthcare have every day in every other part of our lives make us realize that we need that same type of functionality when we’re delivering healthcare services to patients. When we’re managing populations of patients or health plan members, we need those same capabilities, those same toolsets.
To take the simplest of examples, there’s no reason that if I’ve been to the same doctor’s office six times in the last year, that I should have to fill out the same piece of paper on the same clipboard the seventh time. When when I walk into all sorts of other businesses, they know who I am. They have read my license plate or I’ve agreed to have my smartphone notify them when I walk in, so they know from the beacon at the front door that I’ve arrived and they’re ready for me. It’s those kinds of experiences — the scalability, the mobility — that is driving healthcare organizations to create software with those same capabilities.
Will it be hard for healthcare IT vendors to move their systems to the cloud?
Vendors are looking to do that module by module. I don’t have deep insight into the Cerner-AWS announcement that came last month, but the way I understand it, Cerner is not saying that all of a sudden they’re going to move all of their clients who use Cerner Millennium onto AWS servers or AWS services. But they will be increasingly developing new software capabilities on the public cloud. On AWS specifically, for Cerner.
Going back to what I said earlier about the need for stability and reliability by providers and payers, but especially providers, our expectation is that you’ll see vendors developing their new software, their new modules, in the public cloud, taking advantage of those capabilities. They will work deliberately over time to figure out what makes sense in terms of potentially migrating their legacy products to a cloud infrastructure. I’m not sure I have any unique or special insight into that, but that’s the trend I’m seeing, health IT companies developing their new stuff in the cloud and migrating their customers who want those new features to that new platform.
Health IT vendors seem obligated to name-drop AI and analytics in their cloud announcements. What kind of learning curve will they and their cloud services vendor encounter as they modernize healthcare applications?
I wish I knew the answer to that. There clearly are some tremendously exciting applications of artificial intelligence and machine learning in healthcare. I’ve also read many pieces recently about the need to approach it carefully. Any time you’re going to train a machine to learn something, you need to make sure that you’re training it in the right way, otherwise you can create more problems than you’re solving.
But the cloud is a big part of that, because there are so many AI and ML services that are available through a public cloud infrastructure. AWS announced Comprehend, their natural language processing service, a couple of years ago. It allows users to train it and it comes at a competitive price point. That’s an example of how cloud service providers and application developers in AI and NL are looking to leverage the cloud — making those services available, allowing lots and lots and lots of engineers and creators to experiment with those services, test them, and determine what can have a real, positive impact on patient outcomes.
Big provider organizations are announcing their own cloud partnerships, such as Mayo Clinic and Google Cloud. How will those organizations work directly with cloud providers?
It speaks to the amount of data that providers are accumulating. They need to find ways to support the efficient storage and analysis of that data so that they can learn from it as quickly as possible and apply that to better operations and better patient care. It’s not surprising to me at all and I think we will see more of it. It’s understandable, because in other industries, you have big players, big companies that are on a daily basis using cloud platforms and the analytics capabilities of cloud platforms. To improve their products, to improve their customer service, and to improve their deployment of personnel.
Healthcare has the same needs and the same demands of end users to capture those capabilities without having to invest in standing up a new data center full of physical hosts and a big huge team of devops engineers, DBAs, and others to manage all of that traditional infrastructure. You’ve got all of that data and you need somewhere to quickly and efficiently store it and analyze it.
What impact do you expect to see from the federal government’s implementation of the interoperability and data blocking implications of the 21st Century Cures Act?
We’re waiting just like everybody. Our sense is that when the final rule is released, it will raise the floor, but it won’t necessarily raise the ceiling. We are looking to continue to push the ceiling with innovative solutions for integration.
We recognize that even when ONC and CMS release those new rules, it’s likely to be several years before they’re enforced. It’s going to take the vendors time to develop the technology and capabilities that those rules may require. We’re not waiting. We are working every day with health systems and innovative health IT companies to figure out how they can make the most of the data exchange capabilities that exist today.
The bottom line is that we’re eager to see what comes out. Industry discussion of those rules has been robust and the public itself is highly interested in it. Every person has a personal investment in being able to get access to and make portable their health information. So it’s fascinating, but we recognize that it will be years before anything is actually required and implemented. Our goal is to help our customers, providers, and payers take advantage of what they’re capable of right now.
Is the federal government at risk of oversimplifying the interoperability challenge in declaring mission accomplished just because the use of APIs and FHIR has widened?
As I listened to the debate over the last several months, and certainly after the draft rule was released, I was struck by how thoughtful and mature the discussion was across the board on these rules. There is a broad recognition within health IT that if this were easy, we would have solved it.
I’m not saying that it’s challenging mostly from a technology standpoint. It’s challenging mostly because there are lots of competing interests that have to be resolved, and they’re not necessarily easy to resolve. There are ways to do it, and our company and I personally have our own views on how to address some of those challenges. But it’s been a robust, mature discussion about how we balance the interests of different players, always keeping in mind that the goal here is the delivery of better patient care at lower cost and having better outcomes.
Do you have any final thoughts?
My colleagues and I are excited about the pace of innovation in health IT. If we weren’t, we would go find something else to do, since goodness knows the world has plenty of other problems to solve. I look forward going to work every day because of the opportunity to partner with people who feel emotionally compelled to bring positive change to something that impacts every single person — the delivery of quality healthcare.