Kevin Coppins, MBA is president and CEO of Spirion of St. Petersburg, FL.
Tell me about yourself and the company.
Spirion is headquartered in sunny St. Petersburg, Florida. We serve the data privacy and data security markets. I joined Spirion just over a year ago. Before that, I spent the previous couple of decades working across a variety of tech companies, both in the cybersecurity space as well as in the networking space. I started way back when at Novell.
With every role that I’ve held, I’ve had the opportunity to work with healthcare organizations across the US and around the world. Every company you talk to says they are different, their industry is different, or something is different. Healthcare is the only one that gets to carry that badge and actually mean it, because everybody else is much the same. Healthcare is definitely different.
How much information does the average health system have that they don’t know about or don’t realize is unprotected?
I typically start with a question. How much data do you have? Somebody tries to answer the question, and then they stop and say, we have no idea, because you don’t. Think about how fast data flows in and flows out, how it moves. It gets stored in the cloud, and then it gets replicated in the cloud, and you just don’t know. It’s a fair answer now, and people have gotten more comfortable saying that they don’t know. A few years ago, it was a little bit nerve-racking to acknowledge that you don’t know that.
The next question that you ask is, of the data that you don’t know how much you have, how much of that would be considered sensitive, and how do you define it? That depends on the industry, but healthcare will definitely go to HIPAA. Other industries will go to GLBA or PCI. It depends on where they are regulated, because that’s where their brain thinks. You have to take a step back and say, that might be how regulation defines “sensitive,” but how would your patients define sensitive? How would your clients define sensitive? How would your board define sensitive? People take a step back and say, that’s interesting, we didn’t look at it that way.
First you have to define it. Then the challenge comes to, where does it live? Not just how much do I have, but where could it possibly be? That usually leads down another interesting conversation topic as well.
Is healthcare the worst of two worlds, where you have the legally defined protected health information, but you also have the business data of a health system that could be a multi-billion dollar enterprise?
Privacy is an overused term these days, but when you think about privacy, it’s fluid. How privacy is defined for you might be different than how it’s defined for me. It might be different how it’s defined for a provider versus an insurer. How that data is used or misused can also then help define what privacy means.
While regulations have tried to go ahead and put a fork in it, healthcare data back in February is different than it is today. I didn’t really care if the world knew what my body temperature was in February, but now, you could have a bias against me for having a temperature that’s not within the range that you’d expect. Or if you were to find out that somebody living in my townhouse complex was diagnosed with COVID, maybe then I’m not allowed to go to work the next day. A lot of information that’s associated — a combination of that PHI, but also proximity and demographics, et cetera — can be leveraged to help during a pandemic, but can also be leveraged after that to start doing some things that people might not be as comfortable with.
What is the biggest driver that might take a health system from going beyond being minimally compliant with HIPAA to having some enthusiasm about implementing tools and systems to protect data beyond what is legally mandated?
Every board across the US is waking up saying, how can I spend more money on something that doesn’t add direct value to what I do? [laughs] That’s the challenge of privacy security. CISOs deal with that challenge all the time. Vendors like us walk around and say, “If you don’t do it, you’ll be fined, flogged, and frozen and all these bad things will happen to you.”
Until organizations start making it personal, it doesn’t usually get traction. By personal, I mean recognizing that the data that you’re protecting isn’t some amorphous blob of sensitive data sitting in an Azure cloud store. It’s information about your neighbors. It’s information about your community. A lot of healthcare providers are community centric. Something happens to that data and it impacts the entire community, which includes your kids’ teachers and your own relatives.
A good example is that once your child who is under 10 years old has had their Social Security number compromised and used to get credit card, they begin their financial life in the hole. Then it starts becoming a little bit more real. There’s so many more ways than just identity theft in the ways normal people think about how privacy can be breached and how majorly impactful it can be when you start being impersonated by people online, et cetera, et cetera. Or you start getting discriminated against.
One example that I heard that is relevant today is that we’re supposedly getting closer to this vaccine. Let’s say the vaccine is rationed, and you have to meet a certain set of criteria in order to be to the front of the line for the vaccine. It would be pretty easy to figure out what that criteria are, mine for those criteria, and then sell identities that meet that criteria so people can go buy it and be first in line. Then when you go to get your vaccine, somebody says, “Nope, you’ve already gotten it.” Wait a minute, no I haven’t. That’s when it starts hitting home.
It’s really making it personal and shifting that gear to say, this isn’t just a nice thing to do it. It isn’t just a regulatory thing to do. It’s a critical thing to do. That’s when organizations start to shift.
Are hospitals thinking about security differently after the recent surge of ransomware attacks?
Yes, for sure. One of the first things they are asking themselves is, do I have a secure copy of my data, so that if I am ransomed and they want to shut me down, I can rebuild? The second piece is, how much data do I really need? How much of that is critical to my operations, and how much is non-critical? They are starting to think about data in a different way, because ransomware is either about shutting it down and saying, I’m not going to turn you back on until you give me something, or they will actually sell off your data. I’ve got all your sensitive data, and I’m going to release it if you don’t do something. The idea that data can actually hold you hostage is a new concept for boards to think about. That has started putting a different value on that data.
The unfortunate impact of that is people are paying a lot more attention for the wrong reasons, versus waking up and saying, we should do this because it’s the right thing. People who start solving for the privacy problem because it’s the right thing to do typically don’t have the ransomware and breach issues. They have solved it organically and culturally within the company versus as a by-product of something they think they are supposed to do because their regulator said so.
How does a health system reduce the risk that is associated with the data they discover?
The first thing is to reduce to the absolute optimal the number of copies that you need to have of that data, and then make sure that it can’t replicate itself. With cloud stores today, if you are looking at your laptop right now, it’s probably syncing to a OneDrive, Google Drive, or Dropbox. When you save something, will save in the three other spots. Getting a handle on what sensitive data is, how that data can move, and how that data can be stored will be a big step in the right direction to solving the problem. We talk about reducing the threat surface of sensitive data, and you do that by understanding where it is and how much you have. You can only do that once you define what it means to your organization.
Healthcare is fairly new to the cloud and we’ve seem some inadvertent exposures because of incorrect cloud configuration. Is that situation commonly or easily detected?
A cartoon shows the son saying to his father, what are clouds made of, Daddy? And he says, mostly Linux servers, son. [laughs] It’s an abstracted version of storage, of a place to store stuff. The challenge is that people don’t recognize that where they are storing it is completely unsecured and it’s completely open.
Being able to say, wait a minute, this is sensitive data is step one. Step two is, how secure is it? Well, it’s sitting on a server that is wide open to the entire universe. OK, that’s a problem. How active is it? Nobody has actually accessed it since the Reagan administration, so we are OK. Actually no, there have been 10,000 hits on it from foreign countries in the last 15 minutes, so it’s a problem.
It’s not just a matter of knowing that it’s sensitive data, it’s knowing the level of access to the sensitive data and the level of activity around it. You combine those three things together to create a pretty good heat map that would say, I need to shut this down or I have a challenge or issue here. If I can reduce the threat surface and I have fewer locations where sensitive data lives, it gets a heck of a lot easier to manage it.
We had less impact than I expected from GDPR, which could have changed how we think about storing, securing, and using data, especially consumer data. Will we see further effects from GDPR or other legislation?
You see it in California already for sure with CCPA and CPRA. You have the New York Shield Act and 32 other states that are actively debating privacy legislation. With the election behind us now, there’s definitely privacy legislation that’s at a Congressional level as well. So you absolutely will, it will continue to shift. Even CCPA has changed three times since it went into effect last year. It will continue to shift and morph because privacy is fluid.
The wrong lens to look through is, how big have the fines been for GDPR? Well, there’s been some massive ones. How many have been collected on how many have made it through the courts? We’re waiting to go ahead and see.
You have to take a step back and say, what’s the right level of stuff to do from a privacy standpoint? If you show that you are trying to proactively get ahead of the problem, then more often than not, you’re going to be in pretty good stead with the regulators. It’s not trying to keep up with the regulations, but more trying to keep up with the culture, and that usually takes a rethinking of how you move and store data. That wake-up call doesn’t typically come until there’s a breach or something bad that could happen to you that you saw happen to the healthcare organization across town.
Are health systems funding and completing projects related to security, privacy, and data protection?
They are absolutely taking it more seriously. We’ve seen an uptick, even during these crazy times, over the last six months in healthcare because they recognize that it’s a journey that they have to start. They don’t a panacea button that it solves all their issues, but they start saying that they have to get the right processes in place and the right underpinning tools in place to start getting ahead of this problem.
Most healthcare organizations didn’t pop up overnight. They have been around for 50, 100, or 150 years. If you think about the technological age, every healthcare organization that I’ve walked into has equipment and systems that go back to the time the first building was built, that date all the way to the time the most recent building was built. They have a little bit of everything, and across that little bit of everything lies a lot of complexity. For a while, the answer was, we’re just going to throw our hands up because this is too hard to get our heads wrapped around. Now it has shifted into, we have to start somewhere, so let’s put a stake in the ground and let’s start pulling the thread through it.
It’s a hard problem, especially in healthcare. Healthcare is different. A lot of it is because there’s a lot of legacy systems with a lot of legacy information that’s really, really important, but that weren’t designed to protect data the way it’s expected to be protected today.
How do you see that situation changing over the next 3-5 years?
The concept of data and sensitive data is at the core of both security and privacy. The next thing that goes around that is, what is the definition of sensitive as it pertains to privacy? Then also, what is the definition of identity as it pertains to security? I think that recognition is starting to happen, where people say, it’s not a matter of if I’m going to be breached, it’s a matter of when. The perimeter is not going to hold, so when they get in, what are they going to be able to do, and what are they going to be able to find? That gets back to the data part of the question.
People are starting to move in the right direction. They are starting to say, I need to get a handle on my sensitive data footprint so I know what the threat surface is. Then when I am compromised, I know what has happened or is happening and I can minimize the risk. I think you’ll continue to see over the next 3-5 years more and more efforts with a data-centric look at the overall infrastructure and security. That will spawn privacy. You cannot have privacy without security, but you can have security without privacy. We are already seeing that in how people are talking and thinking about how they are leveraging systems. It’s getting more and more prevalent.
Do you have any final thoughts?
When it comes to security and privacy and all the drama and all the noise that you hear about it and read about it, just boil it down to this — am I doing everything I can today to protect what matters most to the constituents I serve? And what matters most to them is their individuality. Recognizing that you hold the digital versions of those physical selves and treating those digital versions as you would treat the physical one is just as important, so make it personal.