The HIStalk Advisory Panel is a group of hospital CIOs, hospital CMIOs, practicing physicians, and a few vendor executives who have volunteered to provide their thoughts on topical industry issues. I’ll seek their input every month or so on an important news developments and also ask the non-vendor members about their recent experience with vendors. E-mail me to suggest an issue for their consideration.
If you work for a hospital or practice, you are welcome to join the panel. I am grateful to the HIStalk Advisory Panel members for their help in making HIStalk better.
This question this time: When you think of potential HIPAA isses, what parts of your health system’s operation give you the most concern? What are your top HIPAA-related priorities?
Our top HIPAA concerns relate to the use of personal devices such as smartphones to transmit pictures and unsecured text. While we can and do provide secure alternatives, there is really nothing we can do to prevent a medical student from snapping a picture of a patient or patient data and sending it to several hundred of his closest friends.
HIPAA is an interesting concept. How do you balance providing sufficient access to critical information that can impact a patient’s health and still protect their privacy? It’s not easy. For many of the children we care for, privacy is not just a regulation to follow, it’s life and death – for children in custody disputes and victims of violence. The most significant challenges we face involve the fact that both the rules and technology are changing at an ever-increasing pace. The people writing the rules aren’t always the ones with the most knowledge about how (and even if it’s possible) to implement.
It’s ironic that we are both demanding healthcare costs go down and simultaneously creating new and unfunded mandates that require enormous amounts of time and money to implement. The two things I worry about most: mobility of devices and data and staying current on vastly complex laws. Small hospitals outside of a larger system are still required to adhere to the same rules and regulations even if they have a fraction of the resources with which to do so.
Top HIPAA-related priorities and concern for us center around secure communication between our staff with clients and providers. Ensuring that the proper processes and technologies are used to secure communications via e-mail, instant message, or any channel is paramount.
When it comes to protecting PHI, my biggest concern is the data that goes to our physicians’ offices for billing. There are many concerns, but how the practice and the billing services treat this data is my greatest. We have no way to audit how this data is used and disposed of. Practice adherence to HIPAA security and privacy is very minimal, as an independent practice has little knowledge or resources to dedicate to this requirement.
HIPAA security requires complete control of PHI storage. There is so much distributed data acquisition going on that it’s difficult to ensure complete control. Example: digital photos taken in the clinic stored on memory cards. Clinical staff don’t see these cards as containing PHI, but they do. Thieves see the cameras as easy to pawn theft targets. When stolen, we have a privacy breach on our hands. In retrospect, we learned we lack procedures to wipe the cards of data once the images are stored in the EHR. These novel data stores continue to pop up and represent control risks.
I lay awake at night thinking about unencrypted laptops. With all the other projects, this one keeps sliding down the priority list. The CFO all but refuses to fund this. We have a policy against keeping PHI on the PC, but I know no one follows this policy.
I’m glad you’re running my comments anonymously because I don’t want to advertise how many potential HIPAA vulnerabilities we have in our organization, ranging from PHI routinely sent via insecure text messages (and the Web-based paging system), workstations that are visible to the outside world that don’t secure properly, shared common windows passwords, shared common remote login passwords, EHR printouts that aren’t shredded in a timely manner, etc. I’ll stop now before I trigger a subpoena coming your way.
Mobile device security and BYOD are probably our biggest concerns. We have a number of clinicians using their own devices, communicating and coordinating patient care. We are putting in place comprehensive mobile device management system that will provide secure communications options. We are in the process of encrypting laptops and securing USB ports.
General staff knowledge and awareness would be the first thing that comes to mind. We can write policy and implement all the controls we want, but people will find ways to circumvent if they don’t understand the whys. Our top priorities in the coming year include establish ongoing staff education, conduct annual policy review, create mobile device management strategies, and evaluate data loss prevention solutions.
We do a good job of educating our employees on HIPAA. We don’t see too many concerns with patients. We do get the occasional employee who looks at a relative’s records. Our greater concern is office staff of independent providers who have access to our patient database by necessity. We rely on the physicians in their office to provide initial and ongoing HIPAA training and this breaks down. We also have the issue of those employees leaving employment in the physician office and the office not informing us to cancel their access. We do a manual audit every 90 days.
There are really four classes of data we are charged with protecting. First, our current data, which may be stored locally or remotely. Second, the data we push out to others (patients, providers and organizations). Third, the data we receive from others and is received in various formats. Fourth, our archived data which might be scanned, paper, or legacy digital formats. The diversity of data itself poses its own challenges.
We often think of securing data through protection from security breaches such as device theft or hackers. Encryption has become the standard in this regard. However, the more common occurrence would be in the form of end user error — leaving devices without logging out or the dreaded exposed password. While much of our effort has to be on prevention of the "big event," we must still focus on end user HIPAA training and routine auditing as the first line deterrent to loss of PHI.
My biggest technical concerns are with mobile devices. We are pushing quite a bit of data to them in e-mail alone, and even with security policy in place, it is still a huge exposure. While internal threats like staff inappropriately accessing someone’s records may be larger, technical solutions to a threat like that are harder to address. Our privacy officer gets to lose sleep over those.
The inability to control what disgruntled employees can do with sensitive health information. Overly curious individuals are also a problem in terms of celebrities or people they know, but they typically would not compromise the sizable amounts of information that could be breached by someone with a grudge and/or desire to sell information for money. Carelessness is also a major problem when people are working with large data sets or spreadsheets as part of their job and leaving it on laptops or sending it in unencrypted files via e-mail.
The use of workarounds to data security initiatives. The tighter the security lockdown, the greater the impingement on ordinary work and productivity, especially in comparison what people are used to doing in other realms of life. Rather than helping with data security, the workarounds just seem to make matters a whole lot worse because then people exchange info surreptitiously by cell phone images, Gmail, and the like.
Since I’m not in management, my top priority is making sure that I keep the data of my own patients secure. Another goal is to educate residents and medical students about the importance of patient privacy. I also advocate for more enlightened approaches at a local and national level for protecting confidential information and for giving patients more say in the way their sensitive information is stored and shared with others.
Where to start? My biggest concern is not knowing what I don’t know. Our customers are doing all kinds of things that I can’t control. I’m sure that data is leaking like crazy and we’re doing all we can to contain it. I am hopeful that in the next 60 days we will have a much better understanding of what is occurring and that we will have better control. Our biggest HIPAA priorities are data loss protection and then preparing for the inevitable audits.
With the increasing use of clinical and other data (read PHI), our concerns are growing around mobility and continued violations of our use policies. We are moving to our second mobile security platform/tool, but are not convinced that even after best efforts that we are "safe." There will always be threats and we have to continuously evaluate what those threats are and how to prioritize the work to protect our data.
Our organization has finally realized we are not impervious to breaches or attacks and is supporting new efforts to ensure we are doing what is appropriate to secure the environment. In addition, we are trying to play more "hard ball" with violators of policy on data use and access. I am afraid a few examples will have to occur before the majority of our users realize we are serious about this as an organization.
The biggest HIPAA issue would be a breach > 500 which triggers a multitude of bad events We do take the approach of "when" not "if" so we are prepared, but we are implementing technology and procedures to reduce the risk of occurrence. The biggest risk is related to PHI leaving the organization. That can happen in many ways (e.g. mobile devices, mobile media, viruses and e-mail). We have implemented encryption in these areas to reduce this risk. We also have virus protection and a SEIM tool to monitor network attacks.
Our next effort is implementation of a data loss protection (DLP) tool. This tool maps the location of all PHI in your domain. Strict rules can then be applied to govern the movement of that PHI. Besides encryption, my feeling is that DLP will have the biggest impact in protecting an organization from a breach.
We had two significant reportable breaks last year, but neither were related to the electronic medical record or other electronic systems here. The first was a physician who e-mailed an Excel spreadsheet which contained PHI to an external unsecured e-mail server. The other was a resident who took home paper copies of patient records for the purposes of a lawsuit they were gathering potential evidence for. In neither case was the patient information actually exposed, but they were reportable breaches nonetheless.
We are in the process of implementing a new clinical platform, so my focus is creating one balancing the new robust functionality with the safeguards that are needed to protect the information. Not an easy task.
Laptops. No matter what we do or what we say, folks will still copy and past information and manage to store PHI on their laptops. We lock down the laptop as much as possible, train, and continuously educate and inform, but the laptop is still our weakest link in the chain.
New phones. With new phones and applications for them, I believe there is more opportunity to access PHI. If you can clone someone’s phone by walking by them and picking up their information, what happens if someone is sending them e-mails, updates, or questions via e-mail, etc.? I am not very informed in this area, but very concerned.
Top concerns: access controls within older non-core EHR systems, such as radiology, lab, and custom systems that we have developed. Providing appropriate levels of adolescent confidentiality. Opening access to psychiatric care visit information as much as legally possible.
Top priorities: dealing with the above. Getting lawyers and others to understand that data-sharing across legal entities for ongoing and potential future care is the same as "treatment" and therefore allowed by HIPAA. Physicians who are members of different legal entities who practice together (e.g., in an ACO) often need to use the same EMR database and that having two or more separate records in a system for a single patient (which is their idea how to do this) is just dangerous.
Vulnerabilities that are rooted in human behavior or misbehavior concern me the most: apathy, naiveté, curiosity, theft, and vengeance. Continual education and empowering employees and physicians with scenario techniques on how to appropriately deal with common situations is helpful. Not intending to scare or intimidate people into compliance, we share media stories of fines and prosecutions of healthcare systems who have had incidents of security or privacy breaches.
The proliferation of personal devices where clinical information can be accessed (smartphones, tablets). We’re working on how to best encourage provider access / patient engagement while still ensuring appropriate security and privacy.
Many vendors, including our eClinicalWorks vendor, are increasingly utilizing cloud technology. We’re working to be able to make best use of the new products while managing security.
The people. Information technology systems are relatively easy to secure, but people have this aggravating habit of not doing what you tell them or expect them to do. I’m functionally the assistant security officer, although my title doesn’t reflect it. I did about half of the facility education in 2003 for the Privacy Rule implementation and it still amazes me how many people don’t make basic information security and patient privacy a part of their day-to-day existence in healthcare.
In 2003, there were three groups of people: those who lived privacy, those who had heard of privacy but for whom it was an add-on to their daily life, and those who had never heard of privacy or the Privacy Rule. In 10 years, we’ve pretty much stamped out the "never heard of it" problem, but there are a lot of people who still treat patient privacy as something to think about when everything else is done. A text message to a friend here, a social media message to a friend there (even a private one) and you have opened yourself up to serious problems. Somehow we still have to convert those folks over to people whose lives include patient privacy. I’m still working on how.
Not misspelling HIPAA
The use of HIPAA as a way to make life harder for physicians, such as CIOs and lawyers creating inane password policies or medical record clerks denying access to results of study I ordered without a written consent "because of HIPAA.”
Stupid mistakes (e.g. having patient info on an unprotected medium which gets stolen). Interestingly, while this may result in embarrassment and financial penalties, it rarely actually compromises a patient’s medical information.
The reality is that HIPAA is simply a mandate of common sense (i.e. only share patient info with someone who should be able to see it for obvious clinical, operations, or payment reasons), and yet ironically it actually winds up making people lose their common sense in how to deal with data and potentially hurts the quality of care by denying access to data needed by caregivers.
Downloading PHI to personal laptops or other mobile storage devices that are not encrypted and not secured with a strong password. All of our corporate laptops and portable storage devices (e.g., thumb drives) are encrypted and password protected, but that’s not the case with personal laptops which inevitably are used by employees for work-related tasks. I’m also constantly concerned about insiders and trusted agents who engage in for-profit identity theft.
In our organization, a chief privacy officer has virtually shut down all research in the name of HIPAA and patient privacy. She has even begun to question the utility of quality improvement efforts and their need to review patient records.
Our health system is most vulnerable with the new culture of real-time information, which means that caregivers are texting, e-mailing, taking photos, etc. as part of the normal practice of patient care. Our EMS and cardiology service line had a great process in place to get information to cardiologist on the patient prior to arrival by using a smartphone to take a picture of the EKG and text it to the physician. Great idea, but not vetted for patient privacy and security.
It is up to us to stay in front of this new culture and put the appropriate privacy and security measures into place. Our health system is developing its updated security program now and I’m concerned that some of these things are going on without our knowledge or preparation.