The Bon Secours Health System Convenes to Review the SAFER Guides
By Patricia P. Sengstack, DNP
Patient safety – have we fixed that yet? Apparently not. Fifteen years after “To Err is Human” was published, we still see errors leading to adverse events in our healthcare settings.
So let’s rely on health IT to take care of the problem. Hmmmm…. It seems that health IT can actually lead to new types of errors when not configured or implemented well. I liken it to a game of Whack-A-Mole. As a new error arises attributed to health IT, we change the system or a process to make it go away. Then a new one that we hadn’t considered pops up that we have to address: orders are written on the wrong patient, a default value is provided for a medication that is inappropriate for a patient in renal failure, a result from an outside lab is manually transcribed incorrectly into a patient’s electronic record.
As we deal with each issue, we hope to become a learning health system, continuously improving to ensure our patients get the best and safest care possible. In looking for resources to support continued safety improvement efforts, we see tools emerging from our industry experts and researchers.
One such tool can be found on ONC’s website and is a collection of nine self-assessment checklists covering safety related areas such as patient identification, system-to-system interfaces, CPOE with CDS, and high-priority practices. These SAFER Guides are available on the ONC website. If you’ve read the recent Sentinel Event Alert (#54) published by The Joint Commission, you know they recommend that organizations develop a proactive, methodical approach to health IT process improvement that includes assessing patient safety risks using tools such as the SAFER Guides.
To do just this, a multi-disciplinary team from across the entire Bon Secours Health System convened to perform a self-assessment and determine areas for health IT safety improvement using the High-Priority Practices SAFER Guide. We wanted to see what this guide was all about and decide if we wanted to move forward with reviewing the other eight guides.
The High-Priority Practices guide consists of 18 evidence-based recommended practices and includes examples of how successful organizations have improved patient safety in each area. A rating scale for each practice is provided that allows organizations to identify areas of vulnerability and to help prioritize follow up activities. These ratings include Fully Implemented In all Areas, Partially Implemented in Some Areas, and Not Implemented.
Since this was the first exposure to the SAFER Guides for almost everyone gathered in the room, our intent was not to create a to-do list with assigned resources for follow up, but simply to review the guide as a group of stakeholders to understand their intent, how to use them, and determine next steps. We had about 25 people in the room that represented clinical, IT, informatics, and patient safety from our entire 14-hospital system.
We started with a discussion on recommended practice #1, “Data and application configurations are backed up and hardware systems are redundant,” then moved on to the next one, and so on. Every single recommended practice generated at least 20 minutes worth of discussion – all good. We only got through recommended practice #11 when time ran out.
Not one of the recommended practices was scored as Fully Implemented in All Areas, but some were almost there. Those were the shorter discussions. We found ourselves wishing that there was another ranking in the scale. If just about everything is “partial” without any differentiation of “partiality,” then it’s hard for an organization to prioritize which partial recommendation to tackle first, second, third. In other words, if we checked off everything as Partially Implemented, where do we focus?
I believe the group felt that the guides were validating. Never before in one place have they seen the importance of their work in black and white with references in a concise checklist. They may have heard that a particular practice was the right thing to do, but having it in this tool provides the necessary focus on things that sometimes get pushed to the back burner for system enhancements that are a bit more sexy and innovative. The list below represents highlights from our self-assessment discussions as well as some questions generated. These will help us to provide focus over the next several months:
- Backup systems are currently adequate. In process of moving some backup systems to a more remote location.
- Every downtime is different. If you’ve survived one downtime, you’ve survived one downtime.
- We need more practice at downtime – decision making, communication, and improvements to downtime forms. If only interfaces are down, should we take the system completely down for all users?
- Where appropriate, we need to ensure we are using SNOMED/LOINC terminologies, need to assess. Are there free text areas that could be coded?
- Some of our naming conventions in radiology are unclear, making order entry problematic and error prone. We need to review and make improvements.
- How much do we police physician use of evidence-based order sets? Do we force their use without exception?
- Pharmacy build team embraces ISMP guidelines.
- How do we get our vendor to help us make improvements using this guideline? They should be at the table with us during the next discussion.
- End user acceptance testing as well as production validation testing are happening, but think we can improve. Problems occur when using test patients in production. (Do not assume there are no real patients in the system with the last name “Test”).
- We strongly recommend using the patient’s picture for identification. If the system allows it, we should implement (and we have started in our inpatient settings).
- Usability of the system can be improved. Some of the language is not clear to the end-user, making it misleading while charting. Need more inclusion of end users at both the vendor and organization level during design sessions.
- We need to develop a “Top 10 Optimization List” based on our safety review.
- Need better method to assess end user proficiency in order to develop effective, ongoing training programs.
At the end of the session, the group wanted to set up times to complete the remainder of the recommended practices in the High Priorities guide and then move on to the Organizational Responsibilities guide. We have the next date scheduled and will continue our review.
At no other time in our organization’s history have we convened to solely discuss health IT safety. This exercise using the SAFER guide has provided the impetus leading to valuable discussions that are only the beginning of this journey to improved patient safety.