Home » Readers Write » Recent Articles:

Readers Write: A Prescription for Poor Clinician Engagement with Health IT: Stop Communicating and Start Marketing

April 26, 2017 Readers Write 14 Comments

A Prescription for Poor Clinician Engagement with Health IT: Stop Communicating and Start Marketing
By David Butler, MD

image

David Butler, MD is associate CMIO of the Epic/GO project of NYC Health + Hospitals of New York, NY. 

My first lesson in healthcare marketing came in the spring semester of my junior year at Texas A&M University, when I accepted a prestigious internship with a little company called Merck Pharmaceuticals. Believe it or not, I hadn’t even heard of this company, but I soon found out one of the many reasons for their meteoric rise.

That summer, Merck was releasing a new prostate drug. They posed the question to their young crop of interns: where should we market this drug? Field & Stream! Men’s Health! Cigar Aficionado! We shouted rapid-fire.

Wrong, wrong, and wrong again. Our instructor basked in our ignorance for a moment before he uttered the answer: Good Housekeeping. Targeting the significant others of the drug’s target audience was actually the smarter way to go. They were more likely to notice changes in their partner’s behavior and push them to go to the doctor.

Fast-forward 25 years later and healthcare is approaching physicians and nurses with the non-WIIFM, non-behavioral economics approaches similar to what my intern class suggested.

We spend hundreds of millions of dollars to implement technology for our best and brightest to leverage to care for patients, yet we continue to allow these transformative changes to the software to enter into their workflows without rollout efforts that match the investment and the desired results.

Healthcare needs to stop communicating and start marketing new health IT projects and improvements to existing provider-facing solutions. Too many initiatives fail not on the merit of the technology, but because the organization failed to successfully relay the value to the end users.

Here are five ways to help launch a full-fledged marketing campaign to capture your end users’ attention and effectively roll out new technology and important updates to current systems:

Change the mindset.

Health IT project teams need to think of their communication differently. It should not only inform, it should persuade. If you were going to sell something to physicians to get them to actually buy it, how would you change your communication? That should be a question asked during the creation of every piece of project collateral. How do you find the wife or the Good Housekeeping marketing equivalent from my opening example?

Get docs and nurses to want to do your desired action, or even better in some cases, understand why it would hurt not to do it.

Spotlight the value.

Too often healthcare organizations spend a bunch of R&D resources creating or improving something really cool, and then communicate that in an email with a laundry list of other changes that aren’t as meaningful. If you’ve added technology that will help save lives or otherwise have a profound impact on clinician efficiency, give it the spotlight it deserves.

For example, it used to be a policy at Sutter Health (my former organization) that if a nurse gave a patient insulin, a second nurse had to log in to double-check the dose. The organization finally changed the policy so that second nurse and verification was no longer needed. Some genius asked how much nursing clicks, time, or dollars would this save. We actually took the time to figure it out.

After calculating the size of organization and the insulin doses given each day, we figured that policy change resulted in $400,000 in savings of nurses’ time—and that’s the value we marketed. Not only to the nurses, but also to the board. We told the nurses how much of their time we were giving back to them and told the board about the significant cost savings for the organization.

Once you find the value to spotlight, think about what that value means to different parties and market that ROI.

Devise a catchphrase.

If you want end user attention, you’re going to have to earn it. There are too many competing priorities for a busy physician’s or nurse’s attention. Have some fun and get some eyeballs by devising a catchphrase for your campaign.

For example, when I was helping roll out a secure messaging solution to thousands of physicians, we could have promoted it with “New! Secure Messaging” or even “Pagers to Smartphones” messaging. Instead, we used “Safe Text.” It was fun and catchy—there were plenty of good-natured jokes and buzz around the campaign—and it also tapped into their own motivation to protect PHI. Make your catchphrase not only descriptive, but also memorable. That’s marketing.

Include a call to action.

What do you want your audience—physicians, nurses, or whichever group it may be—to actually do after they’ve read your communication? Good marketing always includes a call to action, or CTA. After you create marketing for the group, ask yourself what the CTA should be. Do you want them to download an app or an update? Submit their feedback? Add an event to their calendar? Always make the CTA big, bold, and if possible, frictionless.

For example, include a link that can automatically add the event to their calendar, or seamlessly forward it to a friend or colleague. You can also think about the tools you already have and how you might get innovative with them to drive follow-through.

One prominent health system in the Pacific Northwest used their EHR alerts to creatively capture clinician attention at various workflow points within the EHR. They were greeted by a respected physician leader — their CMO — whose image and quote reminded them to complete certain crucial activities within the EHR. Having his face staring at the clinicians alongside that CTA made it much more influential.

Rinse and repeat.

If a company you already like and engage with introduces a new product, they’re going to be marketing that to you on every channel they can: Direct mail, email, TV commercials, social media ads, display ads. Follow a similar approach for internal projects: Emails, flyers, reader boards, table tents in the cafeteria, digital banners on internal websites, announcements at town halls, free tchotchkes—anything you can think of where your end users might see it.

Physicians rarely understood why drug companies would provide free prescription pads, pens, and other items. They stated, “It doesn’t affect my prescribing patterns.” However, after many years of research on this, it actually does. So let’s wise up and follow other marketing examples from other verticals to keep the messaging in front of them. It may take several exposures for the message to resonate, but you can keep it fresh by switching up the format, colors, and graphics.

Finally, don’t forget to ask for help if you need it. Most healthcare organizations have talented marketing teams that are consumer-facing, but may be willing to help out with internal initiatives. They’re just not always asked.

With these five strategies, you can help your organization’s IT team pivot from communicating new technologies from boring emails to full-fledged campaigns that truly market the value to doctors and nurses and successfully bring them on board.

Readers Write: Deep Neural Networks: The Black Box That’s Changing Healthcare Decision Support

April 26, 2017 Readers Write 1 Comment

Deep Neural Networks: The Black Box That’s Changing Healthcare Decision Support
By Joe Petro

image

Joe Petro is SVP of research and development with Nuance Communications.

Don’t look now, but artificial intelligence (AI) is quietly transforming healthcare decision-making. From improving the accuracy and quality of clinical documentation to helping radiologists find the needle in the imaging haystack, AI is freeing clinicians to focus more of their brain cycles on delivering effective patient care. Many experts believe that the application of AI and machine learning to healthcare is reaching a crucial tipping point, thanks to the impact of deep neural networks (DNN).

What is a Neural Network?

Neural networks are designed to work in much the same way the human brain works. An array of simple algorithmic nodes—like the neurons in a human brain—analyze snippets of information and make connections, assembling complex data puzzles to arrive at an answer.

The “deep” part refers to the way deep neural networks are organized in many layers, with the intermediate (or “hidden”) layers focused on identifying elemental pieces (or “features”) of the puzzle and then passing what they have learned to deeper layers in the network to develop a more complete understanding of the input, which ultimately produces a valid answer. For example, a diagnostic image is submitted to the network and the output may be a prioritized worklist and the identification of a possible anomaly.

Like us humans, the network is not born with any real knowledge of a problem or a solution; it must be trained. Also known as “machine learning,” this is achieved by feeding the network large amounts of input data with known answers, effectively teaching the network how to interpret and understand various inputs or signals. Just like showing your child, “This is a car, this is a truck, this is a horse,” the network needs to be trained to interpret an input and convert it to an output.

For example, training a DNN for medical transcription might involve feeding it billions of lines of spoken narrative. The resulting textual output forms a truth set consisting of spoken words connected with transcribed text. This truth set expands over time as the DNN is subjected to more and more inputs. Over time, errors are corrected and the network’s ability to deliver the correct answer becomes more robust.

A key feature of a neural network is that when it gets something wrong, it is corrected, Just like a child, it becomes smarter over time.

The Black Box

Here’s where it gets interesting. Once the DNN has that baseline training and it begins to analyze problems correctly, its neural processes become a kind of black box. The DNN takes over the sophisticated, multi-step intelligence process and figures out how the inputs are connected or related to the outputs. This is a very powerful concept because we may not fully understand exactly how the network is making every little decision to arrive at an output, but we know it is getting it right.

This black box effect frees us from having to contemplate—and generate code for—all the complex intermediate variables and countless analytical steps required to get to a result. Instead, the DNN figures out all intermediate steps within the network, freeing the technologist from having to worry about every single one. And with every new problem we give it, we provide additional truth sets and the neural network gets a little bit smarter as it trains itself, just like a child learning its way in the world.

How smart is smart? One of the biggest challenges with speech recognition is accommodating language and acoustic models, the specific and very individual aspects of the way a person speaks—including accent, dialects, and personal speech anomalies. Traditionally, this has required creating many different language and acoustic models to cover a diverse range of speakers to ensure accurate speech recognition and improve the user experience across a large population of speakers.

When we started using special purpose neural networks for speech recognition, we discovered something surprising. We didn’t need as many models as before. A single neural network proved robust enough to handle a wider range of speech patterns. The network essentially leveraged what it learned from the massive amounts to speech data we used as a training set to improve its accuracy and understand people across the entire speaker population, reducing the word error rate by nearly 30 percent.

Anecdotally, I’ve heard from people seated across from a physician dictating with such a thick accent at such high speed that they could not comprehend what was said, yet DNN-driven speech recognition technology understood and got it right the first time.

It’s important to note that neural networks are not magic. DNNs require problems that have clear answers. If a team of trained humans agrees with no ambiguity and they can repeat the agreement across a large set of inputs, this is the kind of problem that neural nets may help to solve. However, if the truth set has grey areas or ambiguity, the DNN will struggle to produce consistent results. The problems we choose and the availability of strong training data is key to the successful applications of this technology.

Putting DNNs to Work in Healthcare

So how are DNNs changing the way healthcare is practiced? Neural networks have been used in advanced speech recognition technology for years, and that’s just the beginning. The potential applications are nearly endless, but let’s look at two: clinical documentation improvement (CDI) and diagnostic image detection.

Clinical documentation includes a wide range of inputs, from speech-generated or typed physician notes to labs, medications, and other patient data. Traditionally, CDI involves having people who are domain experts reviewing the documentation to ensure an accurate representation of a patient’s condition and diagnosis. This second set of eyes helps ensure patients receive the appropriate treatment and that conditions are properly coded so the hospital receives appropriate reimbursement. The CDI process requires time and resources and can be disruptive to physicians’ workflow since the questions coming from CDI specialists are generally asynchronous with the documentation input.

Technology is used to augment the CDI process. Applications exist that capture and digitize CDI processes and domain expertise, creating a CDI knowledge base at the core. This involves processing clinical documentation, applying natural language processing (NLP) technology to extract key facts and evidence, and then running these artifacts through the knowledge base. The output of this complicated process is a context-specific query that fires for the physician in real time as she is entering patient documentation, linking, say, a relevant lab value with key facts and evidence from the case to indicate the possibility of an undocumented infection, for example. This approach to addressing a common documentation gap is a technically arduous and complex processing task.

What if we applied neural networks to change the paradigm? Many institutions have been doing CDI manually for years and we can leverage not only the existing clinical documentation (the input), but also the queries generated (the output) from those physician notes to create a truth set for training the neural network with a repeatable, deterministic process. The application of neural networks allows us to skip over complexity of digitizing domain expertise and processing the inputs through a multi-step process. Remember the black box concept? The DNN essentially determines the intermediate steps, based on what it learned from the historical truth set. In the end, this helps improve documentation by having AI figure out the missing pieces or connections to advise physicians in real time while they’re still charting.

The applications of neural networks are not limited to speech or language processing. DNNs are also changing the game for evaluating visual data, including radiological images. Reading the subtle variations in signal strength associated with identification of an anomaly requires a highly-trained eye in a given specialty. With neural networks, we can leverage this deep experience by training the network with thousands of radiological images with known diagnoses. This enables the network to detect the subtle differences between a positive finding and a negative finding. The more images we feed through it, the more experienced and accurate the DNN becomes. This technology will streamline the busy workflow of the radiologist and truly amplify their knowledge and productivity.

Augmenting, Not Replacing

While the possibilities for neural networks are incredibly exciting, it’s important to note that they should be viewed as powerful tools for augmenting human expertise rather than replacing it. In the case of diagnostic image detection, for example, a DNN can serve as a first line review of films, helping prioritize them so radiologists focus first on those that are most critical. Or it might serve as an automated second opinion, possibly spotting something that might have been overlooked.

Today, AI in healthcare decision support is still in its infancy. But with the exciting possibilities created by DNNs, that infant is poised to transition from crawling to walking and even running in the foreseeable future. That’s good news for providers and patients alike.

Readers Write: Top Health IT Marketing Trends From #HITMC

April 10, 2017 Readers Write No Comments

Top Health IT Marketing Trends from #HITMC
By John Trader

image

John Trader is VP of communications at RightPatient in Atlanta.

image

I had the opportunity to attend the Health IT Marketing & PR Conference in Las Vegas last week, and thought I’d share some of my top health IT marketing takeaways.

image

Content, Content, Content

Content was certainly king in terms of session topics. What works. What doesn’t work. How to establish a sound content-marketing strategy (even if you’re a small company with a shoestring budget). My biggest takeaway on content is that marketers need to start with the end in mind. Understand what content resonates with the demographic you target by listening first, and then developing a strategy that addresses customer needs and is strategically presented to them as they make their way down the sales funnel.

image

I enjoyed Sarah Davelaar’s (from the The Signal Center for Health Innovation) session where she outlined the key elements in content strategy. I also enjoyed a panel discussion featuring four physicians who shared their content consumption habits – where they go to find information, what content resonates with them, and what they like versus what they ignore. The million-dollar question for any health IT marketer is: What influences their decision to buy? Most docs said that conferences are a great place for them to discover new products. Those docs on social platforms like Twitter do pay attention to who shares their posts and who interacts with them. Catchy headlines are important, and most of them look for unique perspectives on issues as opposed to extolling the virtues of a product.

Innovation Versus Value

Conference organizer and Netspective founder Shahid Shah’s opening presentation on day two was excellent (although the amount of information on his slides was a tad overwhelming). There was a lot of discussion at the conference about whether marketers should position themselves as innovators, since nothing we do is truthfully going to "disrupt" healthcare. The truth is, customers care a lot more about value than innovation. One of the best quotes from his presentation was, “Do customers care about what you think is innovation or will they care more about you when you care about what their innovation needs are?” 

image

Social

Although I didn’t attend any sessions dedicated to social media use or strategy, there were a few that addressed how to navigate the online universe, and how to develop and execute effective social media strategies. “Go where your customers are” seemed to be the general takeaway from attendees of those sessions. Don’t chase the latest shiny social platform just for the sake of having a presence. Again, start with an end goal in mind (create leads and eventually sales), and make sure you are measuring your results (how will you be able to tell if your efforts are successful?) There was also some discussion on how to effectively measure social to gain a better understanding of what works versus what doesn’t work. There was also a lot of chatter moving beyond brand awareness and more into how social efforts are creating leads and sales.

Leveraging the Customer

A recurring theme was how to leverage existing customers to create new business. Kathy Sucich of Dimensional Insight delivered an excellent presentation, where she provided a case study on how she increased her own company’s “share of voice” (a term that was new to me), and gave sound advice on how to successfully leverage customers to create new content and increase brand visibility and messaging. The key takeaway for me here was that capturing and then bringing the customer’s voice to your messaging requires personal relationships with customers. You simply must spend the time to cultivate these relationships by establishing a set of expectations at the outset of the relationship that outlines your plan to work with your customers and get their story in front of others.

image

Video

There was lot of buzz about creating more video as part of an effective marketing strategy. It continues to be a hot topic of interest because it’s clear that people want to consume more of it. The key is making it resonate. The key seems to be keeping it simple, short, and focused on addressing a problem instead of extolling the virtues of a product. Christine Slocumb’s (of Clarity Quest Marketing) session was excellent in reiterating the point that in this day and age, videos have to be personalized to be effective.

SEO Isn’t Dead

Kristine Schachinger of The Vetters Agency presented an excellent session covering modern SEO practices, soup to nuts. We talked about ways to analyze SEO performance, online SEO resources, ranking factors, inbound link tactics, do’s and don’ts for SEO, how to add Google Search Console to your site, how content affects SEO, and keyword research – just to name a few topics. There was a great deal of interaction between the presenter and the audience, and directly between audience members, which, in my opinion, is what makes this conference excellent. Questions were asked and topics brought up that were a great supplement to Kristine’s curriculum. This is perhaps what I like best about HITMC. It has a more intimate setting than most conferences I attend.

About That Other Conference

The buzz around the conference seemed to be the forthcoming HIMSS marketing conference (which, by the way, I don’t anticipate being able to offer the intimate setting I mentioned above). Many have said they heard through the rumor mill that it may be frowned upon by the marketing community to attend in lieu of supporting HITMC’s more grassroots efforts. I talked to several people who have already signed up for the HIMSS event but seem to be keeping that information to themselves. Other buzz has been the quality of HITMC – most people agree that it’s an excellent conference and gets better each year by addressing the most relevant topics to marketers.

The only drawbacks I found, aside from freezing temps in the conference rooms, was that the few tough questions I asked during Q&As weren’t answered as thoroughly as I would have liked, and there was a lack of substantial, real-world case studies to back up presenter assertions. Overall, I think the conference was a great investment. It’s always helpful for me to be around likeminded professionals eager to gain insight and tips on how we can do our jobs more effectively.

Readers Write: In a Fog About the Cloud?

March 27, 2017 Readers Write No Comments

In a Fog About the Cloud?
By Alan Dash

image

Alan Dash is senior advisor with Impact Advisors of Naperville, IL.

The use of a cloud to symbolize some magical spot where all the answers to the world’s questions are housed and an infinite amount of storage exists has been around since the 1970s. I reflect on my own career, while programming for the US Air Force in the early 1980s, drawing clouds in my diagrams to show that somewhere, out there, beneath the pale moonlight, someone’s thinking of me, and filling the void symbolized by my cloud with meaningful data.

Not exactly how Linda Ronstadt and James Ingram sang it, but that was my visual. Back then we called it what it was – centralized computing; output devices received data from centrally-located applications.

Then came PCs, placing those applications out onto the edge of the computing environment and away from the monster in the data center that threw off a dragon’s amount of heat and occasionally an equal amount of fire and brimstone. We called that de-centralized computing; everyone was free to process at their desk.

PCs became smaller, applications bigger. Soon we needed gigabits of storage to hold the very applications that were to be fed with an obese amount of data. Ultimately PCs couldn’t handle the power and space needed, so centralized computing came back, only this time we called it “The Cloud” and it was good – good because we learned new acronyms like SaaS, DaaS, IaaS, NaaS, and RaaS.

So now that we understand the Cloud, kinda, manufacturers have introduced something new — very small sensors which can equally communicate and intercommunicate in such a way, justifying a new name, The Internet of Things (IoT), or The Internet of Everything (IoE).

Ostensibly, these little sensors and devices communicate with the Cloud in a two-way format, providing data and receiving instruction. An example of these devices under IoT include sensors designed to control lights and blinds, HVAC systems and appliances, security and energy efficiency systems. More recent additions of IoT devices include wearable medical technologies, wildlife movement monitoring, urban infrastructure monitoring (road and bridge), and even intelligent collision-avoidance sensors in automobiles (both with driver and without driver).

Back to the Cloud. Servers located remotely (in the Cloud) can, and do, communicate with IoT devices out on the edge of the network; centralized computing works for IoT devices. However, propagation delay (another ‘old’ term) has become a serious factor. Propagation delay is the length of time it takes to get a signal from a sender to a receiver and back. Under normal circumstances, while we are impacted by this delay, we don’t really experience it because of our reference point.

Here’s an example. You call a friend who you are meeting at a restaurant, you ask where they are, and then you see them walking around the corner. You see their mouth move, then you hear their voice in your phone. We always have this delay, but our reference point is such that we do not realize it, so it does not bother us.

Not so for IoT devices. These devices need to instantly communicate and intercommunicate between other IoT devices, and the process of these devices speaking to each other in the Cloud, while technically capable, adds way too much propagation delay to the mix. They become ineffective.

This brings a new (old) concept back into play – de-centralized computing. Ahh, remember that? But we can’t call it de-centralized computing because it’s an old term that we were told does not work any longer, so for IoT to IoT device communication a new name had to be created. That name is … The Fog.

And yes, it makes sense. A fog is a cloud at ground level. A billion droplets of water vapor floating around at a low level, not relying on the cloud for existence. And that’s what the idea of intercommunicating IoT devices is. A billion little sensors bouncing around, intercommunicating, and not relying on the Cloud to perform that communication.

In healthcare, IoT is already here and located within wearable technologies monitoring biometric data, in the RFID systems used to track supplies and locate staff, and in mechanical controls for building automation. For hospitals, growth of wearable tech will be seen as the next step, and this growth will be the first impact on architecture from IoT.

Already we are seeing program space being set aside by hospitals to blend clinical engineering, clinical care providers, and IT departments who will work together to choose, fit, configure, and remotely monitor patients wearing sensors, smart clothing, even implants and prosthetics that will communicate back into the hospital network.

While large leaps into IoT and Fog Computing won’t be seen in the typical hospital for a few years, forthcoming IoT devices will route alarms from equipment to care providers, warn of fall risks, automate re-supply of equipment and meds, track clinical process flow, mitigate queuing, and heighten the use of autonomous robots for specimen collection, supply delivery, and remote telemedicine visits. Beyond that, as driverless cars make their way into mainstream, hospital garages and way finding systems will ultimately communicate directly with these vehicles, perhaps even routing cars to appropriate entry points based on the current biometric readings of the passengers within.

The possibilities are, well, still foggy.

Readers Write: What Healthcare Can Learn From My Roofer

March 27, 2017 Readers Write 6 Comments

What Healthcare Can Learn From My Roofer
By Phelps Jackson

image

Phelps Jackson is CEO of Sirono of Berkeley, CA.

I had a leaky roof over my kitchen. In the dry season, it wasn’t a problem, but it was something I needed to take care of. I kept putting off the repairs because I dreaded the hassle of bids, estimates, and surprise expenses.

When the rainy season finally came, I started using my pots more for catching drips than for cooking. I had to do something. I looked online for the highest-rated roofing company in my area, got an estimate for repairs, and gave the go-ahead for the work.

About 30 minutes into the job, I got a call from the roofer. The wood beneath the shingles was ruined. It would add $1,200 to the repairs. When I asked why that cost wasn’t included in the initial estimate, he politely reminded me that he had warned about the possibility of additional costs.

When I asked why the price was so high, what I got was modern, high-quality customer service: on-the-spot pictures of the rotten sheathing, an email with the price breakdown, a follow-up phone call to see if I had any billing questions, and more pictures of progress as the repairs went on. Actual pictures!

In the end, I was comfortable paying the higher cost because I understood the real value of the service. Best of all, he kept me well informed throughout the whole process even though I was 1,000 miles away on a business trip.

So, if a guy standing on top of my house can offer omni-channel customer service and high-level billing support, why can’t a multimillion-dollar hospital with teams of representatives do the same?

That’s exactly what frustrated patients ask themselves every day. They don’t care about the complexity of medical claim processes. They just want to know how much they will owe and why. The reality is that 61 percent of patients find themselves surprised by out-of-pocket expenses because they were never told that pre-service estimates aren’t 100 percent accurate or more likely didn’t get an estimate in the first place.

In contrast to the customer billing support I was offered, what if three months after the repair I had gotten a roofing bill $1,200 higher than the estimate? I would have assumed that I was being ripped off, disputed the charges, and most likely left negative online reviews so others could avoid a similar experience.

It’s no different when patients receive unanticipated escalated medical bills, which is so often the case. They become suspicious of the additional charges, question their own financial liability, and delay payment or refuse to pay altogether. Even if patients are happy with their medical care and would be willing to accept additional fees, they probably assume that there was an error.

Proactive outreach to explain balance changes shows patients that they are valued and respected. It clarifies the quality of the care received, expedites payment, and inspires customer loyalty. Fifty-seven percent of patients say their medical bills are confusing.

Improving the patient billing experience is a must for every hospital. Utilizing the patient’s preferred methods of communication makes the process easier and far more patient-centered. In healthcare, as in every other industry, consumers want to interact with businesses the way they prefer, whether it is online, email, text, phone, or through the mail.

The ease of online shopping and service-oriented local businesses have raised customer service expectations and the average hospital doesn’t come close. As patient payments become increasingly critical to the revenue cycle, smart health systems will adapt and prosper. Those who don’t—won’t.

Readers Write: Data Security Comparison: Healthcare vs. Retail, Finance, and Government

March 15, 2017 Readers Write No Comments

Data Security Comparison: Healthcare vs. Retail, Finance, and Government
By Robert Lord

image

Robert Lord is co-founder and CEO of Protenus of Baltimore, MD.

In 2016, the healthcare industry experienced, on average, more than one health data breach per day, and these breaches resulted in 27,314,647 affected patient records. Clearly, criminals are targeting patients’ medical information with great frequency and success.

How has the healthcare industry responded to this continuing epidemic? Data suggests there is still a lot of work for healthcare organizations to do in order to improve the security of their patient data. It’s important to look closely at and analyze how healthcare organizations’ security practices and spending compare to retail, finance, and government — three industries known to have proactively advanced their security posture to protect their sensitive data.

Compared to the retail and finance industries, the state of healthcare data security is sorely lacking. Since 2015, 140 million patient records have been compromised, equating to one in three Americans experiencing their health data being inappropriately accessed. Ransomware attacks hit the healthcare industry especially hard, as 88 percent of all ransomware attacks target a healthcare organization.

Criminals are increasingly targeting healthcare because patients’ medical information is incredibly profitable on the black market and it’s more easily accessible when compared to more protected industries, such as finance. Within the finance industry, if a customer’s credit card or bank account number is stolen, that information can simply be changed, rendering it useless to the criminal. Patient data, on the other hand, is a repository of information that can be used to steal an individual’s identity – Social Security numbers, DOB, and addresses.

When combined with sensitive medical information like diagnoses, claims history, and medications, it can create the perfect storm for wreaking havoc in a patient’s life. This kind of information cannot be easily changed, and because of the lagging security in the healthcare industry, this data is incredibly easy to obtain and increasingly vulnerable to criminals’ sophisticated attacks.

There is no question that when compared to other industries, healthcare falls short when it comes to data security. A 2015 survey found that only 31 percent of healthcare organizations used extensive methods of encryption to protect sensitive data and 20 percent used no encryption at all. Another study found that 58 percent of organizations in the financial sector used encryption extensively. These results are concerning because the information healthcare organizations must protect is far more sensitive and potentially damaging than the information retail and finance organizations gather and protect even though the latter group is more proactive in keeping this information safe.

Retail and financial service organizations have more experience protecting customer data from cyber criminals.This gives them an advantage over healthcare organizations, who are relatively new to the game and whose unique security challenges require specially designed solutions. It’s past time for healthcare organizations to invest substantially in protecting patient data. Sadly, according to KPMG, this has not yet occurred at the necessary scale, as IT security spending in the healthcare industry is just 10 percent of what other industries spend on security.

Incentives exist for healthcare organizations to improve their security posture because the cost of a healthcare breach is significantly higher than in other industries. The average cost per lost or stolen record is $158 across all industries. In the retail sector, the cost is $200 per record lost or stolen. In the financial sector, the cost is $264 per record.

Compare this to the healthcare industry, where the average cost per record lost or stolen is $402, double that of the retail sector. Why are healthcare data breaches so much more expensive? In the aftermath of a breach in a heavily regulated industry like healthcare, the breached organization must conduct a forensics investigation and notify any affected patients. These organizations must also pay any HIPAA fines or penalties incurred because of failure to comply with federal or state regulations. This is in addition to legal fees, lawsuits and most importantly, the long-term brand reputation of the affected organization and lost patient revenue.

However, it’s important to note that healthcare is not the only industry to have fallen behind when it comes to data security. The US government has also struggled to institute effective data security practices. A study by SecurityScoreCard examined the security posture of 600 local, state, and federal government organizations and compared them to other industries. The study found that government organizations had some of the lowest security scores, trailing behind transportation, retail, and healthcare industries. It also found that there were 35 major data breaches of the surveyed organizations from April 2015 to April 2016.

In the summer of 2015, the Office of Personnel Management (OPM) announced that it had suffered a massive data breach. The sensitive information of over 21 million people had been stolen, including fingerprints, Social Security numbers, and sensitive health information. A report from the House Committee on Oversight and Government Reform alleged that poor security practices and inept leadership enabled hackers to steal this enormous amount of sensitive data. OPM immediately began to implement changes aimed at improving their security posture and ensure that such a future massive breach would be prevented. However, one can’t help but consider how much less damage would have been done if OPM had made these changes as a proactive data security measure instead of a reactive one.

While healthcare organizations have had their fair share of data breaches, the OPM breach must serve as a lesson to the industry. Since that incident, the government has prioritized cybersecurity and focused on finding solutions to protect our nation’s sensitive information, data, and assets. Healthcare organizations must follow suit.

Here are five things healthcare organizations can do now to improve their health data security:

  1. Frame security risk assessments as an ongoing process rather than a once-per-year event, ideally, but at the very least ensure they are done annually.
  2. Encrypt data stored in portable devices.
  3. Assess other third-party security risks.
  4. Proactively monitor patient data for inappropriate access.
  5. Educate and retrain staff on how to properly handle sensitive data.

Healthcare must make privacy and security top priorities, learning from the past, applying knowledge from other industries, and creating unique solutions specifically designed for the complicated healthcare clinical environment. This will ultimately provide healthcare organizations with the tools to keep sensitive patient information safe, maintain the organization’s brand reputation, and most importantly, increase patient trust.

Readers Write: Beyond the Buzzword: Survey Shows What EHR Optimization Means to Providers

March 15, 2017 Readers Write 3 Comments

Beyond the Buzzword: Survey Shows What EHR Optimization Means to Providers
By David Lareau

image

David Lareau is CEO of Medicomp Systems of Chantilly, VA.

I was intrigued by this recent KPMG CIO survey that found “EMR system optimization” was currently the top investment priority for CIOs. The survey, which was based on the responses of 112 CHIME members, revealed that over the next three years, 38 percent of the CIOs plan to spend the majority of their capital investment on EHR/EMR optimization efforts.

The key word here is “optimization,” since over 95 percent of hospitals already have an EHR/EMR, according to the Office of the National Coordinator (ONC). Given the high level of provider dissatisfaction with their EHRs/EMRs, it’s not surprising that CIOs are seeking ways to make their doctors happier with existing solutions, since starting over with a new system would require a major capital investment that few hospitals are willing or able to afford.

In the KPMG report, the authors suggested a few ways CIOs could optimize their EMRs/EHRs, including providing effective user training and making more technology available remotely and via mobile devices.

Coincidentally, at HIMSS this year, we conducted our own survey to get a better understanding of what providers find most frustrating about working in their EHR/EMR. I am the first to admit our survey wasn’t the most scientific – the primary reason that almost 700 people agreed to participate in the survey was because it allowed them to enter our drawing for a vacation cruise – but nevertheless, the results were compelling.

We asked HIMSS attendees the following question: What is most frustrating about working in your EHR? We then offered the following response choices:

  1. Relevant clinical information is hard to find
  2. Documentation takes too long
  3. Doesn’t fit into my existing workflow
  4. Negatively impacts patient encounters
  5. Doesn’t frustrate me
  6. My organization doesn’t use an EHR

A whopping 44 percent selected the response, “Documentation takes too long.” For the sake of comparison, the next-highest response was, “Relevant information is hard to find” (18 percent), followed by, “My organization doesn’t use an EHR” (13 percent).

What I glean from these results – aside from the fact that CIOs would be well served to invest in solutions that improve documentation speed – is that CIOs and other decision makers may not be focused on the right solutions.

I am a big proponent of user training, but let’s be realistic: if you have a propeller-driven airplane, it’s never going to perform like a jet aircraft. CIOs must accept that even with all the training in the world, the documentation process within some legacy EHR systems will never be significantly faster, nor will it be particularly user friendly.

Rather than investing resources in trying to teach users how to make more efficient use of an inefficient system, why not consider investing in a solution that can easily be plugged into legacy systems and give clinicians the fast documentation tools they desire? CIOs can find technologies that work in conjunction with existing EHRs to alleviate provider frustration because they work the way doctors think, do not get in their way, and do not slow them down.

The KPMG survey confirms what most of us in healthcare IT have long known: EHRs have not yet achieved their full potential, providers are weary of the inefficiencies, and more resources must be spent to optimize the original investments. As CIOs and other decision-makers consider their next steps, I encourage them to assess what they now have and look for solutions that give clinicians what they want and need at the point of care.

Readers Write: Naked Cybersecurity

March 8, 2017 Readers Write 1 Comment

Naked Cybersecurity
By John Gomez

John Gomez is CEO of Sensato of Asbury Park, NJ.

Although the observations in this article are based on my direct experiences over the past four years working with healthcare organizations to secure their systems. I am sure that most of what I am going to share is wrong. I also will apologize upfront for presenting a viewpoint that I am sure is one-sided, and although I believe it to be reflective of the reality of cybersecurity in healthcare, it is probably wrong.

I also want to clarify who I hope will read this article, because it is certainly not meant for everyone. If you are of the belief that academic cybersecurity approaches, checkmark mentality, or putting your faith in things like commercial “trusted” security and privacy frameworks or national cybersecurity information sharing groups is a good idea, then this article is not for you. Reading it will be a total waste of your time.

In fact, if you think that what you have been doing in cybersecurity is right and spot on, this article will just annoy you. And yes, you guessed it, it will be a waste of your time.

On the other hand, if you stay up at night freaked out that despite your best efforts you are losing the battle against a well-armed and informed enemy, then brothers and sisters, you probably will find this article of interest. Yet I warn you — this is more about my opinion (as unqualified as that may be) than any academic, certified, highly-trusted approach you may find in the world of healthcare cybersecurity.

For those who are still reading along, let me drop (in the vernacular of our youth) a truth bomb. A truth bomb that I suspect anyone still reading will not find surprising, but is akin to that small child who once said, “But the emperor has no clothes.” The truth I share with you is that we are losing the cybersecurity war and losing badly. 

There, I said it. And yes, it is rather cathartic to be able to state that in public. Try it with me — I promise you will feel better and empowered. We are losing the cybersecurity war.

Despite our best efforts, despite the beliefs in fancy risk and security frameworks and the latest hyperbole regarding threat intelligence, advanced defenses, and the latest snake oil being peddled by cybersecurity vendors, we are losing ground by leaps and bounds.

If you ever wanted to know what it felt like to be on the receiving end of General Patton’s surge across Europe, just take a job in the world of healthcare cybersecurity. We have some great, passionate, talented people among our ranks, but regardless of how fast they are pedaling, the attacks are overrunning them and taking ground.

In 2016, per a PWC cybersecurity survey, organizations across industries increased their spending on cybersecurity by 20 percent. Yet despite deploying more frameworks, more technology, employing some cool AI stuff, expanding their staffs, and embracing the best practice of the day, we also learned that there was a 38 percent increase in cybersecurity attacks. The cost to remediate an attack rose by 23 percent over 2015.

Talk about a lousy return on investment. You increase spending by 20 percent, and yet you are finding your efforts to not even be closing the gap. In fact, on a cross-industry basis, we are seeing double-digit negative returns on cybersecurity investments.

Years ago, an experiment was conducted where a monkey threw a dart at a list of stocks. The goal was to see if random selection of stocks ended up worst or better than what was selected by professional and well-trained brokers. If I recall, the monkey’s picks fared better. Sadly, for those of us protecting healthcare organizations from attackers, we are seeing similar results. There is no — not one — strategy or best practice that will definitively prevent attackers from gaining access to your systems.

Speaking of attackers, just how painful has life become for their side of the seesaw? I mean, everyone is spending more money; cybersecurity is now a board-level issue; and per HIPAA, it is required that the CEO be intimate with the protection of patient data as it relates to security and privacy. Certainly all this increase in spending, resources, and attention must be making life so very hard for the cyberattacker.

Well, in 2016, the average cost of a highly-sophisticated exploit kit was $1,367, a 44 percent decrease over 2015. Thanks to easy and cheap access to cloud computing (I am looking at you, Microsoft and Amazon), the cost of an attack has dropped 40 percent over 2015. We now have attacker market that include RAS (ransomware as a service), EAS (espionage as a service), and DDoSasS (Distributed Denial of Service as a service). You can contract for any of these attack services from the comfort of your home recliner. We also have learned that the average length of time to successfully execute a breach is now less than 24 hours, a 72 percent decrease over 2015.

Net-net, attackers are winning and probably chilling out, sharing bottles of wine, nibbling on cheese, and laughing their butts off. Yet for those in the trenches, those who get up day to day fighting the good fight, none of this is new. I suspect that the front-line defenders know all of this, yet don’t have the data or podium to yell out, “The emperor has no clothes.”

Ultimately, I believe we all are united (vendors, defenders, management) in understanding that our current approaches are not working over the long term. I also suspect some will have counterarguments, point out that things aren’t that bad, and claim their solution is fault proof. As someone who works with attackers, I can tell you that you would be foolish to believe that your current approaches can thwart attackers. Especially if your approaches date back to 2010, are based on complicated frameworks and tools, and require you to subscribe to checkmark practices.

Here is a final statistical truth bomb that you may find entertaining. About a decade ago, we could detect an attacker in our networks within hours. Over time time-to-detection has evolved from hours to the current average of 265 days. If the attackers keep evolving, soon it will be over a year on average before we can detect an attacker despite our increased spending and advanced defense capabilities.

We can attribute this to advanced persistent threats (even though most attacks are not all that advanced), higher complexity of networks, and technology we defend as among the reasons attackers succeed. I am sure there is some truth in all those reasons, but you don’t win wars by pointing out what you are doing. You win wars by gearing up, toughening up, and figuring out how to fight better and more effectively than your enemy.

I guess the foundational question this article will pose is, is this a lost cause? Should we just wave the white flag and throw up our arms? That is one approach, but I have greater faith in all of you. You who stay awake at night wondering what else you can do to fight the good fight. You who take on your boards, push back against the egotistical physician, and fight to be heard for funding and attention — all to make it a little bit tougher for the attacker. I have tremendous faith for all of you who insist, “Not on my watch.”

I believe there is a lot we can do to turn the tide on the attackers. Right now, we are in a ground war, one that can benefit from technology, but that also requires us to really reconsider our core tactics and principles. One major piece of advice I would give you comes from Luke Cage of Marvel Comics — “…sometimes you have to throw out the science.”

A key approach that should be considered, debated, and tested is simplification. Rather than embrace the false of sense of security that complexity may bring, we should focus on tactics that rely on low-tech solutions that work consistently. You should be establishing last lines of defense that are based on securing high-value targets. It is critical that you take an attacker-centric viewpoint and truly understand attacker motivations. Much of this advice comes from my personal experiences in cybersecurity and in training special operation teams to take the fight to the enemy.

Simply stated, you need to embrace an assertive posture related to your cybersecurity. This is not 2010. It is 2017, and we are now dealing with attackers employing 2020 approaches. We have just seen the release of MedJack 3.0, which bypasses antivirus. We are seeing malware that is polymorphic. We are seeing attackers embrace analytics and machine learning. The answer is not a framework that recommends changing your password every 90 days? A signature-based system is not going to keep an attacker out of your network.

We need to stop putting our faith in those solutions and approaches that are complex and increase complexity. Regardless of the technical solution or tactic, your goal should be to embrace simplicity, reduce excuses, and eliminate barriers to security.

Want to practically eliminate phishing attacks? Invest in a solution that adds the word “External:” to the subject line of any e-mail that comes from outside your organization. You would be surprised how this little low-tech investment dramatically drops the success of phishing attacks. Want to reduce the length of time an attacker is in your network? Learn what scares them most and target their fears (if you don’t know that answer, e-mail me). Turn the tables, get practical, fight back.

Practical real-world security doesn’t require huge expense or complicated approaches. The most critical first step is to become like a child. Open your eyes and realize that the emperor which is healthcare cybersecurity is in the buff.

Readers Write: It’s Time to Bring Back the Noise

March 1, 2017 Readers Write 1 Comment

It’s Time to Bring Back the Noise
By Andrew Mellin, MD

image

A very memorable moment for me at one of the first go-lives for a hospital EHR was when I stood on the unit and realized there was an eerie silence. While the beeps of the monitors and the drone of the overhead pages continued, the buzz of the caregivers talking to each other was gone as everyone was staring intently at a computer monitor.

As an implementation team, we quickly learned we needed to frequently remind the caregivers to keep talking to each other as part of our go-live training for future sites. But years later, it is clear the EHR has fundamentally changed the dynamics of how providers and care teams communicate.

The impact of this dynamic is well recognized. The change in communication patterns, sometimes called the "illusion of communication," is identified as one of the key unintended consequences of implementing an EHR. With today’s EHRs, we now have all the information we need at our fingertips, yet the ability for care teams to collaborate in an ongoing, continuous dialogue is not well supported by the systems’ encounter, inbox, and order-based models.

We still have noisy hospitals, but now we hear the wrong kind of noise: the sounds that keep patients awake and require caregivers to respond to beeps emitted from devices in stationary locations that make it difficult to find a real signal that requires action.

It’s time to bring the right kind of noise back to patient care. Not the auditory noise that we hear, but the cognitive buzz that is generated when high-functioning teams are communicating in an effortless, asynchronous manner.

Think of how communication models like iMessage, WhatsApp, and SMS have changed the way we communicate in our personal lives. There’s very low effort required to initiate a simple message. We have the ability to share rich information — such as images, videos, or voice — as well as expressive notifications. We even have an ongoing transcript of the conversation and acknowledgement of message receipt.

Healthcare communications benefit from the same communications models, but require HIPAA compliance, message traceability, integration to other initiators of messages (e.g., the hospital operator), and EHR integration.

The actual messaging app, however, is simply the user window into communications technologies that not only improve care team collaboration, but more importantly, drive improved care team efficacy and patient outcomes.

For example, physicians work in shifts that are largely defined by an on-call schedule. When I worked as a hospitalist on weekends when the staff frequently changed, I needed to find an on-call schedule to determine which specialist would see my patient that day (usually I just asked the nurse or HUC to page a person for me because it was too hard to figure out who was on call.)

To solve this problem, a healthcare communications platform needs to support messaging to a role that resolves to their correct on-call individual. And secure mobile messaging is not only about person-to-person communications — rather it is a way to notify an individual of any important piece of information about a patient, whether it is generated by a machine or a human.

For example, when a CDS alert in an EHR is triggered to indicate that a patient may be becoming septic, a rapid response team can be automatically and immediately notified. When a device triggers an alarm, instead of a loud beep that has to be interpreted, the specific, detailed message with patient context is sent to the right person’s device with the appropriate sense of urgency.

All technologies have limited value unless directly leveraged to improve organizational goals, and communication tools are often an underrepresented element of process improvement initiatives due to the limited modes that exist without a modern communication infrastructure. I’ve seen dramatic operational and clinical improvements achieved when these tools are embraced, such as 30-minute reduction in admission times from the ED and material improvement in HCAHPS scores.

These tools do not eliminate the phone call that is essential in a complex situation or the need to document the care plan in the EHR. Rather, these tools augment the EHR and elevate the quality and cohesiveness of the care team collaboration. The magnitude of the value of healthcare communications is under-appreciated: One large academic medical center sends over 150,000 messages to the caregivers and support staff in their organization every week.

It’s time to give caregivers the communications tools they need to improve the patient’s care experiences and outcomes and care team efficacy while eliminating the auditory noise where care is delivered. And it’s time to bring in the kind of high-value noise where caregivers are rapidly interpreting and responding to targeted messages on the go on their mobile device.

Andrew Mellin, MD, MBA is chief medical officer of Spok of Springfield, VA.

Readers Write: Growing Contingent Workforce Benefits Both Healthcare Organizations and HIT Professionals

March 1, 2017 Readers Write No Comments

Growing Contingent Workforce Benefits Both Healthcare Organizations and HIT Professionals
By Frank Myeroff

image

There’s high growth when it comes to temporary workers, contractors, independent consultants, and freelancers within healthcare IT. New technologies, cost factors, and a whole new generation of HIT professionals wanting to work in a gig economy are fueling this growth. The rise and growth of the contingent workforce is only expected to accelerate over the next few years into 2020.

This dynamic shift to a contingent workforce makes sense for healthcare organizations and the benefits are well worth it. With a contingent workforce, healthcare organizations experience a big efficiency boost, risk mitigation, and derive a substantial cost savings in these ways:

  • The rise of managed service providers (MSP) enable health systems to acquire and manage a contingent workforce. As contingent labor programs continue to grow, these partnerships will be one of the most important workforce solutions that a health system can adopt to effectively manage risk and decrease healthcare hiring.
  • The use of vendor management systems (VMS) is a fast way to source and hire contingent labor. These systems make it easy to submit requisitions to multiple staffing suppliers.
  • Outsourced expertise will be able to assist healthcare facilities in meeting the January 2018 EHR system requirements. In addition, they often have the extensive knowledge needed when it comes to medical coding. For example, according to the AMA, 2017 ICD-10-CM changes will include 2,305 new codes, 212 deleted ones, and 553 revised ones.
  • Healthcare organizations can dial up or dial down staffing as needed without having to pay FTE benefits.
  • Improved visibility and the provider stays in control through the use of structured reporting, governance processes, and dashboards.
  • Internal resources are freed-up to focus on higher-priority, clinical-facing initiatives such as workflow optimization.

For HIT professionals, contingency work in the HIT space is attractive since opportunities are plentiful, the remuneration is desirable, and the work is rewarding. In addition, work is becoming more knowledge- and project-based and therefore is causing healthcare organizations to become increasingly reliant on their specialized HIT skills and expertise. According to Black Book Rankings Healthcare, this reliance will help to fuel the growth of the global HIT outsourcing market, which should hit $50.4 billion by 2018.

However, making the change from an employee to a contingent worker takes thought and preparation before just jumping in. Here are a few suggestions:

  • Identify the niche where you have skills and expertise. Know your passion. Also, pinpoint what type of HIT services and advice you can offer that healthcare organizations are willing to pay for.
  • Obtain the required certifications. Getting certified is a surefire way to advance your career in the IT industry. Research IT certification guidesto identify which ones you will need in the areas of security, storage, project management, cloud computing, computer forensics, and more.
  • Build your network and brand yourself. It’s important to start building your network once you’ve decided to be a consultant. A strong contact base will help you connect with the resources needed in order to find work. Also, position yourself as an expert, someone that an organization cannot do without. Now combine both a professional network and social network to help you spread with word faster.
  • Target your market and location. Determine what type of facility or organization you want to work with, and once decided, think about location. Do you want to work remotely or on site? Are you open to relocation or a commute via airline to and from work?
  • Decide whether to go solo or engage with a consulting and staffing firm. If you have the entrepreneurial spirit and want to approach a specific organization directly for a long-term gig, you might want to go solo. However, if you’re open to both short-term and long-term opportunities in various locations, a consultant staffing firm might be the answer.

The rise of a contingent workforce and gig economy will only continue to grow, and with it, much opportunity. A consultant or contractor has more freedom than a regular employee to circulate within their professional community and to take more jobs in more challenging environments. For healthcare facilities, a contingent workforce means acquiring the right HIT skills and expertise needed without the overhead costs associated with payroll benefits and administration. No doubt, a win-win situation for both.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

Readers Write: Automate Infrastructure to Avoid HIPAA Violations

March 1, 2017 Readers Write No Comments

Automate Infrastructure to Avoid HIPAA Violations
By Stephanie Tayengco

image

Every other week, news of HIPAA violations comes to light, bringing attention to the challenges of maintaining privacy in the ordinary course of doing business and providing care.

Take, for example, a recent HIPAA violation settlement. Illinois-based healthcare system Advocate Health Care agreed to pay a $5.5 million OCR HIPAA settlement in August after it was found that the company failed to conduct an accurate and thorough assessment of the potential risks and vulnerabilities to all of its ePHI. Earlier this summer, The Catholic Health Care Services of the Archdiocese of Philadelphia agreed to pay $650,000 for failing to implement appropriate security measures and address the integrity and availability of ePHI in its systems.

It is unclear in both cases whether infrastructure configurations were directly to blame. However, addressing the infrastructure-related elements of HIPAA and HITECH take considerable time and effort, time that could be spent addressing the critical application and mobile device-level security standards that result in the vast majority of violations. To refocus engineers away from time-consuming infrastructure compliance, the practices of infrastructure automation and continuous compliance are the key.

Reduce the chance for human error

The foundation for compliant IT infrastructure is implementing strong standards and having guardrails in place to protect against changes that are inconsistent with those standards at the server, operating system, and application level. This is the next evolution of compliance — building a system that can self-correct errors or malicious changes and maintain continuous compliance.

In a recent survey, IT decision-makers shared that 43 percent of their companies’ cloud applications and infrastructure are automated, highlighting that while companies already recognize the tremendous value of system automation, they can do even more.

The road to automation must begin with an IT-wide perception shift — that manual work introduces risk. Any time an engineer is going into a single piece of hardware to perform a custom change, error is possible and system-wide conformity is threatened. This does not mean replacing engineers with robots. It means tasking engineers with creating the control systems. This is an equally challenging (but far less boring) technical task for engineers, but it creates more value.

Part of this control system will be configuration management at the infrastructure level and for application deployment automation. Equally important is the operational shift to train engineers not to make isolated changes to individual machines  and instead to use the control system in place and implement changes as code. Code can be easily changed and tested in non-production environments. Code can be versioned and rolled back. Software deployment tools provide an audit trail of changes and approvals that can be easily read by auditors.

Invest in transparency

One of the main causes that can lead to non-compliance is a lack of transparency, usually in one or both of two key areas:

  • Lack of transparency into where critical data resides
  • Lack of transparency into current state of system configurations (i.e., how/where data is encrypted, who has access to that data, how privileges are maintained, etc.)

Many companies rely on manual processes and spreadsheets to track the configuration of their systems. In a cloud environment that changes frequently, this can be a real headache.

The single biggest change to make today is to improve the visibility of data criticality and system configurations is to implement configuration management. Rather than rely on manual documentation after the fact when changes are made, configuration management tools allow describing a desired state and creating and enforcing it across the infrastructure. Ideal configurations are coded in a single place, providing the current state of all systems at any time. This is a huge leap forward and it is applicable for operating either on bare metal or in the public cloud. Making long-term investments in operational transparency can help avoid HIPAA headaches.

Focus on mission-critical apps, not infrastructure

As healthcare companies improve IT operations, they should be focused on developing or delivering great patient-centered applications and services, not infrastructure maintenance and compliance.

Migrating to the cloud is the first step. Migrating to a public cloud platform like Amazon Web Services (AWS) provides the benefits of a government-grade data center facility that has already been audited for HIPAA and HITECH compliance. Signing a BAA with Amazon means that a portion of the physical security standards is taken care of (note: regular assessments are still required). That is a huge reduction in risk and cost burden right off the bat.

In addition, the cost of change is significantly reduced in the cloud. Adding, removing, or changing infrastructure can mean a few days of work, not months. That means systems engineers can focus on improving software delivery and the configuration management system, not on manually configuring hardware.

Just one word of caution. Beware of any cloud vendor or service provider that describes the cloud as “no maintenance.” It is true that cloud systems are more efficient to maintain, but maintenance is still necessary. The IT team will focus more of their time on maintenance tasks that are more critical to the business, like building a new testing ground for an application development team or refining the code deployment process, not on undifferentiated data center tasks.

It is only a matter of time before the industry witnesses its next HIPAA violation. Automating infrastructure can significantly reduce the cost and effort of maintaining infrastructure compliance, and can refocus IT on higher-impact areas such as device security.

As health IT evolves, expect to see these two key of technologies — cloud and automation — driving the next wave of efficiencies in health IT.

Stephanie Tayengco is SVP of operations of Logicworks of New York, NY.

Readers Write: The Patient Experience Is Clinical

February 1, 2017 Readers Write No Comments

The Patient Experience Is Clinical
By Mark Crockett, MD

image

As quickly as healthcare began to focus on patient experience, the law of unintended consequences kicked in. While well received as a tool to help improve care, this situation unintentionally gave rise to a consumer culture around patient treatment. Today’s value-based care arrangements call for providers to take a fresh look at patient experience.

While patients certainly deserve to be treated with dignity and listened to carefully, the top patient experience expectation is receiving safe, quality care. “Patient experience [is] not about making patients happy over quality,” says James Merlino, MD of the Association for Patient Experience. “It’s about safe care first, high-quality care, and then satisfaction.”

The best way to deliver on this expectation is for providers to view these issues of safety, risk, and compliance as a cohesive whole, thus enabling patients to receive the safe, quality care they expect, in the caring and supportive environment they deserve.

The Beryl Group defines patient experience as “the sum of all interactions, shaped by an organization’s culture, that influences patient perceptions across the continuum of care.”

That’s a big job. Most providers lack the tools to make that happen. Where to start?

It begins with developing provider/patient and provider/organization relationships that encourage collaboration.

In 2013, a British Medical Journal review of 55 studies found that patient experience is “positively associated with clinical effectiveness and patient safety, and supports the case for the inclusion of patient experience as one of the central pillars of quality in healthcare. Clinicians should resist sidelining patient experience as too subjective or mood-oriented, divorced from the ‘real’ clinical work of measuring safety and effectiveness.”

What the BMJ study revealed, and my own anecdotal evidence bears out, is that if a patient experience is positive, the patient feels empowered and can enter into a therapeutic “alliance” with the provider. Patients are motivated to follow treatment plans and are less likely to withhold information if they don’t feel intimidated—or worse, ignored—by their provider and the hospital where treatment was rendered. This supports swifter diagnoses and improved clinical decision-making and leads to fewer unnecessary referrals or diagnostic tests.

Many hospital CFOs don’t need the BMJ study to know a positive patient experience is a clinical indicator that ties to financial outcomes. As outlined in the chart (Figure 1), patient experience is directly associated with a hospital’s Star Rating and patient outcomes:

clip_image002

Creating a positive patient experience, and better clinical outcomes, begins with an understanding of what patients expect from providers. The primary expectation of any patient is, first and foremost, safety. To the unfamiliar, hospitals are scary places. Patients no doubt have read or heard stories (or watched doctor shows on TV) of medical errors and medication mix-ups or of being treated by an unqualified caregiver. Hospitals and other healthcare settings must communicate clearly that theirs is a safe place where patients can trust their caregivers.

If patients believe they are in a safe, trusted environment, their next expectation is, of course, to get better. To be healed. This requires consistent excellence across a wide variety of performance areas. Finally, patients expect to be treated with courtesy and respect.

How do we establish patient experience as one of the pillars of quality healthcare? Not surprisingly, it’s a judicious combination of technology, effective communication, and employee engagement and physician alignment.

Most patients assume all clinicians are highly qualified and fully credentialed. A robust credentialing platform helps providers deliver on that assumption. Other examples of technology impacting patient experience is the ease of electronically submitting information to a Patient Safety Organization. Participating in a PSO not only enables federal protection under the Patient Safety and Quality Improvement Act (PSQIA) but enables the organization to share and learn from peers as it relates to patient safety initiatives that most certainly impact patient experience.

Effective communication improves not just patient satisfaction, but also physician satisfaction. It boosts patient adherence and compliance and reduces medical errors and malpractice claims. The benefits of a culture that encourages open, honest, and direct communication among patients, providers, and staff go directly to the heart of patient experience.

There is a tremendous benefit to incorporating digital rounding (levering mobile technology to gather information in real-time during the rounding process) into a health system’s employee engagement strategy to generate information from patient rounding, safety rounding, and leader rounding. There is much to be learned from the voices of providers, patients, and employees.

For example, although nurses and physicians generate an equal number of complaints, nurses are three times more likely to have positive reports as compared to MDs. However, physician complaints have higher severity and fewer resolutions.

Patient feedback gathered through a rounding process identifies critical focus areas including peer review events, compliance events (particularly in infection control), and patient and employee safety issues.

For one healthcare system, more than 50 percent of all peer review cases at its 30 facilities actually began in patient relations. In addition, validation audits from compliance organizations (specifically CMS) often stem from a patient complaint. Another reason to centralize data gathered from the feedback of patients, providers, and employees is to identify patterns that allow organizations to transform risk management from a reactive process to a proactive component of healthcare delivery.

clip_image004

Patient experience is clinical. It matters to value-based care and has direct impact on an institution’s long-term financial survival. Organizations that sideline patient experience, or simply meet the minimum standards required, do so at their peril.

Mark Crockett, MD is CEO of  Verge Health of Charleston, SC.

Readers Write: No Easy Answers For Scheduling Physician On-Call Coverage

February 1, 2017 Readers Write 1 Comment

No Easy Answers For Scheduling Physician On-Call Coverage
By Suvas Vajracharya, PhD

image

Recent criticism of on-call scheduling practices in the retail sector means that it may be time for healthcare operations leaders to review on-call scheduling practices for their physician teams.

In recent weeks, the retail sector has experienced close scrutiny for on-call arrangements with their staff. According to Reuters, New York Attorney General Eric Schneiderman and “his counterparts in seven states, including California and Illinois, have sent letters to a number of companies in the last year requesting information about their scheduling practices.” In response, employers like Aeropostale and Walt Disney have begun discontinuing the practice of keeping hourly workers on call for last-minute shift changes to avoid further legal disputes.

In healthcare, on-call coverage is regulated under the Emergency Medical Treatment and Active Labor Act (EMTALA). Most medical institutions choose to pay on-call physicians to ensure appropriate coverage under these rules. According to a 2012 SullivanCotter report, nearly two-thirds of healthcare organizations provided call pay to at least some physicians, up from 54 percent in 2010. However, the EMTALA regulations are excessively vague and “in a manner that best meets the needs for the hospital’s patients” can be interpreted in ways that leave physicians feeling like they’re receiving an unfair deal.

“In the MGMA’s 2013 Medical Directorship and On-Call Compensation Survey, primary care physicians reported a median on-call rate of $100 to $150 per day,” according to an article in Medical Economics.

From the physician perspective, these rates may not fairly balance the sacrifices they are making to provide on-call coverage during their days off — if they are receiving compensation at all. For retail employees, state officials concluded workers can be harmed by “unpredictable” schedules that can increase stress, strain family life, and make it harder to arrange child care or pursue an education. Fundamentally, to be on call as either a retail employee or a physician requires foregoing activities and flexibility with free time.

With physician burnout on the rise, heavy variation in the frequency of calls and a wide range in the number of physicians participating in call rotation, health leaders should invest proactively in finding fair on-call strategies to ensure the hospital’s access to physicians and to prevent turnover. How do we fairly compensate a physician for remaining in close proximity to the hospital and being physically and mentally capable of providing direct patient care at a moment’s notice? How do we weigh the difficulty of taking calls on holidays or weekends or being on primary call versus backup call?

Providing adequate on-call coverage remains a constant challenge for most healthcare institutions. Making it a program that is seen as fair and respectful of physician staff can be a crucial first step. Using scheduling technology instead of a manual process not only removes the sense that personal bias may be influencing how on-call hours are assigned, but also provides transparency across teams and flexibility for swaps. Scheduling technology with advanced algorithms based on artificial intelligence can also ensure that on-call schedule enforces work patterns in harmony with circadian rhythm of physicians who need to work at any hour.

Healthcare operations leaders should want to follow the lead of companies like Gap, who proactively change their policies to stay ahead of on-call criticism. Small policy changes can dramatically reduce risk for healthcare operations and improve physicians’ professional satisfaction.

Suvas Vajracharya, PhD is founder and CEO of Lightning Bolt Solutions of South San Francisco, CA.

Readers Write: Future Health Solution

February 1, 2017 Readers Write 5 Comments

Future Health Solution
By Toby Samo, MD

image

Health information technology (HIT) has made significant advances over the last two decades. While adoption is not necessarily a good marker for successful EHR usage, adoption of office-based physicians with EHR has gone from about 20 percent to over 80 percent and more that 95 percent of all non-federal acute care hospitals possess certified health IT. HIT implementation has led to improvements in quality and patient safety.

However, many of the goals of increased HIT implementation have been stymied by social and technical roadblocks. A “one type fits all” approach may help reduce training and configuration costs, but there are many approaches to patient care and unique workflows between specialties and among individual users.

Most EHRs are burdened with three major legacy issues:

  1. Technology. Present EHR systems are mostly built on what would now be considered old technology. Some of the ambulatory products and small acute care products have moved onto cloud-based architecture, but most are client-server. While hosting instances of a product reduces the technical expertise needed by the client and can lead to better standardization of implementation, it does not necessarily deliver the advantages of a native, cloud-based architecture.
  2. Encounter-based. EHRs have been built on the concept that interactions with patients (or members or clients) are associated with a specific encounter. This functions well for face-to-face visits and for specific events, but is limiting where longitudinal care is required.
  3. User experience. The user experience has for the most part taken a back seat to functionality in HIT software development. A quick view of most HIT systems shows the interface to be cluttered and does not draw the user’s attention to the areas that need the most attention. Most users access only a small percentage of the functionality that is present within the system, but vendors continue to add functionality rather than clean up the interface.

Platforms have revolutionized the way business is conducted in many industries. Numerous examples have made household names out of companies like Airbnb, Uber, Facebook, YouTube, Amazon and many more. A platform is not just a technology, but also “a new business model that uses technology to connect people, organization, and resources in an interactive ecosystem.”

There is a need for a HIT platform that would support the multitude of components necessary to move the delivery of HIT into the next generation. The future health solution needs to use contemporary technology that will have the flexibility to adapt to ever-changing requirements and use cases of modern healthcare. Some of the characteristics of the future health solution are:

  • Open. One of the biggest complaints of users and regulators is the closed nature of many HIT systems. The future health solution needs to be built as a platform that is able to share and access not only data, but also workflows and functionality through APIs
  • Apps and modules. A modular structure will enable components to be reused in different workflows and encourage innovation and specialization.
  • True, cloud-based architecture. Cloud computing delivers high performance, scalability, and accessibility. Upfront costs are reduced or eliminated and minimizes the technical resources needed by the client. Management, administration, and upgrading of solutions can be centralized and standardized.
  • Multi-platform. Users expect access to workflows on their smartphones and tablets. Any solution must develop primary workflows for the mobile worker and ensure that the user interface supports these devices
  • Scalable (up and down). To meet the needs of small and large organizations, the future health solutin will need to scale to accommodate changes in client volumes.
  • Analytics, reporting, and big data. HIT systems have collected massive amount of data. The challenge is not just mining that data, but presenting the information in a way that can be quickly absorbed by the individual user.
  • Searchable at the point of use. All the data that is being collected needs to be readily accessible. Using universal search capabilities and the ability to filter and sort on the fly will facilitate the easy access to information at the point of care.
  • Privacy and security. The core platform will need to be primarily responsible for the security and privacy of the data. The other modules built on the platform will need to comply to the platform security and privacy practices, but will not need to primarily manage these issues.
  • Interoperable. Need to adopt all present and future (FHIR) standards of data sharing. The open nature of the platform will facilitate access to data.
  • Internationalization and localization. Internationalization ensures that the system is structured in such a way that supports different languages, keyboards, alphabets, and data entry requirements. Localization uses these technical underpinnings to ensure that the cultural and scientific regional differences are addressed to help with implementation and adoption.
  • Workflow engine. Best practices can change and can be affected by national and regional differences. An easy-to-use workflow engine will be a necessity to help make changes to the workflow as needed by the clients.
  • Task management. Every user has tasks that need to be identified, prioritized, and addressed. Therefore, a task management tool that extends beyond a single module or workflow will be needed.
  • Clinical decision support. Increasingly sophisticated decision support needs to be supported, including CDS, artificial intelligence, and diagnostic decision support. These capabilities need to be embraced by the platform, allowing external decision support engines to interface easily with the other modules.
  • Adaptable on the fly by the end user. Allowing the end user with proper security to make changes to templates and workflows would help improve adoption.
  • User experience. Probably the most significant barrier to adoption of HIT is the user experience. Other industries are way ahead of healthcare in the adoption of clean, easy-to-use interfaces. It is vital that a team of user experience experts be integrally involved in the development process. All user-facing interactions, screens, and workflows need to be evaluated by user experience experts who can recommend innovative ways the user interacts with the system and how information is displayed.

The HIT industry has hit a wall that is preventing it from developing innovative products that use the newest technology and have an exemplary user experience. A new platform has the potential to support a robust, flexible, and innovative series of products that can adapt to meet the needs of the various healthcare markets globally. Such a project would have to build slowly over time, as does any disruptive technology. The legacy systems and other HIT systems that exist do not have to be excluded, but rather can be integrated into this new platform.

Identifying technology that, at its core, has the privacy, security, data management, and open structure could lead to the next generation of healthcare management systems. While some of these characteristics are obvious to developers and users alike, it is the sum of the parts that is important. Integrating most if not all of these characteristics into a single model is what can lead to enhancing the value of HIT and the delivery of care.

Toby Samo, MD is chief medical officer of Excelicare of Raleigh, NC.

Readers Write: Are You Ready for the Quality Payment Program?

January 18, 2017 Readers Write 7 Comments

Are You Ready for the Quality Payment Program?
By Kory Mertz

image

With the start of the New Year, the first performance period for the Quality Payment Program (QPP) has officially started. The QPP, part of the MACRA legislation, was passed with strong bipartisan support in Congress and sends a clear signal of the federal government’s accelerating effort to move to value-based payments.

QPP creates two new tracks for Eligible Clinicians (ECs), as program participants are called: the Merit-based Incentive Payment System (MIPS) and the Alternative Payment Models Incentive Program.

MIPS

MIPS consolidates and sunsets three programs focused on ambulatory providers: the Physician Quality Reporting Program, the Value-Based Payment Modifier, and the Medicare EHR Incentive Program for eligible professionals. In 2017, ECs can receive a maximum payment adjustment of plus or minus 4 percent based on their performance in four categories. ECs who are new to Medicare or who bill less than $30,000 or see fewer than 100 Medicare beneficiaries during a year will be exempt from MIPS.

image

In response to significant feedback from the provider community, the Centers for Medicare and Medicaid Services (CMS) has simplified the requirements and made 2017 a transition year to help ECs get used to participating in MIPS. Providers have three general approaches they can take:

image

Alternative Payment Models Incentive Program

The second track of QPP is focused on increasing EC participation in Alternative Payment Models (APM) (i.e. Accountable Care Organizations, bundled payments, etc.) by offering a 5 percent bonus and exemption from MIPS for ECs who participate in an Advanced APM and meet certain participation thresholds. In 2017, ECs must have at least 25 percent of their Medicare payments or 20 percent of their Medicare patient panel in a CMS Advanced APM to receive the bonus and MIPS exemption. ECs who meet lower payment or patient thresholds have the option to be exempt from MIPS. CMS maintains the list of qualifying Advanced APMs here.

Moving Forward

The overarching framework created in the legislation and initial rulemaking completed by the Obama Administration will continue unchanged in 2017. The Trump Administration will have a chance to put its own twist on the QPP in 2017 by filling in the program implementation details through sub-regulatory guidance (much like CMS has done with the Meaningful Use program) and in 2018 and beyond through rulemaking to establish future program requirements. If Representative Tom Price is confirmed as the Secretary of the Department of Health and Human Services, he may accelerate efforts to reduce provider burden and simplify the QPP.

As providers prepare to participate in the first year of QPP and HIOs prepare to support providers’ success, they should keep the following in mind.

  • While APMs have gained significant attention in recent years, CMS anticipates that the vast majority of providers will participate in MIPS in the early years of the QPP.
  • Providers just beginning to think about the QPP requirements should  generate reports to determine which providers are likely to be an EC during the performance period and which will fall under the low volume exclusion; map out the existing TIN/NPI structure of the organization to help support decision making around group versus individual reporting; and undertake a scan across the organization to determine existing Advanced APM participation by ECs. If an organization participates in an Advanced APM, a report should be generated based on all participating providers to determine if participants will qualify for a bonus and MIPS exemption under the APM track.

HIOs have the opportunity to position themselves to support providers’ success in QPP. HIOs should ensure they have functionality that aligns with program requirements, including:

  • Implement certified tools to collect and submit electronic quality measures to CMS to support ECs and help them achieve bonus points for the quality performance category.
  • Support ECs success with a variety of ACI measures including HIE (send and receive); view, download and transmit; and submitting information to public health and clinical data registries. A key consideration in determining which measures to support include the existing exchange environment the HIO operates in, if certified technology is required to meet the measure, whether the HIO’s technology meets the requirements (i.e. providing machine readable C-CDAs), and the ability to provide ECs necessary audit documentation.
  • Support improvement activities. For example, “Ensure that there is bilateral exchange of necessary patient information to guide patient care that could include one or more of the following: Participate in a Health Information Exchange if available; and/or use structured referral notes.” A key consideration for supporting improvement activities is whether the HIO has the ability to provide ECs with necessary audit documentation.

Kory Mertz is senior manager of Audacious Inquiry of Baltimore, MD.

Readers Write: Integrating EHRs and PDMPs: A Trend for 2017

December 21, 2016 Readers Write 1 Comment

Integrating EHRs and PDMPs: A Trend for 2017
By Connie Sinclair, RPh

image

The opioid epidemic will continue to be a big story in 2017 and the statistics get grimmer by the minute. We just learned from the government that more than 33,000 people died from opioid overdoses in 2015, making it the deadliest year ever.

In response, states will continue to enact legislation to mandate prescribers to use the Prescription Drug Monitoring Program (PDMP) and will encourage making electronic health records (EHRs) more interoperable with PDMPs by integrating access into prescriber workflows. For example, Massachusetts and Ohio are subsidizing statewide projects to facilitate the integration of the state PDMP into EHR solutions used by providers. PDMP usage has been associated with fewer overdose deaths and lack of integration into prescriber workflow has been shown to be a barrier to utilization, so we anticipate more states will follow suit.

While PDMP and EHR integration is an important policy goal, making it a reality has been easier said than done. PDMPs are independent, state-run databases of controlled substance prescriptions that have been reported from pharmacy dispensers. They are operational in all states except Missouri. Because PDMP systems have evolved outside the health IT ecosystem, significant barriers to interoperability have resulted. In contrast to electronic prescribing, for example, there is not a standard method to exchange and integrate the prescription drug data available in PDMPs into EHRs.

That is changing. In 2013, the Office of the National Coordinator (ONC) created a pilot initiative to bring together the PDMP and health IT system communities. The goal was to standardize the data format, transport, and security protocols to exchange controlled substance history information between PDMPs and EHRs as well as pharmacy systems. 

These actions are beginning to bear fruit. These pilots have recently concluded and seven of 10 participating vendors are now moving PDMP functionality into production, leveraging the pilot’s final implementation guide. Appriss has indicated that many EHRs are indeed integrating to their PDMP gateway. 

It is clear that 2017 will see increased legislative movement to require EHRs to integrate with PDMPs and prescriber workflows. The ONC pilots have shown a technical path forward. Now is the time for forward-thinking EHRs to capitalize on that progress and get ahead of the legislative curve. It will create competitive advantage, serve as a tremendous value-add to prescribers, act as a proactive means to improve patient care, and potentially save lives.

Connie Sinclair, RPh is director of the Regulatory Resource Center of  Point-of-Care Partners of Coral Springs, FL.

Readers Write: Seven HIT Talent Trends to Watch in 2017

December 21, 2016 Readers Write No Comments

Seven HIT Talent Trends to Watch in 2017
By Frank Myeroff

image

Here are seven talent trends that are shaping the HIT workforce.

  1. C-level title of chief robotics officer rises. Expect more than half of healthcare organizations to have a chief robotics officer (CRO) by 2025. Since healthcare is an industry where robotics and automation play a significant role, the CRO will have a similar status to that of the CIO today within the next few years. The CRO and their team will manage the new set of challenges that comes with Robotics and Intelligent Operational Systems (RIOS). They will translate how to use this technology and how it is linked to customer-facing activities, and ultimately, to organizational performance.
  2. Talent raids to acquire HIT leaders. Top-tier HIT talent is a core factor in the success of any healthcare organization. Yet there is an insufficient talent pool from which to acquire IT leadership. This labor shortage is causing those on the front lines to talent poach from other healthcare organizations. Right now, the competition for highly qualified and experienced leaders is at an all-time high due to several factors including an underinvestment in leadership development and tighter operating margins that influence workforce strategies.
  3. Videoconferencing for telehealth grows in popularity and jobs. While not exactly new, videoconferencing is gaining popularity in healthcare due to the advances in HIT infrastructure and communication as well as the need to serve the aging population and those residing in remote areas. Healthcare practitioners are increasingly adopting these interactive video applications to offer better access to healthcare as well as deliver improved patient care at reduced prices. Additionally, patients are finding benefits to using this real-time, two-way interaction since it enables healthcare providers to extend their reach of patient monitoring, consultation, and counseling. The most popular HIT professionals sought after in videoconferencing are implementation specialists and telehealth directors.
  4. Burgeoning cybersecurity job market. Healthcare organizations of all sizes are in the hunt for skilled cybersecurity professionals. Just about every day there’s a story regarding a data breach incident within the healthcare industry. Many of these incidents could be attributed to unfilled cybersecurity jobs. Since the current demand is greater than the supply, a career in this sector can mean a six-figure salary, job security, and upward mobility. The cybersecurity industry as a whole is expected to grow from $75 billion in 2015 to $170 billion by 2020, according to Forbes.com. In addition, the demand for the cybersecurity workforce is expected to rise to 6 million by 2019 with a projected shortfall of 1.5 million.
  5. Working remotely fully takes off. Working from anywhere and at any time will become a normal every day thing. By 2020, it is expected that 50 percent of workers in the US will be working either from home or another remote location. Having virtual employees is not only a way to get things done round the clock, without commuting, and with hard-to-find skill sets, but is also a way to meet the needs of employees who don’t live near the organization.
  6. Boomerang employees more common. Boomerang employees are employees who leave an organization only to return back to that same employer sometime later. Rehiring these former workers are on the rise. With HIT talent at a premium, it only makes sense. HIT Managers know that hiring back someone they know is easier than recruiting new blood plus it saves money on training and development. In addition, there’s an immediate ROI.
  7. 3D technology careers wide open. Everyone is talking about 3D printing these days. It is expected to be the top medical innovation in 2017 for the reason that it could change everything for transplants and prosthetics through customization. As the 3D industry continues to evolve in 2017, the job market is wide open. In fact, jobs are appearing faster than candidates can be recruited. Young HIT professionals, especially software developers, should see this market as having huge potential for beginning a new career.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

Subscribe to Updates

Search


Loading

Text Ads


Report News and Rumors

No title

Anonymous online form
E-mail
Rumor line: 801.HIT.NEWS

Tweets

Archives

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Reader Comments

  • Matthew Holt: Sounds like you should have remembered the Rx prescribing part of your behavioral therapy residency and doled some out ...
  • Bob: Since there are about 2.4M patient visits a day in the US, 1.7M transactions would be unbelievable accomplishment, espec...
  • HIT Girl: It assumes that people responding to the poll are people who have experienced harassment and assault. If you haven't, t...
  • Mr. HIStalk: Agreed. Providing a "none of these" option for a multi-option poll doesn't work, though, and the space allowed for the d...
  • Seargant Forbin: Betsy you nailed it! There is a huge disconnect between the general population and healthcare. I don't think that younge...

Sponsor Quick Links