Home » Interviews » Currently Reading:

HIStalk Interviews William Hersh, MD

August 5, 2009 Interviews 1 Comment

William Hersh, MD is professor and chair of biomedical informatics at Oregon Health & Science University, Portland, OR.


How many informatics people is it going to take to support projects launched due to HITECH?

I’ve actually not sat down and penciled that out. I published some research last year using the HIMSS Analytics database, which is an admittedly imperfect source of information, that moving everyone to Stage 4, which was CPOE and clinical decision support, would require another 40,000 people on top of the 109,000 or so that are presumably already working in health IT if you extrapolate.

One of the big questions, though, is the role of people who do informatics. The HIMSS Analytics database focuses more on IT. There’s a growing recognition of people who work in informatics, who work at that intersection between IT and healthcare, and are working with IT but in more of an informatics role, a focus on information. The estimates there vary. Chuck Friedman, who is now David Blumenthal’s deputy, came up with a back-of-the-envelope calculation of about 13,000. Don Detmer, the former president of AMIA, said it’s probably more like 50 to 70,000, but probably somewhere in that ballpark for the informatics people.

We’ve got a lot of people in the industry who learned on the job when there wasn’t any formal training available. What results will we get by using the formally trained people?

There are many ways to learn things. I don’t have a degree in informatics myself. A lot of fields begin with people who blaze a trail. They pick things up, learn things on the job. That’s probably, when we’re talking about industrial scale health IT, not practical. Plus, there is a growing base of knowledge that many of those trailblazers learn.

I think the world is changing and there’s going to be more of a need, just for efficiency reasons, to train people, to give them the knowledge and skills. The old-timers kind of learn that through the college of hard knocks, but to really scale things up is going to take more formal training.

There are so many ways to get training — ANCC certification for nurses, 10×10, graduate certificate programs, full-fledged undergraduate degrees, and graduate degrees. What are the disadvantages and advantages of all of those credentials compared to what the market needs?

I think the market will probably sort itself out. People ask me, “Will I need to have a credential to work?” And at least in this point in time, the answer is no, but that could change in the future, just like anything in our economic system. The value will be what people put in it.

There may be a time in the future when people go to apply for a job and potential employers might look at a group of resumes where one person has some formal knowledge that’s been validated by some sort of certification process, as opposed to someone who has just learned things on the job, and that may tip the scales.

If the people are otherwise equal, if the person has been in the job for 20 years, that may tip the scales the other way. So I think it will sort itself out over time.

Informatics is a profession but with different roles, such as a nurse who sits in front of a screen and builds order sets all day all the way up to a physician who is an architect for an entire system, yet each could call themselves an informatician. Is there a need for more granularity in what people are doing with their credentials rather than which credentials they have?

Yes there is. The reality is, in some ways the informatics field has some similarities with an MBA. People typically don’t come into an MBA program with a business degree. They come from all walks of life. I know of a couple of physicians who have MBAs who’ve gone on to management jobs in healthcare.

What I always tell people is that informatics is a very heterogeneous field. There are many different kinds of things and there are many heterogeneous pathways into the field. There are many heterogeneous pathways into jobs, although usually the job that you do is somewhat a function of your background. So for example, it’s pretty unusual to see a CMIO who’s not a physician, or at least who’s not a clinician, but typically a CMIO is a physician.

But there are other jobs, such as managing an EHR implementation, that are more suited for someone who has knowledge of healthcare but who isn’t necessarily a clinician. And it’s also a jobs-in-between, like the nurse who creates order sets or order entry screens and things like that, like you mentioned.

A physician colleague I know was in the OHSU graduate certificate program and said it was hard, with a lot of statistics and epidemiology. Is there a presumption that there is a base set of knowledge that would be more typically found in a physician?

I’m actually kind of surprised to hear that. In our graduate certificate program, they don’t — for example, like they have to for our master’s program — take a statistics course. I guess it depends on how you’re defining statistics and epidemiology.

Our certificate program, and I think in a lot that are forming, is focused on giving people the knowledge of informatics in terms of what you do with information to improve healthcare, to improve its quality, safety, etc., and then how you go about implementing systems to be able to capture and use that kind of information.

There’s not a massive amount of statistics and epidemiology in our certificate program. I don’t have the exact numbers in front of me, but about 50% in our program are physicians. Of the remaining 50%, about 25% are in healthcare professions like nursing and pharmacy. The other 25% are from everything else, with probably the majority of which is IT. We do get a fair amount of people with IT backgrounds who want to learn more about healthcare to be able to apply it.

Going back to your other question, “Do people need to be trained?” We actually have a lot of people who are already in CMIO kinds of jobs and then realize that they need to learn more informatics and enroll in our program. I chuckle about it sometimes because that’s not usually the way you go learning about a field — after you’ve gotten your job. You wouldn’t send a surgeon off to residency after they started doing operations.

If you’re reporting to a hospital executive, they probably don’t know anything about informatics enough to say, “Yes, I want my person to have a certificate, if not a degree”?

Yes, it’s true. This is still a new field that is sorting itself out. Another problem related to what you’re describing is that HR departments know very little about it, although spending time talking to more and more forward-looking healthcare organizations, the HR departments are starting to learn about the value of this.

There’s a local hospital here in Portland, Providence Health Systems, and they are doing a lot of effort, mostly internal development, but more appropriate, like sending people off to learn more, but also developing internally an informatics cadre, if you will.

It seems like the ideal appetizer for the training would be the 10×10 program, which you were involved in that early on. How does that fit in now and is it going to meet the goal of 10,000 people by 2010?

The 10×10 program was started when AMIA was looking to have some kind of e-learning option. They had talked to some vendors and it would have been prohibitively expensive. We actually already had a broad-based course in our program here.

So I suggested to the AMIA people, “Why don’t we just take this course? We can repackage it as more of a continuing education course.” That’s how it came about. AMIA turned it into a sort of a program that let other universities offer 10×10 courses and so forth.

We won’t hit 10,000 people. In fact, there have been 750 people who have done the 10×10 course. But the main reason why we won’t hit the 10,000 people is that 10,000 people haven’t come forward and said, “We want to do this.” But about 750 have, and we have some published data saying that people find the experience worthwhile.

The way we’ve structured the 10×10 course here is that since it is essentially equivalent to our introductory course. People who do the 10×10 course can then get credit for the introductory course in any of our graduate programs, even all the way up to our PhD program. The graduate certificate program consists of eight courses, one of which is that course, so then they have to take seven more.

10×10 is a broad-based and intensive but introductory experience to informatics. I don’t know if anyone will become a high-end informatician unless they have loads of experience just with that one course.

The tough thing about establishing a credential is that you’ve got to market it to employers. Do you think vendors, given their emphasis on “just get stuff in and installed”, maybe don’t really care too much about the theoretical nature of informatics and are never really going to embrace a credential?

We’ve had vendors who have sent some of their folks. Some of the big vendors have sent a few of their folks to learn about it. I agree, vendors are focused on getting your systems up and running.

I wouldn’t call 10×10 a theoretical course because it’s pretty practical, these issues with implementation, with standards, with quality measures, and things like that. I mean it’s definitely an academic course, but it’s actually not highly theoretical.

We don’t know for sure, but at least half the people get their tuition paid by their employers. Typically, hospitals will send people, sometimes universities, and again, we have had a number of vendors who have sent some of their staff.

I think it would be a great course – obviously, I’m biased — for vendors to just get a bigger sense of the marketplace. With all the expectations of the stimulus package, the vendors are going to have to be a little — you probably know this as well as I do — more cognizant about standards and interoperability, because it’s going to be expected of them, whether they really deep down want to do it or not.

What do you think HITECH is going to do in terms of innovation?

I think that if we define Meaningful Use at a reasonably good level, a level that most people can hit, and we make interoperability a big part of it, that will drive the vendors in a way — I mean, I remember 15 years ago when people were saying, “Should I demand of my vendor that they speak HL7 Version 2?” It was really customers that drove that, and I think it’ll be the same way. 

It’ll be customers and the Meaningful Use guidelines, at least around things like interoperability, that vendors say, “If I want customers to be able to meaningfully use my EHR system, we’re going to have to do this interoperability thing whether we deep down want to do it or not.” I think it’s important for people to know what the issues are around interoperability standards and so forth which are the kinds of things that we teach in courses like this.

Maybe that’s part of the reason programs haven’t picked up — vendors aren’t really developing a lot of new products. Are people with all this formal training going to be disappointed when they go to work for a vendor and realize they won’t get to design a lot of fun new stuff and re-architect systems that have been on the market for 20 years?

That’s actually hard to know. We have about 250 alumni already in our program. That’s not 10×10; those are actually certificate or master’s program. A small number of them work for vendors.

I think they probably have mixed feelings about their job. It probably varies from vendor to vendor in terms of what things people get to do. Of course, you might get a job with an innovative vendor, but you might get stuck on some project that you really don’t want to do anyway.

But I think one of the good things — maybe I’m a little idealistic about this — but if we come up with good, achievable definitions of Meaningful Use, that we can get the vendors or companies … I’ve never worked in the private sector but I certainly know a lot of people in the private sector, and at the end of the day, you’ve got to make a profit. But if we set the motivations for the vendors right, then hopefully we can make them do the right thing and keep their feet to the fire, just like some of the hospitals and physicians will have to be kept to the fire, too, in terms of implementing things that are Meaningful Use.

You were involved with clinical data sharing before it was a hip thing to do. What’s your vision of where it should go?

I think we need to be realistic about it. We need to recognize, for example, that the kinds of things we can do with clinical data sharing, when we have good, quality data to do it — quality measurement, for example — is a great thing. I don’t think anyone is opposed to it in principle, but the question is, can we get good enough data and meaningful quality measures and act on them?

I think that a lot of times people think that just because data is in electronic form that that means that you can do anything with it, like it’s the gospel. The reality is that, for clinicians in the trenches, high quality data is not their top priority. Usually their documentation is what stands between them and their getting home for the day.

I think we need to focus on trying to develop ways to help clinicians to get the best data in the systems so we can do things with it like quality measurements and health information exchange, all the kinds of stuff we talked a lot about now, but all that depends on as good a quality of data as possible. I also think we need to be realistic in what we can and cannot do.

Is it skewed toward having physicians input their own information to create all this quality data that someone else gets to use?

Speaking as a physician, although I actually don’t do patient care these days, I sympathize with a physician when someone ends up imposing an extra hour onto their day in terms of entering data.

This is where I think the research comes in. How can we find ways to get the highest quality data and not increase the cost of getting it? If it truly takes an extra hour a day of physician time, ultimately we’re going to pay for that, and I’m not sure if the healthcare system or the payors are willing to do that.

I think we need to find ways of getting as good a quality of data as possible, but I think physicians are going to have to change their ways a little bit, too, and recognize that they can’t just scribble things, that we need a certain standard of quality for data. I think this could be a role for physician specialty societies — groups like ACP, AAFP that have initiatives — looking at these sorts of questions, like how do we get the best data without taking an inordinate amount of time?

Is it the right step to shoot the government’s wad on putting out electronic medical record systems that didn’t take advantage of any of that research and say, “Look: type or use a mouse, it’s up to you, but that’s how it goes in”?

There’s definitely a risk in what we’re doing. On the other hand, we need to be bold and make it happen, just like healthcare reform. There’s going to be no perfect healthcare reform because we have so many different competing interests, but I think we’ve got to do something because the status quo is not acceptable.

The same thing is true when it comes to information. We need to be bold, but again, I know there’s been a lot of arguments about what should and should not be in Meaningful Use, but I think that it’s a good bar that most people with the right amount of effort can hit. That’s what we ought to aim for.

What technologies that aren’t necessarily mainstream now that can make a difference?

You know, it’s funny, because when I talk about informatics, I often times say, “We can’t be too focused on technology; we need to be focused on information.” I think it would be technologies that help people enter high-quality data, so maybe there will be some kind of role for some things like speech recognition with real-time transcription, or data entry interfaces that have structured interfaces but don’t completely box you into choosing this checkbox or that checkbox.

Whether that’s going to be handheld devices — they’re obviously portable and convenient, and they’re wireless now. On the other hand, they have tiny screens, and things like typing on them are very difficult.

So it’s hard to predict which technology — again, I think the focus should really be on what we want to do. To me, the most important issues in informatics are getting high-quality, standardized, interoperable — I’m actually less concerned about interoperable applications. Those will come if you have interoperable data.

We really need to accelerate trying to standardize clinical data and obviously make it available with obviously all the security protections and so forth, but across applications. The rest of the interoperability, and also things like health information exchange and quality measurements, will come from that. 

My last question, elicited from your previous answer — and this is an A or B answer only, there is no “all of the above” — is informatics about technology, or is it about people and organizations?

Unhesitatingly about people and organizations. That’s an easy one. [laughs] I mean, it’s what you do with the technology. You can’t be ignorant of the technology; you have to understand it and be facile with it, but informatics is about people and organizations, basically improving healthcare and improving people’s health.

And your programs focus on it in that way rather than about technology?

Absolutely. You can come here or online or whatever, and learn a lot about technology and get involved in projects that do a lot with technology, but at the end of the day, it has to have some value to health or healthcare, making people’s lives better. We emphasize that. I think that most informatics programs emphasize that point of view.

I think one thing that’s happened in the last few years is that the informatics field has kind of matured a little bit and recognized its role. Again, I don’t want to say that technology’s unimportant, because it’s very important, but it’s what you do with it that’s more important. I think that informatics has kind of recognized its role in that realm.

Any other comments?

Obviously I have a little bit of a bias toward the academic/education side of the field, but I do think that there is growing knowledge in this field and that people benefit from knowing it. That’s one of the roles that academic programs are going to play. I actually believe that informatics will mature as a profession as a result of that knowledge.

No matter what happens with ARRA, the trajectory was already to increase the use of health IT and I think that will continue, probably accelerated through ARRA.

HIStalk Featured Sponsors


Currently there is "1 comment" on this Article:

  1. I’m not sure if physician admninistrators of EHR/decision support systems need formal I.T training. I think a background in epidemiology would be a plus, but not a requirement. I’ve been designing production ehr systems for more than a decade, and apart from the physical implementation, in my experience, there is little ongoing role for I.T. in terms of maintenance or content development/evolution. it is unlikely that such a light I.T requirement would be applicable at all vendor installations however.

    A huge amount of time and energy can be wasted translating clinical I.T functional requirements into terms developers understand, and then QA’ing the delivered product to see that the clinical requirements were delivered properly. Why noy put together a set of high level clinical administrative screens through which the ehr/decision support system is maintained entirely by clinicians, even to the point that their practice comes to comprise electronic system development and support, either within or adjacent to the vendor community? EHR/decision support systems are conceptually much more complex than ADT, inventory and billing systems which were the mainstay of vendor offerings previously, and are properly in the realm of academic medicine, and (some day) a legitimate field of specialization in their own right. In future, I believe EHR will be entirely maintained by clinicians expert in the translation of the logic of the patient record from paper to electronic medium. “I.T” will primarily concern itself with the maintenance of physical infrastructure similar to the way the phone company supplies the infrastructure for telephone service, but doesn’t supply the apps on the iphone or blackberry. I.T will become further removed than one might think from the core competencies required to conceive, develop, or run EHR/decision support system in the future.

Text Ads


  1. ...which is strongly suggestive, that the VA's problem with Cerner implementation? It's coming a lot more from the VA, than…

  2. Whatever mess is going on with VA aside, DoD has demonstrated that Cerner can be deployed successfully at large scale.…

  3. I understand and sympathize with your elderly family member story. I don’t agree with your solution though. Why are you…

  4. OK, so full disclosure, guilty as charged. Did not read the linked article. However. There is no obvious connection, at…

  5. Could you be more specific in which KFF and Politico story you're referring to? There has been a lot of…

Founding Sponsors


Platinum Sponsors















































Gold Sponsors