Top 10* Design Problems in Clinical Information Systems

Information density vs. simplicity

Simple interfaces are better: Many clinical users say they want a very high density of information display with fewer navigational steps. This approach is often embodied when you look at systems designed or built by end users. The problem with this approach is that a large amount of visible information may in fact slow users down, as a result of cognitive overload and the effects of Fitt's Law. Cognitive overload happens when the user is bombarded with too much unprioritized or unfiltered information that is typically not needed for the current task (but might be, in some obscure instance). This is a result of not applying the 80/20 principle, and laziness on the part of the designers and builders. Many excellent displays exist that show large amounts of data in a summarized and easily grasped view. For an example, check out the Lifelines project at the University of Maryland. Fitt's Law states that the time to acquire a target is a function of the distance to and size of the target. This means that if there are a lot of little controls, text boxes, etc. to be used on the screen, the user will actually be slower in accomplishing the task than if the controls are large and grouped together. For a good discussion of this, check out the AskTog site.

Wrong patient

People use the wrong patient's chart: in a survey I did of support calls at a client site, the fourth most common reason for the call was that people had charted on the wrong patient. This doesn't even address those who simply reviewed the wrong chart and were none the wiser. Designers might be tempted to save some pixels by making the patient information small, but at best this will occasionally waste a lot of time while the user tries to fix it, and at worst result in a patient getting the wrong treatment (or not getting the one they need). Another common scenario is when the software is an MDI application and only the title bar is used for patient identification; this is not enough to really cue busy users into finding the right patient.

Task Interruption

A big problem occurs when applications put up message boxes to tell users about every little thing. Not only does this annoy clinicians (who believe that only patient harm, or potential harm, is worthy of an interruption), but users stop reading the messages and just blindly click OK. In a usability test where the system put up very similar message boxes for deleting some text and deleting an entire document, users stopped reading the message and more than once inadvertently deleted their entire document. The message box for deleting text could have been safely eliminated and users would have paid attention when it was needed.

Indistinct information spaces

Users should be able to guess correctly where things are. Sometimes the "information spaces" in an application are broken out the way they are because of how the programming teams' tasks are allocated. This rarely corresponds to how users "chunk" information. Other times the labels and/or concepts used are correct from a data-modeling standpoint, but are confusing to those who don't know how the underlying data is structured, or who don't think in abstract generalities. For example, in testing one application, I found that users had two areas where they could do work for many patients at once (vs. working in a "chart"). These were called the InBox and Patients. The Patients area was where various patient lists were viewed and maintained, while the InBox contained "to-do" type activities. Users got mixed up when looking for Lab results to review, often going to Patients instead of InBox. The conclusion was that these spaces were more alike than different, so the design was modified to consolidate everything into one area.

Not my hierarchy

Related to the problem of indistinct information spaces is the problem of unfamiliar information hierarchies. This often crops up when users need to find items or information in the system, and the system was set up with an unfamiliar hierarchy. The prototypical example of this is when physicians use an order entry system that has a deep hierarchy that mimics the catalog of each ancillary department. In the paper world, physicians typically deal with simple forms, or write-ins for orders, so the whole taxonomy of, for instance, the central supply department is profoundly unfamiliar to them, making the task of navigating the department's hierarchy frustrating and pointless. This is often compounded when the system uses the host department's name rather than the common name, so searching fails to turn the item up. Think about the last time you tried to find a file on someone else's hard disk, and how hard that was. Note that not all hierarchies are bad; ones that people learn consistently in school (for example that angina pectoris is a symptom of cardiac disease) are good candidates to use.

Doing what we know vs. doing what we need

There is a delicate balance between computerizing what people know and do already vs. giving them something new that leverages the power of the machine. Go too far in either direction and you risk usability problems, either with unfamiliar metaphors or workflows, or through perceived or actual inefficiencies. The most common example of this is the fundamental design question of how to show a patient's chart. Many companies slavishly follow the paper chart's tab metaphor, even preserving the common chart sections. The problem with this is that this layout was developed for the medical records staff's convenience, not for simplicity of data retrieval. In fact doing common tasks like correlating the dose of a drug and the corresponding lab values is very difficult in a paper chart, so replicating this onscreen makes little sense, as it wastes the power of computers to reveal these relationships.

Getting payback early

This is both a design and an implementation issue. Most of the benefit of an electronic record comes when users see historical data. This means two things: the "usage curve" to input this data must not be too steep or users will never see the benefit - they'll abandon the effort early and secondly, when planning implementation, it is critical that as much existing electronic data as possible be migrated to the new system to maximize early payback. Sometimes this may even require the hiring of clerical staff to back-fill information so the system is useful from day 1. From a design perspective this means enabling this kind of input (but perhaps not part of the normal interface) as well as making small "paybacks" happen early to keep users hooked until the big ones come later.

Shifting the burden of work

Ah politics. This has probably killed more medical systems than everything else combined. Inevitably, when implementing a computerized system, the burden of work shifts more directly to caregivers, and away from clerical staff. If users perceive that they are simply replacing clerical work without getting any real benefit, it is the kiss of death for a system. This means that there should be clear and direct benefits for users interacting with the system themselves. One study recommended that in order entry, the Radiology orders be initiated first, as the "reason for exam" typically needs clinical input where lab order often don't. Providing drug-drug and drug-allergy checking at the point of medication ordering is another good example of a tangible benefit that is lost if another person were to do the work.

* Ten problems more or less. Why get hung up on an arbitrary number?