1 Health Care Data AnalyticsUnit 9: Usability Hello, welcome to component 24, Unit 9, Usability. This material (Comp 24 Unit 9) was developed by The University of Texas Health Science Center at Houston, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 90WT0006. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. To view a copy of this license, visit Health IT Workforce Curriculum Version 4.0
2 Usability Lecture – Learning ObjectivesDiscuss the different threats to HIS usability Determine a plausible analysis given a usability concern The objectives for this unit are to: Discuss the different threats to Health Information Systems Usability Determine a plausible analysis given a usability concern. Health IT Workforce Curriculum Version 4.0
3 What is Usability? The effectiveness, efficiency, and satisfaction with which users can achieve tasks in a particular environment (Webster’s Online Dictionary, Human Factors Human-Computer Interaction Before we can discuss usability reasonably, we need to determine a definition for usability. According to Webster’s Dictionary, usability is the effectiveness, efficiency, and satisfaction with which users can achieve tasks in a particular environment. So for the purposes of this unit, we’re going to focus on human factors and less on human-computer interactions, because we’re really concerned with usability around data analytics.
4 Human Factors ObjectiveFocus: Human beings and their interactions with products/equipment, tasks, environments (Micro, macro, ambient) Goal: Design systems and system components to match the capabilities and limitations of humans who use them Optimize working & living conditions So what is the central focus of human factors work? Broadly, it is people and their interaction with all of the technologies and systems. The goal of this work is to optimize these technologies such that they match the capabilities and limitations of people who use them.
5 Human Factors Ergonomics: 3 Major DomainsPhysical Ergonomics Cognitive Ergonomics Organizational / Macro-ergonomics Human Factors Ergonomics have three main domains. Ergonomics is broadly conceived as the study of work and factors that affect it. The term is sometimes used interchangeably with human factors. In any case, we can characterize the three main domains that you see on this slide.
6 Cognitive Ergonomics Concerned with mental processesTopics: mental workload, decision making, skilled performance, HCI, work stress, Application to Health: usability, designing training systems, usable interfaces Examples: Event report systems Implementing incident analysis system For this unit, we would focus on cognitive ergonomics. This is concerned with the mental processes. Cognitive ergonomics includes mental workload, an issue that we would come back to later and usability of systems. This lecture focuses predominantly on cognitive issues.
7 Human Attention Selective MechanismResource needed for information processing Limited Sharable Flexible Human factors focus on different dimensions of cognitive capacity, including memory attention and workload. Our perceptual system inundates us with more stimuli than the cognitive system can possibly process. Attention mechanisms enable us to selectively prioritize and attend to certain stimuli and attenuate other ones. Attentional resources are limited; they also have the property of being shareable which enables us to multitask by dividing our attention between two activities. If we are driving on a highway, we can easily have a conversation with a passenger at the same time. However, as the skies get dark or the weather changes, or suddenly you find yourself driving through winding mountainous roads, you will have to allocate more of your attentional resources to driving and less to the conversation. Most states have outlawed the use of handheld cellphones while driving because they serve to divide one’s attentional resources and greatly increase the likelihood of accidents and highway fatalities. On the basis of studies thus far, it is not clear that using a hands-free cellphone has any effect on reducing driving accidents. It has the effect of sapping one’s needed attentional resources.
8 Information Overload Speed Stress Load Stress Speed/Accuracy TradeoffHumans can easily go into information overload. This slide describes some of the conditions for overload. If you are under pressure to increase the pace of your performance or you are burdened by a heavy information load, the quality or accuracy of performance is likely to degrade. There is a speed accuracy trade-off. As you increase your speed of performance beyond a certain threshold, you increase the probability that the quality or accuracy of your work will degrade.
9 Why Usability Matters: The Clinical StoryAt least 40% of systems are either abandoned or fail to meet minimum requirements HIMSS consider poor usability of clinical information systems as possibly the most important factor hindering adoption Problems with adoption, productivity and patient safety So why does usability matter? I don’t think we need to make a case at this point for usability evaluation but there is certainly one to be made on both the clinical and patient side. At least forty percent of clinical information system implementations are either abandoned or fail to meet minimum requirements of use. This is a rather conservative estimate. The Healthcare Information and Management Systems Society, better known as HIMSS, considers poor usability of clinical information systems as possibly the single most important factor hindering adoption. Usability problems are not only associated with poor system adoption, but a lack of efficiency and productivity and even problems associated with patient safety and medical error. Two items were really focused on for analytics. One can expect that clinical information systems that are difficult to use will be associated with user-fatigue, frustration and high error rates.
10 Why Usability Matters: The Story from Patients and ConsumerseHealth interventions offer significant promise to bridge the digital divide: Problems with usability and poor design disproportionately affect lower computer literacy users Exacerbate the digital divide and possibly increase health disparities Chronic illness and Aging populations have special needs: More susceptible to usability problems There is also a similar story to be told in terms of patients and health consumers. There are a growing number of high quality e-health interventions and they offer significant promise to bridge the digital divide, to reach out to users who are normally disenfranchised by such systems. However, problems associated with usability and inadequate designs are likely to present significant problems for this population. These problems disproportionately affect lower computer literacy users. Similarly, patients suffering from chronic illness and older adults have special needs and they are also more susceptible to usability problems. The net effect of such interventions could be to further exacerbate the digital divide and possibly even increase health disparities. This would be a very unfortunate turn of events.
11 Usability Evaluation MethodsInterviews/Focus groups Questionnaires/Surveys Ethnographic Observations Usability Inspection Methods Usability Testing Controlled Cognitive Experiments So given all this, if we are looking at usability of data analytics or other systems, how do we evaluate usability? Today we are going to talk about a few different usability evaluation methods. Ethnographic observations refer to observing users in the real world as they perform computing tasks. Controlled cognitive experiments may be employed to investigate how a given system impacts or transforms human performance. These are conducted in laboratory settings.
12 Usability Evaluation for Data AnalyticsThere is another way to represent usability methods. It is easier to think of five larger classes or types of methods that can be further sub-divided into a range of methods. We’ll talk a little bit about the first four classes of methods namely, interviews, questionnaires, usability inspection and usability testing. So interviews and focus groups are where we’re talking to people in either a semi-structured or structured setting, trying to find out their feelings about whatever it is we are interviewing them about. Usability inspection involves a cognitive walkthrough and heuristic evaluation; we’ll talk a little bit more about that. The usability testing involves laboratory testing or field testing. And questionnaires and surveys generally use a Likert scale or some other type of ranking scale to elicit opinions about the item that we are assessing. And as we said previously, the observations are just what we observe with someone when they are trying to use a system. 1.1 Figure: Preece, J., Rogers, Y., & Sharp, H. (2007).
13 Usability Principles Visibility of system statusMatch between system and the real world User control and freedom Consistency and standards Help users recognize, diagnose, and recover from errors Minimize memory load Emphasize recognition rather than recall Flexibility and efficiency Motivation and engagement So what do we evaluate for when we are evaluating? Nielsen articulated a set of very general and broadly applicable heuristics. Visibility of system status refers to how easily one can determine the state of the system at a given moment in time. For example, if you just clicked on a link on a webpage and it’s taking a long time to load, you should be able to tell whether the server is slow or overloaded or whether the page is no longer accessible. In most cases, you simply don’t know what is going on. A match between the system and the real world suggests that the system should speak the users’ language using words, phrases and concepts familiar to the user rather than system-oriented speak. And this was covered in the unit on “Displaying Our Data.” It should also follow real world conventions; for example you expect a button to be pressed sand a scroll bar to slide up and down to take a very simple case. Minimizing memory load suggests that a user should not have to memorize complex command sequences to use an application. A system that provides multiple clues and texts or icons to guide the user will diminish memory load. When memory load is reduced, a user can devote more of their energy to the task at hand. This is very important when clinicians are interacting with clinical information systems. Their energy should be devoted to tasks and activities that facilitate patient care rather than having to negotiate the vagaries of a clunky display. Errors are inevitable but a good system should allow a user to recover from an error without the risk of disastrous consequences, such as loss of data. Consistency and standards refer to the fact that a system should adhere to widely acceptable standards and there should be a measure of consistency across all display and modules within a given system. Finally, an application that motivates users and is engaging and pleasurable to use will likely be more widely adopted than one that is not.
14 More Specific HeuristicsAutomate unwanted workload: Free cognitive resources for high-level tasks Eliminate mental calculations, estimations, comparisons, & thinking Reduce uncertainty; display data in a clear and obvious manner Reduce cognitive load by bringing together lower level data into a higher-level summation Present new information with meaningful aids to interpretation: Use a familiar framework, everyday terms, metaphors Some researchers have taken Nielsen’s Heuristics and tried to make it more specific or tailored to a particular context. The heuristics proposed by Gerhardt-Powals is focused on judging a system in terms of how much energy expenditure is necessary to perform a set of tasks. And as you can see here, they’re talking about displaying data in a clear and obvious manner, bringing together lower level data to a higher-level summation. These are principles that are very important in terms of data analytics.
15 More Specific Heuristics (cont’d)Use names that are conceptually related to function Context-dependent Attempt to improve recall and recognition Group data consistently meaningful ways to decrease search time Limit data-driven tasks: Reduce time assimilating raw data, appropriate use of color & graphics Include only information needed by the user at a given time Gerhardt-Powals also suggested heuristics for grouping items effectively to reduce search and minimizing cognitive load by aggregating lower level data into summaries. Incidentally, there are several ongoing efforts to introduce summaries to aggregate patient data and electronic health records. This can save unnecessary search time for sifting through different electronic documents such as lab reports to find the information of interest.
16 Problem Severity Combination of 3 Factors:Frequency with which a problem occurs Commonly or rarely Impact of the problem Easy or difficult to overcome Persistence of a problem One-time problem or constant source of difficulty Nielsen’s Rating Scale: 0: Don’t agree there is a problem 1: Cosmetic problem only 2: Minor usability problem-low priority 3: Major usability problem 4: Usability catastrophe-Fix Now! The problem-severity scale was developed by Nielsen to be used with the heuristic evaluation. It is an excellent tool that can be used with any usability evaluation method. Rating the severity of problems is a very important step in usability analysis. The severity is a combination of three factors, the frequency with which a problem occurs, the impact of the problem and problem persistence i.e. Does it quickly go away or is the problem a constant irritant? The rating scale ranges from a cosmetic problem that is not going to be prioritized to a usability catastrophe, the latter needs to be fixed immediately because it could cause clinicians immense frustration. In Health Information Systems, the wrong display could even result in patient harm.
17 Usability Testing Gold standard for usability evaluationSet of techniques to collect empirical data while observing representative end users using the system under study to perform representative tasks Video-recorded Provide information that can lead to systems that: Easy to learn and use Satisfying to use Provide utility and functionality that are valued by the target population Characterize task-specific competencies Finally to usability testing, this is widely believed to be the gold standard for usability evaluation. The usability inspection method involves the judgement and speculation of analysts. Interviews, focus groups and questionnaires are either subjective or involve some form of reporting. Usability testing provides hardcore evidence as to the nature of difficulties that users encounter when interacting with a system. All of the evaluation methods are useful. In addition, usability testing is the most time-consuming and costly of all the methods.
18 Think-Aloud Protocol Method broadly used in cognitive research and usability testing User verbalizes his/her thoughts while performing a task Report the contents of working memory Session is audio and/or video recorded Transcript of think aloud is coordinated with video analysis The think aloud protocol is a method broadly used in cognitive research as well as in usability testing. The user is asked to verbalize his or her thoughts while examining a data display. They are expected to report the contents of their working memory, basically whatever comes to mind as they examine the display. The user is discouraged from engaging in self-analysis, for example commenting on their strategies. The session is audio and or video recorded, the transcript of the think-aloud protocol is coordinated with video analysis and provides a rather complete picture characterizing the nature of the interaction.
19 Field Usability TestingHybrid Method: Lab and Ethnography/ Field Study Naturalistic setting Numerous constraints Proscribed set of tasks Quasi- Experiment Video analysis is key Intrusive Most usability studies are conducted in a lab setting. Sometimes that’s just an office of an investigator. Field usability testing can be rather challenging but very informative. You can study users in their naturalistic setting, whether it’s in a clinic or in a patient’s home.
20 Readability Must be able to scan quickly with high comprehension12 point or greater, always >9point Allow users to change font size Visual impairments in much of the population i.e. respect system settings for color, size, font Sans serif most readable on computer screens Black on white most readable Readability in data analytics is very important; it ensures that users can scan interfaces quickly with good comprehension, an important factor for clinicians or patients under time pressures. Systems must respect users’ settings for font size or color to accommodate those with visual impairments. Sans serif fonts are most readable on a computer screen and black on white is generally most readable. Contrast and fading should be used appropriately.
21 Information PresentationColor: use to convey meaning, not decoration Consistency of color meaning Use number of colors user can remember Don’t contradict conventions e.g. red=danger, stop; green=ok, go Section 508: 8% of male users are colorblind Convey color meaning with a secondary method e.g. underlining It can be important to use color to convey meaning following conventions common in the United States. Consistency of colors assigned to meanings must also be part of the design. Colors for decoration should be limited to logos and similar branding aspects of design. It is important to note that many people are color blind and confuse red and green and blue and yellow, so secondary methods of showing the same meaning should be used. For example, text highlighted in color can also be underlined for emphasis. Hatch marks or other field techniques can be used to show differences between regions. One practice is to print interfaces in grayscale to ensure that areas distinguished by color are intelligible without it.
22 Can you tell what this is about?So let’s examine some examples of poor design and then hopefully some better. So this particular display is taken from data that was used in unit two of this component. When we look at this, it can be difficult to tell what’s going on with this. So this is the 2012 Death Rate For Males but what death rate? And what are the actual numbers? And how do we tell what’s going on with this? The colors in this are very odd with the black background could make it very difficult to see this especially if you were colorblind. So this could be a very difficult display for a data analytics result. 1.1 Chart: Data retrieved from the U.S. Department of Transportation. Health IT Workforce Curriculum Version 4.0
23 Is this better? This one is somewhat better, as you can see it tends to use a lighter background, the title is clearer, it’s the 2012 Motor Vehicle Occupant Death Rate For Males and you can see that a field is used for the bars making it easier to determine whether the numbers are. This display could be improved if you actually included the numbers at the end of the bars. So there are many ways to do a display and many ways to improve a display. 1.2 Chart: Data retrieved from the U.S. Department of Transportation. Health IT Workforce Curriculum Version 4.0
24 Minimize Cognitive LoadSame information presented graphically allows easy detection of patterns & perception, not calculation We have this one which has the same information presented graphically as well as in table form. On the top is a table showing the patient’s creatinine values and renal function over several years. On the bottom are the same values graphed, allowing one to see patterns at a glance. And this is why visual display with data is often so important. 1.2 Figure: Senathirajah,Y. (2010).
25 Usability Summary Usability for data analytics can be very important for information understanding Determining the usability of information can be complex Presenting the information to minimize cognitive load requires the use of basic readability and information formatting principles This concludes Component 24, Working with Health IT Systems, Unit 9, Usability. In summary, usability for data analytics is similar to usability for other areas of health information technology and it can be very important for true information understanding. Determining the true usability of information can be a very complex undertaken. Presenting the information so that it minimizes cognitive load requires the use of basic readability and information formatting principles. Health IT Workforce Curriculum Version 4.0
26 Usability References References Charts, Tables, and FiguresBubb, H. Information Ergonomics. In Herczeg, M., & Stein, M., eds. (2012). Information Ergonomics: A theoretical approach and practical experience in transportation. Springer Berlin Heidelberg, p. 23. Kaplan, B & Harris-Salamone, K. (2009). Health IT Success and Failure: Recommendations from Literature and an AMIA Workshop. J Am Med Inform Assoc. 2009; 16: Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods. John Wiley & Sons, New York, NY. Polson, P., Lewis, C., Rieman, J., & Wharton, C. (1992). Cognitive walkthroughs: A method for theory-based evaluation of user interfaces. International Journal of Man-Machine Studies, 36, 741–773. Charts, Tables, and Figures 1.1 Figure: Preece, J., Rogers, Y., & Sharp, H. (2007). Interaction Design: Beyond Human-Computer Interaction (2nd ed.). West Sussex, England: Wiley 1.2 Figure: Senathirajah, Y., Kaufman, D., & Bakken, S. (2010, November). Cognitive analysis of a highly configurable web 2.0 EHR interface. In AMIA Annu Symp Proc (Vol. 732). 1.1 and 1.2 Charts: National Highway Traffic Safety Administration. (2012). Traffic Safety Facts Retrieved from https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812032 No Audio. Health IT Workforce Curriculum Version 4.0
27 Health Care Data Analytics UsabilityThis material was developed by The University of Texas Health Science Center at Houston, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 90WT0006. No Audio. Health IT Workforce Curriculum Version 4.0