Technology and Society
Saturday, March 23, 2013
The User Experience Equation
User Experience (UX) is emerging as the umbrella term for aspects of design that concern the user. There are two components that determine UX: the user and the product interface.
Tuesday, September 20, 2011
Reading Notes: Complications
Complications is Atul Gawande's first book of insider stories and observations on the "imperfect science" of medicine. As the title implies, the focus is on areas of medicine, surgery in particular, where things are not straightforward. Where human doctors make human mistakes Where subjects such as pain, nausea, and obesity leave us with mysteries to be solved. While the whole book I found very interesting and enlightening, the first section in particular has interesting things to say about Human Factors in healthcare settings.
The biggest take-away point from Complications is that there is a sentiment in America that doctors are superhuman, trained and vetted so thoroughly that they can do no wrong, and if they do wrong, they must be bad doctors. Gawande argues that all doctors make mistakes, and he recounts some of his own. The problem with the pervasive attitude of the infallible superdoctor is that it gets in the way of developing the best practices to deal with the reality of the all-too-human doctor. Latent error—that is, error that is "waiting to happen" due to an imperfect system component—can be reduced if they are not written off as inevitable. Mistakes can be analyzed and accounted for if they are given the chance to be recognized and addressed.
The biggest take-away point from Complications is that there is a sentiment in America that doctors are superhuman, trained and vetted so thoroughly that they can do no wrong, and if they do wrong, they must be bad doctors. Gawande argues that all doctors make mistakes, and he recounts some of his own. The problem with the pervasive attitude of the infallible superdoctor is that it gets in the way of developing the best practices to deal with the reality of the all-too-human doctor. Latent error—that is, error that is "waiting to happen" due to an imperfect system component—can be reduced if they are not written off as inevitable. Mistakes can be analyzed and accounted for if they are given the chance to be recognized and addressed.
The aviation industry has reduced the frequency of operational errors to one in a hundred thousand flights, and most of those errors have no harmful consequences. . . . Of course, patients are far more complicated and idiosyncratic than airplanes . . . Yet everything we've learned in the past two decades—from cognitive psychology, from "human factors" engineering, from studies of disasters like Three Mile Island and Bhopal—has yielded the same insights: not only do human beings err, but they err frequently and in predictable, patterned ways. And systems that do not adjust for these realities can end up exacerbating rather than eliminating error.
Gawande also notes, recounting the observation of James Reason in Human Error, that:
Disasters do not simply occur; they evolve. In complex systems, a single failure rarely leads to harm. Human beings are impressively good at adjusting when an error becomes apparent, and systems often have built-in defenses. . . . When things go wrong, it is usually because a series of failures conspires to produce disaster.
Saturday, August 20, 2011
Brief Thoughts - The User's Perspective
Can you see where the USB ports are? |
Subscribe to:
Posts (Atom)