IT, Risks and Ergonomics
The Modern Problem
During the recent centuries, technology has developed in a frantic pace - the top inventions of today will move into museums of tomorrow. Yet, the more technologically developed a society is, the more vulnerable it is to disruptions and accidents.
After all, the human has not evolved with technology. A legend about one of the early 'cars', the steam-powered 'truck' built by Nicholas Cugnot in 1796, tells about what has been considered one of the first known traffic accidents. One of the prototypes by Cugnot rammed the stone wall of the Arsenal by accident, knocking it over (with apparently no harm to the machine). Now we can imagine a modern Porsche Carrera hitting the same wall...
Or, we can cite the Breakfast of Champions by Kurt Vonnegut:
"A flying saucer creature named Zog arrived on Earth to explain how wars could be prevented and how cancer could be cured. He brought the information from Margo, a planet where the natives conversed by means of farts and tap dancing. Zog landed at night in Connecticut. He had no sooner touched down than he saw a house on fire. He rushed into the house, farting and tap dancing, warning the people about the terrible danger they were in. The head of the house brained Zog with a golfclub."
Finally, Russians have a proverb: "Wanted the best, came out as usual".
Risky Business
There is no such thing as 'zero risk' - even the most advanced technology is no match for human capability to creative errors and malice (e.g. using passenger planes as cruise missiles). Risks can be reduced though, and mostly by addressing the human side. Most of the related courses so far have addressed environmental safety and ergonomics, IT has often seen only superficial attention. This was also a possible reason why the "Risk and Safety Management" course in the initial year of our college (2000) was met with hostility - the students had to pay substiantial tuition fee back then, and did not tolerate 'being fed some pointless crap'. These students mistook the poorly chosen approach to the subject for the whole thing being pointless. Too many people consider a plane crashing into a house a disaster, but a crashed server just an "IT problem". But what if the plane crashed due to the server crash? While IT risks have some specific traits, a lot is common with other areas.
Some examples of IT risks:
- everything about electronic/online elections - the infamous Diebold machine in the US, the first years of Estonian e-elections having been "Microsoft exclusive" (the software only worked on IE) etc
- the world of social engineering and online scams
- exposure of directly dangerous information (bomb-building instructions, torture/murder techniques etc)
- malware
- cyberwar and terror in its many forms
An important way of reducing risks is to make things easier and more comfortable to use. This is where ergonomics comes into play.
Making things comfortable
Ergonomics comes from the Greek ἔργον, meaning "work", and νόμος, meaning "natural law" - thus literally meaning "laws of work". Besides different aspects of IT, it involves biomechanics, physiology, psychology and a number of other disciplines. Yet the common (albeit a little simplified) definition could be "the art of making things comfortable". Ergonomic furniture is comfortable (functional, durable and aesthetically pleasing), so are ergonomic clothes (fit the person well, use quality materials) and ergonomic IT (easy to learn, well-documented, unambiguous). This is often just what is needed to reduce risks.
When someone says "man-machine system", we likely envision a cyborg, or at least the archetypal "computer nerd" using a toilet seat for sitting in order to cut back on toilet breaks. But actually, such a system is formed every time we sit at the computer and dismantled when we leave - and errors may occur on both sides.
Receiving information
Most people receive the bulk of information visually, so it should be in a form that is understandable to as many people as possible. A good example are pictograms used in international airports and other facilities which see many people with different backgrounds. Still, in order to receive visual information unhindered, we need
- optimal space - the distance between the object and the viewer must be far enough to see the whole image, and close enough not to miss any details
- optimal lighting - the object must be properly illuminated; in some cases, inner lighting is necessary (e.g. emergency lights in rooms without windows)
- optimal size - the visual information must be sized according to the surroundings
- intuitiveness/understandability - the symbols used must be easy to understand. While it sounds self-explanatory, a native from Brazilian heartland may have never seen a modern airliner, so the easy-for-us plane symbol can be totally alien to him/her. Or there are symbols which have different meanings in various parts of the world. A good example is the gesture forming a circle with thumb and index finger. Most Western countries use the American meaning of "OK sign", but the same symbols means money in Japan, and in some parts of the world it can land the user into trouble.
From the IT point of view, they mostly mean adjusting the display according to user needs or using parametres suitable for most users:
- space - the system considers the user's distance from the screen and adjusts the output accordingly
- lighting - both outer (room) and inner (screen lighting) and their combined effect; this also includes special cases like violent bursts of light in some action games that can trigger epileptic seizures in some people
- intuitiveness - uniform positions and shapes of main elements of user interface (windows, menus, icons), but also unambiguity and/or lack of offense in metaphors used (e.g. the "thumbs-up" symbol is used for "Like" on some websites, but can have nasty meaning in some cultures).
However,.there are cases when visuals do not work - either we have one of the conditions above seriously lacking, or the user is visually impaired (we will come to this scenario later).
"What button is that?": switches, pointers and control
For input, today's technology mostly uses two major categories of controls: switches and pointers. In IT, a computer keyboard can also be viewed as a set of switches, while mouse is a typical pointing device. In all these, ergonomics plays an important part. While "human inside a machine" may seem a cliche from science fiction movies (e.g. Luke and Han sitting in battlestations of the Millennium Falcon, fighting off attacking fighters in the original Star Wars), these examples actually carry an important notion - controls must be placed according to human physiology. Earthly examples include anti-aircraft guns on battleships (these were actually models for the Star Wars scenes). or in more modern times, the Head-Up Displays (HUDs) of today's fighter planes. And coming to IT, we can look at the shape of the mouse (fitting the palm), spacebar on keyboards, power switches (on desktop computers - they have gone through a long way from inaccessible back corners to front panels - one possible reason being the changing role of computers from hi-tech to everyday items).
Some IT-related remarks:
- again, the placement of controls - but this also includes furniture, light sources etc. Placement of the monitor and keyboard/mouse are clear cases - but what about the desktop case? This may well depend on whether we need to e.g. write a lot of DVD-s...
- choice of controls - can be a special vs typical keyboard, mouse vs trackball etc. But also special access devices (like Braille technology), customized furniture for small people or Shaq O'Neal types etc
- clearly distinguished alternatives - a good example is the ordinary keyboard where three special keys - Enter, Esc and space - are located as far from each other as possible, they also have different (tactile) shapes. Often, user intefaces stress the difference with position (left vs right), colour (green vs red) and shape, in addition to the written label.
Availability, usability and standards compliance
These three concepts are well-known for security professionals, but are also used in a wider context:
- Availability means that the object is in a functional state at a given moment (e.g. a system keeps working even under a heavy load)
- Usability is the ease of use and learnability of the object, making it accessible to a wide circle of users (including elderly or disabled people, children etc)
- Standards compliance allows use of objects of the same class interchangeably, making them easy to replace and promoting better usability (if the user can use one, he/she can likely use objects in compliance with the same standard).
Interaction with devices
While the nature of interaction has grown more complex, the main points are still the same:
- strictly goal-oriented - small talk with devices is not (yet) a possibility
- both input and output must be suitable for the task at hand (e.g. printing out a video makes no sense)
- non-linear - may have choices, iterations and steps back
- commented - should give hints and explanations
- controlled by the human - e.g. time limit for reading a text is not a good idea
- input and output chosen by the human - e.g. a blind user can pick Braille or sound instead of visuals
- suitable for user - age, education, profession etc.
- robust - should be highly error-tolerant
- polite - should not judge or nag on the user
- instructive - should give feedback that helps correcting mistakes
- using suitable language - e.g. commanding style may support newbies ("Now do that") but irritate advanced users
Two more points:
- As in other kinds of communication, the role of the channel (or interaction mechanism) should not be overlooked.
- Notably, people with disabilities are often a good 'litmus test' for good usability, including in interaction design. Solutions that are designed to be accessible from the beginning are very likely to have good overall usability. Also, accessibility considerations implemented in early stages of design do not add to the cost of the product.
Alarm, alarm!
While risks are generally viewed as potential hazards, actual danger scenarios should be analyzed and prepared for as well. A negative scenario may involve a bunch of crazy fanatics in a plane or a broken hard disk - but skilled and calm response can reduce the negative outcome significantly.
One of the most important factors is the signal itself - it should be clear, concise, informative, not panicky. Compare two error messages:
- "Alarm!!!! You computer has a virus!!! Switch off the machine AT ONCE or you will lose EVERYTHING!"
- "Attention! The CIH-4092 virus has been discovered in your computer. It can attack the file system and make a part of the disk unreadable. Try to run F-Prot or Eset, be sure to inform the tech department at 555-2424."
All alarms should be clearly distinguishable from the background, preferrably also using different senses at once (most typically, visual + audible - but remember that people can have different abilities).
Finally, many alarms are only forwarded to those who can influence the situation. Ships and planes have innocent-sounding code phrases that only inform the crew who then can act without panicking passengers in the way. Likewise, IT alarms should typically involve administrators and other staff, not ordinary users.
Conclusion
Technology keeps evolving, humans... not that much. Thus "softer" disciplines like ergonomics and human-computer interaction design play an increasingly important role in grounding IT risks.