You are on page 1of 16

User needs, and Strategies for addressing

those needs
This is a publication of the Trace Research and Development Center which is funded by the
National Institute on Disability and Rehabilitation Research of the Department of Education
under grant number H133E30012. The opinions contained in this publication are those of the
grantee and do not necessarily reflect those of the Department of Education.
Copyright 1998, Trace Center, University of Wisconsin-Madison, USA.

Table of Contents
User Needs
Strategies for addressing users' needs
Quick-re ference Table
For more information
User Needs
Humans posses many abilities for interacting. Sometimes these abilities are reduced through
environmental factors, injury, disability, or natural degradation from aging. The key to accessing
electronic devices is to use the other abilities that we have when the preferred abilities are not
available to us:
Seeing
Seeing enables us to use visual interfaces such as computer screens, VCR
programming etc., and to be able to locate buttons. For example, being able to tell if
a number pad has the "1" key in the top left or bottom left.
An inability to see anything on an electronic device might come from it being dark,
from our eyes being occupied elsewhere (e.g. while driving a car), or from disease or
injury which causes blindness. Alternative means of interacting in this case include
speech input and output, using lists of interface elements (instead of having to feel
around for them), and tactile displays such as Braille.
A difficulty in seeing electronic devices might result from it being too dark, from us
leaving our glasses behind, from having color vision deficiencies, or from disease or
injury, which can create a very wide range of visual abilities (from losing half our
vision, to having patchy or blurred vision, to having tunnel vision etc.). Alternative
means of interacting in this case include those for an inability to see (speech input
and output, using lists, and tactile displays) as well as changing the way something
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
1 of 16 4/5/2014 1:05
looks (enlarging text, changing colors or font types etc.).
Hearing
Hearing enables us to hold spoken conversations, for example to be able to converse
with another person on the telephone , to listen to music, to watch TV, to hear
devices which beep to tell us when something is finished (like microwave ovens).
An inability to hear the sounds coming from an electronic device might be from
working in a noisy office or shop-floor, from being in a noisy bar, or from disease or
injury which causes total hearing loss. Alternative means of interacting in this case
include showing visual events for audible events (e.g. a flashing light as well as a
beep), giving written text and annotation for spoken text and incidental sounds (e.g.
closed captioning).
A difficulty to hear the sounds coming from an electronic device might come from
being in a moderately noisy environment, or in a necessarily quiet environment (e.g.
a library), or from disease or injury which causes decreased hearing (e.g. conductive
hearing loss), or interrupted hearing (e.g. tinitus - a constant ringing sound in the
ears). Alternative means of interacting in this case include those for an inability to
hear (visual events for audible events), and changing the way something sounds (for
example making it louder, changing the pitch), or connecting it directly to a hearing
aid.
Speaking
Speaking enables us to hold face-to-face or over the telephone conversations, to give
speech input (for example to place a collect call using a computerized operator).
A difficulty or inability to speak (and be understood clearly by the listener) might
come from being in a noisy bar, or from being in a quiet library, from just returning
from dental surgery, from having a strong accent or dialect, or from disease or injury
which prevents the mouth functioning in the normal way. Alternative means of
interacting with electronic devices in this case include using a keyboard or keypad to
enter information instead of or in conjunction with speech input.
Touching, manipulating
Being able to touch and manipulate items with our hands enables us to press
buttons, to move switches and dials, to make gestures etc.
An inability to touch and manipulate might come from being too far away (e.g. too
high or too far to reach, like a small child at a vending machine), or from disease or
injury which causes an inability to reach (e.g. paralysis below the neck). Alternative
means of interacting in this case include using remote-control devices, and using
speech input.
A difficulty in touching might come from having gloves on in cold weather, or from
disease (e.g. cerebral palsy which might make hand-control difficult) or injury (e.g.
having a big bandage on one's hand). Alternative means of making input in this case
include those for an inability to touch (remote control, speech input) and being able
to confirm selections (e.g. giving speech or highlighting feedback, and then requiring
confirmation of that selection using another button that is separate and distinct.
Understanding
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
2 of 16 4/5/2014 1:05
Our ability to understand something is determined by the skills and knowledge we
posses to interpret and process what we experience. The ability to understand can
be reduced by such factors as load (e.g. how many tasks we are doing at one time),
stress (e.g. if we are panicking, or under time-pressure), or from fatigue (e.g. being
awake too long or expending too much mental effort). It can also be caused by
confusion when something is not communicated at our level (e.g. an engineer might
understand a term, which a layman does not; a physician might understand a
medical term, which an engineer does not). Reduced ability to understand can also
come from disease or injury which affects mental processes. Alternative means of
interacting in this case include changing the way something looks (e.g. colors, text
sizes and fonts), changing the level of the language (e.g. simplification), changing
the way speech is presented (e.g. making it faster or slower etc.), and using a remote
control which simplifies the interface.
Combinations of different needs
In addition, people might have combinations of reduced abilities, for example not
being able to see and hear well. These combinations of reduced abilities require us
to use our other abilities in sometimes unusual or innovative ways. There may be a
need to use a combination of the strategies above, or even different strategies
altogether.
Language
Our ability to understand each other comes from having a common language.
However, when we move to a foreign country, for a short period (vacation or a
business trip), or a long period, we may be unable to understand, or at least have
some difficulty in understanding the local language.
The need for general good design
Note that the alternative means of interacting with electronic devices described here
can be augmented by good design: as an example, speech input might be an
alternative when something is too far to reach, but if moving the interface elements to
within easy reach of everyone is possible, then that is a far better and more
universally acceptable solution.
Strategies for addressing users' needs
There are many different ways to change the way an electronic device behaves to take account
of the varying needs of users. The following describes strategies for addressing the needs of
users with a wide variety of abilities and limitations.
If the user cannot see the device, make it say things so they can
use their ears
Things that cannot be seen can be said, using synthesized or pre-recorded speech.
For example, items on a touchscreen interface can be pressed and their name and
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
3 of 16 4/5/2014 1:05
contents can be said aloud; buttons names on a device can be said aloud so that a
user can explore an interface before selecting items. Because of the need to
auditorally explore the interface it is important that items touched are not immediately
selected. and an alternative means of selection is necessary: this can be by requiring
confirmation by pressing another button (that is off to the side / edge and can
easily be found and used without pressing other buttons) or by pressing and holding
the button down for a short delay time until it is selected.
If the user cannot understand things that are said by the device, let
them change the way it says it
Synthesized speech is malleable in the same way that text is. It can be made faster
or slower, the pitch can be raised or lowered, the basic voice can be altered (e.g.
male / female / robotic etc.).
If the user has difficulty seeing the device, let them change the way
it looks
Text is malleable depending upon the constraints of the visual interface. For example,
fonts can be enlarged, changed between serif and sans serif, made white on black or
any other color combination. A visual interface may be constrained in the maximum
size of the text, the colors that are available, and the clarity (resolution) possible.
If the user cannot locate the buttons on the device, let them use a
list with only 3 buttons
If the user is unfamiliar with or cannot reliably find or remember where all of the
buttons are on an interface, an alternative is to put all of the items onto a list. The list
has all of the interface items (buttons, switches, text etc.) arranged logically from top
to bottom. The list can be shown visually, or auditorally. The list can be accessed by
using 3 buttons (up / down / select), or by sliding a finger along an edge (e.g. a
touchscreen). The list works because it takes a two-dimensional interface and makes
it one-dimensional. Although this is a cognitively more complex interface strategy, it
does allow access by people who are unable to locate interface elements
independently.
If the user has difficulty hearing the device, let them change the way
it sounds
Sound contains properties that can be altered, such as volume (loudness) and pitch.
Modifying these can help users who are unable to hear a device operating normally.
In addition, it is possible to directly connect hearing aids to sound sources, providing
a better listening system (e.g. headphone jack connection or telephone hearing aid
T-coil connection).
If the user cannot hear the sounds from the device, show the
sounds visually
Any sounds that a device makes can be shown visually, for example by making a
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
4 of 16 4/5/2014 1:05
display or indicator light flash when a sound is made. Spoken text and sounds can
be shown in "caption" form, enabling someone who cannot hear at all to have access
to the same information as people who can easily hear.
If the user cannot be sure of pressing the right button, allow them to
Confirm button presses
If someone cannot reliably press individual buttons (for example because they are too
small), then it might be easier to confirm button selections. This can be done by
highlighting or saying out loud with synthesized or pre-recorded speech what button
was pressed. For example, when using a cellular phone outside in the winter when
you have large gloves on: you could press the "dial" button but miss and press the
"cancel" button - but you would not have really pressed it; instead you try again until
the "dial" button is highlighted, then press the confirm button which is off to the
side of the phone away from all of the other buttons.
If the user cannot provide speech input, allow them to use buttons
instead
If someone cannot speak but an interface uses speech input, an alternative can be to
press buttons or keys. For example, at the beginning of a computer controlled
operator telephone call , a starting prompt could be "press '1' to use your
touch-tone phone to control this call, or say 'OK' now to use speech control".
If the user cannot reach or touch the device, let them give
commands by speech
If someone is unable to reach or see an interface, they can control it using speech
input together with appropriate output (either audible (speech) or visual). Words
spoken by the user are interpreted by the device and used as commands to control
the interface.
If the user can see, but can only use one or two switches for input,
let them step around the buttons using scanning
If someone can only use one or two switches (for example if they are paralyzed from
the neck down), it is possible to control the interface by having each item highlighted
(or said aloud) one by one. When the one that the user wants is highlighted, they
can select it using a single switch. With a double switch they can use one switch to
advance the highlight, and the other to select. The latter has more flexibility and
control, but not everyone can use two switches which is why single switch is
available. Note: It is possible to scan using auditory feedback, but it would be more
likely that a user would use speech output and a list to interact with the device.
If the user wants to use their own customized type of input and
output, let them use Remote Control
If a user cannot reach an interface, they can control it remotely using an infrared link.
The infrared link works in a standard way so that the remote control can be used to
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
5 of 16 4/5/2014 1:05
control an electronic device. The user points the remote controller at the device and
the device sends the remote controller the available commands for itself. The
commands can be accessed as a list (which enables them to be converted to
hand-held speech output, Braille output, large text output etc.), or as a graphical
image with buttons similar to those on the device itself. The remote control can be
configured to meet the needs of the individual user: different displays and levels of
information can be shown on the remote control, making the interface simpler to use.
If the user cannot use the standard language, let them change the
language to one they can understand
All of the above means of interaction are of no use if the standard language of the
device is not the user's own and they cannot understand it, or have some difficulty in
understanding it. Multi-language support will depend upon the location of the device:
for example, in the USA the two most predominant languages are English and
Spanish, but if a public information device were placed in a tourist area, then
Chinese, Japanese, German, French, Italian etc. would be useful to allow visitors
access to the information.
Quick-reference Table
How and what to change on an electronic device to make it usable by someone with reduced
abilities... The following quick reference table is also available as linear text.
What can't I
do with the
device?
Why can't I do it? Solutions
Can't see the
device well
too dark,
too small,
left glasses at
home,
low vision
Either:
Use voice output
Or, change the display
settings
And / or, allow speech input
Can't see the
device at all
eyes on the road,
eyes occupied
elsewhere,
too dark,
blindness
Either:
Use voice output
And / or, allow speech input
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
6 of 16 4/5/2014 1:05
Can't locate
buttons on the
device (can't
see where
they are)
eyes on the road,
eyes occupied
elsewhere,
too dark,
blindness
Either:
Use a list mode, requiring
only 3 commands
Or, enable users to confirm
their selections
Can't read the
device well
unfamiliar
language,
dyslexic,
poor cognitive
abilities
Either:
Use voice output
Or, change the display
settings
Can't read the
device at all
unfamiliar
language,
poor cognitive
abilities
Use voice output
Can't hear the
device well
in a noisy office,
in a noisy bar,
tinitus,
hard-of-hearing,
deaf
Either:
Show sounds visually
Or, change the sound
settings
Or, use assistive listening
systems
Can't hear the
device at all
in a noisy bar,
tinitus, deaf
Show sounds visually
Can't hear the
device well
and Can't see
the device
well
left glasses at
home and hearing
aid batteries just
ran out,
deaf-blind
Either:
Use voice output and
change the sound settings
Or, enlarge text on the
display and change the
sounds settings
Or, change the sounds
settings and change the
display settings
Or, use remote control with a
tactile display (e.g. Braille)
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
7 of 16 4/5/2014 1:05
Can't reliably
press only
one button at
a time on the
device
gloves,
bandages,
poor muscle control
Either:
Enable users to confirm their
selections
Or, use voice output
Can't touch
the device /
Can't reach
the device
too far away,
too high,
too low,
too much to stick
on one desk,
poor reach
capabilities
Either:
Use remote control it
Or, allow speech input
Can only use
one or two
buttons on
the device
muscles only allow
limited movement,
can't move below
the neck
Either:
Allow scanning input
Or, allow speech input
Can't talk to
the device
chewing,
back from dentist,
in a noisy bar,
strong accent or
dialect,
unable to speak,
poor speech.
Allow an alternative to speech
input
Can't
understand
the device
unfamiliar accent,
unfamiliar dialect,
unfamiliar topic /
level of detail,
poor cognitive
abilities
Either:
Change the settings for
voice output
Or, change the settings for
the display
Or, use a simplified remote
control
Can't
understand
the language
On vacation or
business in a
foreign country
Or, change the language
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
8 of 16 4/5/2014 1:05

For more information
For more information on this document contact:
Chris Law Trace R&D Center
2107 Engineering Centers Bldg.
1550 Engineering Dr.
Madison, WI 53706
Tel: 608 263-2309
TTY: 608 263-5408
Fax: 608 262-8848
E-Mail: info@trace.wisc.edu
| About Trace | Contact Us | Resources and Tools |
| Projects and Programs | News | Publications | Site Help | Search | Home |
=======
User needs, and Strategies for addressing
those needs
This is a publication of the Trace Research and Development Center which is funded by the
National Institute on Disability and Rehabilitation Research of the Department of Education
under grant number H133E30012. The opinions contained in this publication are those of the
grantee and do not necessarily reflect those of the Department of Education.
Copyright 1998, Trace Center, University of Wisconsin-Madison, USA.

Table of Contents
User Needs
Strategies for addressing users' needs
Quick-re ference Table
For more information
User Needs
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
9 of 16 4/5/2014 1:05
Humans posses many abilities for interacting. Sometimes these abilities are reduced through
environmental factors, injury, disability, or natural degradation from aging. The key to accessing
electronic devices is to use the other abilities that we have when the preferred abilities are not
available to us:
Seeing
Seeing enables us to use visual interfaces such as computer screens, VCR
programming etc., and to be able to locate buttons. For example, being able to tell if
a number pad has the "1" key in the top left or bottom left.
An inability to see anything on an electronic device might come from it being dark,
from our eyes being occupied elsewhere (e.g. while driving a car), or from disease or
injury which causes blindness. Alternative means of interacting in this case include
speech input and output, using lists of interface elements (instead of having to feel
around for them), and tactile displays such as Braille.
A difficulty in seeing electronic devices might result from it being too dark, from us
leaving our glasses behind, from having color vision deficiencies, or from disease or
injury, which can create a very wide range of visual abilities (from losing half our
vision, to having patchy or blurred vision, to having tunnel vision etc.). Alternative
means of interacting in this case include those for an inability to see (speech input
and output, using lists, and tactile displays) as well as changing the way something
looks (enlarging text, changing colors or font types etc.).
Hearing
Hearing enables us to hold spoken conversations, for example to be able to converse
with another person on the telephone , to listen to music, to watch TV, to hear
devices which beep to tell us when something is finished (like microwave ovens).
An inability to hear the sounds coming from an electronic device might be from
working in a noisy office or shop-floor, from being in a noisy bar, or from disease or
injury which causes total hearing loss. Alternative means of interacting in this case
include showing visual events for audible events (e.g. a flashing light as well as a
beep), giving written text and annotation for spoken text and incidental sounds (e.g.
closed captioning).
A difficulty to hear the sounds coming from an electronic device might come from
being in a moderately noisy environment, or in a necessarily quiet environment (e.g.
a library), or from disease or injury which causes decreased hearing (e.g. conductive
hearing loss), or interrupted hearing (e.g. tinitus - a constant ringing sound in the
ears). Alternative means of interacting in this case include those for an inability to
hear (visual events for audible events), and changing the way something sounds (for
example making it louder, changing the pitch), or connecting it directly to a hearing
aid.
Speaking
Speaking enables us to hold face-to-face or over the telephone conversations, to give
speech input (for example to place a collect call using a computerized operator).
A difficulty or inability to speak (and be understood clearly by the listener) might
come from being in a noisy bar, or from being in a quiet library, from just returning
from dental surgery, from having a strong accent or dialect, or from disease or injury
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
10 of 16 4/5/2014 1:05
which prevents the mouth functioning in the normal way. Alternative means of
interacting with electronic devices in this case include using a keyboard or keypad to
enter information instead of or in conjunction with speech input.
Touching, manipulating
Being able to touch and manipulate items with our hands enables us to press
buttons, to move switches and dials, to make gestures etc.
An inability to touch and manipulate might come from being too far away (e.g. too
high or too far to reach, like a small child at a vending machine), or from disease or
injury which causes an inability to reach (e.g. paralysis below the neck). Alternative
means of interacting in this case include using remote-control devices, and using
speech input.
A difficulty in touching might come from having gloves on in cold weather, or from
disease (e.g. cerebral palsy which might make hand-control difficult) or injury (e.g.
having a big bandage on one's hand). Alternative means of making input in this case
include those for an inability to touch (remote control, speech input) and being able
to confirm selections (e.g. giving speech or highlighting feedback, and then requiring
confirmation of that selection using another button that is separate and distinct.
Understanding
Our ability to understand something is determined by the skills and knowledge we
posses to interpret and process what we experience. The ability to understand can
be reduced by such factors as load (e.g. how many tasks we are doing at one time),
stress (e.g. if we are panicking, or under time-pressure), or from fatigue (e.g. being
awake too long or expending too much mental effort). It can also be caused by
confusion when something is not communicated at our level (e.g. an engineer might
understand a term, which a layman does not; a physician might understand a
medical term, which an engineer does not). Reduced ability to understand can also
come from disease or injury which affects mental processes. Alternative means of
interacting in this case include changing the way something looks (e.g. colors, text
sizes and fonts), changing the level of the language (e.g. simplification), changing
the way speech is presented (e.g. making it faster or slower etc.), and using a remote
control which simplifies the interface.
Combinations of different needs
In addition, people might have combinations of reduced abilities, for example not
being able to see and hear well. These combinations of reduced abilities require us
to use our other abilities in sometimes unusual or innovative ways. There may be a
need to use a combination of the strategies above, or even different strategies
altogether.
Language
Our ability to understand each other comes from having a common language.
However, when we move to a foreign country, for a short period (vacation or a
business trip), or a long period, we may be unable to understand, or at least have
some difficulty in understanding the local language.
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
11 of 16 4/5/2014 1:05
The need for general good design
Note that the alternative means of interacting with electronic devices described here
can be augmented by good design: as an example, speech input might be an
alternative when something is too far to reach, but if moving the interface elements to
within easy reach of everyone is possible, then that is a far better and more
universally acceptable solution.
Strategies for addressing users' needs
There are many different ways to change the way an electronic device behaves to take account
of the varying needs of users. The following describes strategies for addressing the needs of
users with a wide variety of abilities and limitations.
If the user cannot see the device, make it say things so they can
use their ears
Things that cannot be seen can be said, using synthesized or pre-recorded speech.
For example, items on a touchscreen interface can be pressed and their name and
contents can be said aloud; buttons names on a device can be said aloud so that a
user can explore an interface before selecting items. Because of the need to
auditorally explore the interface it is important that items touched are not immediately
selected. and an alternative means of selection is necessary: this can be by requiring
confirmation by pressing another button (that is off to the side / edge and can easily
be found and used without pressing other buttons) or by pressing and holding the
button down for a short delay time until it is selected.
If the user cannot understand things that are said by the device, let
them change the way it says it
Synthesized speech is malleable in the same way that text is. It can be made faster
or slower, the pitch can be raised or lowered, the basic voice can be altered (e.g.
male / female / robotic etc.).
If the user has difficulty seeing the device, let them change the way
it looks
Text is malleable depending upon the constraints of the visual interface. For example,
fonts can be enlarged, changed between serif and sans serif, made white on black or
any other color combination. A visual interface may be constrained in the maximum
size of the text, the colors that are available, and the clarity (resolution) possible.
If the user cannot locate the buttons on the device, let them use a
list with only 3 buttons
If the user is unfamiliar with or cannot reliably find or remember where all of the
buttons are on an interface, an alternative is to put all of the items onto a list. The list
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
12 of 16 4/5/2014 1:05
has all of the interface items (buttons, switches, text etc.) arranged logically from top
to bottom. The list can be shown visually, or auditorally. The list can be accessed by
using 3 buttons (up / down / select), or by sliding a finger along an edge (e.g. a
touchscreen). The list works because it takes a two-dimensional interface and makes
it one-dimensional. Although this is a cognitively more complex interface strategy, it
does allow access by people who are unable to locate interface elements
independently.
If the user has difficulty hearing the device, let them change the way
it sounds
Sound contains properties that can be altered, such as volume (loudness) and pitch.
Modifying these can help users who are unable to hear a device operating normally.
In addition, it is possible to directly connect hearing aids to sound sources, providing
a better listening system (e.g. headphone jack connection or telephone hearing aid
T-coil connection).
If the user cannot hear the sounds from the device, show the
sounds visually
Any sounds that a device makes can be shown visually, for example by making a
display or indicator light flash when a sound is made. Spoken text and sounds can
be shown in "caption" form, enabling someone who cannot hear at all to have access
to the same information as people who can easily hear.
If the user cannot be sure of pressing the right button, allow them to
Confirm button presses
If someone cannot reliably press individual buttons (for example because they are too
small), then it might be easier to confirm button selections. This can be done by
highlighting or saying out loud with synthesized or pre-recorded speech what button
was pressed. For example, when using a cellular phone outside in the winter when
you have large gloves on: you could press the "dial" button but miss and press the
"cancel" button - but you would not have really pressed it; instead you try again until
the "dial" button is highlighted, then press the confirm button which is off to the side
of the phone away from all of the other buttons.
If the user cannot provide speech input, allow them to use buttons
instead
If someone cannot speak but an interface uses speech input, an alternative can be to
press buttons or keys. For example, at the beginning of a computer controlled
operator telephone call, a starting prompt could be "press '1' to use your touch-tone
phone to control this call, or say 'OK' now to use speech control".
If the user cannot reach or touch the device, let them give
commands by speech
If someone is unable to reach or see an interface, they can control it using speech
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
13 of 16 4/5/2014 1:05
input together with appropriate output (either audible (speech) or visual). Words
spoken by the user are interpreted by the device and used as commands to control
the interface.
If the user can see, but can only use one or two switches for input,
let them step around the buttons using scanning
If someone can only use one or two switches (for example if they are paralyzed from
the neck down), it is possible to control the interface by having each item highlighted
(or said aloud) one by one. When the one that the user wants is highlighted, they
can select it using a single switch. With a double switch they can use one switch to
advance the highlight, and the other to select. The latter has more flexibility and
control, but not everyone can use two switches which is why single switch is
available. Note: It is possible to scan using auditory feedback, but it would be more
likely that a user would use speech output and a list to interact with the device.
If the user wants to use their own customized type of input and
output, let them use Remote Control
If a user cannot reach an interface, they can control it remotely using an infrared link.
The infrared link works in a standard way so that the remote control can be used to
control an electronic device. The user points the remote controller at the device and
the device sends the remote controller the available commands for itself. The
commands can be accessed as a list (which enables them to be converted to
hand-held speech output, Braille output, large text output etc.), or as a graphical
image with buttons similar to those on the device itself. The remote control can be
configured to meet the needs of the individual user: different displays and levels of
information can be shown on the remote control, making the interface simpler to use.
If the user cannot use the standard language, let them change the
language to one they can understand
All of the above means of interaction are of no use if the standard language of the
device is not the user's own and they cannot understand it, or have some difficulty in
understanding it. Multi-language support will depend upon the location of the device:
for example, in the USA the two most predominant languages are English and
Spanish, but if a public information device were placed in a tourist area, then
Chinese, Japanese, German, French, Italian etc. would be useful to allow visitors
access to the information.
Quick-reference Table
abilities... The following quick reference table is also available as linear text.
What can't I Why can't I do it? Solutions
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
14 of 16 4/5/2014 1:05
do with the
device?
Can't see the
device well
too dark,
too small,
left glasses at
home,
low vision
Either:
Use voice output
Or, change the display
settings
And / or, allow speech input
Can't see the
device at all
eyes on the road,
eyes occupied
elsewhere,
too dark,
blindness
Either:
Use voice output
And / or, allow speech input
Can't locate
buttons on the
device (can't
see where
they are)
eyes on the road,
eyes occupied
elsewhere,
too dark,
blindness
Either:
Use a list mode, requiring
only 3 commands
Or, enable users to confirm
their selections
Can't read the
device well
unfamiliar
language,
dyslexic,
poor cognitive
abilities
Either:
Use voice output
Or, change the display
settings
Can't read the
device at all
unfamiliar
language,
poor cognitive
abilities
Use voice output
Can't hear the
device well
in a noisy office,
in a noisy bar,
tinitus,
hard-of-hearing,
deaf
Either:
Show sounds visually
Or, change the sound
settings
Or, use assistive listening
systems
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
15 of 16 4/5/2014 1:05

For more information
For more information on this document contact:
Chris Law Trace R&D Center
2107 Engineering Centers Bldg.
1550 Engineering Dr.
Madison, WI 53706
Tel: 608 263-2309
TTY: 608 263-5408
Fax: 608 262-8848
E-Mail: info@trace.wisc.edu
| About Trace | Contact Us | Resources and Tools |
| Projects and Programs | News | Publications | Site Help | Search | Home |
Ad by CM
User needs, and Strategies for addressing those needs http://trace.wisc.edu/world/kiosks/itms/needs.html
16 of 16 4/5/2014 1:05

You might also like