You are on page 1of 68

Oral Exam of Stefan Marti

Feb 5th, 2002, 13:00-16:00, MIT Media Lab

Main area:

User Interface Design for Small


Mobile Communication Devices

Chris Schmandt

Contextual: Human Interaction with


Autonomous Entities

Technical:

Common Sense Reasoning and


Intelligent User Interfaces

DCGS representative:

Bradley Rhodes

Henry Lieberman

Brian Smith

All materials related to this Qualifying Exams, including the paper on which this presentation is based on, are at:
http://web.media.mit.edu/~stefanm/generals/

User Interface Design of Mobile Devices


and

Social Impact of Mobile Communication:


How do they interact?

Structure of the talk


Two parts:
First part: theoretical foundations
Related work
My approach
Second part: suggestions, results
Relations between social phenomena and user
interface design

Motivation: Why is this interesting at all?


Why should we care?

Mobile devices are ubiquitousperhaps not in the


States, but certainly in Europe and Asia.

Mobile communication has changed, or will change,


our lives. Most of us profit from it and wouldnt miss it.

User interfaces of mobile devices often sport the latest


technology and have a fashionable design.
But did the designers also keep in mind how their
interfaces might impact our social lives?

What is social impact?


Related work

Social impact in a mobile computing setting

Classification of social context situations

Dryer et al. 1998

Rowson 2000

Social Impact in Mobile Computing

Dryer et al. 1999

Their perspective: Social computing in mobile computing systems

Social computing:
interplay between persons social behavior and their interactions with computing technologies

Mobile computing systems:


devices that are designed to be used in the presence of other persons

Depending on the design of such systems, they may either promote or


inhibit social relationships

Possible relationships

interpersonal relationship among co-located persons


human-machine relationship (social behavior directed toward a machine)
machine mediated human-human relationship
relationship with a community

Lab study on influences of pervasive computer design on responses to


social partners

Theoretical model, consisting of four components:


6

Social Impact in Mobile Computing, cont.


System design
e.g., UI design

Factors:
Users believe that device can be used easily
Device resembles a familiar device
Users can share the input with non-users
Users can share the output with non-users
Device appears useful in current context

Social attributions
How we explain for ourselves why
others behave in a certain way
(traits, roles, group memberships)
Factors:
Is partner agreeable or not
Is partner extro- or introverted
Is partner member of same group

Human behavior
What users usually do

Interaction outcome
Consequences of interaction,
both cognitive and affective
Factors:
Was interaction successful?
Are future similar interactions desired?
Did the user like the device?
Did the user like the partner?
Quantity and quality of work produced in a
social exchange

Factors: The device


makes user appear awkward
interferes with natural social behaviors
distracts nonusers from their social
interaction
alters distribution of interpersonal
control between partners
distracts user from social interaction

Social Impact in Mobile Computing, cont.

Empirical study to explore these relationships:


manipulate system design factors and asses their
effect on social attributions, human behaviors, and
interaction outcomes

Method: present participants photos of persons with


different mobile computing devices

Conditions: array of devices with different form


factors: HMD, PDA, wearable (belt-worn subnotebook), laptop, no device

Dependant variables: Questionnaires to asses the


effects, looking for significant correlations among
factors

Results:

Social Impact in Mobile Computing, cont.


System design
e.g., UI design

Social attributions

neg

How we explain for ourselves why


others behave in a certain way
(traits, roles, group memberships)
Factors:
Is partner agreeable or not
Is partner extro- or introverted
Is partner member of same group

Factors:
Users believe that device can be used easily
Device resembles a familiar device
Users can share the input with non-users
Users can share the output with non-users
Device appears useful in current context

Human behavior
What users usually do

neg
neg

Interaction outcome
Consequences of interaction,
neg
both cognitive and affective
Factors:
neg

Was interaction successful?


Are future similar interactions desired?
Did the user like the device?
Did the user like the partner?
Quantity and quality of work produced in a
social exchange

Factors: The device


makes user appear awkward
interferes with natural social behaviors
distracts nonusers from their social
interaction
alters distribution of interpersonal
control between partners
distracts user from social interaction

More social context situations

Dryer et al. looked at one situation: probably a work setting,


involving two people working with mobile computing
infrastructure. This is a very specific social context situation.

Much larger variety of social context situations. How to classify


them?

Rowson (2000) suggests a 2-dimensional space with dimensions


Role and Relationship.

Pragmatic, but useful.

10

Social context situations: Scenario Space


Relationship
Community
Formal Team
Casual Team
Individual

Rowson 2000

Chat,
friend finder

Baseball
team fan

Parent-Teacher
Association

Soccer team

Note passer

Mall
encounter

Homework

Movies

Group
project

School

Recreation

Birds of
feather

Church
group

Finances

Meetings

Study

Shopping

Hallway
chat

Hospice

Organizer

Prayer

Health
management

Family

Work

Spiritual

Role

Examples:

What kind of scenario is located in a work setting and in a casual team?

What kind of scenario is located in a school setting and as an individual?


11

How does this help us?

Dryer et al.: Mobile computing research suggest that user


interface design has social impact on the interaction
outcome, mainly via social attributions; the design of a
system can either promote or inhibit social relationships

Rowson: It seems useful to classify social context


situations in a 2-dimensional space, with dimensions role
and relationship

12

Back to the main question


How does the user interface design of mobile devices influence
the social impact of mobile communication?

My strategy to answer
1.

Define social impact = the influence on social relationships

2.

Look at the different kinds of social relationships that


are relevant in a mobile communication setting

3.

Find social phenomena specific to those relationships. I call them


Statements.

4.

Make suggestions for UI design that enable these social


phenomena, or at least do not get in their way!
Of course there are many other possible influences on social
relationships: personality of involved people, nature of task, culture, etc.
13

Gap of time and/or space

Person 1
Co-located person

Interface
Machine,
Medium
Person 2

Class A:
Social impact on relationship
between person and machine (medium)

14

Gap of time and/or space

Person 1
Co-located person

Interface
Machine,
Medium
Person 2

Class B:
Social impact on relationship
between person and co-located people

15

Gap of time and/or space

Person 1
Co-located person

Interface
Machine,
Medium
Person 2

Class C:
Social impact on relationship
between person and mediated people

16

Basic assumption
Each communication consists of two elements:
1. Initiation (alert)
2. Act of communication

More specifically

Unsuccessful initiations
happens less and less: graceful degradation, awareness applications

Blurred distinction between alerts and acts of communication


e.g., caller ID, Nomadic Radio

Communication has neither clear beginning nor clear end


e.g., awareness communication modes (later more about that)

17

Relationship between human and


machine (medium, device, etc.)
Class A

In human-computer relationships, we sometimes mimic


human-human relationships. Only minimal cues are
necessary to trigger such behavior

e.g., Nass et al. 1993

These are: use of language, human sounding speech, social role,


remembering multiple prior inputs

Computers (or machines, devices) as social actors

User satisfaction with UI not determined by


effectiveness and efficiency, but affective reactions

e.g., Shneiderman 1998

18

Statement 1:
The more human-like the interaction, the better
are the users attitudinal responses.

Class A
User interface design suggestions; they are not orthogonal dimensions:

Interfaces that support common forms of human expression, also


called Natural Interfaces, e.g., speech, pen, gesture

Abowd et al. 2000

Recognition based user interfaces (instead of buttons and sliders)

Myers et al. 2000

Multimodal interfaces: natural human communication is multimodal;


also good for cross-checks, since recognition based interfaces are
error prone
Interfaces that allow the user and the device to select the most
appropriate modality depending on context

Architectures that allow for mixed-initiative interfaces (e.g., LookOut)

Interfaces that enable human-level communication: instead of


controlling the machine, controlling the task domain

Suhm et al. 1999


Oviatt et al. 2000

Ruuska et al. 2001


Walker et al. 1998
Horvitz 1999
Nielson 1993

19

Relationship between human and


machine (cont.)
Class A

Humans probably like interacting with intelligent beings. Social


intelligence probably makes us feel comfortable.

Human social intelligence is how we deal with relationships.

Artificial social intelligence is discussed in framework of SIA(R)s

"Social Intelligence Hypothesis:" primate intelligence originally evolved to solve


social problems, and only later was extended to problems outside the social
domain (math, abstract thinking, logic)

SIA(R)s have human-like social intelligence to address the emotional and interpersonal dimensions of social interaction.

Mechanisms that contribute to Social Intelligence: Embodiment, empathy (scripts


plus memory), autobiographic agency (dynamically reconstructing its individual
history), narrative agency (telling stories about itself and others)

Dautenhahn 2000
Breazeal 2001

Dautenhahn 1998

20

Statement 2:
The more social intelligence a device has,
the more positive the social impact.

Class A
Many user interface design suggestions, here are just two:

Interfaces with reduced need for explicit human-computer


interaction, based on the machine's awareness of the social
situation, the environment, and the goals of the user. Or in short:
context aware UI.

Interfaces that are invisible, both physically and mentally. Can


mean: not controlled directly by the user, but also by the machine.

Dey et al. 2001


Weiser 1991
Lieberman et al. 2000

Nielson 1993

This is a consequence of the function of the machine: Its role will not be to obey orders
literally, but to interpret user actions and do what it deems appropriate.

21

Relationship between human and


co-located people (surroundings)
Class B

Each act of telecommunication disrupts the interaction with


co-located people. In mobile communication, however,
interruption is part of the design.

22

Statement 3:
The less telecommunication, the better for the
interaction with co-located persons.

The less we telecommunicate, the more we can attend to


co-located people, the more time we spend with them.

Class B

Just a single, wide-focus user interface design suggestion:

Interfaces that filter in a context aware manner and therefore


minimize the amount of telecommunication. The more the
device (agent) knows about my social and physical
environment, the less unnecessary distractions (later more)

But

23

Statement 4:
Find balance between useful interruptions and
attention for co-located persons .

Class B

Interfaces that allow communication in parallel to the ongoing colocated interactions, which in turn enable multiple activities
concurrently (mobile communication happens in many different
contexts).
Examples: Simple speakerphone, Nomadic Radio
Interfaces that support multiple levels of intrusiveness," enabling
background awareness applications.
Examples: Audio Aura

Sawhney et al. 2001

Mynatt et al., 1998

Audio Aura: serendipitous information via background auditory cues; uses sonic
ecologies, embedding the cues into a running, low-level soundtrack so that the user is
not startled by a sudden sound; different worlds like music, natural sound effects
(ocean).
Adaptive background music selection system: each user can map events to songs, so
personal alerts can be delivered through the music being played in the public background

24

Class B

Interfaces that present information at different levels of the


periphery of human attention.
Examples: Reminder Bracelet, and many systems in the domain
of ambient media and ambient alerting.
Minimal Attention User Interfaces (MAUI). The idea is to transfer
mobile computing interaction tasks to interaction modes that
take less of the users attention from their current activity. It is
about shifting the human-computer interaction to unused
channels or senses.
Limited divided attention and limited focus of attention are only indirectly relevant
in our context: they are primarily psychological phenomena and influence social
relationships only if co-located persons and the communication device are both
seeking attention at the same time. The real issue is what effect the users choice of
focus of attention has on her social relationships. This is based on the assumption
that the user interface gives the user the freedom to shift attention, and does not just
override the users conscious choice of focus

Abowd et al. 2000


Hansson et al. 2000

Pascoe et al. 2000

25

Statement 5:
The less intrusive the alert and the act of
communication, the more socially accepted.

Class B

Interfaces that can adapt to the situation and allow for mixedmode communication.
Example: Quiet Calls. Important problem to solve is how to map
communication modes adequately, e.g., Quiet Calls uses a TalkAs-Motion metaphor: engage, disengage, listen

Ramping interfaces, including scalable alerting.


Example: Nomadic Radio

Nelson et al. 2001

Rhodes 2000
Sawhney et al. 2001

26

Statement 6:
The more public the preceding alert, the more
socially accepted the following act of
communication.

Class B

Suggestion: design space of notification cues for


mobile devices with two dimensions: subtlety and
publicity.

Hansson et al. 2001

Public and subtle cues are visible to


subtle
co-located persons, and can therefore
Reminder
avoid unexplained activity (e.g., user
Tactile cues
Bracelet
suddenly leaves from a meeting).
private
Rememberance
agent

public

Auditory cues

intrusive
Hansson et al. 2000

Interfaces that support and encourage public but subtle alerts.


Example: Reminder Bracelet
27

Statement 7:
The more obvious the act of communication,
the more socially accepted.

This statement is about the act of communication (the


previous was about the alert)

The talking alone phenomenon: Soon communication


devices will be so small that co-located people cant see
them, so a user will appear to talk to herself. That is strange,
and socially not acceptable.

Class B
Fukumoto et al. 1999

Interfaces that support private communication without concealing


the act of communication to the public .
Example: Whisper, a wearable voice interface that is used by
inserting a fingertip into the ear canal. This Grasping Posture
avoids the talking alone phenomenon
28

Statement 8:
A mobile device that can be used by a single
user as well as by a group of any size will more
likely get socially accepted by co-located
persons.

Class B
In other words: A device which has a user interface that has
. option to adapt to the group size of the social setting (from
the
individual to community), will be a better device

Rowson 2001

Interfaces that can adapt to a particular user group size, from an


individual to a group. This extends its usability, spanning more
social context situations.
Example: TinyProjector for mobile devices; projection size is
scalable and can adapt to a group of a fewusing a table as a
projection surface , up to large groups of hundreds of people,
using a wall of a building as a projection screen

29

Mediated humanhuman
relationships
Class C

The Medium is the Message: How a message is perceived


is defined partially by the transmitting medium. How about
The Interface is the Message ?

Early theories of effects of a medium on the message and on


the evaluation of the communicating parties:
(1) Efficiency of the interaction process: having different
amounts of channels, and being able to transmit
different kinds of nonverbal cues.
(2) Media differ through the possible amount of nonverbal
communication

McLuhan

30

Social Presence and Immediacy

Better heuristic to classify communication media and their


social impact: Social Presence (SP)

SP is a subjective quality of a medium; a single dimension that


represents a cognitive synthesis of several factors:

capacity to transmit information about facial expression

direction of looking

posture

tone of voice

non-verbal cues, etc.

These factors affect the level of presence, the extent to which a


medium is perceived as sociable

Class C
Short et al. 1976

31

Class C

SP theory says that the medium has a significant influence on


both the evaluation of the act of communication, and the
evaluation of the communication partner (interpersonal
evaluation and attraction), which means: high social impact.

The nonverbally richer mediathe ones with higher Social


Presencelead to better evaluations than the nonverbally
poorer media: the transmitted nonverbal cues tend to increase
the positivity of interpersonal evaluation.

Immediacy: Nonverbally richer media are perceived as more


immediate, which means that more immediate media lead to
better evaluations and positive attitudes.

Williams 1975
Chapanis et al. 1972

Mehrabian 1971

32

Statement 9:
The higher the Social Presence and Immediacy
of a medium, the better the attitudinal
responses to partner and medium.

Class C

e-mail

phone

videophone

face-to-face

Immediacy of
medium

User interfaces that support as many as possible channels, and


that can transmit non-verbal cues.
This is probably simplistic.

33

That might be true with generic tasks. But what if the


task requires the partners to disclose themselves?

Class C

Hypothesis 1:

If the task requires the partners to disclose themselves


extensively, their preferences shift and might get reversed: they
prefer media that are lower in immediacy.

This might be explained with a drive to maintain the optimum


intimacy equilibrium in a given relationship.

Argyle et al. 1965

Compensatory behaviors: personal distance, amount of eye contacts, smiling, etc.

Example: If a persons distant cousin dies, she would rather write


the parents (low immediacy medium) than to stop by (high
immediacy medium), because stopping by might be too
embarrassing (since she doesn't know them at all).
34

Positive attitude
towards partner
and medium

Class C
Task requires
only low intimacy
between partners

Task requires
high intimacy

E-mail

videophone
phone
Face-to-face

Immediacy of
medium

35

That might be true if the partner dont know each other


well. What if they do?

Class C

Hypothesis 2:

If the task requires the partners to behave in an intimate way and


the partners know each other well, the preferences might shift
back again, making higher immediacy media preferred.

Example: If a persons father dies, she will choose the medium


with the highest immediacy (which is face to face) to communicate
with her mother.

36

Amount of intimacy task


requires from partners

Class C
Well
acquainted
with partner

Unknown
partner

E-mail

videophone
phone
Face-to-face

Immediacy of
medium

37

Statement 10:
The users attitudinal responses depend on
how well the partners know each other, and if
the communication task requires them to
disclose themselves extensively.

Class C

Interfaces that are aware of the existing relationships of the


communication parties and adapt, suggesting communication modes
that supports the right level of immediacy and social presence.
Example: Agent that is not only aware of all communication history, but also keeps track of
the most important communication partners of the user and current interaction themes,
perhaps with commonsense knowledge to log files and fill in the blanks with natural
language understanding

Interfaces that are aware of the task the communication partners


want to solve, either by inferring it from the communication history, or
by looking at the communication context

38

Statement 11:
The more the user is aware of the social
context of the partner before and during the
communication, the better.

Class C

Interfaces that let the user preview the social context of the
communication partner. This could include interfaces that give
the user an idea where the communication partner is, or how
open and/or available she is to communication attempts
Examples: Live Address Book, ConNexus and Awarenex,
Hubbub

Interfaces that allow the user to be aware of the social context


of the communication partner. This refers to interfaces that
enable the participants to understand each others current
social context during the act of communication.

Milewski et al., 2000


Tang et al., 2001
Isaacs et al., 2002

39

Special Case:
Awareness communication

Person 1 does not communicate directly with person 2,


but with an outer layer of person 2
Outer layer: personal agent that acts on behalf of
person 2

Class C

Example agents:

Agent anticipates arrival time during traveling and radiates this


info to trusted users
Electrical Elves/Friday: multi-agent teams for scheduling and
rescheduling events

Tambe et al. 2001

40

Statement 12:
Receiving information from the outer layer of
a person about her current context simplifies
awareness between the partners, and has
positive social impact.

Class C (special)

Hansson et al., 2000


Interfaces that are open for and actively request information from
Chang et al., 2001
the context layer of communication partners. Such information is
Chang, 2001
most likely to be displayed in the periphery of human attention.
Wisneski, 1999
Examples for UI design: Reminder Bracelet, LumiTouch, ComTouch,
Personal Ambient Display

Related to interfaces of class A relationships: interaction happens


between a person and a machine, e.g., a personal software agent.
Therefore, some design suggestions of this class are relevant:
UI should allow the user to select the most appropriate
modality depending on the physical context
UI has to adapt to the users current social context: ramping
interfaces
41

Statement 13:
Mobile communication happens continuously,
everywhere and anytime, and therefore is used
in many different social context situations.

Mobile
communication

The user interface has to adapt to this variety of social context situations.

Interfaces with small form factors. This is a direct consequence of


the everywhere-anytime paradigm of mobile communication. The
smaller the device and its interface, the more likely they will get
used. Wearability as major theme: devices that will be attached to
the body, wrist-top and arm-top metaphors.

Distributed interfaces that are not only part of the mobile device, but
also of our environment. This includes a modular approach for user
interfaces that dynamically connect to the available communication
devices and channels

Interfaces with varying input and output capabilities.


Example: wearable keyboards like FingeRing.

Interfaces that allow for continuous interactions. Important for


ubiquitous computing, but also relevant for the always-on metaphor
of mobile computing: systems that continue to operate in the
background without any knowledge of on-going activity.

Ruuska et al. 2001

Weiser 1991

Fukumoto et al. 1997

Abowd et al. 2000

42

Conclusions

Social impact = the influence on social relationships

3 classes of relevant social relationships in the mobile


communication setting

13 statements: social phenomena, specific to a class of


relationships

28 design suggestions: how to design a UI for mobile


devices in order to support the statements, or not to violate
the social phenomena described in the statements, or simply
to make the social impact of mobile communication positive

43

Thanks! :-)

All materials related to my Qualifying Exams, including the paper on which this presentation
is based on, are at:
http://web.media.mit.edu/~stefanm/generals/

44

45

The following slides show larger pictures


and more descriptions of some of the
prototypes that were mentioned in the
presentation

46

Reminder Bracelet

Hansson et al. 2000

The Reminder Bracelet is an


experiment in the search for
complementary ways of displaying
notification cues. It is a bracelet,
worn on the wrist and connected
to a users PDA. The LEDs
embedded in the Reminder
Bracelet act as a display for
conveying notifications. The
reason for using light was to allow
for more subtle, less attentiondemanding cues, and also to
make the notifications public to a
certain degree.
Reminder Bracelet is places on
the wrist, a location that generally
rests in the periphery of the users
attention, and also a familiar place
for an informational device.
When a notification occurs, it is
first perceived in the periphery of
the users vision and then it might
move into the center of attention.
In an effort to reduce the user
interaction and to convey
notifications in a consequent and
easily interpreted manner, the
Reminder Bracelet always notifies
its user 15 minutes ahead of
scheduled events in the PDA.

47

Whisper
Fukumoto et al. 1999

Grasping posture
Outside noise shut out
Users hear themselves better (dont raise voice)
because ear covered

48

Whispers Talking Alone Phenomena

Fukumoto et al. 1999

Talking alone: todays earphone-microphone units are large enough to be visible, so the
surrounding people can easily notice their presence.

However, it is clear that almost invisible ear plug style devicesintegrating telephone
and PDA functionalitywill be feasible sometime soon. Such devices can be easily
overlooked by co-located people, and it will appear to these people as if the user is
talking to herself.

The phenomenon of talking alone might look very strange, and is certainly socially not
acceptable. Fukumoto et al. even hypothesize that the stigma attached to talking
alone has hindered the spread of the wearable voice interface. Therefore, the
important issues that must be addressed are the social aspects when designing and
implementing wearable voice interfaces.

Talking alone phenomenon does not occur if the user seems to hold a tiny telephone
handset, even if the grasped object is too small to be seen directly. Basically, this effect
can be achieved by just mimicking the grasping posture.

Whisper, a wearable voice interface that is used by inserting a fingertip into the ear
canal, would satisfy the socially necessary need not to conceal the act of
communication

49

Nomadic Radios Soundbeam by Nortel

Sawhney et al. 2001

Nomadic Radio explores the space of possibly


parallel communication in the auditory area. The
system, a wearable computing platform for
managing voice and text based messaging in a
nomadic environment, employs a shoulder worn
device with directed speakers that make cues only
audible for the user (without the use of socially
distracting headphones). This allows for a natural
mix of real ambient audio with the user specific
local audio.
To reduce the amount of interruptions, the systems
notification is adaptive and context sensitive,
depending on whether the user is engaged in a
conversation, her recent responses to prior
messages, and the importance of the message
derived from content filtering with Clues

50

Lookout

Horvitz 1999

LookOut: parses incoming


email and tries to find out if
the user wants to schedule
an event, based on this
email.

More precisely, it computes the


probability that the user wants to
open the calendar or even
schedule an appointment. It either
waits (does nothing), asks the user
if she needs the agent's service, or
goes ahead and schedules an
appointment for the user.
The idea of mixed-initiative
systems is well known in robotics,
and related research is done in the
areas of human-robot symbiosis,
mixed-initiative problem solving,
and co-habited mixed realities

51

Quiet Calls

Nelson et al. 2001

http://www.fxpal.com/ACM1/qc.htm

Quiet Calls: allows telephone users to respond to a call without talking aloud. The prototype QC-Hold has three buttons for
responding to a call, sending three types of pre-recorded audio messages directly into the phone. The most important
problem to solve is how to map communication modes adequately.
Talk-As-Motion metaphor for Quiet Calls: Communication is supported in three directions: move in to the call by engaging
the caller verbally; move out of the call by disengaging; and in between these opposites, stay in place by listening to the
caller. This design is implemented as a state transition process and overloading buttons with multiple meanings over the
course of the call. The three buttons trigger three different kinds of messages: engage, disengage, and listen.
It enables the user to respond on a meta level, which is grainier than real speech, but still precise enough to control the
mixed-mode conversation, letting the device decide about the wording of the sentences. This solution is preferable over the
manual selection of a specific answer, e.g., via a long list of canned replies that are difficult to manage and browse

52

Comtouch

Chang, 2001

http://www.media.mit.edu/~anjchang/COMTOUCH/CHI/ComTouchl.rm
ComTouch uses the haptic modality. It allows a
handheld device to register the force of pressure
from each finger as the object is squeezed. At the
receiving end, vibrations under each finger
represent the transmitted force.

53

LumiTouch

Chang et al., 2001

LumiTouch is a pair of interactive picture frames that are


cross connected so that when one user touches her picture
frame, the other picture frame lights up. Semi-ambient
display that can transition seamlessly from periphery to
foreground in addition to communicating emotional content.

54

Personal Ambient Display

Wisneski, 1999

Personal Ambient Displays


are small, physical devices
worn to display information to
a person in a subtle,
persistent, and private
manner. Such personal
ambient displays are small
enough to be carried in a
pocket (e.g., as key ring
accessory), worn as a watch,
or even as jewelry.
Information is displayed
through extended tactile
modalities such as heating
and cooling, movement and
vibration, and change of
shape

55

Awarenex

Tang et al., 2001

Hubbub

Isaacs et al., 2002

Hubbub: instant messenger that runs on a wireless Palm and a PC, enabling people to maintain
background awareness of others and send them quick messages. It uses a novel concept of "sound
instant messages," i.e., earcons that have meaning, such "Hi" or "Thanks." Each user has a Sound ID
that announces their sound messages and their changes in availability. Users can protect their privacy
and control sound overload.

ConNexus: integrates awareness information, instant messaging, and other communication channels
in an interface that runs on a desktop computer.

Awarenex: extends that functionality to wireless handheld devices, such as a Palm. A speech interface
56
also enables callers to make use of the awareness information over the telephone

Live Address Book

Milewski et al., 2000

57

FingeRing

Fukumoto et al. 1999

The idea was to create tiny interfaces devices similar to


watches or glasses, that are worn all the time and are
used to control a PDA. FingeRing is a wearable input
device, a wearable keyboard, for PDAs and possibly
musical instruments. The user can type on any surface,
including knee. The information of the 5 accelerometers is
transmitted via a "direct coupling" method that uses the
human body for signal and the air for ground loop. The
authors also developed a new chording method, with
which expert users (piano players) can input 52 different
symbols, at 200 symbols per minute.

58

Samsung Scurry

59

Arm-top metaphor

Wrist-top metaphor

Ruuska et al. 2001

60

Hypothesis 1 (from Marti, 1992)

61

Hypothesis 2 (from Marti, 1992)

62

63

References

64

[1]Abowd, G.D., and Mynatt, E.D. (2000). Charting past, present, and future research in ubiquitous computing. ACMTransactionsonComputerHumanInteractionToCHI,7(1),March2000,pp.29-58.
http://www.acm.org/pubs/articles/journals/tochi/200071/p29abowd/p29abowd.pdf
[2]Argyle, M. & Dean, J. (1965). Eye-contact, distance, affiliation.Sociometry,28,289-304.
Abstract:http://nimbus.ocis.temple.edu/~mlombard/Presence/argyle65.html
http://links.jstor.org/sici?sici=0038-0431%28196509%2928%3A3%3C289%3AEDAA%3E2.0.CO%3B2-Q
[3]Bannon, L. (1995). Editorial, Commentaries and a response in the Suchman-Winograd debate.ComputerSupportedCooperativeWork(CSCW),
3(1),Netherlands:KluwerAcademics,pp.29-95.
[4]Biocca, F.,Burgoon, J., Harms, C., Stoner, M. (2001). Criteria And Scope Conditions For A Theory And Measure Of Social Presence.In
proceedingsofPresence2001,4thAnnualInternationalWorkshop,May21-232001,atTempleUniversity,Philadelphia,PA.
http://nimbus.ocis.temple.edu/~mlombard/P2001/Biocca1.pdf
[5]Breazeal, C. (1999). Robot in Society: Friend or Appliance?InAgents99WorkshoponEmotion-BasedAgentArchitectures,Seattle,WA,pp.18-26.
http://www.ai.mit.edu/projects/sociable/publications.html
[6]Chang, A. (2001). ComTouch: A Vibrotactile Emotional Communication Device.Technicalreport,MITMediaLab(unpublishedpaper).
http://web.media.mit.edu/~anjchang/DS2001/comtouch.htm
[7]Chang, A., Resner, B., Koerner B., Wang, X and Ishii, H. (2001). LumiTouch: An Emotional Communication Device.InExtendedAbstractsof
ConferenceonHumanFactorsinComputingSystems(CHI'01),Seattle,Washington,USA,March31-April5,2001,ACMPress,pp.313-314.
http://tangible.media.mit.edu/papers/LumiTouch_CHI01/LumiTouch_CHI01.pdf
[8]Chapanis, A., Ochsman, R.B., Parrish, R.N., and Weeks, G.D. (1972). Studies in interactive communication: I. The effect of four communication
modes on the behaviour of teams during cooperative problem-solving.HumanFactors,14(6),487-509.
[9]Clark, H. H. (1996). Using language.NewYork,NY:CambridgeUniversityPress.
Review:http://www.massey.ac.nz/~alock/virtual/clarke.htm
[10]Dautenhahn, K. (1998). The Art of Designing Socially Intelligent Agents Science, Fiction, and the Human in the Loop.SpecialIssueSocially
IntelligentAgents,AppliedArtificialIntelligenceJournal,Vol.12,7-8,pp.573-617.
http://orawww.cs.herts.ac.uk/~comqkd/papers.html
[11]Dautenhahn, K. (2000). Socially Intelligent Agents and The Primate Social Brain - Towards a Science of Social Minds.ProceedingsofAAAIFall
SymposiumSociallyIntelligentAgents-TheHumanintheLoop,AAAIPress,TechnicalReportFS-00-04,pp.35-51.
http://orawww.cs.herts.ac.uk/~comqkd/papers.html
[12]Dey, A., Ljungstrand, P., and Schmidt, A. Distributed and Disappearing User Interfaces in Ubiquitous Computing.AworkshopheldatCHI2001.In
ExtendedAbstractsofComputer-HumanInteraction(CHI)2001,ACMPress,2001,pp.487-488.
http://www.teco.edu/chi2001ws/disui.pdf
[13]Dix,A.,Rodden,T.,Davies,N.,Trevor,J.,Friday,A.,andPalfreyman,K.(2000).Exploitingspaceandlocationasadesignframeworkforinteractive
mobilesystems.ACMTransactionsonComputerHumanInteractionToCHI,7(3),Sept.2000,pp.285321.
http://www.acm.org/pubs/articles/journals/tochi/200073/p285dix/p285dix.pdf
[14]Dryer, D.C., Eisbach, C., and Ark, W.S. (1999). At what cost pervasive? A social computing view of mobile computing systems.IBMSystems
Journal,Volume38,No4.
http://www.research.ibm.com/journal/sj/384/dryer.pdf

65

[15]Flores, F., Graves, M., Hartfield, B., and Winograd, T. (1988). Computer Systems and the Design of Organizational Interaction.ACMTrans.
OfficeInfo.Systems,April1988,pp.153-172.
http://portal.acm.org/citation.cfm?id=45943&coll=ACM&dl=ACM&CFID=1282938&CFTOKEN=25128089
[16]Fukumoto, M. and Tonomura, Y. (1999). Whisper: A Wristwatch Style Wearable Handset.ACMCHI'99Proceedings,pp.112-119.
http://www.acm.org/pubs/articles/proceedings/chi/302979/p112fukumoto/p112fukumoto.pdf
[17]Fukumoto, M., and Tonomura, Y. (1997). Body Coupled FingeRing: Wireless Wearable Keyboard.ACMCHI'97Proceedings,pp.147-154.
http://www.atip.or.jp/Akihabara/links/johanwear/ntt/fkm.htm
[18]Gong, L., and Lai, J. (2001). Shall We Mix Synthetic Speech and Human Speech? Impact on Users' Performance, Perception and Attitude.ACMCHI
2001Proceedings,pp.158-165.
http://www.acm.org/pubs/articles/proceedings/chi/365024/p158gong/p158gong.pdf
[19]Hansson, R. and Ljungstrand, P. (2000). The Reminder Bracelet: Subtle Notification Cues for Mobile Devices.In:ExtendedAbstractsofCHI2000
(StudentPoster),ACMPress,pp.323-325.
http://www.viktoria.se/~rebecca/artiklar/chi2000poster.pdf
[20]Hansson, R., Ljungstrand, P., and Redstrm, J. (2001). Subtle and Public Notification Cues for Mobile Devices.In:ProceedingsofUbiComp
2001,Atlanta,Georgia,USA.
http://www.viktoria.se/~rebecca/artiklar/Notification_final.pdf
[21]Hjelm, J. (2000). Designing Wireless Information Services.NewYork,NY:JohnWiley&Sons.
http://www.wireless-information.net/
[22]Horvitz, E. (1999). Principles of Mixed-Initiative User Interfaces.ACMCHI'99Proceedings,pp.159-166.
http://www.acm.org/pubs/citations/proceedings/chi/302979/p159-horvitz/
[23]Isaacs, E., Walendowski, A., and Ranganathan, D. (2002). Hubbub: A sound-enhanced mobile instant messenger that supports awareness and
opportunistic interactions.TobepublishedApril,2002byACMintheProceedingsoftheConferenceComputer-HumanInteraction(CHI),
Minneapolis,MN.
http://www.izix.com/pro/lightweight/hubbub.php
[24]Ishii, H., and Ullmer, B. (1997). Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms.InProceedingsofCHI'97,pp.234241.
http://tangible.media.mit.edu/papers/Tangible_Bits_CHI97/Tangible_Bits_CHI97.pdf
[25]Lenat, D.B. (1995). Cyc: A Large-Scale Investment in Knowledge Infrastructure.CommunicationsoftheACM,38(11):32-38.
http://www.acm.org/pubs/articles/journals/cacm/1995-38-11/p33-lenat/p33-lenat.pdf
[26]Luff, P., and Heath, C.(1998). Mobility in collaboration.ACMCSCW98Proceedings,pp.305-314.
http://www.acm.org/pubs/articles/proceedings/cscw/289444/p305-luff/p305-luff.pdf
[27]Marx, M., and Schmandt, C. (1996). CLUES: Dynamic Personalized Message Filtering.ProceedingsofCSCW96,November1996,pp.113-121.
http://portal.acm.org/citation.cfm?doid=240080.240230
[28]Mehrabian, A. (1971). Silent Messages.Belmont,CA:Wadsworth.
[29]Milewski, A. and Smith, T. M. (2000). Providing Presence Cues to Telephone Users.ProceedingsofCSCW2000,Philadelphia,PA.
http://www.research.att.com/~tsmith/papers/milewski-smith.pdf
[30]Myers, B., Hudson, S.E., and Pausch, R. (2000). Past, Present and Future of User Interface Software Tools.ACMTransactionsonComputerHumanInteractionToCHI,7(1),March2000,pp.3-28.
66
http://www.acm.org/pubs/articles/journals/tochi/2000-7-1/p3-myers/p3-myers.pdf

[31]Mynatt, E.D., Back, M., Want, R., Baer, M., and Ellis, J.B. (1998). Designing Audio Aura.ACMCHI98Proceedings,pp.566-573.
http://www.acm.org/pubs/articles/proceedings/chi/274644/p566-mynatt/p566-mynatt.pdf
[32]Nass, C., Steuer, J., Tauber, E., and Reeder, H. (1993). Anthropomorphism, Agency, & Ethopoeia: Computers as Social Actors.Presentedat
INTERCHI'93;ConferenceoftheACM/SIGCHIandtheIFIP;Amsterdam,Netherlands,April1993.
http://www.acm.org/pubs/citations/proceedings/chi/259964/p111-nass/
[33]Nelson, L., Bly, S., and Sokoler, T. (2001). Quiet calls: talking silently on mobile phones.ACMCHI2001Proceedings,174-181.
http://www.acm.org/pubs/articles/proceedings/chi/365024/p174-bly/p174-bly.pdf
[34]Nielsen, J. (1993). Noncommand user interfaces.AnupdatedversionofapaperthatappearedintheRevisedversionofCommunicationsoftheACM
36(4),April1993,pp.83-99.
http://www.useit.com/papers/noncommand.html
[35]Norman, D. A. & Draper, S. W. (Eds.) (1986). User centered system design: New perspectives on human-computer interaction.Hillsdale,NJ:
LawrenceErlbaumAssociates.
[36]Oppermann, R. and Specht, M. (1998). Adaptive support for a mobile museum guide. ProceedingsofInteractiveApplicationsofMobileComputing
98,Rostock,Germany.
http://www.rostock.igd.fhg.de/veranstaltungen/workshops/imc98/Proceedings/imc98-SessionMA3-2.pdf
[37]Oviatt, S. and Cohen, P. (2000). Multimodal Interfaces That Process What Comes Naturally.CommunicationsoftheACM,Vol.43(3),March
2000,pp.45-53.
http://www.acm.org/pubs/articles/journals/cacm/2000433/p45oviatt/p45oviatt.pdf
[38]Pascoe, J., Ryan, N., and Morse, D. (2000). Using while moving: HCI issues in fieldwork environments.ACMTransactionsonComputer-Human
InteractionToCHI,7(3),September2000,pp.417-437.
http://www.acm.org/pubs/articles/journals/tochi/200073/p417pascoe/p417pascoe.pdf
[39]Rhodes, B. (2000). Just-In-Time Information Retrieval. Ph.D.Dissertation,MITMediaLab,May2000.
http://www.media.mit.edu/~rhodes/Papers/rhodes-phd-JITIR.pdf
[40]Rice, R. E. (1992). Task analyzability, use of new medium and effectiveness: A multi-site exploration of media richness. OrganizationScience,3(4),
pp.475-500.
Abstract:http://nimbus.ocis.temple.edu/~mlombard/Presence/rice92.html
http://links.jstor.org/sici?sici=1047-7039%28199211%293%3A4%3C475%3ATAUONM%3E2.0.CO%3B2-A
[41]Rowson, J. (2001). The Social Media Project at Hewlett Packard Laboratories.TalkattheStanfordNetworkingSeminarofNovember1,2001,
CenterfortheStudyofLanguageandInformation(CSLI),StanfordUniversity.
http://netseminar.stanford.edu/sessions/2001-11-01.html
[42]Ruuska-Kalliokulju, S., Schneider-Hufschmidt, M., Vnnen-Vainio-Mattila, K., Von Niman, B. (2001). Shaping the Future of Mobile Devices.
ResultsoftheWorkshoponFutureMobileDeviceUserInterfacesatCHI2000.SIGCHIBulletinJanuary/February2001.
http://www.acm.org/sigchi/bulletin/2001.1/mobile_cont.pdf
[43]Sallns, E. L. (1999). Presence in multimodal interfaces.ProceedingsoftheSecondInternationalWorkshoponPresence,UniversityofEssex,
Colchester,UK.
http://www.nada.kth.se/~evalotta/Presence/IWVP.html

67

[44]Sawhney, N. and Schmandt, C. (2000). Nomadic Radio: Speech & Audio Interaction for Contextual Messaging in Nomadic Environments.ACM
TransactionsonComputerHumanInteractionToCHI,7(3),Sept.2000,pp.353-383.
http://www.acm.org/pubs/articles/journals/tochi/2000-7-3/p353-sawhney/p353-sawhney.pdf
[45]Schmidt, A., Gellersen, H.-W., and Beigl, M. (1999). Matching Information and Ambient Media.InProceedingsofCoBuild'99.Second
InternationalWorkshoponCooperativeBuildings,Pittsburgh.LNCS1670.Springer:Heidelberg1999.
http://www.comp.lancs.ac.uk/~albrecht/pubs/pdf/schmidt_cobuild99_ambient.pdf
[46]Shneiderman, B. (1998). Designing the User Interface: Strategies for Effective Human-Computer Interaction.ThirdEdition,Reading,MA:Addison
Wesley.
[47]Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications.London:JohnWiley.
[48]Singer, A., Hindus, D., Stifelman, L., White, S. (1999). Tangible Progress: Less Is More In Somewire Audio Spaces.InProceedingsofCHI'99,
ACM,1999,pp.104-111.
http://portal.acm.org/citation.cfm?doid=302979.303007
[49]Steuer, J. (1995). Self vs. Other; Agent vs. Character; Anthropomorphism vs. Ethopoeia. InVividnessandSourceofEvaluationasDeterminantsof
SocialResponsesTowardMediatedRepresentationsofAgency,doctoraldissertation,StanfordUniversity,advisedbyNassandReeves.
http://www.cyborganic.com/People/jonathan/Academia/Dissertation/theory1.html
[50]Suhm, B., Myers, B., and Waibel, A. (1999). Model-based and empirical evaluation of multi-modal interactive error correction.ACMCHI'99
Proceedings,pp.584-591.
http://www.acm.org/pubs/articles/proceedings/chi/302979/p584suhm/p584suhm.pdf
[51]Tang, J., Yankelovich, N., Begole, J., Van Kleek, M., Li, F., and Bhalodia, J. (2001). ConNexus to Awarenex: Extending awareness to mobile
users.InProceedingsofACMCHI2001,Seattle,Washington,March31-April5,2001,pp.221-228.
http://www.sun.com/research/netcomm/papers/CHI2001Proc.pdf
[52]Walker, M.A., Fromer, J., Di Fabbrizio, G., Mestel, C., and Hindle, D. (1998). What can I say?: Evaluating a Spoken Language Interface to
Email.ACMCHI98Proceedings,pp.289-290.
http://www.acm.org/pubs/articles/proceedings/chi/274644/p582walker/p582walker.pdf
[53]Weiser. M. (1991). The computer for the 21st Century.ScientificAmerican,Volume265,Number3,September1991,pp.94-104.
http://nano.xerox.com/hypertext/weiser/SciAmDraft3.html
[54]Wickens, C.D. (1992). Engineering Psychology and Human Performance.NewYork,NY:HarperCollins.
http://vig.prenhall.com/catalog/academic/product/1,4096,0321047117,00.html
[55]Williams, E. (1975). Medium or message: Communications medium as a determinant of interpersonal evaluation.Sociometry,38(1),pp.119-130.
http://links.jstor.org/sici?sici=0038-0431%28197503%2938%3C119%3AMOMCMA%3E2.0.CO%3B2-P
[56]Williams, E. (1977). Experimental comparisons of face-to-face and mediated communication: A review.PsychologicalBulletin,84(5),pp.963-976.
[57]Wisneski, C. A. (1999). The Design of Personal Ambient Displays.MastersThesis,MassachusettsInstituteofTechnology.
[58]Wisneski, C., Ishii, H., Dahley, A., Gorbet, M., Brave, S., Ullmer, B. and Yarin, P. (1998). Ambient Displays: Turning Architectural Space into an
Interface between People and Digital Information. InProceedingsofCoBuild'98,InternationalWorkshoponCooperativeBuildings,Darmstadt,
Germany,February1998,Springer,pp.22-32.
http://tangible.media.mit.edu/papers/Ambient_Disp_CoBuild98/Ambient_Disp_CoBuild98.pdf

68

You might also like