Professional Documents
Culture Documents
Main area:
Chris Schmandt
Technical:
DCGS representative:
Bradley Rhodes
Henry Lieberman
Brian Smith
All materials related to this Qualifying Exams, including the paper on which this presentation is based on, are at:
http://web.media.mit.edu/~stefanm/generals/
Rowson 2000
Social computing:
interplay between persons social behavior and their interactions with computing technologies
Possible relationships
Factors:
Users believe that device can be used easily
Device resembles a familiar device
Users can share the input with non-users
Users can share the output with non-users
Device appears useful in current context
Social attributions
How we explain for ourselves why
others behave in a certain way
(traits, roles, group memberships)
Factors:
Is partner agreeable or not
Is partner extro- or introverted
Is partner member of same group
Human behavior
What users usually do
Interaction outcome
Consequences of interaction,
both cognitive and affective
Factors:
Was interaction successful?
Are future similar interactions desired?
Did the user like the device?
Did the user like the partner?
Quantity and quality of work produced in a
social exchange
Results:
Social attributions
neg
Factors:
Users believe that device can be used easily
Device resembles a familiar device
Users can share the input with non-users
Users can share the output with non-users
Device appears useful in current context
Human behavior
What users usually do
neg
neg
Interaction outcome
Consequences of interaction,
neg
both cognitive and affective
Factors:
neg
10
Rowson 2000
Chat,
friend finder
Baseball
team fan
Parent-Teacher
Association
Soccer team
Note passer
Mall
encounter
Homework
Movies
Group
project
School
Recreation
Birds of
feather
Church
group
Finances
Meetings
Study
Shopping
Hallway
chat
Hospice
Organizer
Prayer
Health
management
Family
Work
Spiritual
Role
Examples:
12
My strategy to answer
1.
2.
3.
4.
Person 1
Co-located person
Interface
Machine,
Medium
Person 2
Class A:
Social impact on relationship
between person and machine (medium)
14
Person 1
Co-located person
Interface
Machine,
Medium
Person 2
Class B:
Social impact on relationship
between person and co-located people
15
Person 1
Co-located person
Interface
Machine,
Medium
Person 2
Class C:
Social impact on relationship
between person and mediated people
16
Basic assumption
Each communication consists of two elements:
1. Initiation (alert)
2. Act of communication
More specifically
Unsuccessful initiations
happens less and less: graceful degradation, awareness applications
17
18
Statement 1:
The more human-like the interaction, the better
are the users attitudinal responses.
Class A
User interface design suggestions; they are not orthogonal dimensions:
19
SIA(R)s have human-like social intelligence to address the emotional and interpersonal dimensions of social interaction.
Dautenhahn 2000
Breazeal 2001
Dautenhahn 1998
20
Statement 2:
The more social intelligence a device has,
the more positive the social impact.
Class A
Many user interface design suggestions, here are just two:
Nielson 1993
This is a consequence of the function of the machine: Its role will not be to obey orders
literally, but to interpret user actions and do what it deems appropriate.
21
22
Statement 3:
The less telecommunication, the better for the
interaction with co-located persons.
Class B
But
23
Statement 4:
Find balance between useful interruptions and
attention for co-located persons .
Class B
Interfaces that allow communication in parallel to the ongoing colocated interactions, which in turn enable multiple activities
concurrently (mobile communication happens in many different
contexts).
Examples: Simple speakerphone, Nomadic Radio
Interfaces that support multiple levels of intrusiveness," enabling
background awareness applications.
Examples: Audio Aura
Audio Aura: serendipitous information via background auditory cues; uses sonic
ecologies, embedding the cues into a running, low-level soundtrack so that the user is
not startled by a sudden sound; different worlds like music, natural sound effects
(ocean).
Adaptive background music selection system: each user can map events to songs, so
personal alerts can be delivered through the music being played in the public background
24
Class B
25
Statement 5:
The less intrusive the alert and the act of
communication, the more socially accepted.
Class B
Interfaces that can adapt to the situation and allow for mixedmode communication.
Example: Quiet Calls. Important problem to solve is how to map
communication modes adequately, e.g., Quiet Calls uses a TalkAs-Motion metaphor: engage, disengage, listen
Rhodes 2000
Sawhney et al. 2001
26
Statement 6:
The more public the preceding alert, the more
socially accepted the following act of
communication.
Class B
public
Auditory cues
intrusive
Hansson et al. 2000
Statement 7:
The more obvious the act of communication,
the more socially accepted.
Class B
Fukumoto et al. 1999
Statement 8:
A mobile device that can be used by a single
user as well as by a group of any size will more
likely get socially accepted by co-located
persons.
Class B
In other words: A device which has a user interface that has
. option to adapt to the group size of the social setting (from
the
individual to community), will be a better device
Rowson 2001
29
Mediated humanhuman
relationships
Class C
McLuhan
30
direction of looking
posture
tone of voice
Class C
Short et al. 1976
31
Class C
Williams 1975
Chapanis et al. 1972
Mehrabian 1971
32
Statement 9:
The higher the Social Presence and Immediacy
of a medium, the better the attitudinal
responses to partner and medium.
Class C
phone
videophone
face-to-face
Immediacy of
medium
33
Class C
Hypothesis 1:
Positive attitude
towards partner
and medium
Class C
Task requires
only low intimacy
between partners
Task requires
high intimacy
videophone
phone
Face-to-face
Immediacy of
medium
35
Class C
Hypothesis 2:
36
Class C
Well
acquainted
with partner
Unknown
partner
videophone
phone
Face-to-face
Immediacy of
medium
37
Statement 10:
The users attitudinal responses depend on
how well the partners know each other, and if
the communication task requires them to
disclose themselves extensively.
Class C
38
Statement 11:
The more the user is aware of the social
context of the partner before and during the
communication, the better.
Class C
Interfaces that let the user preview the social context of the
communication partner. This could include interfaces that give
the user an idea where the communication partner is, or how
open and/or available she is to communication attempts
Examples: Live Address Book, ConNexus and Awarenex,
Hubbub
39
Special Case:
Awareness communication
Class C
Example agents:
40
Statement 12:
Receiving information from the outer layer of
a person about her current context simplifies
awareness between the partners, and has
positive social impact.
Class C (special)
Statement 13:
Mobile communication happens continuously,
everywhere and anytime, and therefore is used
in many different social context situations.
Mobile
communication
The user interface has to adapt to this variety of social context situations.
Distributed interfaces that are not only part of the mobile device, but
also of our environment. This includes a modular approach for user
interfaces that dynamically connect to the available communication
devices and channels
Weiser 1991
42
Conclusions
43
Thanks! :-)
All materials related to my Qualifying Exams, including the paper on which this presentation
is based on, are at:
http://web.media.mit.edu/~stefanm/generals/
44
45
46
Reminder Bracelet
47
Whisper
Fukumoto et al. 1999
Grasping posture
Outside noise shut out
Users hear themselves better (dont raise voice)
because ear covered
48
Talking alone: todays earphone-microphone units are large enough to be visible, so the
surrounding people can easily notice their presence.
However, it is clear that almost invisible ear plug style devicesintegrating telephone
and PDA functionalitywill be feasible sometime soon. Such devices can be easily
overlooked by co-located people, and it will appear to these people as if the user is
talking to herself.
The phenomenon of talking alone might look very strange, and is certainly socially not
acceptable. Fukumoto et al. even hypothesize that the stigma attached to talking
alone has hindered the spread of the wearable voice interface. Therefore, the
important issues that must be addressed are the social aspects when designing and
implementing wearable voice interfaces.
Talking alone phenomenon does not occur if the user seems to hold a tiny telephone
handset, even if the grasped object is too small to be seen directly. Basically, this effect
can be achieved by just mimicking the grasping posture.
Whisper, a wearable voice interface that is used by inserting a fingertip into the ear
canal, would satisfy the socially necessary need not to conceal the act of
communication
49
50
Lookout
Horvitz 1999
51
Quiet Calls
http://www.fxpal.com/ACM1/qc.htm
Quiet Calls: allows telephone users to respond to a call without talking aloud. The prototype QC-Hold has three buttons for
responding to a call, sending three types of pre-recorded audio messages directly into the phone. The most important
problem to solve is how to map communication modes adequately.
Talk-As-Motion metaphor for Quiet Calls: Communication is supported in three directions: move in to the call by engaging
the caller verbally; move out of the call by disengaging; and in between these opposites, stay in place by listening to the
caller. This design is implemented as a state transition process and overloading buttons with multiple meanings over the
course of the call. The three buttons trigger three different kinds of messages: engage, disengage, and listen.
It enables the user to respond on a meta level, which is grainier than real speech, but still precise enough to control the
mixed-mode conversation, letting the device decide about the wording of the sentences. This solution is preferable over the
manual selection of a specific answer, e.g., via a long list of canned replies that are difficult to manage and browse
52
Comtouch
Chang, 2001
http://www.media.mit.edu/~anjchang/COMTOUCH/CHI/ComTouchl.rm
ComTouch uses the haptic modality. It allows a
handheld device to register the force of pressure
from each finger as the object is squeezed. At the
receiving end, vibrations under each finger
represent the transmitted force.
53
LumiTouch
54
Wisneski, 1999
55
Awarenex
Hubbub
Hubbub: instant messenger that runs on a wireless Palm and a PC, enabling people to maintain
background awareness of others and send them quick messages. It uses a novel concept of "sound
instant messages," i.e., earcons that have meaning, such "Hi" or "Thanks." Each user has a Sound ID
that announces their sound messages and their changes in availability. Users can protect their privacy
and control sound overload.
ConNexus: integrates awareness information, instant messaging, and other communication channels
in an interface that runs on a desktop computer.
Awarenex: extends that functionality to wireless handheld devices, such as a Palm. A speech interface
56
also enables callers to make use of the awareness information over the telephone
57
FingeRing
58
Samsung Scurry
59
Arm-top metaphor
Wrist-top metaphor
60
61
62
63
References
64
[1]Abowd, G.D., and Mynatt, E.D. (2000). Charting past, present, and future research in ubiquitous computing. ACMTransactionsonComputerHumanInteractionToCHI,7(1),March2000,pp.29-58.
http://www.acm.org/pubs/articles/journals/tochi/200071/p29abowd/p29abowd.pdf
[2]Argyle, M. & Dean, J. (1965). Eye-contact, distance, affiliation.Sociometry,28,289-304.
Abstract:http://nimbus.ocis.temple.edu/~mlombard/Presence/argyle65.html
http://links.jstor.org/sici?sici=0038-0431%28196509%2928%3A3%3C289%3AEDAA%3E2.0.CO%3B2-Q
[3]Bannon, L. (1995). Editorial, Commentaries and a response in the Suchman-Winograd debate.ComputerSupportedCooperativeWork(CSCW),
3(1),Netherlands:KluwerAcademics,pp.29-95.
[4]Biocca, F.,Burgoon, J., Harms, C., Stoner, M. (2001). Criteria And Scope Conditions For A Theory And Measure Of Social Presence.In
proceedingsofPresence2001,4thAnnualInternationalWorkshop,May21-232001,atTempleUniversity,Philadelphia,PA.
http://nimbus.ocis.temple.edu/~mlombard/P2001/Biocca1.pdf
[5]Breazeal, C. (1999). Robot in Society: Friend or Appliance?InAgents99WorkshoponEmotion-BasedAgentArchitectures,Seattle,WA,pp.18-26.
http://www.ai.mit.edu/projects/sociable/publications.html
[6]Chang, A. (2001). ComTouch: A Vibrotactile Emotional Communication Device.Technicalreport,MITMediaLab(unpublishedpaper).
http://web.media.mit.edu/~anjchang/DS2001/comtouch.htm
[7]Chang, A., Resner, B., Koerner B., Wang, X and Ishii, H. (2001). LumiTouch: An Emotional Communication Device.InExtendedAbstractsof
ConferenceonHumanFactorsinComputingSystems(CHI'01),Seattle,Washington,USA,March31-April5,2001,ACMPress,pp.313-314.
http://tangible.media.mit.edu/papers/LumiTouch_CHI01/LumiTouch_CHI01.pdf
[8]Chapanis, A., Ochsman, R.B., Parrish, R.N., and Weeks, G.D. (1972). Studies in interactive communication: I. The effect of four communication
modes on the behaviour of teams during cooperative problem-solving.HumanFactors,14(6),487-509.
[9]Clark, H. H. (1996). Using language.NewYork,NY:CambridgeUniversityPress.
Review:http://www.massey.ac.nz/~alock/virtual/clarke.htm
[10]Dautenhahn, K. (1998). The Art of Designing Socially Intelligent Agents Science, Fiction, and the Human in the Loop.SpecialIssueSocially
IntelligentAgents,AppliedArtificialIntelligenceJournal,Vol.12,7-8,pp.573-617.
http://orawww.cs.herts.ac.uk/~comqkd/papers.html
[11]Dautenhahn, K. (2000). Socially Intelligent Agents and The Primate Social Brain - Towards a Science of Social Minds.ProceedingsofAAAIFall
SymposiumSociallyIntelligentAgents-TheHumanintheLoop,AAAIPress,TechnicalReportFS-00-04,pp.35-51.
http://orawww.cs.herts.ac.uk/~comqkd/papers.html
[12]Dey, A., Ljungstrand, P., and Schmidt, A. Distributed and Disappearing User Interfaces in Ubiquitous Computing.AworkshopheldatCHI2001.In
ExtendedAbstractsofComputer-HumanInteraction(CHI)2001,ACMPress,2001,pp.487-488.
http://www.teco.edu/chi2001ws/disui.pdf
[13]Dix,A.,Rodden,T.,Davies,N.,Trevor,J.,Friday,A.,andPalfreyman,K.(2000).Exploitingspaceandlocationasadesignframeworkforinteractive
mobilesystems.ACMTransactionsonComputerHumanInteractionToCHI,7(3),Sept.2000,pp.285321.
http://www.acm.org/pubs/articles/journals/tochi/200073/p285dix/p285dix.pdf
[14]Dryer, D.C., Eisbach, C., and Ark, W.S. (1999). At what cost pervasive? A social computing view of mobile computing systems.IBMSystems
Journal,Volume38,No4.
http://www.research.ibm.com/journal/sj/384/dryer.pdf
65
[15]Flores, F., Graves, M., Hartfield, B., and Winograd, T. (1988). Computer Systems and the Design of Organizational Interaction.ACMTrans.
OfficeInfo.Systems,April1988,pp.153-172.
http://portal.acm.org/citation.cfm?id=45943&coll=ACM&dl=ACM&CFID=1282938&CFTOKEN=25128089
[16]Fukumoto, M. and Tonomura, Y. (1999). Whisper: A Wristwatch Style Wearable Handset.ACMCHI'99Proceedings,pp.112-119.
http://www.acm.org/pubs/articles/proceedings/chi/302979/p112fukumoto/p112fukumoto.pdf
[17]Fukumoto, M., and Tonomura, Y. (1997). Body Coupled FingeRing: Wireless Wearable Keyboard.ACMCHI'97Proceedings,pp.147-154.
http://www.atip.or.jp/Akihabara/links/johanwear/ntt/fkm.htm
[18]Gong, L., and Lai, J. (2001). Shall We Mix Synthetic Speech and Human Speech? Impact on Users' Performance, Perception and Attitude.ACMCHI
2001Proceedings,pp.158-165.
http://www.acm.org/pubs/articles/proceedings/chi/365024/p158gong/p158gong.pdf
[19]Hansson, R. and Ljungstrand, P. (2000). The Reminder Bracelet: Subtle Notification Cues for Mobile Devices.In:ExtendedAbstractsofCHI2000
(StudentPoster),ACMPress,pp.323-325.
http://www.viktoria.se/~rebecca/artiklar/chi2000poster.pdf
[20]Hansson, R., Ljungstrand, P., and Redstrm, J. (2001). Subtle and Public Notification Cues for Mobile Devices.In:ProceedingsofUbiComp
2001,Atlanta,Georgia,USA.
http://www.viktoria.se/~rebecca/artiklar/Notification_final.pdf
[21]Hjelm, J. (2000). Designing Wireless Information Services.NewYork,NY:JohnWiley&Sons.
http://www.wireless-information.net/
[22]Horvitz, E. (1999). Principles of Mixed-Initiative User Interfaces.ACMCHI'99Proceedings,pp.159-166.
http://www.acm.org/pubs/citations/proceedings/chi/302979/p159-horvitz/
[23]Isaacs, E., Walendowski, A., and Ranganathan, D. (2002). Hubbub: A sound-enhanced mobile instant messenger that supports awareness and
opportunistic interactions.TobepublishedApril,2002byACMintheProceedingsoftheConferenceComputer-HumanInteraction(CHI),
Minneapolis,MN.
http://www.izix.com/pro/lightweight/hubbub.php
[24]Ishii, H., and Ullmer, B. (1997). Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms.InProceedingsofCHI'97,pp.234241.
http://tangible.media.mit.edu/papers/Tangible_Bits_CHI97/Tangible_Bits_CHI97.pdf
[25]Lenat, D.B. (1995). Cyc: A Large-Scale Investment in Knowledge Infrastructure.CommunicationsoftheACM,38(11):32-38.
http://www.acm.org/pubs/articles/journals/cacm/1995-38-11/p33-lenat/p33-lenat.pdf
[26]Luff, P., and Heath, C.(1998). Mobility in collaboration.ACMCSCW98Proceedings,pp.305-314.
http://www.acm.org/pubs/articles/proceedings/cscw/289444/p305-luff/p305-luff.pdf
[27]Marx, M., and Schmandt, C. (1996). CLUES: Dynamic Personalized Message Filtering.ProceedingsofCSCW96,November1996,pp.113-121.
http://portal.acm.org/citation.cfm?doid=240080.240230
[28]Mehrabian, A. (1971). Silent Messages.Belmont,CA:Wadsworth.
[29]Milewski, A. and Smith, T. M. (2000). Providing Presence Cues to Telephone Users.ProceedingsofCSCW2000,Philadelphia,PA.
http://www.research.att.com/~tsmith/papers/milewski-smith.pdf
[30]Myers, B., Hudson, S.E., and Pausch, R. (2000). Past, Present and Future of User Interface Software Tools.ACMTransactionsonComputerHumanInteractionToCHI,7(1),March2000,pp.3-28.
66
http://www.acm.org/pubs/articles/journals/tochi/2000-7-1/p3-myers/p3-myers.pdf
[31]Mynatt, E.D., Back, M., Want, R., Baer, M., and Ellis, J.B. (1998). Designing Audio Aura.ACMCHI98Proceedings,pp.566-573.
http://www.acm.org/pubs/articles/proceedings/chi/274644/p566-mynatt/p566-mynatt.pdf
[32]Nass, C., Steuer, J., Tauber, E., and Reeder, H. (1993). Anthropomorphism, Agency, & Ethopoeia: Computers as Social Actors.Presentedat
INTERCHI'93;ConferenceoftheACM/SIGCHIandtheIFIP;Amsterdam,Netherlands,April1993.
http://www.acm.org/pubs/citations/proceedings/chi/259964/p111-nass/
[33]Nelson, L., Bly, S., and Sokoler, T. (2001). Quiet calls: talking silently on mobile phones.ACMCHI2001Proceedings,174-181.
http://www.acm.org/pubs/articles/proceedings/chi/365024/p174-bly/p174-bly.pdf
[34]Nielsen, J. (1993). Noncommand user interfaces.AnupdatedversionofapaperthatappearedintheRevisedversionofCommunicationsoftheACM
36(4),April1993,pp.83-99.
http://www.useit.com/papers/noncommand.html
[35]Norman, D. A. & Draper, S. W. (Eds.) (1986). User centered system design: New perspectives on human-computer interaction.Hillsdale,NJ:
LawrenceErlbaumAssociates.
[36]Oppermann, R. and Specht, M. (1998). Adaptive support for a mobile museum guide. ProceedingsofInteractiveApplicationsofMobileComputing
98,Rostock,Germany.
http://www.rostock.igd.fhg.de/veranstaltungen/workshops/imc98/Proceedings/imc98-SessionMA3-2.pdf
[37]Oviatt, S. and Cohen, P. (2000). Multimodal Interfaces That Process What Comes Naturally.CommunicationsoftheACM,Vol.43(3),March
2000,pp.45-53.
http://www.acm.org/pubs/articles/journals/cacm/2000433/p45oviatt/p45oviatt.pdf
[38]Pascoe, J., Ryan, N., and Morse, D. (2000). Using while moving: HCI issues in fieldwork environments.ACMTransactionsonComputer-Human
InteractionToCHI,7(3),September2000,pp.417-437.
http://www.acm.org/pubs/articles/journals/tochi/200073/p417pascoe/p417pascoe.pdf
[39]Rhodes, B. (2000). Just-In-Time Information Retrieval. Ph.D.Dissertation,MITMediaLab,May2000.
http://www.media.mit.edu/~rhodes/Papers/rhodes-phd-JITIR.pdf
[40]Rice, R. E. (1992). Task analyzability, use of new medium and effectiveness: A multi-site exploration of media richness. OrganizationScience,3(4),
pp.475-500.
Abstract:http://nimbus.ocis.temple.edu/~mlombard/Presence/rice92.html
http://links.jstor.org/sici?sici=1047-7039%28199211%293%3A4%3C475%3ATAUONM%3E2.0.CO%3B2-A
[41]Rowson, J. (2001). The Social Media Project at Hewlett Packard Laboratories.TalkattheStanfordNetworkingSeminarofNovember1,2001,
CenterfortheStudyofLanguageandInformation(CSLI),StanfordUniversity.
http://netseminar.stanford.edu/sessions/2001-11-01.html
[42]Ruuska-Kalliokulju, S., Schneider-Hufschmidt, M., Vnnen-Vainio-Mattila, K., Von Niman, B. (2001). Shaping the Future of Mobile Devices.
ResultsoftheWorkshoponFutureMobileDeviceUserInterfacesatCHI2000.SIGCHIBulletinJanuary/February2001.
http://www.acm.org/sigchi/bulletin/2001.1/mobile_cont.pdf
[43]Sallns, E. L. (1999). Presence in multimodal interfaces.ProceedingsoftheSecondInternationalWorkshoponPresence,UniversityofEssex,
Colchester,UK.
http://www.nada.kth.se/~evalotta/Presence/IWVP.html
67
[44]Sawhney, N. and Schmandt, C. (2000). Nomadic Radio: Speech & Audio Interaction for Contextual Messaging in Nomadic Environments.ACM
TransactionsonComputerHumanInteractionToCHI,7(3),Sept.2000,pp.353-383.
http://www.acm.org/pubs/articles/journals/tochi/2000-7-3/p353-sawhney/p353-sawhney.pdf
[45]Schmidt, A., Gellersen, H.-W., and Beigl, M. (1999). Matching Information and Ambient Media.InProceedingsofCoBuild'99.Second
InternationalWorkshoponCooperativeBuildings,Pittsburgh.LNCS1670.Springer:Heidelberg1999.
http://www.comp.lancs.ac.uk/~albrecht/pubs/pdf/schmidt_cobuild99_ambient.pdf
[46]Shneiderman, B. (1998). Designing the User Interface: Strategies for Effective Human-Computer Interaction.ThirdEdition,Reading,MA:Addison
Wesley.
[47]Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications.London:JohnWiley.
[48]Singer, A., Hindus, D., Stifelman, L., White, S. (1999). Tangible Progress: Less Is More In Somewire Audio Spaces.InProceedingsofCHI'99,
ACM,1999,pp.104-111.
http://portal.acm.org/citation.cfm?doid=302979.303007
[49]Steuer, J. (1995). Self vs. Other; Agent vs. Character; Anthropomorphism vs. Ethopoeia. InVividnessandSourceofEvaluationasDeterminantsof
SocialResponsesTowardMediatedRepresentationsofAgency,doctoraldissertation,StanfordUniversity,advisedbyNassandReeves.
http://www.cyborganic.com/People/jonathan/Academia/Dissertation/theory1.html
[50]Suhm, B., Myers, B., and Waibel, A. (1999). Model-based and empirical evaluation of multi-modal interactive error correction.ACMCHI'99
Proceedings,pp.584-591.
http://www.acm.org/pubs/articles/proceedings/chi/302979/p584suhm/p584suhm.pdf
[51]Tang, J., Yankelovich, N., Begole, J., Van Kleek, M., Li, F., and Bhalodia, J. (2001). ConNexus to Awarenex: Extending awareness to mobile
users.InProceedingsofACMCHI2001,Seattle,Washington,March31-April5,2001,pp.221-228.
http://www.sun.com/research/netcomm/papers/CHI2001Proc.pdf
[52]Walker, M.A., Fromer, J., Di Fabbrizio, G., Mestel, C., and Hindle, D. (1998). What can I say?: Evaluating a Spoken Language Interface to
Email.ACMCHI98Proceedings,pp.289-290.
http://www.acm.org/pubs/articles/proceedings/chi/274644/p582walker/p582walker.pdf
[53]Weiser. M. (1991). The computer for the 21st Century.ScientificAmerican,Volume265,Number3,September1991,pp.94-104.
http://nano.xerox.com/hypertext/weiser/SciAmDraft3.html
[54]Wickens, C.D. (1992). Engineering Psychology and Human Performance.NewYork,NY:HarperCollins.
http://vig.prenhall.com/catalog/academic/product/1,4096,0321047117,00.html
[55]Williams, E. (1975). Medium or message: Communications medium as a determinant of interpersonal evaluation.Sociometry,38(1),pp.119-130.
http://links.jstor.org/sici?sici=0038-0431%28197503%2938%3C119%3AMOMCMA%3E2.0.CO%3B2-P
[56]Williams, E. (1977). Experimental comparisons of face-to-face and mediated communication: A review.PsychologicalBulletin,84(5),pp.963-976.
[57]Wisneski, C. A. (1999). The Design of Personal Ambient Displays.MastersThesis,MassachusettsInstituteofTechnology.
[58]Wisneski, C., Ishii, H., Dahley, A., Gorbet, M., Brave, S., Ullmer, B. and Yarin, P. (1998). Ambient Displays: Turning Architectural Space into an
Interface between People and Digital Information. InProceedingsofCoBuild'98,InternationalWorkshoponCooperativeBuildings,Darmstadt,
Germany,February1998,Springer,pp.22-32.
http://tangible.media.mit.edu/papers/Ambient_Disp_CoBuild98/Ambient_Disp_CoBuild98.pdf
68