Professional Documents
Culture Documents
Ben Norton
Masters Dissertation in Creative Systems University of Sussex: Department of Informatics September 2011
ii
Acknowledgements
Firstly I would like to grant many thanks to my dissertation supervisor Dr. Thor Magnusson for his expert guidance and direction. I would also like to thank all at the University of Sussex who have imparted their considerable expertise, instruction and enlightenment: Dr. Chris Thornton, Dr. Pablo Romero, Dr. Nick Collins, Andrew Duff, Mary Krell, Dr. Paul Newbury and Dr. Christopher Frauenberger, as well as the others on Creative Systems: Andrew Lambert, Matt Garland, Steve Mansfield and Adrija Dey. I would further like to thank all participants in the design and evaluation process: David, Penny and Jen Ames, Matt Garland, Andrew Lambert, Thor Magnusson, Alastair and Lesley Norton, Maureen Relf, Maria and Morgan Scottow, and Hannah and Asher Walker whose contributions were invaluable, insightful and often surprising. I would finally like to thank my friends and family for all their support.
iii
iv
Abstract
This dissertation will examine recent developments in human-computer interaction. The core focus is on tangible and embodied interaction within the context of a creative system. The research will explore the constraints and affordances of the tangible interaction paradigm, informed by research in the cognitive sciences into embodied, extended and distributed cognition. A number of user-centred design practices, such as ethnography and participatory design, will be used in the production of a creative system based upon this research. The paper describes a creative system in the realm of computer music production and performance, and digital instrument design. As an activity intimately tied to realtime performance, much computer music research has been dedicated to the expressive control of digital and hybrid devices, and the major areas of practice and associated theoretical approaches will be examined herein. This paper will further explore an analysis of synchronic versus diachronic forms of joint creative activity and relate this to avenues of research within ubiquitous and physical computing and group creativity.
vi
Table of Contents
Introduction Chapter 1 Considerations from Philosophy and Cognitive Science 1 3
1.1 Phenomenology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1.1 Being in the World . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.1.2 Inhabiting the Body . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2 The Embodied Mind . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2.1 Embodied Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2.2 Enaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.3 Extended Cognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.3.1 Distributed Cognition . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Chapter 2 Tangible Interaction: The Tangible User Interface 9
2.1 The Broader Field of Tangible Interaction . . . . . . . . . . . . . . . . . . . . . 9 2.2 Implementations and Application Domains . . . . . . . . . . . . . . . . . . . . . 10 2.3 Ecology of Tangible Interaction: Constraints and Affordances . . . . . . . . . . . . . 12 2.3.1 The Tangible and the Social: Embodied Interaction . . . . . . . . . . . . . . . 13 2.4 Group Creativity: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.4.1 Realtime and Non-realtime Group Creativity . . . . . . . . . . . . . . . . . 15 Chapter 3 The System 3.1 Design Considerations and Requirements Analysis: An Embodiment Perspective 3.1.1 Ethnomethodology and Technomethodology . . . . . . . . . . . . . 3.1.2 User-Centered Design . . . . . . . . . . . . . . . . . . . . . . 3.1.3 Participatory Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 . 18 . 19 . 20 . 20
3.2 Results from the Participatory Design Sessions . . . . . . . . . . . . . . . . . . . 20 3.3 Overview of the System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Chapter 4 Evaluation 25
4.1 Ecology of the System: Constraints and Affordances . . . . . . . . . . . . . . . . . 27 Conclusions and Future Directions Appendix I Appendix II Appendix III Appendix IV Bibliography 29 31 37 39 41 81
vii
viii
Introduction
This thesis will describe a creative system within the burgeoning field of tangible and embodied interaction, prevalent in current HCI research. The system is a set of novel interaction modalities, allowing the user to utilise physical objects to interface with the computer. The major concern of this field is a reconsideration of what it means to interact with computational technologies as an embodied person; situated within a wider social and technological culture; and interacting in a more physical, direct and natural manner. There is a growing body of research on the central role that the body plays in our cognitive processes; on how knowledge can be tacitly understood and how our environment shapes the way we think. The prevailing model in the history of modern computing has been the window, icon, menu, pointer (WIMP) interface. This office-based metaphor was developed from the conceptual framework outlined by Doug Engelbart at the Stanford Research Institute and was implemented in the oN-Line System (NLS). Engelbart defined four key elements in the framework which contribute to the the goal of augmenting the human intellect (Engelbart 1962, cited in Moggridge 2007, pp. 30-37): Artifactsphysical objects designed to provide for human comfort, for the manipulation of things or materials, and for the manipulation of symbols. Languagethe way in which the individual parcels out the picture of his world into the concepts that his mind uses to model that world, and the symbols that he attaches to those concepts and uses in consciously manipulating the concepts ("thinking"). Methodologythe methods, procedures, strategies, etc., with which an individual organizes his goal-centered (problem-solving) activity. Trainingthe conditioning needed by the human being to bring his skills in using means 1, 2, and 3 to the point where they are operationally effective. The NLS system radically changed the way in which people could manipulate information and interact with computers. Up until that point most interaction was via batch processing on mainframe systems with punch cards, or through command-line interfaces. What is implicit in this new form of interaction enabled by Engelbart's framework is that it brings the body of the user into play in a physical, present and performative way; the fact that the interaction is a form of realtime computing, the fact that the mouse, keyboard and keypad become extensions of the user's hands, operating in concert with the visual feedback Figure 1.1: The NLS keyboard, keypad and mouse. from the graphical user interface . Indeed the first element Engelbart defines in his framework is the use of artifactsphysical objects which can be augmented with computational power, thus generating new affordances. So we see that the interaction paradigm of the graphical user interface (herein GUI), mouse and keyboardwhich was implemented in the NLS system and has dominated in ever-evolving forms to 1
this dayis only one of innumerable possible configurations for putting into practice a mode of interaction which implements the four core elements of Engelbart's framework. The model of the networked, attention-maximising terminal with the desktop-based graphical interface is now being augmented withand in some cases replaced byvery different forms of interaction. Much current theory and practice in the field of HCI is focused upon these alternative methods for interacting with information. A particularly prominent area of research is being conducted in the field of tangible user interfaces (herein TUI). A prime example of a recent TUI is the reacTable devised by the Music Technology Group at Pompeu Fabra University in Barcelona. It is a table-based collaborative musical instrument, which utilises a computer vision system to track fiducial markers and multitouch input on its surface. It is a visually appealing and radically new way of performing live music using computational technologies (Geiger 2010). We will now consider the philosophical foundations and the developments in cognitive science which have led to this reconsideration of the way that our body shapes our mental abilities, how this can affect other people and objects in the environment, and how this can in turn mediate different forms of interaction with digital information.
1.1 Phenomenology
A number of key elements from the philosophy of phenomenology have been appropriated and recuperated into the disciplines of cognitive science in recent decades following Hubert Dreyfus' sustained critiques informed from the perspective of phenomenology: Alchemy and Artificial Intelligence (1965) and What Computers Can't Do (1972) (cited in Boden 2006, pp. 838-841). What is central to current HCI practice in the development of tangible interfaces is the redress that ensues from reintroducing the body of the user and the context in which the interaction takes place. The core tenet of phenomenology was the aim to develop a complete understanding of the nature of first-person experience. It emerges with Edmund Husserl's interpretation of his teacher Franz Brentano's conception of 'intentionality', which states that our mental world is always directed toward and trained upon phenomena out in the world (1874, pp. 88-89). For Husserl the important first step in understanding the world from the first-person perspective was taken by 'bracketing' out elements of the experience (Husserl 2001). By stripping a moment of conscious awareness of the various preconceptions and prejudices we may have concerning that moment, and subsequently reacquainting ourselves with a properly transformed understanding of the experience, we would arrive at the fundamental phenomenal understanding of that moment. This level of understanding is one of induction, whereby the facets of the thing perceived-as-itself (what Husserl would term the noetic content) are an existence proof in themselves of the validity and authenticity of our intentionality directed towards them, whether that thing is a physical object or an abstract concept.
1.1.1 Being-in-the-world One of Husserl's students, Martin Heidegger, devised a critical reappraisal of this version of phenomenology. In Being and Time (1962) Heidegger stresses the importance of context, the situation of being-in-the-world, which he termed 'Dasein'. Heidegger's version of phenomenology stressed not only the embedded nature of being-in-the-world but also the active and incorporated nature of the experiential phenomena which an embedded being perceives and acts upon. This notion is expressed in the concepts of 'ready-to-hand' and 'present-at-hand'. For Heidegger the most common stance in everyday existence toward any object of consciousness is that of the ready-to-hand. Heidegger uses the example of a carpenter using a hammer to illustrate this. The hammer is considered ready-to-hand when the knowledge of the use of the hammer is fully incorporated into the Dasein, that is to say that the act of hammering is unconscious, automated and the being who is doing the hammering can direct their attention elsewhere. The present-to-hand is the the kind of intentional state we exhibit when, for example, a hammer is broken or unusual in some way. In the present-to-hand mode we occupy a conscious position akin to the concept of bracketing laid out by Husserl. In this state the hammer presents to our consciousness many divergent aspects which can potentially enter into our consideration: the size, the weight, the material make-up of the hammer, any distinguishing marks, alternate uses and so on. This is true of the break-down of any element in our world which we would normally engage from the procedural, incorporated stance of the ready-to-hand. 1.1.2 Inhabiting the Body Heidegger devotes little attention to the medium of being-in-the-world however, the material locus of our interactions with the world. It is in Maurice Merleau-Ponty's Phenomenology of Perception (2002) that we find the most complete account of the role that the body plays in the construction of the phenomenological subject: Consciousness is being-towards-the-thing through the intermediary of the body. A movement is learned when the body has understood it, that is, when it has incorporated it into its 'world', and to move one's own body is to aim at things through it We must therefore avoid saying that our body is in space, or in time. It inhabits space and time (ibid., pp. 138-139, italics in original). Merleau-Ponty makes explicit that the phenomenological reduction espoused by Husserl would be constrained by the nature of being a body in time and space, with the contingencies that entails, i.e., the relative distance from which the perception occurs, the sensory organs mobilised to attend to the phenomena, the temporal aspect of the perception, and so on. The everyday nature of our perceptions are grounded by the fact that we tend to perceive things at a middle-distance, that we can move around objects or, if they afford it, manipulate them with our hands. We utilise our sensorimotor capacities to form a phenomenal intuition and this is a central design concerns in TUIs, which will be explored in the third chapter. The situatedness of our perception of time is also a fundamental concern, as Merleau-Ponty states: Time presupposes a view of time. It is, therefore, not like a river, not a flowing substance. The fact that the metaphor based on this comparison has persisted from the time of Heraclitus to our 4
own day is explained by our surreptitiously putting into the river a witness of its course (ibid, p. 411). He goes on to add, let us no longer say that time is a 'datum of consciousness'; let us be more precise and say that consciousness deploys or constitutes time (ibid, p. 414). This notion of temporality is central to Heidegger's phenomenology and it is clear that being-in-the-world is a continuum and that the body is a process over time. The practical engagement with the world of Dasein is necessarily oriented temporally, be it through past, present or future, and considerations of the representation of time in a TUI will be a central design concern laid out in chapter 3. We next consider the extent to which many theorists believe our cognitive abilities are embodied.
Condition 4: It is the level at which most of our knowledge is organized. This basic-level categorisation applies not only to objects but also to actions, emotions, and so on. The implication here is that the nature of our embodiment constrains and dictates the kinds of concepts which we are capable of having. This is seen clearly in the spatial relations apparent in the prepositions used in language. Lakoff and Johnson give a list of some examples used in the languages of the world: part-whole, center-periphery, link, cycle, iteration, contact, adjacency, forced motion (e.g., pushing, pulling, propelling), support, balance, straight-curve, and near-far. Orientations also used in the spatial-relations systems of the world's languages include vertical orientation, horizontal orientation, and front-back orientation (ibid., p. 35). These are forms of what Lakoff and Johnson term 'image schemas', derived from the experiential phenomena of being in the world, as a motile body with sensorimotor capacities for registering and reacting to movement and spatial orientation. The conclusion which Lakoff and Johnson arrive at is the possibility (borne out as existence proof from neural modelling techniques) that conceptual inferences are a part of the sensorimotor system in the brain and that these categorical systems are learned unconsciously, emerging from the bodily interaction with the world (ibid., pp. 31-39). This entails a collapse of the distinction between concept and percept. Gallagher considers the problem in a similar vein from the perspective of theory of mind. He states that: It is not clear that we represent, explicitly or implicitly, the sorts of rules (causal-explanatory laws) that would summarize what we know of human situations and that would operate as the basis for a theoretical understanding of the other person (2005, p.211). He considers an experimental situation where the subject is required to observe and then simulate an action performed by another person. In the experiment fMRI is used and the results show a significant overlap for observation of the action and simulation of the action in the supplementary motor area, the dorsal premotor cortex, the supramarginal gyrus, and the superior parietal lobe (ibid., p. 222). The implication is in line with the findings of Lakoff and Johnson regarding the potential for the sensorimotor system's capacity to perform perceptual and conceptual operations without any representation, but rather as one process. 1.2.2 Enaction In the account detailed by Varela et al. in The Embodied Mind (1991, p. 173) there are two preliminary points defining enaction (1) perception consists in perceptually guided action and (2) cognitive structures emerge from the recurrent sensorimotor patterns that enable action to be perceptually guided. In the enactive view the key to understanding perception is understanding the structure of the embodiment which enables the embodied being to modulate its action in response to environmental factors. Moreover the embodied being has an evolutionary and developmental history in conjunction with the environment, and thus the environment and the being are mutually 'enacted' by reciprocal specification and selection (ibid. p. 174). Varela et al. go on to consider the evolutionary history of colour in various species, from the trichromatic colour vision of the primate to the tetrachromatic colour vision of goldfish and turtles (ibid., pp. 181-184). Each phylogenetic pathway is locally optimal to each species, therefore cognitive abilities and possibilities are optimally adapted to the local environment by structural coupling. This entails that the ecological affordances outlined by Gibson (1986) are not provided by the environment but co-dependently enacted between the environment and the perceiving-acting being (ibid., p. 204). Varela et al. note 6
two levels of description which can be used to articulate the enactive stance; focusing on the structure of the system by describing it as composed of various subsystems ... or focus on the behavioural interactions of the system by describing it as a unity capable of various forms of coupling (ibid., p. 206). To properly understand cognition in this framework requires looking out onto the structurally coupled unity of an enacted being in its environment, a being-in-the-world. Indeed, if we wish to recover common sense, then we must invert the representationist attitude by treating contextdependent know-how not as a residual artifact that can be progressively eliminated by the discovery of more sophisticated rules but as, in fact, the very essence of creative cognition (ibid., p. 148, italics in original). In this view the representationalist accounts of cognitive science have been merely considering the residue, the vapour trails of cognition.
In his view the temporal nature of the interaction has an important role to play. Procedures can be sequentially unconstrained, that is the procedure is not going to be derailed by operations happening out of sequence, and procedures can also be sequentially constrained, such that operations out of sequence could derail it (ibid., p. 198). Sequentially constrained operations require control through planning or backtracking. Consequently there exist many different levels of enaction, from the intrapersonal and interpersonal to higher orders of interacting groups, and all of these levels are mediated and modulated by the constraints of the individual body, elements within the environment, and by technological apparatus, most of which are already evolutionarily black-boxed in such a way that it optimises the interactions between the elements in the cognitive process. This over-arching framework is absolutely central to the research field of tangible user interfaces (Dourish 2001). We now consider the history of how this field emerged.
1) Tangible Manipulation: Haptic Direct Manipulation: Physically moving objects with the hands. Lightweight Interaction: Reliable feedback from the system. Isomorph Effects: Understanding the relationship between actions and effects. 2.) Spatial Interaction: Inhabited Space: The meaning which emerges in the context of the interaction. Configurable Materials: The degree to which the interactional elements can be configured. Non-fragmented Visibility: A clear line of sight for all participants. Full-Body Interaction: The role the whole body might play. Performative Action: Communication using gesture. 3.) Embodied Facilitation: Embodied Constraints: The way the system coupled with the users bodies. Multiple Access Points: Affording equal participation. Tailored Representation: Drawing on the users' tacit knowledge and inviting the interaction.
10
temperature sensors, and actuators which can produce many different types of motion (ibid., pp. 8081). Examples of RFID TUIs include those of the Tangible Media Group, such as mediaBlocks (Ullmer et al. 1998), which consists of a set of small, wooden, electronically tagged blocks that facilitate containment and manipulation of online digital media; and the Senseboard (Jacob et al. 2002), which operates through the placement of tagged pucks on a white board, allowing for the organisation of information and for executing commands. Similarly Smart Blocks is a system of tagged blocks which are designed to be connected together and the system can then compute the volume and surface area of the resultant 3-D structure (Schaer & Hornecker 2010 p. 76). Examples of computer vision based TUIs include the DigitalDesk (Wellner 1993), the Urban Planning Workbench (Ishii et al. 2002), Illuminating Clay (Ishii et al. 2004), SandScape (ibid.) and the reacTable (Geiger et al. 2010). The Urban Planning Workbench like the DigitalDeskutilised an overhead camera to track physical objects placed on the work surface and a projector to overlay digital information. The application domain was in urban planning and included functions for computing wind flow, shadows and window reflectance. Illuminating Clay and SandScape utilise an overhead laser and infra-red camera system respectively to compute ridges and troughs in surfaces made of sand and clay. Figure 2.4: Illuminating Clay. Like the Urban Planning Workbench a projector is used to overlay digital information onto the sand and clay surfaces to represent states such as the flow of water over the landscape. The reactable is a modular synthesis system, which utilises physical objects with printed fiducials. These symbols are recognised by the computer vision system and tracked, with control data being transmitted according to the position, orientation, and proximity of the symbols. The systems mentioned here are all table-top systems and the main reason are the affordances for group participation in these systems. Rogers and Lindley (2004, cited in Sharp et al. 2007, p. 275) have found that horizontal display surfaces promote more collaboration and turn-taking practices in collocated groups than vertical display surfaces. There is, of course no necessity for the computer vision systems to be table-top based. Examples of microcontroller based TUIs include Posey, an optocoupled poseable hub and strut construction system (Weller et al. 2008); Senspectra, a physical modeling toolkit for sensing and visualising structural strain (Leclerc et al. 2007); Easigami, a reconfigurable folded-sheet TUI (Huang et al. 2009); and Block Jam (Newton-Dunn et al. 2003), a polyrhythmic sequencer utilising blocks which can be combined to create musical phrases. These TUI systems use a wide range of microcontrollers and sensors to enable rich and diverse interactions. One of the major constraints is that it is Figure 2.5: Senspectra. difficult to generate physical feedback. Most of the physical feedback is through the use of LEDs, while the coupled computer displays richer multimedia digital feedback (ibid., p. 77). 11
interaction is a central aspect of music performance. The core advantages of space multiplexing are that two-handed operations can be performed and incorporated as procedural motor skill memory. This frees up attention to focus on other elements, and vision is also partially freed up as the tangible interaction becomes procedurally incorporated. One of the major constraints of TUIs is the physical form which the tangible elements take (Baskinger & Gross 2010). The tangible parts of the system have aesthetic and sociocultural significance to be taken into consideration. The TUI is also grounded in the sense that, unlike a virtual display, there is no ability to flip between windows, to access elements which employ various differentiated image schemas (think of the standard windows drop down menu in most software applications and the various types of information which can be represented in each one). This constrains the interface to be necessarily consistent, but some TUIs have been combined with GUI elements (specifically mouse and keyboard operations) to enable tasks which require accessing extraneous levels of information which cannot be readily represented in the TUI (Ishii et al. 2002). Another constraint on a TUI is the amount of physical clutter which can build exponentially as the application domain and data sets increase (Schaer & Hornecker 2010, pp. 106-107). As the tangible elements are not as malleable as graphical objects they require more careful consideration of their operation reusability and the range of functions which they can perform. Perhaps the greatest strength of the TUI is its potential for use in co-located collaborative tasks (Schaer & Hornecker 2010, pp. 97-98). A major affordance of in situ shared tangible interaction is the capacity to convey information among participants with a greater degree of interpersonal intelligibility (Suchman 1987, p. 180). The prevalence of tabletop interfacesparticularly circular onesfor shared work is an expression of the natural F-formation transaction space employed in co-present social interactions (Kendon 1990, cited in Hornecker 2006a, p. 32) whereby the shared overlapping region directly in front of two or more people oriented toward one another creates a specific kind of interaction space affording egalitarian access and encouraging dialogue. This is distinct from the social interactions afforded by a traditional computer monitor or wall based display and thus encourages the kind of roundtable dialogues intended to be open and inclusive. 2.3.1 The Tangible and the Social: Embodied Interaction The affordances for cooperative interaction in co-located collaborative work are revealing in the light of the concept of embodied interaction outlined in Paul Dourish's Where the Action is (2001). Dourish argues that embodied interaction is a single research program encompassing both tangible and social interaction (ibid., p. 17). In his formulation context is the core focus. The reconsideration undertaken by Varela et al. (1991, pp. 151-157, and pp. 200-214) of the Gibsonian model of affordances is useful in this respect. In their view the affordances an environment grants to an animal are not simply presented but enacted in a co-evolution of structural coupling. The forms which the structural coupling of agents and the environment take thus define the properties of the interaction. The idea of structural coupling is nuanced, being more than just input/output relations between an agent and its environment, but a dynamic, adaptive and emergent process. Dourish lays out six principles which are central to the program of embodied interaction (2001, pp. 162, 166, 170, 177 & 183, italics in original):
13
1) Computation is a medium. 2) Meaning arises on multiple levels. 3) Users, not designers, create and communicate meaning. 4) Users, not designers, manage coupling. 5) Embodied technologies participate in the world they represent. 6) Embodied interaction turns action into meaning. Meaning is an enactive process emerging from the interaction of people and their shared environment, as Dourish puts it (ibid., p. 206): Embodied interaction is about the relationship between action and meaning, and the concept of practice that unites the two. Action and meaning are not opposites. From the perspective of embodiment, they form a duality. Action both produces and draws upon meaning; meaning both gives rise to and arises from action ... This relationship between action and meaning implies a similar relationship between the physical and the symbolic. As embodied forms of interaction take place the symbolic and the physical are bound together as image schemas are formed and attached to physical objects. This is a hermeneutic relationship (Ihde 1990, pp. 80-97) between technological objects and people. It is in the interaction of the physical and the symbolic in context that it becomes incorporated into practice. This interactive praxis becomes part of a community activity, and as its terms are negotiated a language-game emerges (Wittgenstein 1953).
embodied knowledge of day-to-day sensorimotor locomotion (Lakoff & Johnson 1999, pp. 139161)). Altering the conditions of the space involves a radical revision of the way we saw an idea before, which is suggestive that many forms of creativity are possible only if they accurately fit to types of embodied cognition such as conceptual metaphors and image schemas. The heuristics are thus defined by our enacted history. In this view the symbolic language used to represent facets of the world are points defining the outline of a vastly more complex totality, which is shaded in tacit knowledge. This aligns with the notion of 'conceptual blending' outlined by Fauconnier and Turner in The Way We Think (2002), detailing the creative complexity involved in 'blending' conceptual structures in analogical and creative thoughts which we deploy daily with effortlessness, as imaginative musings and through mental rehearsals of situations. Similar to Koestler's bisociation of matrices, but in greater detail, Fauconnier and Turner define conceptual blending as occurring when there is a 'network integration' of intersecting conceptual spaces (ibid., pp. 39-58). The conceptual blending theory when applied to technology (ibid., pp. 195-215) can be considered to be somewhat analogous to a dynamic, soft-assembly version of theories of hermeneutic relations. As the hermeneutic interpretations may exist in a state of attentional flux with creative conceptual blending operations being deployed and modulated by environmental factors. In light of the ideas from distributed cognition what is essential is for individuals to be able to adapt creatively in group situations with a limited bandwidth of input. Some problems for group creativity involve stopping the group from settling into conformist behaviours once stimuli have been collectively integrated, aggregation mechanisms for collective decisions, and allowing for both high levels of control and equal participation of all users (Fischer 2011, p. 46). In multi-user computational systems, the roles which users play can be highly flexible due to the malleability of the system (Jord 2005, pp. 1-2). In the computer music domain an ensemble of laptop musicians can dynamically alter their instruments in ways inconceivable with a traditional orchestra. In addition it is important to maintain democratic engagement among the participants to avoid one member driving the interaction (Sawyer 2003, p. 9). 2.4.1 Realtime and Non-realtime Group Creativity An improvising musician must both maintain coherence with the genre and the prior flow of the performance, while creating something novel. These are both necessary components of the improvisational process, and do not operate in isolation but rather continually interact with each other during the generation of the improvisation (Clarke 1988, cited in Sawyer 2003, p. 91). We now analyse the ways in which creative interaction spaces emerge in realtime group creativity (synchronic) versus non-realtime product creativity (diachronic) (ibid., pp. 122-149). As was considered earlier with regard to distributed cognition, group operations can be sequentially constrained or sequentially unconstrained. In a live musical performance most actions of the performers are sequentially constrained. In Cskszentmihlyi's (1992) conception of flow he documents the deeply involved state an expert performer of an activity enters as they become immersed in the process, losing oneself or being in the zone colloquially. In a 'synchronic' group musical performance, on the fly improvising of compositionsor jammingis a kind of co-present social interaction, much like a conversation, with the various emergent dynamics that entails (Sawyer 2003, pp. 124-125). 15
The main differences between diachronic and synchronic creativity as postulated by Sawyer are that the diachronic is considered to be the 'macrohistorical' level and the synchronic the 'microinteractional' level (ibid., p. 128). There is an interplay between the two as both are dendent upon the other. Synchronic creativity can refer to individual realtime improvisation as well as group improvisation. Likewise diachronic creativity is present in the pre-existing artifacts and cultural heritage performers utilise in an improvisational setting. What is key is the difference in time scales of the two. The abilities displayed by people as they encounter each other and converse naturally is an astonishing feat of realtime creativity often taken for granted. In a tangible interaction or TUI setting, it is the mediation of the dynamic equilibrium of these constraints, and the potential for fostering the right kinds of interaction which will be uppermost in the design considerations for co-present collaborative interactions. This framework will now be considered in relation to the design of a tangible creative system.
16
3 The System
It was decided that the system would take the form of a computer vision-based tabletop tangible and multi-touch auditory interface. This interaction framework is supported through open source software, the hardware technology can be implemented cost-effectively, and it allows for rapid prototyping of interaction methods. The framework used in the system utilises the TUIO protocol developed by Martin Kaltenbrunner, Till Bovermann, Ross Bencina and Enrico Costanza.
The TUIO (Tangible User Interface Objects) protocol (Kaltenbrunner et al. 2005) enables the development of tangible user interfaces by transmitting control data between a TUIO-enabled tracker and client applications. TUIO is based on the transport independent messaging protocol OSC (Open Sound Control), often used for networked communication between computers, sound synthesizers and multimedia devices. There are a number of TUIO tracker applications available, such as reacTIVision, Community Core Vision (CCV) and touchlib, as well as TUIO client implementations for common programming languages and development environments. The setup for the table-top part of the system is a mid-tech setup constructed from a circular glass table, a PlayStation Eye webcam with its infrared blocking filter replaced by an infrared bandpass filter, infrared LED lamps for illumination, and a screen diffuser made of tracing paper. It uses rear diffuse illumination (DI) similar to the reacTable. Regions of light intensity are then tracked by the computer vision system (technically called 'blobs'). Objects placed on the surface diffract the diffused infrared light and the reflections appear as illuminated spots in the camera image. The drawbacks of this technique are that the fidelity of object recognition by the computer vision system is not as high as a frustrated total internal reflection (FTIR) system, but conversely it affords the combination of multi-touch finger input with physical object and fiducial marker tracking, which is not possible with FTIR (Han 2005). This system differs from similar DI setups, such as the reacTable, through the decision not to include a multimedia projector in the assembly (see fig 3.1), thus disabling visual feedback to the user via an overlaid GUI. This decision was made in order to constrain the feedback to purely auditory means. This places the system within the overlapping regions of HCI research into the coupling of tangible interfaces with Figure 3.2: The table interface. both auditory displays utilising model-based sonification for representing data (de Campo et al. 2011, pp. 381-405), and gestural controllers for synthesis and composition in digital musical instruments (Miranda & Wanderley 2006). 17
In model-based sonification data can be aurally represented by a number of different means allowing data points in high dimensional spaces to be mapped to various parameters of our auditory capacity. Some sonification types include alarm signals, audification (e.g. mapping values in a time series to a waveform's amplitude), auditory icons (mapping a familiar sound to a control object or icon), and parameter mapping (where multiple data values are mapped to various attributes of a sound, such as duration, frequency, waveform, envelope, etc.) (Herman & Ritter 1999). An example of this interaction model is the AudioDB system developed by the Neuroinformatics group at Biefeld University (Bovermann 2009, pp. 132-144) which allows users to sort, group and select items in a database of auditory representations of data. The individual sounds are represented by movable tangible objects on a 2-D surface. In contrast to the analysis of data through sonification is the expressive production of audio synthesis and composition through the control of digital musical instruments. This is an active research field, through conferences such as the New Interfaces for Musical Expression (NIME) or the International Computer Music Conference (ICMC), and through institutions such as the Studio for Electro-Instrumental Music (STEIM) and the Institut de Recherche et Coordination Acoustique/Musique (IRCAM). In the realm of digital musical instruments there exists a distinction between the augmented instrumentan acoustic instrument which been modified with the addition of sensor devices to create an electro-acoustic hybrid (Collins 2010, pp. 213-214)and the purely digital instrument, which has none of the physical-material coupling of the control interface to the sound producing elements evident in acoustic instruments. As Miranda and Wanderley note (2006, p. 4) the gestural controller and the sound generating unit can be treated entirely independently in digital musical instruments, allowing for far greater reconfigurability. The distinction in digital instrument design is between the physical controller and the patcherthe sound engine for the production of synthesis, such as Csound, Max/MSP, Pure Data or SuperCollider (Magnusson 2009, pp. 208-211). With the advances in computational power, and these software patchers to develop synthesis algorithms, allied to the affordability and availability of portable, sensor-rich technology, such as smart-phones and modern laptops, the combinational possibilities are vast. As there is no natural mapping of sonic output from the patcher to the materiality of the instrument in digital instrument design it is the task of the designer to carefully choose the artistic and ergonomic relations of instrument interface to output. The system would be developed using SuperCollider and Processing. SuperCollider is a dynamic and interactive programming language ideal for rapidly developing interaction methods and it utilises OSC messaging at its core for communication between the client-side SC language and SC server, which is used for audio digital signal processing. Processing is useful for rapidly developing program sketches and is well supported through 3rd-party libraries. It supports server communication with SuperCollider through the p5_sc library, OSC communication through the oscP5 and netP5 libraries, and the TUIO protocol through the TUIO library.
related to the requirements of the user. In gathering data for requirements it is kept in mind that designs will likely be unsuccessful if they are not adequately usable (Sharp et al. 2007, pp. 473526). As the system is intended to be utilised as a tool in CSCW it is essential in the design that multiple-user interaction is supported and the types of interaction possibilities ascertained with the focus on the end-user. The embodied nature of tangible interaction lent itself to the use of embodied methods in the design process. 3.1.1 Ethnomethodology and Technomethodology Dourish and Button (1998) outline the definition of technomethodology as a sociological framework in system design derived from ethnomethodology. Ethnomethodology is a variant of sociological inquiry whereby the emergence of social structures and order is studied as the outcome of a set of methods which are employed by engaged members interacting in a social arena. A reflexive accountability (Garfinkel 1967, pp. 1-2) of activityimplicitly understood by the participants comes forth as a negotiated and comprehensible practice. For the ethnomethodologist the qualitative data of ethnographic studycontextually grounded and informed by indexical expressions (ibid., pp. 4-6)are put to use in a systems-level analysis of an overarching social infrastructure. These analyses have become central within much research in HCI. If social activity is a complex emergent structure of interacting agents, whereby a commonly understood practice is performed, then the introduction of a technological system into this established order can result in deleterious effects, or, to use a biological analogy, outright rejection, rather than the intended integration: What ethnomethodology tells us is that the production of an account of action is an indexical (or situated) phenomenon. In other words, a user will encounter a system in myriad settings and circumstances, and will attempt to find the systems behaviour rational and sensible with respect to whatever those infinitely variable circumstances might be, day to day and moment to moment (Dourish & Button 1998, p. 16). This is a problem of strata of abstraction, the levels of black-boxed behaviours which are utilised in modular systems design. Not only should the conceptual metaphor which the system imposes upon the usersuch as the desktop interfacebe consistent, but it should also make itself accountable. In Dourish and Button's analysis this is evident at the metalevel interface. Technomethodology is a means for discerning foundational equivalence between the ethnomethodological data and the proposed technological system, in any context.
Figure 3.3: (a) Clients interaction with traditional black-box abstractions through standard abstraction barriers. (b) Open implementations also reveal inherent structure (Dourish & Button 1998, p. 18).
In the field of tangible user interface design the use of abstraction layers is problematised in three ways. First the immutability of the physical object can limit the amount of properties available in the display (e.g. multiple windows cannot be overlaid as with the GUI). Secondly physical objects have physical properties which are tacitly incorporated into our cognitive understanding of the world. As users are manipulating physical objects maintaining the consistency and expectancy of a conceptual metaphor in the interaction imposes strict constraints (although the potential for implementing surprising 'magical' properties should not be overlooked). And thirdly, accountability may be 19
difficult to implement when using physical objects, where the interface real-estate may be at a premium. In many tangible interaction systems specific parts of the system are utilised through the use of a separate GUI, as there is often no obvious or natural way to integrate certain types of information into the TUI itself. 3.1.2 User-Centered Design A number of techniques from user-centred design practices would be employed in deciding upon the interaction methods of the system. There are a number of constraints already in place; the tangible interface is a 2-D surface of fixed dimensions; the tracker and client interface uses the TUIO protocol, which contains a set of defined profiles for cursor, object and blob descriptors, each with specific attributes; and the system would output only audio as perceptual feedback to the user. To explore the affordances of these specific constraints, and determine requirements analysis, the potential end-users would be engaged in the design process from the outset. Specific user-centred design methods were employed to this end. 3.1.3 Participatory Design Participatory design (PD) is a design practice intended to involve end-users of a system as equal partners in the design process (Muller 2003). It is an inherently democratic approach to systems design, with its broad emphasis on ensuring a multiplicity of voices are heard from the inception of a system's design right through to release. The ideal consequence in PD practice is the negotiation and resolution of a language-game where end-users, designers and potentially any other stakeholders define and delineate possible solutions in a discursive space primed for co-creation. The major affordance is the opportunity for the designers to learn something we didn't know we needed to know (ibid., p. 1054, italics in original). With this goal in mind it is essential that the co-design process with the potential users of a system take into account their tacit knowledge. In order to reveal and understand this knowledge, which is implicit and often difficult to impart verbally, the designers should aim to negotiate the system's mechanics with the potential end-users through active cooperation and through crafting the illusion of actually working with the system. This serves as a means of assisting in the conceptualisation of the system in the round, and encourages creative feedback from the participants (Bdker, Grnbk, & Kyng 1993, pp. 157-175).
20
The tangible primitives provided were: Spheres Cubes Discs Tori Containers String Malleable dough The participants were given the premise along with some of the constraints imposed by the system and could construct interaction models in any configuration they desired from these primitives. They were also encouraged to sketch their ideas on paper (see appendix I).
The participants ranged in age from 4 to 80 years and had a wide variety of levels of experience playing musical instruments, both acoustic and digital (see appendix II). This was deemed appropriate as the most likely use of a TUI is as an educational tool. The PD sessions were conducted in various home settings and one educational institutional setting, and they involved from one to five participants at a time. It was hoped that the elements would not be used in ways strictly isomorphic to GUI widgets, but rather in ways appropriate to the materiality of the elements themselves. A number of interaction styles Figure 3.5: The author with a young materialised within the constraints and affordances of the participant in the design process. elements provided, as participants engaged with and developed upon the proposed system. These would form the models on which to base the interaction prototypes. Events were associated with activities such as: 1) Direct manipulation: Most often positioning by hand, but also including throwing, wearing, blowing, rolling, spinning and colliding. 2) On-ness, in motion: The table constraint meant that the placing of an object on top would usually act as trigger. This was often modulated by movement of the object. 3) Position in cartesian coordinate system: Usually along x and y, but also along z axes. An axis was often used to conceptualise a time dimension. 4) Content-container relationship: Content-locative operations (event triggered by placing objects in the container, or imprinting solid objects into the malleable dough) and container-locative operations (event triggered by removing upturned container from an object, or surrounding an object with a torus). 21
5) Breaking into pieces/multiplication/dependencies: A large object would be separated into many smaller objects, each retaining characteristics of the original but differing in some respects. Action only occurred in non-quantised (mass noun) elements, such as dough. 6) Speed of motion and acceleration: Altering parameters through the velocity and acceleration of an object. 7) Aspect of an event: Instantaneous events, as in a collision of objects (telic). Open-ended, as in a repeated musical phrase or pattern which can be modified on the fly (atelic). Culmination, as in the completion of a predefined task, ie., forming a specific shape. These were often combined to form complex behaviours. 8) Causation: Causing, as in a collision between agonist and antagonist. Allowing or enabling, as in a path between two nodes. Blocking, as in a container over an object. 9) Chance: Often accompanied with a throwing, rolling or spinning motion, allowing for dynamically random events through, e.g., throwing a block, spinning a disc or torus, or having an object break into parts upon collision, where the nature of the break makes a specific event occur. 10) Networks/constructions: Spatial relationships occur between multiple objects serving as nodes in a network or construct. Sometimes explicitly linked with physical paths or simply linked by nearest neighbour relations. Action occurred in both quantised (count noun) and non-quantised (mass noun) elements, such as cubes or dough. 11) Symbols and signs: Objects of a certain shape or colour, or coupled with a symbol, such as a number, would represent a more complex elementa musical instrument type, or a rest period. All of the participants had preconceptions about the relationships between sound and physicality, so that specific correlations recurred, such as lower pitch frequency assigned to larger objects, and pitch frequency ramping mapped to movement along an axis. Some of the participants combined these typical associations and methods to generate sensible use cases and complex conceptual systems for interaction, with solutions for specific tasks such as: 1) Learning a musical scale. 2) Tangible composition on a stave. 3) Constructing musical sequences from chains of objects. 4) Ordering food in a restaurant using an auditory menu. 5) Constructing audio waveforms from malleable dough and morphing between them. 6) Playing a game with the objective of navigating a terrain using only auditory cues. 7) Playing a game of forming objects to be identified by a shape recognition system.
22
Tangible interaction modalities were then implemented to control parameters of the synthesis graphs, using qualitative data from the participatory design sessions as guidelines for prototyping. A tangible system for manipulating sound samples was developed from an observation of participants exploring a means for navigating a timeline. A piece of string was utilised as a playhead and passing it over objects would play back a sample. A program was implemented to allow a number of tangible elements to be used in a similar way, incorporating tangible interaction affordances via direct manipulation, movement, positioning in cartesian coordinate system and speed of motion. Each tangible has its IDwhich is assigned when coupled with the systemassociated with a sound file. The sound files could be assigned to a buffer via a graphical user interface with ten sample banks and the option was provided for loading a sample or directly recording via microphone or line-in. These objects could then be independently manipulated to play back Hann windowed envelopes of sections of the sound file through synchronous granular synthesis when the tangible element is moved. By moving the tangible element left to right along the x axis the system would spawn grain envelopes from subsequent regions of the sample's wavetable, directly mapping the time dimension of the sample to a spatial dimension on the table, and the velocity of movement along the x and y axes would speed up, slow down or reverse the playback rate of the sample accordingly. In addition the triggering frequency of the grain envelope and the duration of the envelope were parametrically linked on the y axis. As a number of tangibles can be used at any one time interesting polychromatic cloud formations are possible through repositioning and manipulation. A tangible interaction modality for sequencing musical phrases was also implemented incorporating interaction affordances via direct manipulation, on-ness, position in cartesian coordinate system, atelic aspect and networks/constructions. A number of participants in the design sessions suggested the use of musical sequencing operations, from composing on a tangible stave to a system where tangible objects representing notes and rest periods were connected vertically and placing a torus shape over the stack would trigger a looping sequence. A novel method of step sequencing was developed from the constraint of the circularity of the tabletop interface. In the case of the sample sequencer the tangible element is mapped to a sampled waveform. Placing a tangible on the surface triggers playback of the sample with re-triggering at 23
regular intervals. The frequency of the re-triggering is determined by the tangible element's proximity to the centre of the table and it increases in a stepwise fashion and exponentially as the tangible is moved towards the centre, through beats of 1 bar, 1/2 nd, 1/4th, 1/8th and 1/16th. Adding tangibles to the table starts new trains of isochronous beats, and the angular position to the centre of the table determines which of the ten samples is triggered (again these are either loaded or directly recorded through the use of a GUI). The trigger rates allow for repetitive rhythms in simple meter for each tangible, but the timing of placement of the tangible on the table affects where the initial beat falls, and as this is not quantised and as more than one block can be used to trigger any sample it affords syncopated and polyrhythmic beat possibilities. In addition this sequencing method was employed with the Karplus Strong and analogue modelling synthesis algorithms. In these cases the linear 'note' range of the instrument was mapped to polar coordinates such that the angle to the centre of the table determined which note would be played. An interesting consequence of the participatory design sessions was the common concern that the gestural control of music often required multiple interaction abstraction layers. Most musical instruments include control parameters separate from the playing interface (analogous to a means of switching modes from ready-to-hand to present-at-hand). These range from the time-multiplexed graphical user interface of a software instrument to the space-multiplexed foot pedals on a piano. This separation into abstraction layers in the system could have been achieved through a graphical user interface, however in the context of embodied Figure 3.7: The Nintendo Wii remote. interaction a useful multipurpose input device which can communicate over OSC using the DarwiinRemoteOSC application and has been subject to analysis as a digital musical instrument controller (Kiefer, Collins & Fitzpatrick, 2008) is the Nintendo Wii remote, and for this reason it was decided to include its use in the interaction prototypes. The Wii remote has eleven buttons and it affords continuous control through the accelerometer data passed from adjusting the pitch and roll axes. All of the interaction modalities employed the use of the Wii remote. In the granular system this was utilised to control a reverb effect, the pitch axis affecting the size of the reverberant space and the roll axis controlling the dry/wet mix. In the sample sequencer system the remote controlled a reverb as well as a master clock for the tempo. This was sped up or slowed down by holding a button and rotating the remote through its roll axis, allowing for embodied accelerando and rallentando. In the Karplus Strong system the remote controlled the reverb and the tempo clock, as well as a dynamically adjustable physical model of a resonant chamber and timbral qualities such as the sustain of the note. In the analogue modelled synth system the remote could be used to adjust envelope parameters and a resonant low-pass filter. Although these interaction types employ basic elements found in numerous popular digital musical instruments in the western traditionsuch as the sample-based step sequencerit was hoped that the different interaction styles employed in utilising them would open up novel ways of creating musical patterns from very simple foundations.
24
25
From the data logged from the semantic differential questionnaire we can see that most users found the interface easy and pleasant to use. The Wii remote scored highly but also had a relatively high standard deviation, which might be expected as it is a successful and highly usable commercial product, but also carries a cultural significance which may colour some participants' views. The system's speed of operation scored highly which is important in a musical system as low latency is of paramount consideration. The system's reliability did not score so well as it sometimes failed in operation, and this result had a high standard deviation which is to be expected with occasional failures.
26
The level of dependency on prior experience for ease of use scored poorly. Without obvious feedback from the system (such as displaying the currently playing note or meter) it took users time to understand the constraints and affordances. A frequently noted constraint was that there should be feedback, either through marked zones on the surface of the table, or information displayed when a tangible is introduced which would speed up the learning of the system. Feedback could also be relayed haptically, in a similar way to an acoustic instrument such as a guitar. The lack of feedback about the system state breaks one of the main heuristics of usability (Nielsen & Molich 1990), however there is a trade-off with systems of this kind in how obvious they are to use and the scope for discovering new features and for mastering a device as you would an acoustic instrument. In addition the learning of advanced features and remembering control gestures and sequences to complete tasks did not score quite so highly. This might signal that the level of feedback from the system is insufficient once your understanding progresses beyond a certain point and that without this feedback for tracking the system's state there is an increased strain on cognitive load. One of the more pleasing outcomes was that when more than one user was present during the evaluation they collaborated in using the instrument without prompting. Sometimes one would operate the Wii remote while the other utilised the table, at other times they would take turns in adding tangibles to the table.
Table 4.1: Frequent comments from a survey of the positive and negative aspects of acoustic and digital musical instruments.
The tangible user interface incorporates elements of the acoustic and the digital. Though the openness and adaptability is often sacrificed there can be inspiration gained and new pathways of affordances opened up through the materiality of the interface.
27
In the interaction modalities explored with the system it is evident that the lack of system state feedback was a problem for many users. Currently the best methods for recognising objects in a computer vision based tangible system is through the use of fiducials or through training the system utilising a support vector machine. The original intention was to develop the system so that it could be used with any object in an ambient environment, however that introduces the constraint that some advanced capabilities may not possible as it is sometimes more practical and convenient to parcel out complex operations to a graphical user interface. Also for the human users tracking which tangible is associated with which sample/note in working memory often became confusing when there were more than a few tangibles in play, though this might be alleviated as users interact with the system over a length of time and their understanding becomes incorporated into procedural memory (Krakauer & Shadmehr 2006). The labelling of control parameters is also problematic without visual feedback, increasing the time taken to learn the system (though this may afford a degree of interest in discovering and remembering the relationships 'blind' as it were).
28
29
30
Appendix I
Participants' sketches which were generated in the participatory design process:
Age of participant: 50-59yrs. Interaction described: tangible composition on a stave, where x co-ordinate determines note pitch, y co-ordinate determines sequence of notes, a colour represents note length and the shape of the tangible element represents the instrument type. In addition expressive control of the instrument was to be performed separately on a dedicated region of the table.
Age of participant: 20-29yrs. Interaction described: using colour, shape and size co-ordination to learn musical scales.
31
Age of participant: 0-9yrs. Interaction described: torus shape with a kind of pressure or flexible sensor manipulated to determine the sound properties. Image on the left is a form of musical sequence.
Age of participant: 0-9yrs. Interaction described: Making a wooden chime sound when a tangible element is dropped into a small container.
32
Age of participant: 0-9yrs. Interaction described: Moving an object between two containers. Either of the containers can contain malleable dough.
Age of participant: 0-9yrs. Interaction described: An arrangement of string, with onomatopoeic sounds.
33
Age of participant: 0-9yrs. Interaction described: Different elements inside a container produce different sounds. Arrows display movement.
Age of participant: 20-29yrs. Interaction described: Shaping sounds with malleable dough.
34
Age of participant: 20-29yrs. Interaction described: A game where navigating with a tangible element is performed using only auditory cues.
Age of participant: 20-29yrs. Interactions described: 1) containment of sound filters volume, frequency, etc. 2) breaking an object into many parts generates complex interacting sounds from a simple one.
35
36
Appendix II
Demographic breakdown of participants by sex, age, experience of playing acoustic and digital musical instruments and the types of instruments played: Sex:
8 6 4 2 0 Male Female
Age:
5 4 3 2 1 0 0-9yrs 10-19yrs 30-39yrs 50-59yrs 20-29yrs 40-49yrs 60+yrs
Experience of playing acoustic musical instrument/s, using software-based/digital musical instruments and using computers in general:
12 10 8 6 4 2 0 Up to 1yr 1-3yrs 4-5yrs 6-9yrs 10+yrs Experience of playing an acoustic musical instrument/s: Experience of using software-based/digital musical instrument/s: Experience of using computers in general:
37
Keyboard
Clarinet
Piano Drums
Flute
Ableton
38
Appendix III
Responses gathered from the experience sampling method. Log recordings after ten minutes the granular synthesis system: I found the concept very interesting and state of the art. Could be used perhaps in entertainment field. Interactive video games? Child development? Something for blind people maybe. Found the ability to use more than one object at a time intriguing DJ system with different songs fade in/fade out. Increase/decrease volume. Being able to use any object could be useful. Science museum. You need to make sure the shapes are marked as I quickly forgot which was which. I liked the bird song best, the singing was difficult to hear as a song. The echoiness didn't really help much. The constant noise when the object(s) were not moving was confusing and distracting. The echo effect was very obvious but it didn't seem to matter which direction the Wiimote was turned. The parameters for where the blocks work could do with being marked so you know when you are out of the area. The noises/sounds are really fun to work with you kind of get a ghostly horror film soundtrack effect! Not sure if I like the Wii remote, simply because I was too engrossed in using both hands to move the blocks. Great fun nice to experiment with new sounds. Would be good as a learning device for children. Very inventive system. Never seen anything like it so unique Log recordings after twenty minutes the sample sequencer system: Could see more potential as a useable concept. Again suited to children/entertainment/DJ etc. Easily recognised new sounds added and could hear them speeding up/slowing down. Good for blind people to make music but would need different shapes for different sounds. The different sounds, speed of beat and variation in reverberation were obvious and controllable. However, it would be better if the objects were tagged or named so I could remember which one was causing a particular sound. Loved this one again as you know where the blocks can work you can build a whole rhythm as a group or alone. As a group it would be excellent practice for getting rhythm and timing skills to develop, and encourage team building. Then once you have a rhythm going you can move one of the blocks and throw the whole rhythm out and it's then the task of the team to create a new rhythm around the new position of said block. In a teaching environment there 39
could be cards printed with specific starting positions for the first block teaching time signatures, tempo and beat strength again it would then be the group's task to create a working rhythm around that pre-determined block. This could be a really exciting tool to build an enthusiasm for creating music. The use of the Wii remote here was useful because you could adjust the tempo and then put it down, so it didn't interfere with the action on the table. Marked zones on the playing surface would enable faster 'learning' of the system. No expert, but imagine great potential for children, esp. those with learning difficulties. Good to use, would be interesting to see how hard and long it would take to make melodies. System was easy to use and the Wii remote was also easy to use (sorry couldn't think of a better phrase). Log recordings after thirty minutes the plucked string sequencer system: Good being able to change notes. Would be better if had points of reference to see what note you're playing. Could see it being mixed with the drum beats. Liked being able to use fingers, felt more interactive. Felt the Wii remote really came into its own by affecting the sound. Impressed with how like a proper guitar it sounded and able to change the frequency of notes easily. Shapes or blocks need to be coloured or different shapes. Would get bored using blocks as they are. If easy to identify next step could be to write a piece of music that could be followed. A lot more variations were possible with this and some interesting notes and beats were obtained. A really fun and interactive way to explore pitch and note combinations, scales, intervals etc. With the two programs combined you could teach rhythm and pitch, so a classroom could have two units one set up to do each and have a whole class write an entire song in an easy accessible way. You would literally have your own table top orchestra. Not all people possess the ability to play something like a guitar or a drum kit but packaged in this way anyone could become a pro. It would even encourage people who never thought they could play or would want to play an instrument. Quite addictive and strangely calming. Lighting behind the playing surface would make even more fun. With practice could be used as an instrument. Would be novel and eye catching on stage. I think this could be developed as a tool to help children with hand eye coordination, especially children with visual impairments or other special needs.
40
Appendix IV
// // // // MSc Dissertation Ben Norton. MSc Creative Systems. Date: 07/09/2011. SuperCollider program to store SynthDefs to memory.
// Run this to load UGen graphs ( s.waitForBoot({ { {Server.default = Server.local}; s.sync; // Store the Synth Definitions to memory ( // A synth graph for emulating a plucked string SynthDef(\karplusStrong, { arg outbus=0, midi=69, trigrate=1, dur=1, amp=0.2, pan=0.0, tempo=0.5, att=0, dec=0.001, delayTime; var burstEnv, out; delayTime = [midi, midi + 12].midicps.reciprocal; burstEnv = EnvGen.kr(Env.perc(att, dec), gate: Impulse.kr(trigrate/tempo)); out = PinkNoise.ar([burstEnv, burstEnv]); out = CombL.ar(out, delayTime, delayTime, dur, add: out); Out.ar(outbus, Pan2.ar(out, pan)); }).store; // A synth graph for granular synthesis SynthDef(\syncGrain, {arg outbus=0, trigrate= 100, bufNo=0, rate=1, centerPos=0, duration=0.025, pan=0.0, amp=1, interpol=2; var dens, bufLength; dens= SinOsc.ar(trigrate);
41
Out.ar(outbus,(TGrains.ar(2, dens.lag, bufNo, rate.lag(0.5) + TRand.kr(0,0.01,dens), BufDur.kr(bufNo)*centerPos + TRand.kr(0,0.005,dens), duration.lag+TRand.kr(0,0.001,dens), WhiteNoise.kr(pan), amp.lag(0.5), interpol))); }).store; // A synth graph for analogue modelling synthesis SynthDef(\analogLF, { arg outbus=0, freq=60, trigrate=1, tempo=0.5, rez=3200, rq=0.0, randLo=0.975, randHi=1.025, amp=0.2, gate=1, aT=0.0, sT=0.2, rT=0.6, sL=0.8; var signal, env, analogRand; env = EnvGen.kr(Env.linen(aT,sT,rT,sL, 'sine'), gate:Impulse.kr(trigrate/tempo), doneAction:0); analogRand = TRand.kr(randLo,randHi,gate); signal = BLowPass4.ar(Mix.fill(2,{ LFSaw.ar([freq*analogRand, freq, freq*analogRand*0.5, freq*0.5, freq*analogRand*0.25, freq*0.25 ].midicps.round, 2)}.distort.distort*amp*env), rez.lag, rq.lag+0.1); Out.ar(outbus, signal) }).store; // A synth graph for an arpeggiator using analogue-modelling synthesis SynthDef(\analogArp, { arg outbus=0, per1=0.5,per2=0.5,per3=0.5,per4=0.5, freq1=40, freq2=40, freq3=40, freq4=40, filt1=3200, filt2=3200, filt3=3200, filt4=3200, port=0.05,temp=0.5,amp=0.1; var freq, filter, source, signal; freq = Duty.kr(Dseq([per1,per2,per3,per4]*temp,inf), 0, Dseq([freq1,freq2,freq3,freq4],inf)).midicps.round; filter = Duty.kr(Dseq([per1,per2,per3,per4]*temp,inf), 0, Dseq([filt1,filt2,filt3,filt4],inf)); source = Mix.new (LFSaw.ar([1.0,0.75,0.5,0.25,1.1,0.99,1.01]*freq.lag(port))); signal = BLowPass4.ar(source, filter.lag(0.0625),0.5); Out.ar(outbus, Pan2.ar(signal)*amp.lag(1))
42
}).store; // A synth graph for playing a recorded sample waveform SynthDef(\sampler, {arg outbus=0, bufNo=0, startPos=0, rate=1, tempo=0.5, loop=0, trigrate=1.0, done=0, pan= 0.0, amp=0.4, aT=0.01, aL=1, dT=0.08, dL=0.8, sT=1, sL=0.8, rT=2, rL=0; var in, trigger; trigger = Impulse.kr(trigrate/tempo); in= PlayBuf.ar(1, bufNo, BufRateScale.kr(bufNo)*rate, trigger, startPos, loop, done); Out.ar(outbus, Pan2.ar(in,pan, amp)); }).store; // A synth graph for physically modelling the resonant mode of an instrument SynthDef(\resonator, {arg inbus=0, outbus=0, freq1=110, freq2=220, freq3=440, freq4=880, klAmp1=0.0, klAmp2=0.0, klAmp3=0.0, klAmp4=0.0, ring1=0.1, ring2=0.1, ring3=0.1, ring4=0.1; var input, effect; input = In.ar(inbus, 2); effect = Limiter.ar (DynKlank.ar(`[[freq1,freq2,freq3,freq4].lag(0.2), [klAmp1,klAmp2,klAmp3,klAmp4].lag(0.2), [ring1,ring2,ring3,ring4].lag(0.2)], input), 0.8, 0.01); Out.ar(outbus, effect); }).store; // A synth graph for a reverb fx unit SynthDef(\reverb, { arg inbus = 0, outbus = 0, mix = 0.5, room = 0.5, damp = 0.5; var input, effect; input= In.ar(inbus, 2); effect= FreeVerb.ar(input, mix, room, damp); Out.ar(outbus, effect); }).store;
43
// A synth graph for a delay fx unit SynthDef(\delay, {arg inbus=0, outbus=0, maxdelay=1, delaytime=0.1, decay=1.0; var input, effect; input=In.ar(inbus,2); effect= CombL.ar(input, maxdelay, delaytime, decay); Out.ar(outbus,effect); }).store; // A synth graph for recording from input to the soundcard SynthDef(\recorder, { arg outbus = 0, recLvl = 1.0, preLvl = 0, bufnum = 0; var in; in = SoundIn.ar(0); RecordBuf.ar(in, bufnum, 0.0, recLvl, preLvl, 1, 1, 1); }).store; // A synth graph for defining node execution order SynthDef(\placeholder, {arg inbus=0, outbus=0; Out.ar(outbus,InFeedback.ar(inbus,2)); }).store; ); }.fork }) )
44
// SuperCollider program to create a GUI for loading samples to buffers // and to record sounds through the line in ( var w, file, soundPath; ~buffers = List[]; //Create a main frame window. w=Window("Samples", Rect(20,400,540,340), resizable: false); // Bank of buttons to assign ten samples to ten buffers ~samples= Array.fill(10, {arg i; Button(w,Rect((i%5)*100+20,i.div(5)*150+20,100,50)) .states_([[ "Sample " ++ (i+1), Color.black, Color.white ]]) .action_({ arg button; if(button.value == 0) { ~load = Dialog.getPaths ({ arg paths; paths.do({arg soundPath; ~buffers.insert(i, Buffer.readChannel (s, soundPath, channels: [0], bufnum:i));}) }, allowsMultiple:false); }; }) });
// Bank of buttons to record 4 second input samples to ten buffers ~recs= Array.fill(10, {arg i; Button(w,Rect((i%5)*100+20,i.div(5)*150+70,100,50)) .states_([ [ "Record " ++ (i+1), Color.black, Color.white ], [ "Stop", Color.white, Color.red ] ]) .action_({arg button; if(button.value == 1) { ~buffers.insert (i, Buffer.alloc(s, s.sampleRate*4.0, numChannels:1, bufnum:i)); ~rec= Synth(\recorder, \bufnum, i) }; if(button.value == 0){ ~rec.free } }); }); w.front;//brings the window to the front w.onClose= {~buffers.free; ~rec.free};//clears buffer )
45
/** * Ben Norton * MSc Creative Systems. * Date: 07/09/2011. * This system implements a tangible granular synthesis instrument */ import import import import TUIO.*; supercollider.*; oscP5.*; netP5.*;
HashMap synths = new HashMap(); Group group = new Group(); Synth synth = new Synth("syncGrain"); Synth place = new Synth("placeholder"); Synth rvrb = new Synth("reverb"); TuioProcessing tuioClient; WiiController wiiController; // these are some helper variables which are used // to create scalable graphical feedback float cursor_size = 30; float object_size = 60; float table_size = 760; float scale_factor = 1; PFont font; void setup(){ size(1024, 768); loop(); frameRate(30); hint(ENABLE_NATIVE_FONTS); font = createFont("Helvetica.vlw", 16); textFont(font); scale_factor = height/table_size; // we set initial parameters of synth objects // to assign buses and node and group execution order place.set("inbus",20); place.addToHead(); rvrb.set("inbus", 20); rvrb.set("mix", 0.0); rvrb.set("room", 0.0); rvrb.addToTail(); group.create();
46
group.addToTail(); // we create an instance of the WiiController wiiController = new WiiController(); // we create an instance of the TuioProcessing client // since we add "this" class as an argument the // TuioProcessing class expects an implementation // of the TUIO callback methods (see below) tuioClient = new TuioProcessing(this); } // within the draw method we retrieve a Vector (List) of // TuioObject and TuioCursor (polling) from the TuioProcessing // client and then loop over both lists to draw the graphical feedback. void draw(){ background(0); textFont(font,18*scale_factor); float obj_size = object_size*scale_factor; float cur_size = cursor_size*scale_factor; Vector tuioObjectList = tuioClient.getTuioObjects(); for (int i=0;i<tuioObjectList.size();i++) { TuioObject tobj = (TuioObject)tuioObjectList.elementAt(i); stroke(0); fill(0); pushMatrix(); translate(tobj.getScreenX(width), tobj.getScreenY(height)); rotate(tobj.getAngle()); rect(-obj_size/2, -obj_size/2, obj_size, obj_size); popMatrix(); fill(255); text(""+tobj.getSymbolID(), tobj.getScreenX(width), tobj.getScreenY(height)); } Vector tuioCursorList = tuioClient.getTuioCursors(); for (int i=0;i<tuioCursorList.size();i++) { TuioCursor tcur = (TuioCursor)tuioCursorList.elementAt(i); Vector pointList = tcur.getPath(); if (pointList.size()>0){ stroke(255,0,0); TuioPoint start_point = (TuioPoint)pointList.firstElement();
47
for (int j=0;j<pointList.size();j++) { TuioPoint end_point = (TuioPoint)pointList.elementAt(j); line(start_point.getScreenX(width), start_point.getScreenY(height), end_point.getScreenX(width), end_point.getScreenY(height)); start_point = end_point; } stroke(192,192,192); fill(192,192,192); ellipse(tcur.getScreenX(width), tcur.getScreenY(height), cur_size,cur_size); fill(0); text(""+ tcur.getCursorID(), tcur.getScreenX(width)-5, tcur.getScreenY(height)+5); } } // the Wii remote button 'A' plus roll and pitch // adjusts the reverb parameters if(wiiController.buttonA){ rvrb.set("mix", map(wiiController.roll, -85.0, 91.0, 0.0, 1.0)); rvrb.set("room", map(wiiController.pitch, 92.0, -90.0, 0.0, 10.0)); } } // these callback methods are called whenever a TUIO event occurs // called when an object is added to the scene void addTuioObject(TuioObject tobj) { } // called when an object is removed from the scene void removeTuioObject(TuioObject tobj) { } // called when an object is moved void updateTuioObject (TuioObject tobj) { } // called when a cursor is added to the scene void addTuioCursor(TuioCursor tcur) { int id = tcur.getCursorID()%10; Synth synth; if (id == 0) { synth = new Synth("syncGrain");
48
synth.set("bufNo", 0); } else if (id==1) { synth = new Synth("syncGrain"); synth.set("bufNo", 1); } else if (id==2) { synth = new Synth("syncGrain"); synth.set("bufNo", 2); } else if (id==3) { synth = new Synth("syncGrain"); synth.set("bufNo", 3); } else if (id==4) { synth = new Synth("syncGrain"); synth.set("bufNo", 4); } else if (id==5) { synth = new Synth("syncGrain"); synth.set("bufNo", 5); } else if (id==6) { synth = new Synth("syncGrain"); synth.set("bufNo", 6); } else if (id==7) { synth = new Synth("syncGrain"); synth.set("bufNo", 7); } else if (id==8) { synth = new Synth("syncGrain"); synth.set("bufNo", 8); } else { synth = new Synth("syncGrain"); synth.set("bufNo", 9); } synths.put(tcur.getSessionID(),synth); synth.set("outbus",20); synth.set("amp", 0.2); synth.set("trigrate", map(tcur.getY(), 0.0, 1.0, 70.0, 0.1)); synth.set("centerPos", tcur.getX()); synth.addToHead(group); synths.put(tcur.getCursorID(),synth); }
49
// called when a cursor is moved void updateTuioCursor (TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); // the granular synthesis parameters are adjustable // through positioning in x and y co-ordinates and // the speed of the cursor in motion synth.set("centerPos", tcur.getX()); synth.set("duration", tcur.getY()*0.1); synth.set("trigrate", map(tcur.getY(), 0.0, 1.0, 200.0, 0.001)); synth.set("rate", map(tcur.getXSpeed(), -10.0, 10.0, -1, 3)); synth.set("rate", map(tcur.getYSpeed(), -10.0, 10.0, -1, 3)); // the amplitude of the indivdidual sample playback is // audible only if the cursor crosses a motion threshold if(tcur.getMotionSpeed() > 0.05){ synth.set("amp", 1.0); } else{ synth.set("amp", 0.0); } } // called when a cursor is removed from the scene void removeTuioCursor(TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); synth.free(); if(tuioClient.getTuioCursors().size() == 0) { group.freeAll(); } } // called after each message bundle // representing the end of an image frame void refresh(TuioTime bundleTime) { redraw(); } // called after the program is stopped // frees all SC server-side objects public void stop(){ group.free(); place.free(); rvrb.free(); synth.free(); }
50
/** * Ben Norton * MSc Creative Systems. * Date: 07/09/2011. * This system implements a tangible sample-based synthesis instrument */ import import import import TUIO.*; supercollider.*; oscP5.*; netP5.*;
HashMap synths = new HashMap(); Group group = new Group(); Synth place = new Synth("placeholder"); Synth rvrb = new Synth("reverb"); Synth synth = new Synth("sampler"); TuioProcessing tuioClient; WiiController wiiController; // these are some helper variables which are used // to create scalable graphical feedback float cursor_size = 30; float object_size = 60; float table_size = 760; float scale_factor = 1; PFont font;
void setup(){ size(1024, 768); loop(); frameRate(30); hint(ENABLE_NATIVE_FONTS); font = createFont("Helvetica.vlw", 16); textFont(font); scale_factor = height/table_size; // we set initial parameters of synth objects // to assign buses and node and group execution order place.set("inbus",20); place.addToHead(); rvrb.set("inbus", 20); rvrb.set("mix", 0.0); rvrb.set("room", 0.0); rvrb.addToTail();
51
group.create(); group.addToTail(); // we create an instance of the WiiController wiiController = new WiiController(); // we create an instance of the TuioProcessing client // since we add "this" class as an argument the // TuioProcessing class expects an implementation // of the TUIO callback methods (see below) tuioClient = new TuioProcessing(this); }
// within the draw method we retrieve a Vector (List) of // TuioObject and TuioCursor (polling) from the TuioProcessing // client and then loop over both lists to draw the graphical feedback. void draw(){ background(0); textFont(font,18*scale_factor); float obj_size = object_size*scale_factor; float cur_size = cursor_size*scale_factor; Vector tuioObjectList = tuioClient.getTuioObjects(); for (int i=0;i<tuioObjectList.size();i++) { TuioObject tobj = (TuioObject)tuioObjectList.elementAt(i); stroke(0); fill(0); pushMatrix(); translate(tobj.getScreenX(width), tobj.getScreenY(height)); rotate(tobj.getAngle()); rect(-obj_size/2, -obj_size/2, obj_size, obj_size); popMatrix(); fill(255); text(""+tobj.getSymbolID(), tobj.getScreenX(width), tobj.getScreenY(height)); } Vector tuioCursorList = tuioClient.getTuioCursors(); for (int i=0;i<tuioCursorList.size();i++) { TuioCursor tcur = (TuioCursor)tuioCursorList.elementAt(i); Vector pointList = tcur.getPath(); if (pointList.size()>0){ stroke(255,0,0);
52
TuioPoint start_point = (TuioPoint)pointList.firstElement(); for (int j=0;j<pointList.size();j++) { TuioPoint end_point = (TuioPoint)pointList.elementAt(j); line(start_point.getScreenX(width), start_point.getScreenY(height), end_point.getScreenX(width), end_point.getScreenY(height)); start_point = end_point; } stroke(192,192,192); fill(192,192,192); ellipse(tcur.getScreenX(width), tcur.getScreenY(height), cur_size,cur_size); fill(0); text(""+ tcur.getCursorID(), tcur.getScreenX(width)-5, tcur.getScreenY(height)+5); } } // the Wii remote 'A' button plus roll and pitch adjust reverb parameters if(wiiController.buttonA){ rvrb.set("mix", map(wiiController.roll, -85.0, 91.0, 0.0, 1.0)); rvrb.set("room", map(wiiController.pitch, 92.0, -90.0, 0.0, 1.0)); } } // these callback methods are called whenever a TUIO event occurs // called when an object is added to the scene void addTuioObject(TuioObject tobj) { } // called when an object is removed from the scene void removeTuioObject(TuioObject tobj) { } // called when an object is moved void updateTuioObject (TuioObject tobj) { } // called when a cursor is added to the scene void addTuioCursor(TuioCursor tcur) { // assign specified buffers to cursor instances // when they're introduced to the scene
53
int id = tcur.getCursorID()%2; Synth synth; if (id == 0) { synth = new Synth("sampler"); } else { synth = new Synth("sampler"); } synths.put(tcur.getSessionID(),synth); // set initial parameters of the synth objects synth.set("outbus",20); synth.set("amp", 1.0); // set initial envelope triggering rate relative to // the cursor distance to centre of table if(tcur.getDistance(0.5,0.5) < 0.15){ synth.set("trigrate", 1.0); } else if(tcur.getDistance(0.5,0.5) < 0.275){ synth.set("trigrate", 0.5); } else if(tcur.getDistance(0.5,0.5) < 0.4){ synth.set("trigrate", 0.25); } else if(tcur.getDistance(0.5,0.5) < 0.5){ synth.set("trigrate", 0.125); } else { synth.set("trigrate", 0.0625); } // set initial sample buffer relative to // the cursor angle to centre of table if(tcur.getAngleDegrees(0.5,0.5) < 36){ synth.set("bufNo", 0); } else if(tcur.getAngleDegrees(0.5,0.5) < 72){ synth.set("bufNo", 1); } else if(tcur.getAngleDegrees(0.5,0.5) < 108){ synth.set("bufNo", 2); } else if(tcur.getAngleDegrees(0.5,0.5) < 144){ synth.set("bufNo", 3); } else if(tcur.getAngleDegrees(0.5,0.5) < 180){ synth.set("bufNo", 4);
54
} else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 5); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 6); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 7); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 8); } else { synth.set("bufNo", 9); } synth.addToHead(group); synths.put(tcur.getCursorID(),synth); }
< 216){
< 252){
< 288){
< 324){
// called when a cursor is moved void updateTuioCursor (TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); // set envelope triggering rate relative to // the cursor distance to centre of table if(tcur.getDistance(0.5,0.5) < 0.2){ synth.set("trigrate", 1.0); } else if(tcur.getDistance(0.5,0.5) < 0.325){ synth.set("trigrate", 0.5); } else if(tcur.getDistance(0.5,0.5) < 0.425){ synth.set("trigrate", 0.25); } else if(tcur.getDistance(0.5,0.5) < 0.5){ synth.set("trigrate", 0.125); } else { synth.set("trigrate", 0.0625); } // set sample buffer relative to // the cursor angle to centre of table if(tcur.getAngleDegrees(0.5,0.5) < 36){ synth.set("bufNo", 0); }
55
else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 1); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 2); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 3); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 4); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 5); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 6); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 7); } else if(tcur.getAngleDegrees(0.5,0.5) synth.set("bufNo", 8); } else { synth.set("bufNo", 9); }
< 72){
< 108){
< 144){
< 180){
< 216){
< 252){
< 288){
< 324){
// the Wii remote 'Home' button plus roll assigns overall tempo if(wiiController.buttonHome){ synth.set("tempo", map(wiiController.roll, -85.0, 91.0, 2.0, 0.25)); } }
// called when a cursor is removed from the scene void removeTuioCursor(TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); synth.free(); if (tuioClient.getTuioCursors().size() == 0) { group.freeAll(); } }
// called after each message bundle // representing the end of an image frame void refresh(TuioTime bundleTime) { redraw();
56
// called after the program is stopped // frees all SC server-side objects public void stop(){ group.free(); place.free(); rvrb.free(); synth.free(); }
57
/** * Ben Norton * MSc Creative Systems. * Date: 07/09/2011. * This system implements a tangible physical-modelling * synthesis of a plucked string instrument */ import import import import TUIO.*; supercollider.*; oscP5.*; netP5.*;
HashMap synths = new HashMap(); Group group = new Group(); Synth place = new Synth("placeholder"); Synth rvrb = new Synth("reverb"); Synth synth = new Synth("karplusStrong"); Synth reson = new Synth("resonator"); TuioProcessing tuioClient; WiiController wiiController; // these are some helper variables which are used // to create scalable graphical feedback float cursor_size = 30; float object_size = 60; float table_size = 760; float scale_factor = 1; PFont font; void setup(){ size(1024, 768); loop(); frameRate(30); hint(ENABLE_NATIVE_FONTS); font = createFont("Helvetica.vlw", 16); textFont(font); scale_factor = height/table_size; // we set initial parameters of synth objects // to assign buses and node and group execution order place.set("inbus",20); place.addToHead(); rvrb.set("inbus", 20); rvrb.set("mix", 0.0); rvrb.set("room", 0.0); rvrb.addToTail();
58
reson.set("inbus", 20); reson.addToTail(); group.create(); group.addToTail(); // we create an instance of the WiiController wiiController = new WiiController(); // we create an instance of the TuioProcessing client // since we add "this" class as an argument the // TuioProcessing class expects an implementation // of the TUIO callback methods (see below) tuioClient = new TuioProcessing(this); } // within the draw method we retrieve a Vector (List) of // TuioObject and TuioCursor (polling) from the TuioProcessing // client and then loop over both lists to draw the graphical feedback. void draw(){ background(0); textFont(font,18*scale_factor); float obj_size = object_size*scale_factor; float cur_size = cursor_size*scale_factor; Vector tuioObjectList = tuioClient.getTuioObjects(); for (int i=0;i<tuioObjectList.size();i++) { TuioObject tobj = (TuioObject)tuioObjectList.elementAt(i); stroke(0); fill(0); pushMatrix(); translate(tobj.getScreenX(width), tobj.getScreenY(height)); rotate(tobj.getAngle()); rect(-obj_size/2, -obj_size/2, obj_size, obj_size); popMatrix(); fill(255); text(""+tobj.getSymbolID(), tobj.getScreenX(width), tobj.getScreenY(height)); } Vector tuioCursorList = tuioClient.getTuioCursors(); for (int i=0;i<tuioCursorList.size();i++) { TuioCursor tcur = (TuioCursor)tuioCursorList.elementAt(i); Vector pointList = tcur.getPath();
59
if (pointList.size()>0){ stroke(255,0,0); TuioPoint start_point = (TuioPoint)pointList.firstElement(); for (int j=0;j<pointList.size();j++) { TuioPoint end_point = (TuioPoint)pointList.elementAt(j); line(start_point.getScreenX(width), start_point.getScreenY(height), end_point.getScreenX(width), end_point.getScreenY(height)); start_point = end_point; } stroke(192,192,192); fill(192,192,192); ellipse(tcur.getScreenX(width), tcur.getScreenY(height), cur_size,cur_size); fill(0); text(""+ tcur.getCursorID(), tcur.getScreenX(width)-5, tcur.getScreenY(height)+5); } } // the Wii remote 'A' button plus roll and pitch // adjusts reverb parameters if(wiiController.buttonA){ rvrb.set("mix", map(wiiController.roll, -85.0, 91.0, 0.0, 1.0)); rvrb.set("room", map(wiiController.pitch, 92.0, -90.0, 0.0, 10.0)); } // the Wii remote direction buttons plus roll and pitch set // resonant mode frequency and amplitude if(wiiController.buttonLeft){ reson.set("freq1", map(wiiController.roll, -85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp1", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } if(wiiController.buttonUp){ reson.set("freq2", map(wiiController.roll, -85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp2", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } if(wiiController.buttonRight){ reson.set("freq3", map(wiiController.roll,
60
-85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp3", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } if(wiiController.buttonDown){ reson.set("freq4", map(wiiController.roll, -85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp4", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } } // these callback methods are called whenever a TUIO event occurs // called when an object is added to the scene void addTuioObject(TuioObject tobj) { } // called when an object is removed from the scene void removeTuioObject(TuioObject tobj) { } // called when an object is moved void updateTuioObject (TuioObject tobj) { } // called when a cursor is added to the scene void addTuioCursor(TuioCursor tcur) { // assign synth to cursor instance // when introduced to the scene int id = tcur.getCursorID()%2; Synth synth; if (id == 0) { synth = new Synth("karplusStrong"); } else { synth = new Synth("karplusStrong"); } // set initial envelope triggering rate relative to // the cursor distance to centre of table if(tcur.getDistance(0.5,0.5) < 0.15){ synth.set("trigrate", 1.0); } else if(tcur.getDistance(0.5,0.5) < 0.275){ synth.set("trigrate", 0.5); } else if(tcur.getDistance(0.5,0.5) < 0.4){ synth.set("trigrate", 0.25);
61
} else if(tcur.getDistance(0.5,0.5) < 0.5){ synth.set("trigrate", 0.125); } else { synth.set("trigrate", 0.0625); } synths.put(tcur.getSessionID(),synth); // set synth object initial parameters synth.set("midi", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 108)); synth.set("outbus",20); synth.set("amp", 0.2); synth.addToHead(group); synths.put(tcur.getCursorID(),synth); } // called when a cursor is moved void updateTuioCursor (TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); // set synth object midinote to polar co-ordinate from centre of scene synth.set("midi", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 108)); // set envelope triggering rate relative to // the cursor distance to centre of table if(tcur.getDistance(0.5,0.5) < 0.15){ synth.set("trigrate", 1.0); } else if(tcur.getDistance(0.5,0.5) < 0.275){ synth.set("trigrate", 0.5); } else if(tcur.getDistance(0.5,0.5) < 0.4){ synth.set("trigrate", 0.25); } else if(tcur.getDistance(0.5,0.5) < 0.5){ synth.set("trigrate", 0.125); } else { synth.set("trigrate", 0.0625); } // the Wii remote 'B' button plus roll and pitch assign // note sustain and delay time
62
if(wiiController.buttonB){ synth.set("dur", map(wiiController.roll, -85.0, 91.0, 0.0, 5.0)); synth.set("delayTime", map(wiiController.pitch, 92.0, -90.0, 0.0001, 0.5)); } // the Wii remote 'Home' button plus roll assigns overall tempo if(wiiController.buttonHome){ synth.set("tempo", map(wiiController.roll, -85.0, 91.0, 2.0, 0.25)); } println("update cursor "+tcur.getCursorID()+ " ("+tcur.getSessionID()+ ") "+tcur.getX()+" "+tcur.getY()+ " "+tcur.getMotionSpeed()+ " "+tcur.getMotionAccel()); } // called when a cursor is removed from the scene void removeTuioCursor(TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); synth.free(); if (tuioClient.getTuioCursors().size() == 0) { group.freeAll(); } } // called after each message bundle // representing the end of an image frame void refresh(TuioTime bundleTime) { redraw(); } // called after the program is stopped // frees all SC server-side objects public void stop(){ group.free(); place.free(); rvrb.free(); synth.free(); reson.free(); }
63
/** * Ben Norton * MSc Creative Systems. * Date: 07/09/2011. * This system implements a tangible analogue-modelling * synthesis instrument */ import import import import TUIO.*; supercollider.*; oscP5.*; netP5.*;
HashMap synths = new HashMap(); Group group = new Group(); Synth place = new Synth("placeholder"); Synth synth = new Synth("analogLF"); Synth reson = new Synth("resonator"); TuioProcessing tuioClient; WiiController wiiController; // these are some helper variables which are used // to create scalable graphical feedback float cursor_size = 30; float object_size = 60; float table_size = 760; float scale_factor = 1; PFont font; void setup(){ size(1024, 768); loop(); frameRate(30); hint(ENABLE_NATIVE_FONTS); font = createFont("Helvetica.vlw", 16); textFont(font); scale_factor = height/table_size; // we set initial parameters of synth objects // to assign buses and node and group execution order place.set("inbus",20); place.addToHead(); reson.set("inbus", 20); reson.addToTail(); group.create(); group.addToTail();
64
// we create an instance of the WiiController wiiController = new WiiController(); // we create an instance of the TuioProcessing client // since we add "this" class as an argument the // TuioProcessing class expects an implementation // of the TUIO callback methods (see below) tuioClient = new TuioProcessing(this); } // within the draw method we retrieve a Vector (List) of // TuioObject and TuioCursor (polling) from the TuioProcessing // client and then loop over both lists to draw the graphical feedback. void draw(){ background(0); textFont(font,18*scale_factor); float obj_size = object_size*scale_factor; float cur_size = cursor_size*scale_factor; Vector tuioObjectList = tuioClient.getTuioObjects(); for (int i=0;i<tuioObjectList.size();i++) { TuioObject tobj = (TuioObject)tuioObjectList.elementAt(i); stroke(0); fill(0); pushMatrix(); translate(tobj.getScreenX(width), tobj.getScreenY(height)); rotate(tobj.getAngle()); rect(-obj_size/2, -obj_size/2, obj_size, obj_size); popMatrix(); fill(255); text(""+tobj.getSymbolID(), tobj.getScreenX(width), tobj.getScreenY(height)); } Vector tuioCursorList = tuioClient.getTuioCursors(); for (int i=0;i<tuioCursorList.size();i++) { TuioCursor tcur = (TuioCursor)tuioCursorList.elementAt(i); Vector pointList = tcur.getPath(); if (pointList.size()>0){ stroke(255,0,0); TuioPoint start_point = (TuioPoint)pointList.firstElement();
65
for (int j=0;j<pointList.size();j++) { TuioPoint end_point = (TuioPoint)pointList.elementAt(j); line(start_point.getScreenX(width), start_point.getScreenY(height), end_point.getScreenX(width), end_point.getScreenY(height)); start_point = end_point; } stroke(192,192,192); fill(192,192,192); ellipse(tcur.getScreenX(width), tcur.getScreenY(height), cur_size,cur_size); fill(0); text(""+ tcur.getCursorID(), tcur.getScreenX(width)-5, tcur.getScreenY(height)+5); } } // the Wii remote direction buttons plus roll and pitch // adjusts resonant mode frequency and amplitude parameters if(wiiController.buttonLeft){ reson.set("freq1", map(wiiController.roll, -85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp1", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } if(wiiController.buttonUp){ reson.set("freq2", map(wiiController.roll, -85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp2", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } if(wiiController.buttonRight){ reson.set("freq3", map(wiiController.roll, -85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp3", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } if(wiiController.buttonDown){ reson.set("freq4", map(wiiController.roll, -85.0, 91.0, sq(0.0), sq(30.0))); reson.set("klAmp4", map(wiiController.pitch, 92.0, -90.0, 0.0, 0.5)); } } // these callback methods are called whenever a TUIO event occurs
66
// called when an object is added to the scene void addTuioObject(TuioObject tobj) { } // called when an object is removed from the scene void removeTuioObject(TuioObject tobj) { } // called when an object is moved void updateTuioObject (TuioObject tobj) { } // called when a cursor is added to the scene void addTuioCursor(TuioCursor tcur) { // assign synth to cursor instance // when introduced to the scene int id = tcur.getCursorID()%2; Synth synth; if (id == 0) { synth = new Synth("analogLF"); } else { synth = new Synth("analogLF"); } // set initial envelope triggering rate relative to // the cursor distance to centre of table if(tcur.getDistance(0.5,0.5) < 0.15){ synth.set("trigrate", 1.0); } else if(tcur.getDistance(0.5,0.5) < 0.275){ synth.set("trigrate", 0.5); } else if(tcur.getDistance(0.5,0.5) < 0.4){ synth.set("trigrate", 0.25); } else if(tcur.getDistance(0.5,0.5) < 0.5){ synth.set("trigrate", 0.125); } else { synth.set("trigrate", 0.0625); } synths.put(tcur.getSessionID(),synth); // set synth object initial parameters synth.set("freq", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 108));
67
synth.set("outbus",20); synth.set("amp", 0.2); synth.addToHead(group); synths.put(tcur.getCursorID(),synth); } // called when a cursor is moved void updateTuioCursor (TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); // set synth object midinote to polar co-ordinate from centre of scene synth.set("freq", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 108)); // set envelope triggering rate relative to // the cursor distance to centre of table if(tcur.getDistance(0.5,0.5) < 0.15){ synth.set("trigrate", 1.0); } else if(tcur.getDistance(0.5,0.5) < 0.275){ synth.set("trigrate", 0.5); } else if(tcur.getDistance(0.5,0.5) < 0.4){ synth.set("trigrate", 0.25); } else if(tcur.getDistance(0.5,0.5) < 0.5){ synth.set("trigrate", 0.125); } else { synth.set("trigrate", 0.0625); } // the Wii remote 'A' button plus roll and pitch assign // the upper and lower deviations in the mixed waveforms // to create analogue detuning effect if(wiiController.buttonA){ synth.set("randLo", map(wiiController.roll, -85.0, 91.0, 0.0, 1.0)); synth.set("randHi", map(wiiController.pitch, 92.0, -90.0, 1.0, 2.0)); } // the Wii remote 'B' button plus roll and pitch assign // the low pass filter cutoff frequency and bandwidth ratio if(wiiController.buttonB){ synth.set("rez", map(wiiController.roll, -85.0, 91.0, 200.0, 12800.0)); synth.set("rq", map(wiiController.pitch, 92.0, -90.0, 2.0, 0.01 )); }
68
// the Wii remote 'Minus' and 'Plus' buttons plus roll and pitch // set the attack time, release time, sustain time and sustain level // of the synth's envelope if(wiiController.buttonMinus){ synth.set("aT", map(wiiController.roll, -85.0, 91.0, 0.0, 1.0)); synth.set("rT", map(wiiController.pitch, 92.0, -90.0, 0.0, 1.0)); } if(wiiController.buttonPlus){ synth.set("sT", map(wiiController.roll, -85.0, 91.0, 0.0, 1.0)); synth.set("sL", map(wiiController.pitch, 92.0, -90.0, 0.1, 1.0)); } // the Wii remote 'Home' button sets the overall tempo if(wiiController.buttonHome){ synth.set("tempo", map(wiiController.roll, -85.0, 91.0, 2.0, 0.25)); } println("update cursor "+tcur.getCursorID()+ " ("+tcur.getSessionID()+ ") "+tcur.getX()+" "+tcur.getY()+ " "+tcur.getMotionSpeed()+ " "+tcur.getMotionAccel()); } // called when a cursor is removed from the scene void removeTuioCursor(TuioCursor tcur) { Synth synth = (Synth)synths.get(tcur.getSessionID()); synth.free(); if (tuioClient.getTuioCursors().size() == 0) { group.freeAll(); } } // called after each message bundle // representing the end of an image frame void refresh(TuioTime bundleTime) { redraw(); } // called after the program is stopped // frees all SC server-side objects public void stop(){ group.free(); place.free(); synth.free(); reson.free(); }
69
/** * Ben Norton * MSc Creative Systems. * Date: 07/09/2011. * This system implements a tangible analogue-modelling * synthesis arpeggiator */ import import import import TUIO.*; supercollider.*; oscP5.*; netP5.*;
HashMap synths = new HashMap(); Group group = new Group(); Synth place = new Synth("placeholder"); Synth synth = new Synth("analogArp"); TuioProcessing tuioClient; WiiController wiiController; // these are some helper variables which are used // to create scalable graphical feedback float cursor_size = 30; float object_size = 60; float table_size = 760; float scale_factor = 1; PFont font; void setup(){ size(1024, 768); loop(); frameRate(30); hint(ENABLE_NATIVE_FONTS); font = createFont("Helvetica.vlw", 16); textFont(font); scale_factor = height/table_size; // we set initial parameters of synth objects // to assign buses and node and group execution order place.set("inbus",20); place.addToHead(); synth.set("outbus", 20); synth.set("amp", 0.0); synth.create(); synth.addToHead(); group.create();
70
group.addToTail(); // we create an instance of the WiiController wiiController = new WiiController(); // we create an instance of the TuioProcessing client // since we add "this" class as an argument the // TuioProcessing class expects an implementation // of the TUIO callback methods (see below) tuioClient = new TuioProcessing(this); } // within the draw method we retrieve a Vector (List) of // TuioObject and TuioCursor (polling) from the TuioProcessing // client and then loop over both lists to draw the graphical feedback. void draw(){ background(0); textFont(font,18*scale_factor); float obj_size = object_size*scale_factor; float cur_size = cursor_size*scale_factor; Vector tuioObjectList = tuioClient.getTuioObjects(); for (int i=0;i<tuioObjectList.size();i++) { TuioObject tobj = (TuioObject)tuioObjectList.elementAt(i); stroke(0); fill(0); pushMatrix(); translate(tobj.getScreenX(width), tobj.getScreenY(height)); rotate(tobj.getAngle()); rect(-obj_size/2, -obj_size/2, obj_size, obj_size); popMatrix(); fill(255); text(""+tobj.getSymbolID(), tobj.getScreenX(width), tobj.getScreenY(height)); } Vector tuioCursorList = tuioClient.getTuioCursors(); for (int i=0;i<tuioCursorList.size();i++) { TuioCursor tcur = (TuioCursor)tuioCursorList.elementAt(i); Vector pointList = tcur.getPath(); if (pointList.size()>0){ stroke(255,0,0); TuioPoint start_point = (TuioPoint)pointList.firstElement();
71
for (int j=0;j<pointList.size();j++) { TuioPoint end_point = (TuioPoint)pointList.elementAt(j); line(start_point.getScreenX(width), start_point.getScreenY(height), end_point.getScreenX(width), end_point.getScreenY(height)); start_point = end_point; } stroke(192,192,192); fill(192,192,192); ellipse(tcur.getScreenX(width), tcur.getScreenY(height), cur_size,cur_size); fill(0); text(""+ tcur.getCursorID(), tcur.getScreenX(width)-5, tcur.getScreenY(height)+5); } } // the Wii remote 'B' button sets amplitude if(wiiController.buttonB){ synth.set("amp", 0.2); } else{ synth.set("amp", 0.0); } // the Wii remote 'A' button plus pitch sets portamento if(wiiController.buttonA){ synth.set("port", map(wiiController.pitch, 92.0, -90.0, 0.0, 4.0)); } // the Wii remote direction buttons plus pitch // low pass filter cutoff frequency if(wiiController.buttonLeft){ synth.set("filt1", map(wiiController.pitch, } if(wiiController.buttonUp){ synth.set("filt2", map(wiiController.pitch, } if(wiiController.buttonRight){ synth.set("filt3", map(wiiController.pitch, } if(wiiController.buttonDown){ synth.set("filt4", map(wiiController.pitch, } set
72
// the Wii remote 'Home' button plus pitch sets overall tempo if(wiiController.buttonHome){ synth.set("temp", map(wiiController.pitch, -85.0, 91.0, 0.1, 4.0)); } } // these callback methods are called whenever a TUIO event occurs // called when an object is added to the scene void addTuioObject(TuioObject tobj) { } // called when an object is removed from the scene void removeTuioObject(TuioObject tobj) { } // called when an object is moved void updateTuioObject (TuioObject tobj) { } // called when a cursor is added to the scene void addTuioCursor(TuioCursor tcur) { // set frequency and duration parameters with cursor instances int id = tcur.getCursorID()%4; if (id == 0) { synth.set("freq1", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per1", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.01, 2)); } else if (id==1) { synth.set("freq2", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per2", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.01, 2)); } else if (id==2) { synth.set("freq3", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per3", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.01, 2)); } else { synth.set("freq4", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per4", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.01, 2)); } }
73
// called when a cursor is moved void updateTuioCursor (TuioCursor tcur) { // set frequency and duration parameters with cursor instances int id = tcur.getCursorID()%4; if (id==0) { synth.set("freq1", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per1", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.0, 2)); } else if (id==1) { synth.set("freq2", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per2", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.0, 2)); } else if (id==2) { synth.set("freq3", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per3", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.0, 2)); } else { synth.set("freq4", map(tcur.getAngleDegrees(0.5, 0.5), 0.0, 360.0, 21, 81)); synth.set("per4", map(tcur.getDistance(0.5, 0.5), 0.0, 0.5, 0.0, 2)); } } // called when a cursor is removed from the scene void removeTuioCursor(TuioCursor tcur) { } // called after each message bundle // representing the end of an image frame void refresh(TuioTime bundleTime) { redraw(); } // called after the program is stopped // frees all SC server-side objects public void stop(){ group.free(); place.free(); synth.free(); }
74
/** * This class enables OSC communication using the Wii Remote * import oscP5.*; import netP5.*; /** darwiinremoteOSC address space /wii/connected , i /wii/mousemode , i /wii/button/a , i /wii/button/b , i /wii/button/up , i /wii/button/down , i /wii/button/left , i /wii/button/right , i /wii/button/minus , i /wii/button/plus , i /wii/button/home , i /wii/button/one , i /wii/button/two , i /wii/acc , fff /wii/orientation , ff /wii/irdata , ffffffffffff /wii/batterylevel , f /nunchuk/joystick , ff /nunchuk/button/z , i /nunchuk/button/c , i /nunchuk/acc , fff /nunchuk/orientation , ff */ class WiiController { OscP5 osc; boolean buttonA, buttonB, buttonUp, buttonDown, buttonLeft, buttonRight; boolean buttonOne, buttonTwo, buttonMinus, buttonPlus, buttonHome, buttonC, buttonZ; boolean isConnected; float roll, pitch; float nRoll, nPitch; Acceleration acc; float x,y; float nX, nY; float pNx, pNy; Acceleration nAcc; boolean isNunchuck = false; float batterylevel; boolean DEBUG = true;
75
IRdata[] ir; String remoteAddress; int remotePort; WiiController() { // by default darwiinremoteOSC sends OSC messages to port 5600 osc = new OscP5(this,5600); // the address and the port of darwiinremoteOSC remoteAddress = "127.0.0.1"; remotePort = 5601; acc = new Acceleration(); nAcc = new Acceleration(); ir = new IRdata[4]; osc.plug(this,"connected","/wii/connected");// i osc.plug(this,"mousemode","/wii/mousemode");// i osc.plug(this,"buttonA","/wii/button/a");// i osc.plug(this,"buttonB","/wii/button/b");// i osc.plug(this,"buttonUp","/wii/button/up");// i osc.plug(this,"buttonDown","/wii/button/down");// i osc.plug(this,"buttonLeft","/wii/button/left");// i osc.plug(this,"buttonRight","/wii/button/right");// i osc.plug(this,"buttonMinus","/wii/button/minus");// i osc.plug(this,"buttonPlus","/wii/button/plus");// i osc.plug(this,"buttonHome","/wii/button/home");// i osc.plug(this,"buttonOne","/wii/button/one");// i osc.plug(this,"buttonTwo","/wii/button/two");// i osc.plug(this,"acceleration","/wii/acc"); osc.plug(this,"orientation","/wii/orientation"); osc.plug(this,"irdata","/wii/irdata"); osc.plug(this,"batterylevel","/wii/batterylevel"); osc.plug(this,"joystick","/nunchuk/joystick"); osc.plug(this,"buttonZ","/nunchuk/button/z"); osc.plug(this,"buttonC","/nunchuk/button/c"); osc.plug(this,"nunchukAcceleration","/nunchuk/acc"); osc.plug(this,"nunchukOrientation","/nunchuk/orientation"); } void connected(int theValue) { isConnected = (theValue==0) ? false:true; } void oscEvent(OscMessage theEvent) { //println(theEvent.addrPattern()+"/ "+theEvent.typetag()); }
76
void acceleration(float theX, float theY, float theZ) { acc.x = theX; acc.y = theY; acc.z = theZ; //if(DEBUG) { // println("acceleration x:"+acc.x+" y:"+acc.y+" z:"+acc.z); //} } void orientation(float theRoll, float thePitch) { roll += (theRoll - roll)*0.04; pitch += (thePitch - pitch)*0.04; //if(DEBUG) { // println("orientation roll:"+roll+" pitch:"+pitch); //} } // darwiinremoteOSC sends 12 floats containing the x,y and size values for // 4 IR spots the wiimote can sense. values are between 0 and 1 for x and y // values for size are 0 and bigger. if the size is 15 or 0, the IR point is //not recognized by the wiimote. void ir( float f10, float f11,float f12, float f20,float f21, float f22, float f30, float f31, float f32, float f40, float f41, float f42 ) { ir[0].x = f10; ir[0].y = f11; ir[0].s = f12; ir[1].x = f20; ir[1].y = f21; ir[1].s = f22; ir[2].x = f30; ir[2].y = f31; ir[2].s = f32; ir[3].x = f40; ir[3].y = f41; ir[3].s = f42; }
void joystick(float theX, float theY) { // the origin xy coordinates for the joystick are theX = 125, and theY=129 nX = theX; nY = theY; isNunchuck = true; } void nunchukAcceleration(float theX, float theY, float theZ) {
77
void nunchukOrientation(float theRoll, float thePitch) { nRoll += (theRoll - nRoll)*0.04; nPitch += (thePitch - nPitch)*0.04; //if(DEBUG) { // println("NUNCHUCK orientation roll:"+roll+" pitch:"+pitch); //} } void buttonA(int theValue) { buttonA = (theValue==1) ? true:false; } void buttonB(int theValue) { buttonB = (theValue==1) ? true:false; } void buttonOne(int theValue) { buttonOne = (theValue==1) ? true:false; } void buttonTwo(int theValue) { buttonTwo = (theValue==1) ? true:false; } void buttonMinus(int theValue) { buttonMinus = (theValue==1) ? true: false; } void buttonPlus(int theValue) { buttonPlus = (theValue==1) ? true:false; } void buttonUp(int theValue) { buttonUp = (theValue==1) ? true:false; } void buttonDown(int theValue) { buttonDown = (theValue==1) ? true:false; } void buttonLeft(int theValue) { buttonLeft = (theValue==1) ? true:false; }
78
void buttonRight(int theValue) { buttonRight = (theValue==1) ? true:false; } void buttonC(int theValue) { buttonC = (theValue==1) ? true:false; } void buttonZ(int theValue) { buttonZ = (theValue==1) ? true:false; } void buttonHome(int theValue) { buttonHome = (theValue==1) ? true:false; } void batterylevel(float theValue) { //println("BatteryLevel: "+theValue); batterylevel = theValue; } class Acceleration { float x,y,z; float speedX=0, speedY=0, speedZ=0; } class IRdata { float x,y,s; } void requestBatterylevel() { osc.send("/wii/batterylevel",new Object[] {},remoteAddress,remotePort); } void forcefeedback(boolean theFlag) { int n = 0; if(theFlag) { n = 1; } osc.send("/wii/forcefeedback",new Object[] {new Integer(n)}, remoteAddress,remotePort); } void led(int[] n) { osc.send("/wii/led",new Object[] { new Integer(n[0]), new Integer(n[1]),new Integer(n[2]), new Integer(n[3])},remoteAddress,remotePort); } }
79
80
Bibliography
Baskinger, M. and Gross, M. (2010) 'Tangible Interaction = Form + Computing'. Interactions, 17(1), pp. 6-11. Boden, M.A. (2004) The Creative Mind: Myths and Mechanisms. 2nd edn. London & New York: George Weidenfield and Nicolson Ltd. Boden, M.A. (2006) Mind as Machine: A History of Cognitive Science. Oxford: Oxford University Press. Bdker, S., Grnbk, K., and Kyng, M. (1993). 'Cooperative Design: Techniques and Experiences from the Scandinavian Scene'. In D. Schuler and A. Namioka (eds.) Participatory design: Principles and practices. NJ, USA: Erlbaum. Bovermann, T. (2009) 'Tangible Auditory Interfaces: Combining Auditory Displays and Tangible Interfaces'. PhD Thesis. Bielefeld University. Brentano, F. (1995) Psychology from an Empirical Standpoint. 2nd edn. London: Routledge and Kegan Paul. Chin, J. P., Diehl, V. A. and Norman, K. L. (1988) 'Development of an Instrument Measuring User Satisfaction of the Human-Computer Interface'. Proceedings of SIGCHI '88, New York: ACM/SIGCHI, pp. 213-218. Clark, A. (1998) Being There. Cambridge, MA: MIT Press. Clark, A. and Chalmers, D. J. (2000) 'The Extended Mind'. In Chalmers, D.J. (ed.) Philosophy of Mind: Classical and Contemporary Readings. Oxford: Oxford University Press. Cskszentmihlyi, M. (1992) Flow: The Psychology of Optimal Experience. New York, NY: Harper & Row. de Campo, A., Rohruber, J., Bovermann, T. and Frauenberger, C. (2011) 'Sonification and Auditory Displays in SuperCollider'. In Wilson, S., Cottle, D. & Collins, N. (eds.) The SuperCollider Book. Cambridge, MA: MIT Press. Dourish, P. (2001) Where The Action Is. Cambridge, MA: MIT Press. Engelbart, D. C. (1962) Augmenting the Human Intellect: A Conceptual Framework. [Online] Available from: www.liquidinformation.org/ohs/62_paper_full.pdf (accessed 23/06/11). Fauconnier, G. and Turner, M. (2003) The Way We Think: Conceptual Blending and the Mind's Hidden Complexity. New York, NY: Basic Books. Fischer, G. (2011) 'Understanding, Fostering, and Supporting Cultures of Participation', Interactions, 18 (3), pp. 42-45.
81
Fitzmaurice, G., Ishii, H. and Buxton, W. (1995) 'Bricks: Laying the Foundations for Graspable User Interfaces'. Proceedings of CHI '95, pp. 442-449. Gallagher, S. (2005) How the Body Shapes the Mind. Oxford: Oxford University Press. Gallagher, S. and Cole, J. (1995) 'Body Image and Body Schema in a Deafferented Subject'. Journal of Mind and Behavior, 16(4), pp, 369-390. Garfinkel, H. (1984) Studies in Ethnomethodology. 2nd edn. Cambridge, UK: Blackwell Publishing Ltd. Geiger, G., Alber, N., Jord, S. and Alonso, M. (2010) The Reactable: A Collaborative Musical Instrument for Playing and Understanding Music. [Online] Available from: www.reactable.com/files/Her&Mus_Reactable.pdf (accessed 06/06/2011). Gibson, J. J. (1986) The Ecological Approach to Visual Perception. 2nd edn. Hillsdale, NJ: Lawrence Erlbaum Associates Ltd. Han, J. Y. (2005) 'Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection'. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology. Heidegger, M. (1962) Being and Time. Oxford: Blackwell Publishers. Hermann, T. and Ritter, H. (1999) 'Listen to Your Data: Model-Based Sonification for Exploratory Data Analysis'. In Lasker, G. E. (ed.) Advances in Intelligent Computing and Multimedia Systems. Baden-Baden Germany, pp. 189-194. Hornecker, E. (2006) 'A Design Theme for Tangible Interaction: Embodied Facilitation'. Proceedings of the 9th European Conference on Computer Supported Cooperative Work (ECSCW'05), Kluwer/Springer, pp. 23-43. Hornecker, E. (2006) 'An Encompassing View on Tangible Interaction: A Framework'. Position Paper for CHI 2006 Workshop: What is the Next Generation of Human-Computer Interaction? Workshop proceedings TR-2006-3, April 2006, Tufts University. Huang, Y., Gross, M. D., Yi-Luen Do, E. and Eisenberg, M. (2009) 'Easigami: a reconfigurable folded-sheet TUI'. Proceedings of the 3rd International Conference on Tangible and Embedded Interaction (TEI '09), Feb 16-18, Regent, United Kingdom, pp. 107-122. Hutchins, E. (1995) Cognition in the Wild. Cambridge, MA: MIT Press. Husserl, E. (2001) Logical Investigations. 2nd edn. London: Routledge. Ihde, D. (1990) Technology and the Lifeworld: from Garden to Earth. Bloomington, IN: Indiana University Press. Ishii, H. and Ullmer, B. (1997). 'Tangible Bits: Towards Seamless Interfaces Between People, Bits, and Atoms'. Proceedings of the ACM Conference on Human Factors in Computing Systems, CHI '97, pp. 234-241. 82
Ishii, H., Underkoffler, J., Chak, D., Piper, B., Ben-Joseph, E., Yeung, L. and Kanji, Z. (2002) Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation. [Online] Available from: http://tangible.media.mit.edu/papers/Urban_Planning_Workbench_ISMAR02.php (accessed 06/06/2011). Ishii, H., Ratti, C., Piper, B., Wang, Y., Biderman, A. and Ben-Joseph, E. (2004) 'Bringing Clay and Sand into Digital DesignContinuous Tangible User Interfaces', BT Technology Journal, 22(4), pp. 287-299. Jacob, R. J. K., Ishii, H., Pangaro, G. and Patten, J. (2002) A Tangible Interface for Organizing Information Using a Grid. [Online] available from: http://www.dourish.com/classes/ics203bs04/10-Jacob.pdf (accessed 11/07/11). Jord, S. (2003) 'Sonigraphical Instruments: From FMOL to the reacTable*', Proceedings of the 2003 Conference on New Interfaces for Musical Expression (NIME-03), Montreal, Canada, pp. 70-76. Jord, S. (2005) Multi-user Instruments: Models, Examples and Promises. [Online] available from: www.mtg.upf.edu/files/publications/9b7c65-NIME2005-Jorda.pdf (accessed 07/06/11). Kaltenbrunner, M., Bovermann, T., Bencina, R. and Costanza, E. (2005) TUIO: A Protocol for Table- Top Tangible User Interfaces [Online] Available from: www.rossbencina.com/static/writings/tuio_gw2005.pdf (accessed 06/06/2011). Karplus, K. and Strong, A. (1983) 'Digital Synthesis of Plucked-String and Drum Timbres'. Computer Music Journal, 7(2) (Summer), pp. 43-55. Kiefer, C., Collins, N. and Fitzpatrick, G. (2008) 'HCI Methodology For Evaluating Musical Controllers: A Case Study'. Proceedings of the 8th International Conference on New Interfaces for Musical Expression (NIME08), June 4-8, Genova, Italy pp. 87-90. Kirsch, D. and Maglio, P. (1994) 'On Distinguishing Epistemic from Pragmatic Action', Cognitive Science, 18(4), pp. 513549. Koestler, A. (1976) The Act of Creation. 2nd edn. London: Hutchinson & Co. Krakauer J. W. and Shadmehr, R. (2006) 'Consolidation of Motor Memory', TRENDS in Neurosciences, 29(1), pp. 58-64. Lakoff, G. and Johnson, M. (1999) Philosophy in the Flesh: the Embodied Mind and its Challenge to Western Thought. New York, NY: Basic Books. Larson, R., and Csikszentmihalyi, M. (1983) 'The Experience Sampling Method', New Directions for Methodology of Social and Behavioral Science, 15, pp. 41-56. LeClerc, V., Parkes, A. and Ishii, H. (2007) 'Senspectra: A Computationally Augmented Physical 83
Modeling Toolkit for Sensing and Visualization of Structural Strain'. Proceedings of CHI 2007, April 28-May 3, San Jose, CA, USA, pp. 801-804. Lewis, G. (2000) 'Too Many Notes: Complexity and Culture in Voyager'. Leonardo Music Journal, 10, pp. 33-39. Magnusson, T. (2009) 'Epistemic Tools: The Phenomenology of Digital Musical Instruments'. PhD Thesis. University of Sussex. Magnusson, T. and Hurtado, E. (2007) 'The Acoustic, The Digital and the Body: A Survey on Musical Instruments.' Proceedings of the 2007 Conference on New Interfaces for Musical Expression (NIME07), New York, NY, USA ,pp. 94-99. Merleau-Ponty, M. (2002) Phenomenology of Perception. 2nd edn. London: Routledge Classics. Miranda E.R. and Wanderley M.M. (2006) New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. Middleton, WI: A-R Editions Inc. Moggridge, B. (2007) Designing Interactions. Cambridge, MA: MIT Press. Muller, M.J. (2003). 'Participatory design: The third space in HCI'. In J. Jacko and A. Sears (eds.) The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. Mahway, NJ: Lawrence Erlbaum Associates, Inc. Newton-Dunn, H., Nakano, H. and Gibson, J. (2003) Block Jam: A Tangible Interface for Interactive Music. Proceedings of the 2003 Conference on New Interfaces for Musical Expression, (NIME'03), Montreal, Canada, pp. 170-177. Nielsen. J. and Molich, R. (1990) Heuristic Evaluation of User Interfaces. CHI '90: Proceedings of the SIGCHI conference on Human factors in computing systems: Empowering people, April 1-5, Seattle, Washington, USA, pp. 249-256. Norman, D.A (2002) The Design of Everyday Things. 2nd edn. New York, NY: Basic Books. O'Hara, K., Kjeldskov, J. and Paay, J. (2011) 'Blended Interaction Spaces for Distributed Team Collaboration', ACM Transactions on Computer-Human Interaction. 18(1), Article 3, 28 pages. Polanyi, M. (1966/2009) The Tacit Dimension. London: The University of Chicago Press Ltd. Sawyer, R. K. (2003) Group Creativity: Music, Theater, Collaboration. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Roads, C. (1996) The Computer Music Tutorial. Cambridge, MA: MIT Press. Roads, C. (2001) Microsound. Cambridge, MA: MIT Press. Schank, R.C. and Cleary, C. (1993) 'Making Machines Creative'. In: Smith, S., Ward, T.B. and Finke, R.A. (eds.) The Creative Cognition Approach. Cambridge, MA:MIT Press. 84
Shaer, O. and Hornecker, E. (2010) 'Tangible User Interfaces: Past, Present, and Future Directions', Foundations and Trends in HumanComputer Interaction, 3(12), pp. 1137. Shaer, O. and Jacob, R.J.K. (2009) 'A Specification Paradigm for the Design and Implementation of Tangible User Interfaces', ACM Transactions on Computer-Human Interaction, 16(4), article no. 20. Sharp, H., Rogers, Y. and Preece, J. (2007) Interaction Design: Beyond Human-Computer Interaction. Chichester: John Wiley & Sons Ltd. Suchman, L (1987) Plans and Situated Actions. New York, NY: Cambridge University Press. Ullmer, B., Ishii, H. and Glas, D. (1998) mediaBlocks: Physical Containers, Transports, and Controls for Online Media [Online]. Available from http://tangviz.cct.lsu.edu/papers/ullmersiggraph98-mediablocks.pdf (Accessed: 12/07/2011). Varela, F.J., Thompson, E. and Rosch, E. (1991) The Embodied Mind. Cambridge, MA: MIT Press. Weiser, M. (1991) 'The Computer for the 21st Century', Scientific American, 265(3), pp. 94-102. Weller, M. P., Yi-Luen Do, E. and Gross, M. D. (2008) Posey: Instrumenting a Poseable Hub and Strut Construction Toy. Proceedings of the Second International Conference on Tangible and Embedded Interaction (TEI'08), Feb 18-20, Bonn, Germany, pp. 39-46. Wellner, P. (1993) 'Interacting with Paper on the DigitalDesk', Communications of the ACM, 36(7), pp. 87-96. Wittgenstein, L. (1953) Philosophical Investigations. Oxford: Blackwell Publishers. Zhang, J. and Norman, D. A. (1994). 'Representations in Distributed Cognitive Tasks', Cognitive Science, 18(1), pp. 87-122.
85