This project strives toward an ontological reframing of the relationship between our bodies, the environment and technologically mediated experience through computational sensing, processing and actuation. Weather Inflections explores issues embedded within the philosophical landscape of phenomenology. Specifically, how we as embodied beings experience changing weather patterns and our environment. In facilitating a temporary rupture in an individual’s perceptual framework, the Weather Inflections installation seeks to expand the vocabulary of awareness of its audience and challenge established paradigms of sensory cognizance. By ‘crossing the perceptual wires’ and initiating a synaesthetic experience, an individual may observe spatially and temporally mapped digital data within their physical self as manifested by a tactile and aural experience. This interactive installation further seeks to critically question how our relationship to computing technology is effected and affected through mediation with embodied interaction modalities.
The aesthetic approach of this project is inspired by the neurological phenomenon of synaesthesia. When exposed to some stimulation of one sensory or cognitive pathway, a person who experiences synaesthesia may have an automatic and involuntary experience in a second pathway (Cytowic, Eagleman, & Nabokov, 2009, p. 309). For example, looking at a picture may evoke a particular smell or listening to a passage of music may conjure an image of wavering colours. This work does not attempt to engage with any theories or arguments surrounding synaesthesia, rather it merely draws upon synaesthesia as a source of aesthetic inspiration for the artwork. The focus of this project is to explore notions of embodied interaction through the lens of phenomenology by expressing our interactional relationship with computer data and its mediation of the experience of the climate / environment.
Climate change is a fiercely debated topic both in mainstream media commentary and also in wider global and national political discourses. With governments and industry each pouring significant investment into research that furthers their own individual political agendas, many reasonable positions have emerged as to what can only be construed as ‘the interpreted reality’ of climate change or lack thereof. As embodied beings, our phenomenological reality is that while we may be able to acutely sense climate changes in the present moment, it is much more difficult to recall and make accurate comparisons between sensory perceptions across different spatial- temporal mappings. Weather Inflections aims to help bridge this perceptual gap by allowing users to have an immediate tactile experience of changes in climate data through the process of sonification. In addition to providing a novel, visceral representation of the data, this approach may also help an individual literally ‘internalize’ the experience of climate change. Tammy Freiler, a researcher in the field of Adult Learning and Somatics suggests that for many in the Western world, describing and interpreting the experience of embodiment remains awkward and challenging. Our society’s preoccupation with physical appearance, body consciousness and to some extent, disembodied virtual interactions, goes some way in accounting for why many people are becoming increasingly disconnected from and inattentive to their bodies (2008, p. 39). Indeed Freiler even goes so far as to suggest that, one day our very survival may depend on embodiment in our personal spaces and sensory awareness of our surrounding environment (2008, p. 45).
To investigate the concepts outlined above, we propose to build a multi-channel, interactive, tactile transduction installation entitled Weather Inflections. This work will feature a somatic sonification of climactic and environmental data as mapped across spatial-temporal parameters. By translating changes and patterns in the data set into tactile sonifications, Weather Inflections aims to provide a physically tangible inference of co-locatedness through spatial mapping within the mediation of our cognitive experience of the world, and additionally through more direct interactions with our environment as physically embodied agents.
Sonification is described as “the use of nonspeech audio to convey information” (Thomas Hermann, 2008). Adderly & Young further elucidate that sonification aims to, “render complex and multidimensional data susceptible to intuitive or more methodical appraisal and analysis, potentially via an interactive interface” (2009, p. 5). Due to the spatial and temporal characteristics of the Weather Tunnel’s sensor input data, the technique of sonification offers some potential advantages over more traditional data visualisation techniques as the human auditory system is capable of being aware of both subtle variations in data and overall global attributes, even where multiple simultaneous streams of information are presented (Adderley & Young, 2009, p. 5).
Furthermore, sonification as a novel mode of data representation offers a range of characteristics that may be positively leveraged in the context of this work:
- Sound represents frequency responses in an instant (as timbral characteristics)
- Sound represents changes over time, naturally
- Sound allows microstructure to be perceived
- Sound rapidly portray large amounts of data
- Sound alerts listeners to events outside their current visual focus
- Sound holistically brings together many channels of information (Thomas Hermann, n d)
The sonification algorithm for Weather Inflections will incorporate the auditory display techniques of Parameter Mapping Sonification and Model Based Sonification.
y = f(x) = σ(Ax)
In Parameter Mapping Sonification (PMS), data streams are transformed into acoustic streams using the above mapping function where the acoustic attributes vector y, is calculated using a linear transformation A, and a nonlinear distortion function σ. The components of y are sound synthesis parameters like for instance frequency, amplitude or modulation indices (T. Hermann, Höner, & Ritter, 2006, p. 314).
In contrast, Model Based Sonification (MBS) utilises a dynamic model that acts as a mediator between the input data and the output sound. The input data determines the setup of a dynamic system whose temporal evolution is the only process that generates the sound (T. Hermann et al., 2006, p. 314). Each technique offers its own unique advantage. In Parameter Mapping Sonification the output sound has a direct mathematical relationship with the input data, however Model Based Sonification could subjectively produce acoustic relationships that are more intuitively understood by the listener.
The sonification algorithm of Weather Inflections will retrieve historical weather data from Perth, Western Australia dating back to the early 1900s as recorded by the national Bureau of Meteorology and compare it with the current sensor data recorded at Curtin University. The data variance will then be calculated, processed and translated into electrical impulses output by tactile transducers.
The goal of Weather Inflections is not to browse the Weather Tunnel data set, but rather allow users to navigate and tangibly connect with the data in a tactile manner, thereby providing an arguably more phenomenological, intimate experience with the information and variance patterns contained within it.
For a brief description of the Weather Inflections installation experience, please refer to the attached “WeatherInflections-Experience.pdf” document.
Installation Technical Requirements
- 6 x i-mu resonance speakers placed into purpose-built audio-dampened enclosure / stand
- 1 x LCD Touchscreen or 1 x iPad built into the above stand (For audience interaction)
- 1 x Mac Mini (Running custom Sonification Software)
- 3 x sets of over-the-head style ear protectors (For audience interaction)
- 1 x MOTU Ultralite mk3: USB/Firewire Multichannel Audio Interface
- 6 x 1/4 inch to 1/8 inch audio cables, 5 meters long: Mono to Stereo connectors
- 1 x Australian Powerboard with at least 10 ports available
- 1 x Power Plug converter to change above Powerboard to the native plug shape of installation country. (Not required for China)
- 1 x USB 2.0 Cable, 2 metres long: Standard Type A to Type B connectors or
- 1 x Firewire 800 to Firewire 400 convertor, plus
- 1 x Firewire 400 cable, 2 meters long: Standard 6 pin to 6 pin connectors
Additional equipment required during setup:
- Wired connection (RJ-45) to the internet for Pachube data input
- 1 additional monitor, keyboard & mouse for initialization of the system
- 1 x HDMI to DVI or HDMI to VGA cable to connect the above monitor
Duplicate backup of above system.
- Adderley, W. P., & Young, M. (2009). Ground-breaking: Scientific and Sonic Perceptions of Environmental Change in the African Sahel. Leonardo, 42(5), 404-411. doi:i: 10.1162/ leon.2009.42.5.404</p>
- Cytowic, R. E., Eagleman, D. M., & Nabokov, D. (2009). Wednesday is Indigo Blue: Discovering the Brain of Synesthesia. The MIT Press.
- Freiler, T. J. (2008). Learning through the body. New Directions for Adult and Continuing Education, 2008(119), 37-47. doi:10.1002/ace.304
- Hermann, T., Höner, O., & Ritter, H. (2006). AcouMotion–An Interactive Sonification System for Acoustic Motion Control. Gesture in Human-Computer Interaction and Simulation, 312-323.
- Hermann, Thomas. (2008). Taxonomy and Definitions for Sonification and Auditory Display. Proceedings of the 14th International Conference on Auditory Display. Paris, France. Retrieved from Proceedings/2008/Hermann2008.pdf
- Hermann, Thomas. (n.d.). An Overview of Auditory Displays and Sonification. SONIFICATION.DE. Retrieved March 25, 2011, from http://sonification.de/son
Joel Louie is a PhD Candidate at Curtin University. Joel’s research and creative practice seek to explore how our relationship to computing technology is effected and affected through mediation with embodied interaction modalities. Influenced by the aesthetic of synaesthesia, Joel enjoys making interactive artifacts that translate one set of sensory input into a different set of sensory output. In addition to being a new media artist and composer, Joel is also a dedicated educator and researcher in the area of Human-Computer Interaction. His PhD dissertation is entitled, “Embodied Interaction: Tangible Human-Computer Interfaces using the corporeal frameworks of Maurice Merleau-Ponty and Rudolf Laban.”
Jan L Andruszkiewicz completed a Bachelor of Arts, Fine Art, Sculpture Major at Curtin University in 1992 and a Bachelors degree in Computer Science at Edith Cowan University in 2007. He is in the process of completing a Master of Philosophy (creative arts) degree at Curtin University Perth, WA.
His thesis title: “Re-imaging visual information complexity: A creative approach to information entropy, perception, identification and understanding” is due for completion late 2011. Jan is a member of the Australian Computer Society (ACS), a Professional Member of the Association for Computing Machinery, and an Associate Member of Leonardo/ ISAST.
Bryan J Mather is a polymath with two specific fields of expertise, Information Technology and Fine Art, and since 1981 he has alternated between these two careers. Bryan’s current research is into the effect of computer language (algorithmic) structure on digital reality. It extends the idea that we are constrained to a reality that exists in language, and proposes the digital simulacrum we live within is constrained by the algorithms we use to speak it into existence. The subject of this research is the mass uncontrolled experiment of digital culture, and its fundamental changes to human cognition, behaviour and communication.
Kevin Raxworthy is senior technician in the Studio of Electronic Arts (SEA) at Curtin University of Technology. Kevin has been working in the area of media art since 1983. He was the technical support officer for the Biennale of Electronic Art Perth 2002 and 2004. Raxworthy has been working in collaboration with Paul Thomas on the Midas Project that was exhibited at Enter 3 Prague in 2007. In their current project Nanoessence he is writing an algorithm based on cellular automaton. The algorithm is affected and stimulated by using the different information gained from sensors that read the user’s breath. Kevin’s work looks at the nexus between artificial life, code space and art. He is currently completing his masters in electronic Art. Raxworthy has recently completed a Master of Art (Electronic Art)
Julian Stadon completed Ba. Fine Arts and Ma. Electronic Arts @ Curtin University. Currently working as Associate lecturer for Open Universities Australia, as Web Development and e-learning researcher for Curtin Art Online and as Research Assistant for NOMAD. Recently completed an ANAT Emerging Artists Mentorship @ Interface Culture Lab, Linz, Salford University, Manchester and Curtin University, Perth. Recently received an Australian Post Graduate Award Scholarship along with a Curtin University Research Scholarship to continue ongoing research practice
Associate Professor Paul Thomas, currently holds a joint position as Head of Painting at the College of Fine Art, University of New South Wales and Head of Creative Technologies, Centre for Culture and Technology, Curtin University of Technology. In 2009 he established Collaborative Research in Art Science and Humanity (CRASH) at Curtin http://crash.curtin.edu.au. Paul is a practicing electronic artist whose work has exhibited internationally and can be seen on his website http://www.visiblespace.com