‘Virtually Senuous (Geographies): Archiving, VR and Participatory Installations’.
Rambert School of Contemporary Dance / University of Chichester
I speak to you as a choreographer who has been actively involved in the creation of choreographic installations for several years – these are another form of live dance event. For at least half a century the archiving of live dance events has relied on video documentation. Over the years techniques for camera work and direction have been devised and refined to the point at which the somatic responses that are integral to the experience of watching live dance can be to some degree evoked on screen.
But today I am taking you into another domain. For many years, dance and choreography has been taking steps beyond the stage. One manifestation of this has been the development of a choreographic practice that takes the form of interactive immersive installations which are driven by a choreographic sensibility. These works set up the conditions for a first person engagement with an audiovisual environment, one which allows the sonic and visual details of the installation to be experienced kinaesthetically. Video documentation has not proven adequate for archiving such installations.
Immersive installations span the art forms. Some constitute 360º surround sound environments. Others multiscreen environments featuring a multiplicity of video images. An example of this can be found in Julian Isaacs’ multi screen installation Ten Thousand Waves (2010)  SLIDE 2. In a work such as this the viewers ‘choreograph’ their own experience by deciding when and how to look at the multiple screens and how to move between and/or through them. In still others, created mainly by choreographers, the screen imagery is designed to emphasise the sensate experience that immersion in the audio-visual environment brings. SLIDE 3 Examples are trajets by Susan Kozel & Gretchen Schiller (1999 v1/ v2.2007) and SLIDE 4 Sensuous Geographies by myself and composer Alistair Macdonald (2003)
In some such installations the audiovisual imagery may be pre-set and unchanging, in others the system is interactive (or responsive), allowing participants to modulate the properties of the imagery they encounter. In these installations the visitors’ actions can change any or all of the following ways in the audiovisual environment: scale, colour, timbre – velocity, volume, density; presence/absence of particular images or sounds, order in which the imagery appears. In this way visitors can orchestrate and reconfigure their environment through their behaviour . As we experience a continual interplay between our bodies and the environment at the same time as they are manipulating the world of the work, they are implicitly manipulating their sensate responses to it,.
Clearly, the complex, ever changing perceptual effects engendered by such environments cannot be recorded or archived from outside, for this can only provide a third person perspective on a work that is dependent on first person engagement.
If future dance and media historians and artists are to gain access to this facet of the history of choreographic thinking they need the opportunity to share in a semblance of the act of engaging actively with the worlds of the installations being created.
To do this new strategies for archiving such installations need to be devised.
SLIDE 4 SG
This presentation reflects on potential strategies for creating an embodied archive of Sensuous Geographies, an example of the immersive choreographic installations, created by myself and an electroacoustic composer. Part of the history of a moment in the expansion of the concept of the choreographic Sensuous Geographies has a presence in books (Digital Perfomrance Dixon, 2008; Identity, Performance and Technology: Practices of empowerment, embodiment and technicity. Broadhurst and Macon 2007), conference presentations and academic papers from disciplines as diverse as computer science, geography, music, dance, theatre, media and a web presence via video documents. None of the extant documents, however, give access to the experience of engaging with this installation environment.
By virtue of its structures, the installation raises several of issues for those involved in archiving immersive interactive installations for future generations. SLIDE 5
For example, it comprises multiple layers in terms of structures, media and artistic intent.
With respect to the latter it was intended that the installation would
- use sensate, as well as conscious awareness to guide movement behaviour,
- facilitate collaborative interactivity between participants
- generate emergent sound worlds and spatial choreosonic events
- SLIDE 9
- and allow participants both to engage interactively with the installation,
and to view the performance of the interactive event taking place
But how to document this?Until recently the idea of providing documentation of the first person experience of an installation has been an unattainable dream. However, recent developments in VR technologies (Malakach, 2017) suggest that the dream is becoming more and more attainable. In this presentation I will rehearse some of those developments, and suggest that even installations such as Sensuous Geographies can be successfully archived as a record of a moment in choreographic history.
Before moving on to what such an archive might comprise it is worth describing how the installation worked
Sensuous Geographies comprised an inner and an outer environment. The outer environment was the first to be encountered. Although not responsive, it was experiential through the use of surround sound for the sonic environment SLIDE 11 This whilst they viewed the informal choreography that was emerging from the active participants’ behaviour, they also had both a sense of the space.
There was also an inner environment which was electronically sensitised and responsive to the visitors’ movement. SLIDE 12 On entering this area visitors crossed the threshold from looking to hearing, looking to feeling. Each visitor was allocated a particular sound when they entered the inner space. This sound followed their trajectories as the navigated the space.. At the same time the sonic textures of the sound strand changed in response to the directions and velocity of the participants’ movement. The combined assemblage of visitors’ sounds not only created a rich and sensate sonic environment, but also enabled the participants to locate others moving towards them, away from them or circling them. Over time, different actions, for example shifting proximities between participants, became additional means of modulating the soundworld – the system echoing the progressive levels of a computer game. With repeated exposure, participants could learn the rules of the game and ‘play’ the installation like a huge musical instrument.
Through their activity participants were creating on the one hand an observable composite choreosonic event and on the other a rich affective environment.
As can be imagined documenting this multidimensional sensory installation satisfactorily presents something of a challenge.
in 2013, 3D documentation seemed a possible solution to at least one aspect of the installation that of navigating an installation environment. As such it offered a minimal form of first person experience, but could not offer the immersive experience as the installation was being viewed on a screen from a third person perspective.
As such as with video 3D could not generate the kind of embodied experience initiated by Sensuous Geographies.
My archival dream has been to find a way of simulating within a virtual version of Sensuous Geographies the kind of embodied experience generated through participants’ engagement with it, and the collective interactive engagement that was central to this installaiton.
I believe VR can now take us a long way towards this through
- SLIDE 13 the advent of affordable Head Mounted Displays (HMDs)
- SLIDE 14 advances in audiovisual rendering technologies
- SLIDE 15 Advances in the interactive capabilities of multiuser Games Engines,
- SLIDE 16 the increasing attention being paid by VR developers to the interplay between the perceptual senses and the role this plays in our embodied experience of the VR environments we occupy,
When using an HMD, players are embraced by the virtual world’s material environment. Just as in the real world there is no ‘edge of vision’. Simply by moving their heads players can not only see the world, but also get a corporeal, or kinaesthetic, sense of the breadth and depth of that world, In VR there no break to the ‘flow’ of the sense of the body being enfolded in the environment, as can happen in a CAVE for example, when the flow of embodied vision is broken by an edge.
But where would a VR re-presentation of an immersive art installation expereinced through an HMD stand in relation to the real-world version? just as we engage in an intricate interplay with our environment in the real world, which affects not only how and where we move, but also our perceptions, and our sense of being, so too do we in VR Whilst we cannot replicate individual experiences of a real world phenomenon in Virtual Reality, we can create a model which provides relevant conditions for generating that experience
And it is to these perceptual conditions that I now turn my attention
The importance of the way our perceptual systems operate cannot be overemphasised. The five basic senses are vision. hearing, smell, taste and touch. However, the haptic sense system (touch) incorporates the kinaesthetic senses and proprioception. These are an integral part of all perceptual experience.
Perception is not a simply a combination of parallel perceptual inputs it is generated by an intricate, multimodal dynamic network of sensory systems. When in play this results in a complex, highly integrated, system that absorbs, interprets and responds to complex environmental information coming from all of the senses simultaneously (Gibson 1979).
SLIDE 18 Further, all the perceptual channels simultaneously act upon and acted upon by each other – and SLIDE 19 of equal significance they arebeing acted upon from all directions by the environment
In the realworld as we move around orientation and proximity to the features of the environment change what we see and hear enable us to estimate position, orientation and affective responses. SLIDE 19 Significantly we use the same perceptual cues in a multi-sensory virtual world as we do in the real world.
To build a virtual world with the affective tone that gives the rich embodied response that characterises the realworld Sensuous Geographies, and which can be experienced from without and from within, we would need to create
SLIDE 21 a realistic visual re-presentation of the installation environment which will SLIDE 22 allow participants to navigate a world, encounter other users in some way, that displays
SLIDE 23 the direction and play of light that give rise to changes of intensity and affective tone. This VR developers call the Plenoptic Function (Adelson et al 1991, Wong et al, 2002)
a rendering of a multi-layered sonic environment which is
SLIDE 25 spatially distributed and simulates the spatio-temporal acoustic pressure that flows from the volume and shaping of the material environment SLIDE 26. This is known as the Plenacoustic function or the environment’s acoustic footprint/sound field (Stones 2014)).
SLIDE 27 It must alsogive a realistic sense of kinaesthetic involvement in the shifts and changes in the VR environment, SLIDE 28 which is generated both by the
play of light in the environment and the sensations generated by the sonic environment
SLIDE 29 We also need to give players full control of their movement in the virtual space by providing
SLIDE 30 a navigational interface that allows them to negotiate the space intuitively as they do in the real world
The latter is achieved through the activation of the proprioceptive and kinaesthetic systems, (Nitzsche 2004).
In order to create the conditions that would allow for this in VR, Norbert Nitzche built an electronically sensitised real world user environment. Users in a small real world space donned an HMD and then navigated the virtual world they were seeing through the HMD by walking. This was made possible by a Motion Compression algorithm, which allowed users in a restricted realworld space to navigate the Virtual world intuitively using not only the visual and auditory systems, but also the proprioceptive and kinaesthetic systems (This is known as the Plenhaptic Function). SLIDE 32 Developments of Nitzsche’s interface has given rise to Extended Range Telepresence and is increasingly being incorporated into VR systems, SLIDE 33 including for archival purposes in the visual arts (for example Nam June Pak’s installation Versailles Fountain (2013-14)
Finally, the installation’s bespoke interactive system needs to be imported into the VR system to simulate the users modulation of the installation’s audiovisual environment.
And a system devised so that players can engage in collective interactivity
And progressively advance to more difficult levels of interactivity
SLIDE 38 The technology is in place for the development of VR re-presentations of experiential installations that allow visitors both a first person embodied and interactive experience, and a third person, spectatorial one
So this is my proposal for using new technologies in the documentation of choreographic, and indeed any interactive immersive installations. Were it to be subjected to actual rather than virtual research, it has the potential to lead to advances in the archiving of immersive and certain kinds of participatory installations.
 Mirror Neurons – empathy – intersubjectivity
 In doing so (because sight, hearing, and the kinaesthetic/ haptic senses operate as a single dynamic perceptual system)
Initially this entailed modulating and spatializing sound individually, weaving it through and around the sound strands of others. The modulation of those sounds then would take place through collective behaviour, including control of the spatial proximity between individual players. In this way the players could orchestrate in realtime a unique rendering of the nascent sound environment that formed the compositional plane from which Sensuous Geographies was brought to being .
 However, Larsson and his colleagues (2001) have revealed that the affective tone and spatial awareness of a VR environment are contingent on information derived from both visual and aural channels to such an extent that a mismatch between the sonic and visual perceptual clues in a virtual world measurably reduce the sense of presence of its visitors.