As an Engineer and Artist I work to bridge the gap between these fields and find ways in which each can benefit the other. I am currently in the process of developing a Mixed Reality (MR) Keyboard which will immerse viewers in a virtual environment by enhancing their real senses. This keyboard will be the focal point of an interactive installation which experiments with virtual interfaces and explores the potential of Virtual Reality (VR) as an artistic medium.
The technologies from which cinema developed were highly experimental and almost nothing like modern film-making, but without curious and novel inventions the medium would have never been able to develop into what it is today. These sometimes bizarre contraptions, such as the chronophotographic gun, created the initial framework from which cinema could find its niche. It was clear that the temporal nature of film could take it far beyond the constraints of photography, but intermediaries were needed to discover its potential. VR is currently in this same stage of development—it has gained considerable popular interest, and has the potential to grow beyond any of the extant media, but is in need of creative experimentation to do so. Virtual reality has the potential to free cinema from its physical and temporal determinism, to allow the viewer the freedom not only to move about a virtual space, but also to move between numerous possibilities within the virtual environment. Currently many pursuits in this field seek to find ways of applying it to cinema, such as the “VR movies” available through Facebook, Oculus and other such platforms; however, just as cinema divided itself from its photographic origins, just as television diverged from radio, so too must VR establish itself as a new, independent medium.
Hansen suggests that human perception is an embodied experience, one which cannot be reduced to any one sense, but exists through the complex interaction of all of the senses.1 VR is a medium which seeks to remediate the entirety of human perception rather than being limited to vision and sound as film and television are. As such, VR must engage all of the senses, including proprioception and tactility. According to Hansen, “with the flexibility brought by digitization, there occurs a displacement of the framing function of medial interfaces back onto the body from which they themselves originally sprang.”2 VR is capable of taking full advantage of the flexibility endowed by digital technology, removing the visible frame entirely, immersing the body of the viewer completely within the experience by connecting directly to their perceptual system.
This project seeks to contribute to both the technological and theoretical development of the emerging medium of VR. As Hansen suggests, “the vast majority of VR systems [to date]… work with an impoverished conception of experience as above all visual.”3 The primary objective of this Mixed Reality Keyboard Installation is to integrate the other senses and synchronize them with the visual to produce a more complete and convincing experience. According to H.G. Hoffman, integrating real objects into a virtual experience is an effective way to improve immersion by creating a sense that virtual objects belong to the user’s reality.4 This produces a mixed reality experience known as Augmented Virtuality5 in which a virtual environment is made to feel more real.
This installation will serve as an opportunity to expose the general public to this emergent technology, while also allowing them to influence its future development. In addition to explicitly experimenting with methods of improving perceptual connectivity, the work will simultaneously unveil the nature of perception itself to the viewer. The content of the installation will center around the various levels of abstraction through which we perceive the reality which surrounds us. For example, using the keyboard, the viewer will be able to control the position and brightness of a light source – depending on the “abstraction” layer viewed they might see a fairly ordinary light, the data processed by the retinal ganglion cells, or a visualization of the quantum nature of light itself. While the project develops an understanding of perceptual interaction with virtual environments, so too will the viewer begin to glimpse the very systems which drive their perception of reality. This project also seeks to challenge assumptions made about the very nature of perception, establishing reality as something more than what can be seen and felt, as suggested in Donald D. Hoffman’s “Interface Theory of Perception.”6
The physical work will consist of an integrated virtual reality system with numerous inputs and outputs. It will use a custom designed MIDI Keyboard, an HTC Vive VR system, a Leap Motion hand tracker, a Microsoft Kinect, a VR capable computer, and up to four projectors.
The MIDI Keyboard will be affixed to the floor at the center of the room, taking on a sculptural quality through its minimalist wood and acrylic frame. It will be built at a height that facilitates interaction and a matching stool may be included to ensure it can used by all who enter the space. The HTC Vive will float weightlessly above the keyboard, suspended by a counterbalanced cable, encouraging visitors to interact with the system. While the headset is meant primarily for direct interaction with the keyboard, users will also be able to move about the space with the headset on, to interact with other spaces, people and possibly objects.
The Kinect and projectors will be positioned high, out of immediate view of viewers. The Kinect will be used to detect people entering the space, enabling interaction outside of the Vive headset, as well as creating a link between a viewer using the headset and the spectators outside. The projectors will be used to convert the space into a VR Cave, projecting imagery related to spectators’ movements and the actions of the headset viewer and their keyboard inputs. The intent is to make the room an intermediary between the real outside world and the virtual one inside the headset. A single VR capable computer will be used as the hub of the system, hidden inside or outside the room, or perhaps integrated into the keyboard stand depending on the requirements of the space.
Some functional elements may be made visible in order to give visitors’ a glimpse into how the system works, but much will be hidden just as many functional elements of reality are hidden by perception. The room itself will be physically empty other than the components required for the function of the system, giving the outlandish virtual environment precedence over the austere real space it occupies, and creating space for interaction.
The virtual environment will be completely controlled by spectators’ actions. There will be a varying degree of interaction, beginning as soon as a spectator enters the room. Through positional tracking simply their presence will affect the virtual space. By moving about the space they produce more effects and thus become more intertwined with the virtual. They may fiddle with the keyboard without donning the headset, which will produce more direct effects on the space. Finally, by putting on the HTC Vive they become completely immersed in the experience. The keyboard will enable the most direct interactions, but it will not operate as a conventional keyboard, producing visuals instead of sound. Sound is likely to be an important aspect of the experience, but it will be detached from the keyboard’s typical association with a piano. A slider on the keyboard will allow the viewer to move through various levels of “abstraction”—from a quantum particle view at one end, through each level of perception, as these elementary particles are processed from their real state towards the abstracted state of human perception, the viewer can control at which stage they intercept the information. This will also affect the visuals projected into the room; however, these visuals will be necessarily simplified, encouraging interaction with the full experience contained within the center of the display.
In order to promote the further technical development in this field, participants will be encouraged to complete a survey as they leave the space, and may opt to be interviewed further in order to gather data about their experience. This will allow the piece to be substantiated into useful information which can in directly affect the progress of virtual reality as a whole.
Present State of the work
The work is currently a functional prototype which uses most of the physical components mentioned above with the exception of the Kinect. The current design is being used to experiment with various physical and virtual interfaces and will be used as the basis for the finished work. A refined keyboard made for public display will be produced for the installation. Much of the development yet to be completed is in the virtual environment itself which will be designed between May and August 2017.
Materials & Budget
The following list of materials and budget assumes all components need to be purchased specifically for this installation. For temporary installations many of these components will be borrowed, cutting the cost in half. For a permanent installation the cost would as much as double to ensure permanence and reliability.
|Wood and Acrylic for frame||$700|
The system will be designed to be moved from location to location and for as simplistic installation as possible. The installation process is expected to take under two days and will involve mounting the components in the room and calibrating the system to the space.
As an interdisciplinary project the installation will be displayed in technical and cultural centers both locally and internationally.
Local Gallery Installations
- MAP Maker Space. The work will initially be presented in an open house in the MAP Maker Space some time between August and December 2017. This preliminary display will give the opportunity to present the work to a smaller audience in a more intimate environment and to gain feedback prior to going to a more public display.
- Fifth Parallel Gallery. The Fifth Parallel will be targeted as the first public display of the installation between November 2017 and February 2018 open for between a week and a month. This location is ideal for both its size and visibility to University students.
- Neutral Ground/SOIL. Other than the Fifth Parallel, Neutral Ground is likely the most ideal location for this piece. Both venues will be sought if possible. SOIL Media Art & Technology will also be considered as a route to have a display at Neutral Ground.
- Art Gallery of Regina. The Art Gallery of Regina will be considered as an alternative location; however the space is larger than ideal for the installation and would require special consideration.
- Mackenzie Art Gallery. The Mackenzie Art Gallery will be contacted to determine whether this installation would be viable, but is not expected to be a good fit.
- The Hague Gallery. The Hague Gallery is the least ideal location due to its limited public hours and likely conflicts with a space that will need to remain dedicated throughout the exhibition.
- CMMR 2017, ICMC 2017. A paper detailing the work has been submitted to these conferences and will be presented pending acceptance.
- Audio Mostly 2017. An installation demo will be submitted to Audio Mostly in London to demonstrate the technical potential of this project. If accepted, this demonstration will occur in late August, 2017 although it is likely to be presented as a work in progress.
- SIGGRAPH 2018. I also intend to submit the finished piece to SIGGRAPH 2018 in Vancouver along with any other relevant art and technology conferences in 2018.
This installation will be a cutting edge display of technology and an artistic exploration into the capacity of a new medium. It will explore perceptual immersion of viewers while simultaneously allowing them to explore their own perception. It will enable various levels of immersion and will encourage interaction and participation from all who enter. Furthermore, by exposing this developing technology to the public and providing a venue for discussion it will allow for a collaborative shaping of this emerging field.
1. Mark B.N. Hansen, New Philosophy for New Media, (Cambridge, Mass.: MIT Press, 2004), 100.
2. Ibid., 22.
3. Ibid., 162.
4. H.G. Hoffman, “Physically touching virtual objects using tactile
augmentation enhances the realism of virtual environments,” Proceedings of the
IEEE Virtual Reality Annual International Symposium ‘98, Atlanta GA, (Los Alamitos, CA: IEEE Computer Society, 1998), 59–63.
5. Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information Systems, E77-D(12), (IEEE, 1994), 1321–1329.
6. Donald D. Hoffman, “The Interface Theory of Perception,” Current Directions in Psychological Science 2016, Vol. 25(3) 157–161. DOI: 10.1177/0963721416639702