Third Life Project is a networked international arts-based research collaboration of Territorium KV (AT), Simula Research Lab (NO), MIH Media Lab/Stellenbosch University (ZAF) and Embedded Systems/University of Duisburg-Essen (DE). They explored the potential of virtual actions to perform real actions causing extravirtual physical effects on physical objects (output devices) and effects on the bodily and mental states and behaviours of persons (emotions, sensory impressions, beliefs, etc.) in a performative setting.
In October 2015, the team presented three performance lectures at WUK Vienna staging the real-time video game of Minecraft in a theatrical performance. The performance was not built around a specific narrative and focused rather on extravirtual avatar interactions with performers and objects in the physical world. The goal of the project both artistic and technological was to devise a distributed, hybrid and distinctive performance while creating a platform for sharing knowledge. Each performance was directly followed by a discussion with the spectators to give an insight in how the artists and also the experts work, and to exchange ideas about performing with mixed reality and ubiquitous technologies.
The technological interface combined Minecraft environments, novel tracking technologies and connected objects of the Internet of Things (IoT), most of them designed and developed by the project members. Minecraft defined as well the overall aesthetics of the set design and ubiquitous objects embedded in the physical world. A computer server developed as part of the FiPS project was present on stage to host the Minecraft virtual world. The smart stage was built beyond allowing the performer to view the virtual worlds with the head-mounted display Oculus Rift DK2 on. To move in the virtual world, a single camera worn by the performer observed a set of pre-installed markers - CCTags. The performer walked barefoot on a soft carpet and, to move longer virtual distances, he had to step out of it into what we named the scrolling area. The carpet provided him with haptic feedback for the transition in between the areas and always reminded him of the borders of his physical area. The off-the-shelf controller Leap Motion was mounted on the front of Oculus and integrated into the Minecraft client to allow the performer to select, place, brake or otherwise manipulate virtual objects.
Whereas the first performer was fully immersed into the virtual world, the second one experienced it using a traditional 2D screen and could perform activities that would be difficult or potentially dangerous for the first one. As an example, he jumped on a real trampoline with embedded sensors to make the avatar jump and dance. He could teleport the avatar to different virtual locations carrying a physical block to the different locations with embedded sensors onstage. Overall, these applications immersed the second performer into the game and opened space for an interesting power dynamics between the performers. The first one had the power to activate through specific virtual actions in Minecraft all the connected physical objects onstage, which would stay otherwise unresponsive. The second performer could act as a kind of guardian angel for the first one. If the avatar could not get free or needed to get away quickly, he could initiate the teleport. If the avatar got stuck in a hole in the virtual world, the performers could combine their actions to jump the avatar out. We only conducted initial experiments, but our case shows that the real-life multiuser cooperation can enrich immersion by giving the interactions with virtual environments an interpersonal dimension.
For various production reasons the performance was not devised for the audience to actively participate and interact with the technology. That stimulated a lot of thoughts on the nature of perfoming with command-and-control technologies in the discussions. The interactions were reserved almost exclusively to the two performers and the audience was simply viewing these interactions in a way that might be no different from viewing any other type of theatre performance. The only exception were the two situations in which one of the performers read, with an embedded camera, some QR codes given to the audience when buying their theatre tickets, that allowed them virtually appear as non-player characters in the Minecraft worlds.
Most of the development was conducted via weekly video conferencing meetings, where the team discussed all the aspects of the research design and development, from initial ideas to prototyping, problem-solving and evaluation among others. With the exception of the interface integration and the testing that required meeting physically together, the online chats prompted regular updates on the progress of the work done by each member individually while keeping the appropriate balance of control between the artists and the researchers. The design iterations went loosely through the phases of concept and fundraising (2014), development, user - testing and evaluation (2015), with the last two being repeated several times, especially towards the end of the project.
The general objectives of Third Life Project:
The objectives challenging the existing paradigms of aesthetics and critique:
Concept / Dramaturgy / Scenography / Virtual environments / Performance: Otto Krause & Milan Loviska (Territorium KV, Vienna)
Scientific and technological realization:
Minecraft expertise and gesture control: Herman Engelbrecht, Jason Bradley Nel (MIH Media Lab, Stellenbosch University, South Africa)
Tracking: Carsten Griwodz, Lilian Calvet (Simula Research Lab & LABO Mixed Realities, Norway)
Cyberphysical devices and Non-Player-Characters: Gregor Schiele, Alwyn Burger, Stephan Schmeißer, Christopher Cichiwskyj (Embedded Systems, University of Duisburg-Essen, Germany)
Server: René Griessl (Bielefeld University, Germany)
Premiere @ WUK Vienna, 08 October 2015
A co-production of Territorium KV and WUK performing arts in Vienna