Consisting of five levels and over 80 maps divided among those levels, all of which are heavily populated with various light and sound combinations, Virtual DJ is a complex piece. While Gibson had developed and performed it for over three years and, so, was intimately familiar with the location of the sound and light elements in the 80 maps, Grigar, new to both motion tracking performance and the piece, had to learn how to work in the space and with the piece before she could begin rehearsing for the performance.

The fact that GAMS allows users to see one another in the system by looking at their relative location on the map shown on the screen of the computer controlling the data (via Flash Track) made it possible for Grigar to memorize the location of her body in the space in relation to the location of media elements programmed in that space. For example, having learned that a certain arpeggio was located on the map downstage on the right side of the room about 1.85 meters above the ground, she could reach her hand up above her head to "touch it" with the tracker.

The cameras in her studio, sensing her hand in that place in the map, would send her location, articulated three-dimensionally, to her PC. In turn, the PC would then send it to her Mac where the sound, produced with Reason, was housed. At the same time, the PC would also be transmitting the data to the central computer in Canada where the database linked to the programmed maps resides. That computer would send the appropriate data about the arpeggio to Gibson's PC and Mac in his studio and back to Grigar in hers, allowing her not only to evoke the element and produce the sound­­as well as evoking and producing it in Gibson's studio almost simultaneously­­but also to see the location of that element represented visually on her PC screen as a point in 3D space in a particular place on that specific map.

With visual recognition of space reinforcing kinesthetic involvement in that space, Grigar learned Virtual DJ well enough to rehearse it, in less than ten hours. In that sense, embodied telepresence made it possible for Grigar to learn the piece, rehearse it, and eventually perform it online with Gibson. They have performed the piece over the net several times since this early performance, most recently at BC.net, held in Vancouver, Canada in November 2007.

4 Conclusion

Motion tracking technology enhances telepresence and collaboration in performance and installations by making it possible for users to manipulate not only data objects like images, video, sound, and light but also hardware and equipment, such as computers, robotic lights, and projectors, with their bodies in a 3D space across a network. As the two examples of its use show, applications of this technology can do much to promote collaborations for those working on digital media projects where hardware, software, and peripherals must be controlled in real-time by teams working together at-a-distance or where physical computing research is undertaken.

 

Works Cited

Rokeby, David. 2000. Installations: A Very Nervous
System. January 15, 2006 <http://homepage.mac.com/davidrokeby/vns.html>.

Frayn, Michael. Copenhagen. NY, NY: Anchor Books,
1998.

Gibson, Steve. 2002. Virtual DJ. January 16, 2006
<http://www.telebody.ws/Virtual DJ/>.

Gibson , S. AND Grigar, D. 2005. When Ghosts Will
Die. January 16, 2006 <http://www.telebody.ws/Ghosts>.

O'Sullivan, Dan. AND Igoe, Tom. Physical Computing:
Sensing and Controlling the Physical World with Computers. Boston, MA, Thompson, 2004.

Wilson, Stephen. Information Arts: Intersections of Arts,
Science, and Technology. Cambridge, MA: The MIT Press, 2002.

1 2 3 4 5