Interactive Tech Demonstrations for Live Performance
Historical note Analog vs Digital
Starting back in 1965 I think Variations V is a prelude for things to come:
Cage, Cunningham, Van Der Beek (& Moog and Tudor): Variations V
Video Tracking
Frieder Weiss, the creator of the Eyecon Software used in Chunky Move’s GLOW
Demonstration of EyeCon Software Windows
HARDWARE: Depth Sensing Cameras
Xbox Kinect and similar depth sensing cameras are at the root of a lot of gesture and movement controlled art works/performances.
Rashaad Newsome is using a Kinect for the Five SFMOMA piece, I will demonstrate how that piece works.
Below are a few basic examples so you begin to see the correlation between body and camera, some samples are long so feel free to scrub around these video examples.
SOFTWARE: Max MSP
I teach the software Max MSP from cycling ’74 so most of these examples are rooted in the use of that software.
(other software of interest: Isadora, MadMapper, TouchDesigner, Processing, openFrameworks)
Demonstration:Computer Vision with Max Jitter.
Demonstration: Motion detection with Max MSP Jitter and a web camera.
Color Tracking with a camera.
Below are two samples of using the color RED painted on hands to create audio effects:
https://www.youtube.com/watch?v=BWf_sy87qNI
https://www.youtube.com/watch?v=ZveY_8fqh18
Demonstration: color tracking with Max Jitter
Kinect Sensor Cameras
Basic Kinect with movement:
https://www.youtube.com/watch?v=z9hofZb3zyU
https://www.youtube.com/watch?v=h3ZVbeXWq7g
https://www.youtube.com/watch?v=ISKV1BeB3pM
More advanced samples:
Claire Bardainne & Adrien Mondot (https://www.am-cb.net/) (https://vimeo.com/145201272)
Zach Lieberman has a lot of wonderful artworks, have a look at his website:
Here a Kinect is used to control lighting effects in real time (combine this with the Heart Beat monitor!)
https://www.youtube.com/watch?v=Y4CsZrIhWzo
Demonstration: Kinect sensor and drawing with Max Jitter and Windows
Physical Computing involves interactive systems that can sense and respond to the world around them.
Heart Beats are always an interesting way to make physical the unseen as we saw in the Sandro Masai video.
Arduino, the staple of Physical Computing
Sound artist Dafna Naphtali uses gesture and voice with wii remote controls and robots:
https://dafna.info/robotica/
The hardware and software mirrors of artist Daniel Rozin are clear and entertaining examples of human computer interaction.
In this video, it’s interesting to watch how people interact with the works.
Here, Danny Rozin discusses one of his mirrors from his studio.
Interactive Lighting with DJ equipment
HARDWARE:
Demonstration of Interactive Lighting with DMX and Max
the coded gaze: face tracking
Tracking bodies and color are the ground work for artificial intelligence and machine learning. Face tracking is ubiquitous these days and, disturbingly, embedded with algorithmic bias.Joy Buolamwini (the poet of code) uses face tracking and art as research to illuminate the social implications of artificial intelligence.
Joy’s website is here for more exploration and information on her artwork: The Aspire MirrorJoy’s work with students to use face tracking to make video graffiti and projection mapping is documented here.
In-Ear sound monitors
These systems can be a little pricey, however you want a good system when using in performance!
Here is a link to an in-ear monitor system
Singers and musicians use in-ear monitors for pitch information and more.
For Annie Dorsen’s piece Yesterday Tomorrow In-ear monitors were used to provide metronome and pitch for the singers to use while they performed live sight-reading of a musical score.
watch an example here:
https://vimeo.com/194698057
more info on that project here:
https://anniedorsen.com/projects/yesterday-tomorrow/