The photon detector has arrived from IDQ our new partner in Geneva. They have graciously provided us with this very complex and sensitive piece of equipment. Perhaps this is the first stage performance that will use the arrival of photons (light in its particle form rather than its wave form) as an active participating element. The 10 nano second length signals it passes on from a filtered light source (the candle) will give the rhythm of the music, influence visual effects maybe even determine the length of the show.
sometimes when there is no video image projected and the light is cared for, the origami and the space give a feeling of peace
Open the space around the zenith ground projection
stitching two images together into one for zenith type projection
Vincent working on the code sofa.
This image shows the basic technical setup in terms of systems integration.
5 computers (maybe less if we can get the perfect and most powerful one we need) all exchanging information through OSC constantly. The info will come from 2 kinect 3D cameras, 2 multi axis wireless bracelets, various internal processes in calculation from sound (Live and Max) the processes in Unity (3D game engine that we are going to use live on stage) EBA a sequencer for MaxMSP written by Yacine Sebti that will serve as an interchange for all info to all systems. 6 video projectors, info being sent and shown in html5 even. 6 channel sound with playback and interactive generative sound manipulation.. etc, etc… and I almost forgot – a the signal from a photon detector (if we can find one – if you have any ideas please contact me).
There is also 6 dmx controlled winches that will move our mobile oregami screen scenery that will also be interacting with my body through the other systems.