Practice with the 1, 2 and 3 dimensional joint distance objects that I created in Max, and see which ones are interesting for which kind of sound manipulation. Select instruments and effects to go forward with the project. Map all of it together. Test fabrics in kinect and start designing the costume.
KINECT TO MAX
I still had some tasks left from last week, so I explored the KinectPV2 library a bit more, but the documentation is really not the best — I found four mistakes in the two first sections of their reference tutorial. I also took a look at Kinectron, but it seemed like a browser wouldn’t be the best option for me, so I finally decided to get the [dp.kinect2] object. I figured I have learned all that I wanted to learn by trying to access that data myself, and the Max object is clearly more stable and resourceful than what I’d be able to get with the time I have, so that was a good idea.
Having bought this object, I was back where I left off on my midterm presentation. I worked on organizing the joints in a more visual way, having all the data I can possibly need within reach, leaving it so that I would just need to use my comparison objects to do the mapping. I processed each joint data following the patch below, making it easy to calibrate the minimum and maximum values, as well as creating a variable for each joint position.
This is a full view of the code after I organized the joints in subpatchers. The mess in the bottom left were a few mapping tests.
For the video, I basically merged the patcher used for Surface, cleaning it up from the drawing part, and making it more suitable for this project. Then I tried a few mapping options and new video mixings, and ended up with the following program:
With both video and sound, this is the mapping that I have at the moment. Whatever is in square brackets means that I still have to work on it. There’s a lot more that I can work on, but the things listed are the most simple system that would be interesting enough for the final version of this performance for class. [dist1], [dist2], [dist3], and “>” — representing [monitorChange] — refers to the subpatchers mentioned in the last post.
headY — bpm
torsoZ > 400 — create beats
r_handY near headY — clear sequencer
[r_handY — note volume]
[l_handY — note duration]
dist3 l_hand / r_foot > 8 — pitch
[headY — note volume]
[dist1 r_shoulderX / l_footX — note duration]
[ — reverb]
[ — delay]
dist3 r_ankle / l_ankle — pitch
waistX — feedback anchor_x
waistY — feedback anchor_y
dist1 headY r_footY — xfade
headZ — randomFrame toggle + metro
From the experience I had with Surface, I believe that this project asks me to be as vulnerable on stage as my filmed self. For that reason, I thought that working with see through materials would be an interesting idea. Also, in Fractal Fantasy @ Sónar, Zora Jones was wearing a see through bodysuit that interacted with the image in Kinect, so I decided to try some materials on the camera.
Unfortunately, you have to be very close for it to show any differences, so it doesn’t exactly work for the technical purpose that I intended, but I believe it still works conceptually, so I’ll continue to develop the idea in the sketch below.
For next week my assignment is to work on the costume and concept, and I wrote on the execution plan that I’d add live looping mechanics, but I don’t think I’ll be going down that road for this first performance. Instead, I will get more and more into practicing, so I have the mapping well set, and some time to work on the movement.