For the first step of my project, i had to learn how to use Kinect since it is the tool for the game to be played. I had to connect Kinect to the computer and use a software that will track the body and make this communicate with Max/MSP.
First of all, I managed to do the bodytracking with Synapse. This way joints such as elbows, hands, knees were being tracked. The link below is telling all about what to do to get Kinect track your body:
Later, i had to learn how to transfer the information of these tracked points to Max/MSP. What Synapse is doing was sending these information to Max through OSC protocol. OSC (Open Sound Control) is a content format for messaging among computers, sound synthesizers, and other multimedia devices that are optimized for modern networking technology. In order to get these messages from Synapse i used ‘udpreceive’ object. When i used it, i was able to see the numbers in Max window. Again the link below was a useful reference :
Since only right and left hand would be enough for my game, i didnt need to use all the body point coordinations. Later i used ‘unpack’ object to seperate these numbers in x, y and z coordinates.
My aim was to achieve to get 1 value when the hands were in a certain area on the screen. This way i will be able to use this 1 value as the trigger of my animation. For this, i used 4 corners of the screen. Both hands had to give 1 value when they are in that specific corner, otherwise it should be 0. That’s why i used ‘split’ object and created specific coordinate intervals. Then i connected this to the 1 and 0 messages. Later, i used ‘&&’ object and its duty was to act like an ‘and’. If both are the values (x and y) are 1, the number object gives 1, too.
P.s. Since i didn’t need the z values yet, i didn’t connect them to anywhere. I left it there in case i need in the future.
Here is how my patch looks: