This week I looked into integrating Kinect camera with our Openrameworks project, mainly best practice and limitation. The Kinect and Openframeworks project  from Zachary Lieberman and Dan Wilcox, helped start understanding the process of taking camera input and turning into usable data. Two add-on libraries are required to achieve this, which are OpenNI and the included ofxKinect.
However I wasn’t able to acquire a Kinect camera for initial setup and testing. I did being a test project and set basic function. I intend to being testing with actual hardware as of Friday jan 27. Next week I aim to being testing ways of interpreting interaction. The initial project setup with Xcode was really seamless, no compiling errors.
Main takeaway from this week is that we need to redesign our interaction gesture in away it’s easy to track while remaining natural for users to do.
STUDIO for Creative Inquiry (CFA), “3B: Using the Kinect with OpenFrameworks”, http://artandcode.com/3d/workshops/3b-using-the-kinect-with-pure-data/