This project page was constructed mostly in a diary-like format. The project was abandoned after it became apparent that I wasn't making any art, I was only trouble shooting a device which ate dollar bills of effort and spat out cents of results.
I am curently experimenting with EyesWeb and Processing to develop a motion tracking system. This system will obviously not be of the highest order, but it will be cheap considering the immediate alternatives. Below are source files for this project. The nature of the motion tracking is capture via a camera device which is sent to EyesWeb. This then extrapolates co-ordinates from the video feed and sends them via OSC (Open Sound Control) to another PC operating Processing. Processing decodes the string of numbers sent to it and then creates a graphical display. Below is also included a document of how I set up communication between the two computers.
Any suggestions are welcome.
Update: I have tested the sending patch and ironed out the issues with it. The example above does not work. The Processing receiver GUI has the added functionality of creating a shell around the path and exporting that to an .obj file. Because we are working in 3DMax, unfortunately I have had to work with a limited .obj import plugin that does not support splines so I have resorted to boxes.
Update (31/10/05): I have suceeded in a working capture session. I found that I was successful so long as I had the lights off and trained the capture on a glowstick. There seems to be an issue concerning loss of data on either camera. It is currently set to freeze the marker if any data is lost, but this results in no capture at all if the tracking is lost on one of the cameras. I hope to iron this out. Below are some very crude 3D drawings made with this package.