.::..Nicholas Sagan..::. …:.::..artworks and experiments…::.:…nicholassagan@gmail.com

DGP project update

A couple of weeks ago the innovative Dance and Technology group Troika Ranch did a workshop and performance here at Columbia.  If you have ever heard of the program called Isadora, this is the group that created it.  It does much of what I am working with now in terms of computer vision capabilities.  I urge you to check out their website to see some of their videos where the program and performers are in action.

The workshop was a very intensive way of learning the basics of Isadora…it was computer vision in 3 hours.  While we didn’t really get into the nitty gritty of a lot of what the program can do, we did look at some of the more simpler things like frame differencing, background subtraction and the like.  Those are pretty standard functions for many programs that have computer vision capabilities built in.  One of the biggest differences with Isadora is that the interface is more related to the theater-tech style of production where you have scenes, actors and transitions.  Each “scene” contains a set of “actors” (such as frame differencing and video players, etc) and you can move back and forth between different sets.

As I’ve been working through this semester with Max 5 and EyeCon (and trying to set up OSC communication between them) I’ve been spending a good deal of time just learning the interface of EyeCon.  The Max environment I was already familiar with but EyeCon has a very different style.  First of all, it feels very “PC”-ish, which isn’t always the most user friendly.  But that isn’t what the program is necessarily aiming for.  It’s functionality, which is pretty darn amazing.  So far in EyeCon I’ve been able to set up what are called “dynamic fields” which detect variables in a camera’s view such as direction of movement and if a virtual line is “broken”.  With the latter I’ve created a sort of ‘invisible optical keyboard’, meaning that a range of piano/keyboard notes are set to different points along a line and as you move “through” it, the corresponding notes play.  The first amusement of it came when showing the project to a class.  In that demonstration I utilized the tilt of the studio’s floor to roll a marker ‘up the scale’ of the keyboard!

And that is really one of the main things I want to get at with this work.  There is an immediacy to the interface where someone can walk in the room, create some reaction that is recognizable and then access the content.  In the case of this work I am still sorting out the content as I work out these technological problems/ projects.  Much of the projected content branches off of earlier projects that deal with astronomical space and the scale of the universe…but how do I make it fun???

Leave a Reply

Your email address will not be published. Required fields are marked *