.::..Nicholas Sagan..::. …:.::..artworks and experiments…::.:…nicholassagan@gmail.com

Digital Universe Experiments

A few years back software developers at the Hayden Planetarium developed a program called Partiview, which is the access interface to their Digital Universe Atlas.  This is a pretty amazing bit of software where you can essentially fly around the galaxy, picking and choosing what specific features to observe.  It’s really handy if you want to search out all the exoplanets or just all the recorded red dwarfs.  It is also FREE for both Mac and Windows.  Highly recommended.

When I first began exploring this program and it’s functionality I was struck by the ability to sort of customize the visual environment.  Not only is the whole thing a composite of images of the Milky Way and all the constituent planets, stars, nebulae, etc, its graphic overlay system is complex and exciting!  That may sound a little schoolboy giddy but it’s true!  All of the visual components of this program blend together really nicely to create an aesthetic experience worthy of note.  This is but another example of how the visual languages of astronomy can overlap into a realm that relies on aesthetic interpretation.  Lev Manovich’s Info Aesthetics and Steven Wilson’s Information Arts discuss this form of aesthetics at length; the visualization of data can not only have a real world impact in terms of understanding the flow of data at certain scales, but also that there is more enjoyment and engagement with these (more aesthetic) forms of visualization and thus more learning or understanding at a deeper level.

Now this program, because of its richness and potential for aesthetic exploration and appropriation, seems like it is a good place to experiment with some forms of interactivity.  This would at least begin to move in the direction I wanted to go when developing Waking the Invisible, at least in terms of it’s connection to actual science (as opposed to fictional) and it’s aim towards an immersive, interactive environment (more so than it had).  There are several ways in which I could go about creating or experimenting with this program.

I speak in terms of potential because there are certain thresholds to be crossed in order to really get moving on the experiments.  Before I mention one of the first and arguably largest obstacle the general accessibility of the basic interface must be addressed.  As you can see in the images below there are quite a few variables to be selected.  For my uses it is not necessary to know exactly what buttons correspond to what types of stars, but rather just to know how to turn certain groupings on and off.  This is also true of setting the luminosity of all the stars, turning the grid overlay systems on and off and general perspective, positioning and maneuvering within the environment.  All of these variables are important and necessary in a program such as this because of the vastness of the data represented.  BUT the interface is a bit clunky and, dare I say, counter-intuitive.  However, orientation in outer space (which is essentially isotropic without objects) is not really and intuitive thing and in many ways really does depend on multi-variable mediation.  As a side note, one of the more visually interesting elements is the inclusion of the Sloan Digital Sky Survey images, which add some very nice textures and colors to the environment.

Now back to the big main obstacle: access to this interface.  I’ve attempted to contact the software developers to see of there are any other entry points to the atlas other than the main buttons on the user interface.  In fact, there are even a few buttons included that do not have an assigned function which leads me to wonder what they had in mind.  My hope is that those blank spots could be turned into an OSC port, where the input of OSC data can control certain other variables within the environment.  If I can find a way to get this program to accept OSC data, I can not only make the interface more interactive in terms of the possible creation of a Cave-style immersive environment, but at the most fundamental level this program would be opened up in the best sense.

Thinking back to ways of continuing the development of Waking the Invisible’s interactivity, this program could be used to generate patterns for star movements, etc.  One thing I would like to test is how small bundles of fiber optics perform against the screen of an iPod or iPad.  If the screen on one of those objects can generate enough light for the contrast to be noticeable, it will prove to be a step in the right direction.  But one of the issues is that trying to align a single point of light on a screen to a single point of light at the end of a strand of fiber could prove to be difficult, if not impossible, since the idea of the fiber optics is to disperse the light across much larger, three-dimensional fields.  So one of the first steps in the process of this phase of experimentation is to use a video with a high contrast, on/off look to it.  One such video is the eclipse, which I had projected onto painted surfaces in a project earlier this summer.  Eventually, I’d like to try placing a large bundle of fibers in front of a projector, which emits much more light than the screen of an iPod, etc.  Now, if I can get Partiview to accept OSC, and get some fiber bundles to react to light from screens, then I can create an interactive experience that is representative of the Digital Atlas…if not, I could always work on an interactive abstract video to hook up to the fiber arrays. (keep an eye out for that this fall!)


Leave a Reply

Your email address will not be published. Required fields are marked *