In a previous blog post, I talked about DynViz, our awesome Processing.org based visualization tool to view worldwide DNS queries. Since then, we have done some pretty awesome modifications to it in order to not only support our ever growing stable of worldwide data centers but also to incorporate cutting edge technology and increase ease of use.
A brief recap
The tool itself was create using the Java based Processing.org framework, displaying on a 3D globe which can be rotated to view any region with the DNS packets origin color coordinated with the global datacenter which is filling the request. The actual data can be fed from a stored pcap capture file (generated from a tool like Wireshark or from a more useful (and cooler) real-time data stream sent from the recording facilities themselves.
We include a sample perl script which will add the geolocation into the packets before sending them to the DynViz tool. If you want a more in depth on the how and the setup, check out our original post.
The new funness
(Yeah, I know that’s not a word. Neither was Twitter 10 years ago!)
The first thing that was added was support for the new datacenters, bringing us to 17 worldwide. It was tough finding enough unique colors for that many, but it was done and now the regions are properly accounted for on the lovely spinning globe.
The other change which was a lot more major and more fun from a design/develop/test standpoint: the ability to control your DynViz globe via Microsoft Kinect.
You can now stand in front of the Kinect hooked up to your DynViz system, grab a hold of the globe and turn it to view DNS queries on any side of the world or just give it a spin for some fun.
The technical challenges of first getting the Kinect to run on a PC and then be recognized within Processing were dealt with using the really cool OpenNI framework, drivers and middleware from PrimeSense and the SimpleOpenNI processing framework.
As long as you run the correct installs in the correct order and use compatible versions, it all goes together smoothly.
(Don’t worry: that order and those versions are in the README for the project on github.)
Once the Kinect setup was accomplished, the next challenge was translating the movements into something useful and worthwhile for controlling the globe. Just haphazardly spinning the globe was useless, but being able to look at regions by moving the globe, being able to stop its spin and being able to zoom it in or out were all helpful when trying to glean information from it.
To do this, some of the built-in circle and push functions were used and for everything else, the NITE hand tracking was perfect for mapping coordinates — ultimately allowing for managing opened versus closed hands to allow for grasp detection.
Overall, I am sure I will be constantly tweaking and modifying the code (what can I say, I am never quite satisfied) but have fun trying out this new interface. and let us know if you extend it to do any cool and creative things yourself!