Some months ago a couple of colleagues at Fraunhofer ESK Munich and me duct-taped a mobile projector together with a webcam aiming at a software solution to combine the intuitive work with physical information resources, e.g. books or papers (for research or courseworks) with the benefits of online research, e.g. effective search or cross referencing.
After some weeks of trial and error cowboy coding we finally came up with some kind of reading lamp used like device to annotate special physical objects with meta data. The augment reality flashlight system (physically similar to MITs 6th sense) first of all recognizes an object, then searches for important data on the internet and finally creates a seamless interface right next to the object.
Based on OpenCV, especially with its support on Speeded Up Robust Features and Motion Tracking, as well as the conceptual input from projects such as the "Multi-user interaction using handheld projectors", "Motion-based finger tracking for user interaction with mobile devices", "Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration" or "Map torchlight: a mobile augmented reality camera projector unit" to name just a few... we created the system shown above in the video working without the need of any special markers.
Even though this is work in progress and the software is pre alpha, we'll be presenting the system and primarily its interaction technique in October at UbiComp 2009 to share some thoughts about it and to discuss ideas about several application areas.
Posted in: Project, Video