Since the release of Microsoft's Kinect and ASUS's Xtion PRO, we've seen a lot of neat applications crop-up as the community explores the new depth-sensing technology for the first time.
One such application that has caught our eye is a demonstration of a holographic game engine developed by programming4fun, which utilises a Kinect unit with the new official Kinect SDK, along with a mobile phone and projector to produce a pseudo-holographic experience, enabled by tracking the user with the Kinect device and adjusting the in-game rendering angle appropriately to create the feeling of a holographic image; the mobile phone is used to control the game remotely.
This concept isn't completely new and is similar to the idea of head-tracking, which, for example, was used in Gran Turismo 5, where the Playstation Eye was used to track gamer head movements, adjusting the in-game angle to create a more immersive experience; however, we hope that you'll agree, the concept displayed in this post takes matters a step further on the coolness scale.
In theory, the same result as witnessed in the video, could have been obtained by the use of a monitor laid flat on the table, though without a flush fit and with the bezel in the way, the experience may have been somewhat diminished. Currently such an approach can only work for one user at a time as the image must be tailored to the individual's position, however, we'd be mighty interested to see what could be possible in the future, with a specially produced parallax barrier or lenticular autostereoscopic screen; perhaps even, the more immediate application of active 3D monitors capable of multiple feeds.
Something that you can't pull off with just a monitor alone and really do need a projector for was a recent advert created by Sony that utilised the same core concept, where the Playstation Move was used to track the camera position and adjust the angle of a projected image, resulting in something rather special, all realised in real-time.