Until now, input on projected images relies mainly on proximity sensors and cameras that track where a touch was recognized.  Skinput takes the idea of a projected UI a bit further, relying on skin taps as a way of interacting with a user interface.  Skinput analyzes vibrations through your skin to track where a tap was made by means of an armband that also projects the interface onto your hand or arm.  Take a look at the demo as they reach some pretty incredible accuracy stemming from research of vibrations through your soft tissue, bones and joints.  Transforming your hand into a phone would be pretty cool…if only they could make the device a bit less bulky.