Google pushed out an update for its MyGlass app today, which now turns your Android device into a screencast remote control for the wearable technology. The software update, which also includes bug fixes, allows users to control Glass through touch, swipe and tap gestures on their smartphone, allowing them to navigate the HUD technology without calling attention to their actions.
While Google has boasted that Glass will help relieve us from our dependence on the rectangular computers in our pockets, the truth is, at least for now, controlling Glass is a lot more awkward and conspicuous than swiping at a smartphone. The new MyGlass update should correct this issue, allowing users to take the sneakiest of photos and videos ever without giving away what they’re doing.
Besides snapping unnoticed pictures, there doesn’t seem to be any real purpose to screencasting Glass to your Android phone. Sure, controlling the new technology can take some getting used to but, once you’re using a smartphone, the HUD module just becomes a redundant second screen. If anything, the new update is designed to help users gain their footing with Glass, turning your handset into a set of training wheels for the wearable technology.