Robot Control through Virtual Reality: A Look at the Present, Potential and Future Applications
Immersive media and entertainment aren't the only applications demonstrated by Virtual Reality (VR) technology. In fact, an often-overlooked one is the capacity to provide an intuitive interface for controlling real-world devices; primarily, robots. It may seem straightforward to realize, but consider what's in development, and, chances are, you'll see there's broad range of potential applications - likely in the foreseeable future.Full 6 DOF Testing from DORA on Vimeo.
Already there has been plenty of demonstrations of VR technology being used as an intuitive interface. 219 Design, for instance, has shown a simple proof-of-concept, where they controlled a 3D-printed, small, robotic arm through VR. The arm would have a virtual image, or 'proxy,' of it displayed through a VIVE headset, which the viewer could 'grab' and 'move' around. Any movement made with the proxy would be mimicked by the real arm in real-time, allowing the person to 'directly' manipulate it. They could even teach it precise movements without requiring any form of coding.
A more complex example would be the Telesar V robot developed at the Keio University of Japan, which is a robotic puppet controlled through a VR 'suit.' The robot would not only mirror any upper body movements made by the pilot, but would be able to let him see, and even feel objects around him - yes, that means haptic feedback is involved. In effect, the pilot could have a robotic avatar operate in an entirely different location.
There are even VR controlled-drones, like FLYBi, which was funded in 2015 through Indiegogo and can let it's controller have a bird's eye-view of the world below. Countries such as Russia are also developing more militarized versions.
All these examples show only an inkling of the various applications this interfacing can have, with an obvious, major one being remote-controlled robots in dangerous environments. The Telesar V is one example, as one aim with it is to use it in Fukushima-style disasters, where radiation is just too hazardous to human presence. Although robots for such scenarios do currently exist, none of them have the motor coordination of a person, or in this case, as Telesar.
Johns Hopkin's Computational Interactive Robotics Laboratory is also aware of this application, as they've been researching an Immersive Virtual Robotics Environment (IRVE). IRVE runs along the similar lines as the 219 Design demonstration, where a person can control a robot in real-time via a virtual proxy; or allow them to work with an entirely simulated robot. The latter aspect would let people safely do routine check and programming on industrial robots, which are otherwise too dangerous to directly work with; while the former would be for scenarios like that of Telesar's.
Other applications for this interfacing could be education, medical use, communication via telepresence and more. The only limit to the possibilities is that of the human imagination.
The options will only become more apparent as we push the immersive envelope of VR. All of this shows what this form of interfacing is capable of. Plenty of industrial companies, institutes, and even hobbyists are aware of these useful applications this could have in society, and many have already demonstrated working prototypes. As VR becomes more refined and more easily accessible, we could end up seeing VR-controlled devices more prominently in the foreseeable future.