Machine learning gives users 'superhuman' ability to open and control tools in virtual reality

Original | Text BFT Robot 

picture

Recently, researchers in Cambridge developed a virtual reality app that allows users to open and control a range of 3D modeling tools with just the movement of their hands.

Researchers from the University of Cambridge used machine learning to develop "HotGestures" similar to hotkeys (shortcut keys) used in many desktop applications. HotGestures enables users to build graphics and shapes in virtual reality without having to interact with menus, helping them stay focused on the task without losing their train of thought.

The idea of ​​being able to open and control tools in virtual reality has been the stuff of movies for decades, and researchers say this is the first time this "superhuman" ability has been made a reality. The research results were also published in the journal IEEE Transactions on Visualization and Computer Graphics.

Virtual reality (VR) and related applications have been hailed as game-changers for years, but outside of gaming, their effectiveness has yet to be fully realized. Per Ola Kristensson, a professor of engineering at the University of Cambridge who led the research, said: "Users gain some qualities when using VR, but few are willing to use it for an extended period of time." Visual fatigue and ergonomic issues aside, VR doesn't really offer anything you can't get in real life.

Most desktop software users will be familiar with the concept of hotkeys—command shortcuts such as ctrl+c for copy and ctrl+v for paste. While these shortcuts don't require opening a menu to find the right tool or command, they rely on the user to remember and give instructions correctly.

The Center for Artificial Intelligence and associated researchers want to take the concept of hot keys (shortcut keys) and turn it into something more meaningful for virtual reality - something that doesn't rely on the user already having the shortcut keys in their head. Kristensson and colleagues at the Center for Artificial Intelligence developed "HotGestures", which instead of hotkeys (shortcut keys), users can use gestures to open and control the tools they need in a 3D virtual reality environment.

"HotGestures" can help users complete operating instructions in a simple and convenient way. For example, performing a cutting motion will open the scissors tool, and performing a spraying motion will open the spray can tool. Users don't need to open menus to find the tools they need, or remember specific shortcuts. Users can seamlessly switch between different tools by performing different gestures during tasks, without having to pause their work to navigate menus or press buttons on a controller or keyboard.

"In the real world, we all interact with our hands, so it makes sense to extend this form of interaction into the virtual world," Christensen said.

In this study, researchers built a neural network gesture recognition system that can recognize gestures by making predictions on incoming hand joint data streams. The system is designed to recognize ten different gesture information related to building 3D models: pen, cube, cylinder, sphere, palette, spray, cut, zoom, copy and delete.

The team conducted two small studies in which participants used HotGestures, menu commands, or a combination. Gesture-based technology provides fast and efficient shortcuts for tool selection and use. Participants found HotGesture to be unique, fast, and easy to use, while also complementing traditional menu-based interactions. The researchers designed the system to be immune to false activations—the gesture-based system correctly recognizes what is a command and what is a normal hand movement. Overall, gesture-based systems are faster than menu-based systems.

There are currently no VR systems available that can do this, Christensen said: "If using VR is like using a keyboard and mouse, then what's the point of using it? It needs to give you almost superhuman strength, which is Something you can’t get anywhere else.”

The researchers have made the source code and dataset available so that designers of VR applications can incorporate it into their products.

I hope this becomes a standard way of interacting with VR. The filing cabinet-based approach that has been used for decades no longer meets the needs of the times. We need new ways of interacting with technology, and we think this is a step in that direction. If done right, VR can work like magic.

The research was partly supported by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

Note: If you have any questions about the content of this article, please contact us and we will respond promptly.

Guess you like

Origin blog.csdn.net/Hinyeung2021/article/details/134458119