Hand tracking in VR is a very appealing feature. No need to learn the placement and functions of the buttons on the controllers anymore. How convenient and enjoyable would it be to just move your real and virtual hand in sync through your HMD? Just a move of your hand and voilà! Which means real-time hand tracking in VR has a lot of potential for casual and professional VR., especially because it replaces the controller and makes the VR experience even more immersive.
The thing is: our hands are a bit too much for the current VR systems. They grab objects, point at things, type… There is even an entire language made of hand signs! For short: our hands are too nimble and too fast for computers to understand them properly… Or, are they?
This article will give you an overview of what a hand tracking system in VR is, how it will help your industry and how it works:
- What is VR hand tracking?
- How will hand tracking in VR help your industry?
- How does VR hand tracking work?
What is VR hand tracking?
Hand tracking in VR means you can use your hands as input, instead of your controllers. No need to touch a button or type a command. However, hands are hard to track: they can come in many shapes and sizes, take many poses, and make fast and complex movements in various degrees of freedom (DoFs).
Keep in mind that hand tracking and gesture recognition are not the same thing. Gesture recognition technologies can only remember a few specific gestures (a thumbs up, a wave…), while hand-tracking will use all the hand moves to understand the human-machine interaction (clicking on a button, grabbing a tool…).
Hand tracking in VR and finger tracking in VR are different as well. Hand tracking only needs some landmarks to track movements (like the knuckles and the fingertips of the index and the thumb). Finger tracking will track all your fingers movements. Very useful for specialized tasks, like typing on a virtual keyboard.
How will hand tracking in VR help your industry?
Do you remember back when touchscreen technology came out? Suddenly, we (mostly) didn’t need remotes or buttons on our TV’s, our cars and phones became “smart” and it got much faster and more intuitive. Now, imagine that you don’t even need the instrument panel as a physical interface. You just need hand gestures (and sometimes voice commands, but that’s another subject).
Remember that hand-tracking in VR is not just a gadget. For some industrial tasks, like ergonomics studies, tracking a user movement is a critical feature (whether they track the full body, the hand or the fingers). VR controllers only will be able to reproduce primitive interactions like targeting an object, selecting or activating an object and manipulating it. But these interactions are limited, and unnatural. You’ll always find yourself trying to remember that little button that enables you to do this or that. And this is problematic, because it will probably induce design bias in your product conception. Hand-tracking enables you to use VR more finely and intuitively, and involve non-expert users in the design process.
Project review in VR
VR is a powerful tool for making design reviews. It enables engineers to view their project in real size and interact with it, without relying on a physical prototype. Now, imagine that you don’t even need the controllers to manipulate your product. Neat, right? And you can get a much better sense of proportions, and how people are likely to interact with the final product.
Whether if the work environment is quite hostile to the workers, or if there are a lot bespoke adaptations, the maintenance staff will require specific training. They won’t have VR controllers in their hands when they make their maintenance operation, in real life. So, it is more effective to train them with hand-tracking in VR.
Being immersed in a workstation in VR is one thing. You can check if everything is at hand’s reach. And you can grab and manipulate everything with actual virtual hands. Hand tracking in VR will enable you to make precise ergonomics study of your workstations.
With hand-tracking in VR, you can interact with all the elements inside of a vehicle, and make a precise review of your car 3D model’s ergonomics. You can check for accessibility of all commands with your hands, and also check the efficiency and safety of the virtual prototype. Also, if you’re using TechViz fusion, you can also combine the ergonomics review with a driving scenario, and track the user’s hand movements and timing.
Training workers to a set of tasks will always be more effective in VR, because they are immersed in the training scenario. Adding hand-tracking is like the cherry on top of the cake: they are more focused and they learn better. Adding haptics (if you’re using VR gloves) can help them feel the virtual objects they are interacting with.
How does VR hand tracking work?
Here’s how hand recognition in VR works:
- The devices detect hand movement
- The input is segmented to find the hand pose
- The input is analyzed by the VR software
- When the VR software interprets a gesture, it is rendered in real-time in the virtual environment. If the user interacts with another object in VR, the software prompts the correct reaction
How are hand movements captured by the VR hardware?
The ideal hand tracking system in VR should require nothing but the human hand. As of today, it is still a bit complicated, but we’re getting there. Thus, you will mostly find two types of system to gather hand movements
- Device-based hand tracking that uses gloves and/or wearable cameras
- Camera-based hand tracking that rely on one or several cameras, with or without markers to help with the detection
The hand tracking devices are far from perfect. Gloves and wearable cameras can be barriers for the natural movements. Camera-system will have a hard time distinguishing the different fingers and/or hand poses, they can lose track with fast movements or changing lighting conditions and be susceptible to occlusion (especially when collaborating in VR in a CAVE or in front of a PowerWall).
What VR headsets support hand tracking?
Not all headsets enable hand tracking natively. Only Oculus Quest supports controller-free hand tracking natively since May 2020, and it is limited to some built-in VR games. Oculus Rift S will probably be equipped with a similar feature. However, Facebook thinks that hand-tracking is more suited for casual VR.
For those who don’t own a Quest or a Rift S, the alternatives are proprietary add-ons that require additional hardware or software, like ManoMotion or Ultraleap (added to Pimax and Varjo headsets). For Vive and Vive Pro users, HTC released an SDK in 2019, but it is still in early access.
How is the VR hand interpreted by the VR software?
Most often, a VR software enabling hand tracking will use different machine learning algorithms for hand pose estimation, such as:
- Discriminative approaches: the image features are extracted and examined by the VR system to determine hand movements (frame by frame) and run against a hand pose database. This database can be created using machine learning, and training the system to recognize the different hand poses.
- Generative approaches: they employ an existing 3D model of a hand and generate a pose for the model that corresponds the best to the input (glove, markers or camera)
- Hybrid methods that combines both methods to rely on a frame-by-frame tracking along with a good capability of re-initialization using a hand model
If you want to go deeper on the difference between the different methods, here’s a nice paper on Generative and Discriminative models.
Once the hand pose and movement is identified, it is rendered in 3D by the VR system in real-time, so that you can move your hands in sync in the real and the virtual world.