Industry 4.0 and digital revolution brought many innovations, new methodologies and tools to the manufacturing industry. Virtual Reality technologies enable businesses to design products and better their production process by simulating 3D prototypes in a virtual environment. Besides saving time and resources, this new approach also enables to optimize the manufacturing lines and improve the workplace ergonomics.
Conceiving workplaces with VR enables to have a human-centered approach of their design. With VR simulation and full body tracking, engineers can validate their design from the direct observation of the worker’s movements. Find-out how full body tracking can improve the ergonomics of your industry in this article:
Full Body Tracking (or motion tracking) is a technology for collecting information about the user’s posture and movements, and convert them into data of a reproductible posture. In most VR applications, the head and hands of the users are already tracked by their headsets and controllers. The most common types of body tracking technologies are:
All these tracking technologies are different and require specific hardware and setup. This is why in other works, you will see a clear distinction between full body tracking, hand tracking and eye tracking.
Curious about how VR full Body Tacking could fit easily into your processes? Take a look at our workstation ergonomics Case Study to discover why companies like GMI and AGCO Corp chose us to improve workstation ergonomics on any AR / VR system.
The full-body experience is like the holy grail of VR developers in order to create an immersive experience for the users. A VR avatar needs a minimum of 5 sensors attached to hands, feet and hands, the more sensors you add, the more accurate you get. As one of the trending technologies in 2020, full-body VR experience has made an impact in the gaming industry, but also applies to other fields. For instance, the teslasuit, as well as the tracking suits created by ART, TEA and Xsens were thought for industry-use, and not only gaming.
Hand tracking for Virtual Reality is when the entire movement of the hand is mapped to a digital skeleton, and the input data is based on the movement or the pose of the hand. It allows for natural movement when manipulating digital objects. It can include tracking devices like sensors attached to gloves.
Finger tracking rely on specific technology that are not necessarily included in (full) body-tracking. Recent innovations in finger tracking, such as the ones on Oculus Quest, makes us think there are quick improvements coming in that field. Why? Because it lets you use your hands naturally during the VR experience, rather than relying on controllers. Facebook research teams used model-based tracking and deep neural networks to infer where your hands and fingers are and what they are doing, and then reconstruct your hands into VR.
Hand pose recognition has its own research field. It deals with linking events in the Virtual Environment with specific hand signs of the user like grabbing or releasing an object, selecting something, or any other common action in VR, when using a controller-free VR system. Even though it requires less processing power than robust hand-tracking, it can get tedious for the users, because they have to reproduce specific hand signs instead of moving naturally.
Where the user is looking can be an important information, especially to anticipate the intent or interest. Eye tracking can be very interesting to mix with other types of tracking, for example to triangulate the position of an object using the line of direction and the hand movement. It rely on specific hardware that can detect where the user is looking at
There are a few ways body motion can be tracked, but in short there are two main categories:
Optical tracking generally involves the use of one or several cameras. The tracked user has optical markers, in the HMD, or sometimes dots of reflective material on known points of their body. The aim is to map the user and their body movements in 3D, for example with two cameras aiming at the same dots
Most consumer virtual reality don’t have active optical tracking (with LED lights), but passive market methods in the HMD and controllers. Active optical tracking works either with a tethered headset, or wear another power supply, and both can limit the movements of the user. HMD with markers have passive ones that can be detected by your VR system, however they don’t offer depth tracking.
HMD’s and controllers contain micro-electromechanical sensors like gyroscopes, accelerometers, and magnetometers. These sensors give information about motion data that are transmitted to your VR display software. They can also be combined with optical methods like passive reflector tracking or infrared.
Usually when we think of full body tracking in VR, it relies on sensors inside the HMD and controllers, but also tracking gloves or suits that gives even more precise information about the user’s movements, and can be more accurately reproduced in the virtual environment.
In some industries, such as manufacturing, workers experience awkward postures and motions due to improper workstations. This can lead to accidents, which will not only affect your productivity but also increase your production costs.
When it comes to check the ergonomics of a workplace, or how to lay-out the interior of a room with a 3D model, tracking the full body or the hands of the user becomes very important. Full body tracking can give engineers information about:
In the real world, engineers would need lots of equipment to measure these different actions. With VR technologies they can run several simulations and anticipate the ergonomics problems before they happen.
TechViz is optimal to assess ergonomics in the manufacturing industry. With the use of VR tracking devices, you can study how a real person would interact in the environment you designed. It works by placing a virtual human in your 3D model, and controlling it “puppet style” or following a real user wearing VR Body Tacking devices. You can even track a work posture in real-time, and compare it with the ergonomics standards of your company.
For instance, you can use this feature to: