Collision detection is an essential aspect of AR / VR simulation. Does this new Ikea sofa you have an eye on really fit into that funky-shaped corner of your living room? This really is the question you would like answered when using the AR app on your tablet. Well, sometimes other industries have the same question when fitting machine-tools in factories.
In industries such as engineering, manufacturing and architecture, some of the biggest benefits of testing designs in VR is to assess the integration of different 3D elements or integration of the human factor with the CAD model. Collision detection can also enhance the realism of the VR experience, making it more effective for training and evaluation. In this article, we will define what 3D collisions are and how they can be used across many industries.
3D collisions are collisions that occur between two virtual objects, from any direction. They can be hard to calculate, as 3D CAD models are composed of multiple connected triangles, and any of these triangles can collide with another triangle.
The more triangles there are to represent a surface (also called a mesh), the more realistic the rendering is. But 3D collisions are also more complex to compute.
A part of your model collides with another part. It is important to detect these collisions when running a design review, so that you can see where parts clash, and avoid errors before reaching the production stage.
The VR hand collides with the 3D model. These collisions can be translated visually with a red rim where the parts intersect, haptically with vibrations or with sound.
The simulation of your real hands is penetrating the 3D model. This type of simulation is even more realistic when combined with tracking sensors, such as VR tracking gloves.
In your VR simulation, your simulated tool is colliding with parts of the model. It is very useful to test maintenance scenarios in real-like conditions. If the virtual tool and the model collides, that might be an indication that you are lacking space to properly mount and dismount the equipment.
The mannequin collides with parts of the virtual objects as he is executing a task. It is essential when performing reachability and ergonomics studies. You can get more accurate data by using a full-body tracking suit in VR.
Simulating the collisions between two models allows engineers to study the behavior of these objects in various environments, and see how different factors such as velocity and acceleration can influence the outcome of a collision.
Some use cases for the intersection of virtual objects would be:
However, without taking physics into account, simple interactions between CAD models will not be realistic enough for some use cases. What happens when you need to run a real-like simulation? Let’s go beyond simple 3D collisions and look into physics.
There are two main approaches on how to approximate 3D collisions with realistic physics:
Source: developpers.mozilla.org
For instance, for TechViz Virtual Assembly feature, TechViz software integrates Haption IPSI’s engine for rigid-body physics simulation with force-feedback. It ensures the non-interpenetration of virtual objects. The virtual objects are voxelized by the physics engine, which means it creates an orthogonal cover of the 3D object parallel to the different meshes, so as to capture its approximate shape information. The size of the voxel unit can be changed in TechViz.
Imagine that you have two parts of a product you want to assemble together: an engine, an aircraft, a ship… One part of the model has a hole, the other part has a protrusion that is the same shape and size as the hole. Detecting collisions will enable you to check if all the measures are correct when the two parts are connected. It also allows you to simulate constraints between the two groups of virtual objects.
Picture a robotic arm programmed to pick up and move objects from one conveyor belt to another. When the robot arm comes into contact with an object, it applies a force to grip the object and move it to the desired location. If the object is moving too quickly or unexpectedly, the robot arm can also apply an impulse to adjust its trajectory and prevent a collision with other objects on the conveyor belt.
Let’s say a car crashes into a truck. Both vehicles would have different masses and centers of mass, and their behavior during the collision would be determined by their respective velocities, the angle at which they collide, etc. The 3D collision would show the transfer of kinetic energy between vehicles, and the motion of both objects would be affected by the collision.
In this situation, you can also add virtual manikins to simulate the passengers and the consequences of the crash on a human body.
An accurate example for visualizing complex data in VR would be a building structure undergoing an earthquake. The simulation would involve the deformation of the different walls, columns, and other effects of the earthquake’s forces, such as the failure of the building. This simulation could be used by engineers to study the behavior of building structures during earthquakes and to design more resilient structures.
A car manufacturer can check if the spare where fits in its designed space in the trunk, or a warehouse manager car see how many pallets of goods can fit inside a storage unit.