VR in engineering: nearly everything you need to know

Posted by Sandrine Lasserre on Nov 25, 2020 1:35:08 PM
Find me on:
Updated July 21, 2022
VR in engineering by TechViz
Virtual Reality has become a common fixture in today’s society: whether it is for entertainment or for business, for technological reasons or safety requirements. 
VR is an innovation that’s been around since the 90’s and - even though companies are more and more preoccupied by digital trust - VR is now widely used in the professionnal area. It is revolutionizing today, and gives everyone with an engineering background the possibility to shape tomorrow. 

Now, let’s dive into what VR engineering really entails:

What is virtual reality?


Virtual reality (VR) is a computer-generated simulation where the user, in full immersion, can interact with the virtual environment. The viewer sees the virtual world through a virtual reality headset or a projection-based device (powerwall or CAVE), and interacts with the 3D objects with controllers.

In the real world, we use our different senses to perceive our environment. Outside of the five you learn through school, there are a lot more we use to determine the environment we are in. Virtual reality mostly relies on visual and hearing inputs (and sometimes touch through haptics and controller vibrations) to emulate the sensation of presence, and make you believe you really are in the virtual environment.

Virtual and augmented reality have to be distinguished. They are two sides of the same coin. Augmented reality works by overlaying virtual objects in the real world. Virtual reality works by immersing the user in a virtual environment. Both can rely on the same sensors and headsets, but they don’t serve the same purpose.

16 important virtual reality concepts

We have long held the desire to escape the limitations of a computer screen, keyboard and mouses. With virtual technologies, we are understanding more and more how to create more intuitive ways to get human-machine interactions (HMI), also called human factors integration. VR is a new form of communication with machines

Before you can delve further into VR engineering, it is important to grasp some core concepts about virtual reality. Here’s a little glossary of VR concepts you should know.

1.     Computed Aided Design (CAD) and Computer Aided Engineering (CAE)

Computed Aided Design (CAD) and Computer Aided Engineering (CAE) software are programs used for engineering and design that helps creating parts or complete virtual prototypes. CAD and CAE software are now benefiting from advantages of virtual reality systems in terms of visual collaboration, which helps reducing the time and resources engaged in product development.

2.     Degree of Freedom (DoF)

In mechanics, the degree of freedom is the minimal number of independent variables required to define the position of an object in space. In other words, the DOF determines the number of directions in which an object can move. In 3D, these variables are

  • the angle from the planes XY, XZ and YZ. (also known as pitch, yaw, and roll)
  • the distance from origin of the objects in the x, y and z axis

When you are immersed in a virtual environment, you can either be in a fixed position (like sitting), or moving around. 3-DoF headsets track only rotational movements, which means it will register if you look left or right, up or down and pivot your head in different angles. 6-DoF headsets also allow us to track translational motion, which means it allows you to physically move into the virtual space: forward, backwards, sideways and even up and down.


For short, 3-DoF headsets like Google Cardboard, Oculus Go or Samsung Gear VR are best suited to consume 360 degree content. If you are working with 3D models, and need to more around 6-DoF headsets such as Oculus Rift, Oculus Rift or HTC Vive are better for your use case. And if you need even more powerful head mounted headsets, check-out Pimax 8K or Varjo VR-2.

3.     Dollhouse view

When you are watching a 3D model totally zoomed out, usually from a top-down view. It enables engineers and designers to take a look at a (large scale) CAD model, and see how it should look like once it’s entirely built.

4.     Eye-tracking

It’s a process used by some headsets to keep track of the user’s gaze. This information can give accurate information on what the viewer is looking at, and for example give a focus on one virtual object or another, furthering the immersion.

5.     Field of View (FoV)

The field of view is the number of degrees visible from a certain point of view. The human’s FOV is generally 200° (binocular vision only covers about 120°). When it comes to VR FOV, it is limited by the lenses, and to get a broader field of view you need either bigger lenses (so a bigger headset) or lenses closer to the eyes (and it might cause headaches). Most headsets in the market have 100-120° headsets, but most vendors are working towards expanding the FOV capabilities of their HMD.

6.     Frame rate (fps)

A frame rate is the frequency at which an image (or frame) is replaced by another on the screen. This is how you can have the illusion of movement. A low frame rate will give the impression of chopped movements, and a high frame rate will give smooth movements. 60 fps means you have 60 images displayed in 1 second. Frame rates depends on your CPU (Central Processing Unit) and GPU (Graphics Processing Unit).

In order to prevent the users from getting motion sickness and headaches, VR needs to maintain a high frame rate (at least 90 fps).

Work on the Digital Twin of your 3D factory with TechViz VR software

7.     Haptics

In order to provide feedbacks to the users, by physically simulating their interactions with the virtual environment. For example, it can be a vibration effect on the controllers. Using other wearable gears like gloves can further the experience. With the use of haptics, VR closes even more the gap between the real world and the virtual world, by adding the sense of touch in the immersion experience.

8.     Head tracking

Head tracking consists on tracking the position and orientation of the user’s head. This information allows the point of view to follow the user’s movements in the virtual world. In the context of full immersion, head tracking is essential to give the users a complete freedom of will and a realistic and interactive virtual reality experience.

9.     Immersion vs presence

Two concepts often appear through VR articles and papers: immersion and presence. Immersion in a VR system depends on sensory immersion, which is “the degree which the range of sensory channel is engaged by the virtual simulation” (Kim and Biocca 2018). Presence is the perceptual illusion the user has of being in the virtual reality: they react to the changes in the virtual world, but they cognitively know this is not reality.

These two terms are really close, and they are often used interchangeably. Most of the times, papers refer to immersion as the objective level of sensory fidelity and to presence for the subjective…

To be noted that a high level of immersion is not necessarily good for collaborating in VR. Being able to see your real body while experiencing VR (for instance with a Powerwall or in a CAVE) - using a body tracking software - is less disorienting, reduces the risks of bumping into another VR user.

10.     Inside-out tracking vs outside-in tracking

Inside-out and outside-in tracking are two different approaches to motion tracking in the real world to replicate them in virtual reality:

  • Inside-out tracking uses the sensors placed inside of the VR. The cameras on the headset will record fixed points in your real surroundings and use as reference points for your movements. Markerless inside-out tracking is also possible and uses positional tracking of objects, but it lacks accuracy despite being much cheaper.
  • Outside-in tracking uses external tracking devices (like camera or lighthouses) to simulate the “virtual box” where the user is. It is very accurate but doesn’t track anything outside the line of sight of the sensors.
TechViz VR software with Xsens body Tracking

11.     Judder

Judder (or smearing) is the manifestation of motion blur and the perception of more than one image simultaneously. It is caused by a low refresh rate or dropped frames. Judder may cause simulator sickness.

12.     Latency

The latency is the delay between the input (visual or auditory) and the output (through the screen, earphones, haptics). It is mostly a technical problem that should resolve itself as technology is improving. High latency can cause VR sickness, though due to the unnatural feeling of your brains knowing you’re interacting with the virtual environment, but not receiving the correct information.

13.     Monoscopic vs stereoscopic videos

Monoscopic videos (180° and 360°) are the easiest to produce for consumer VR. Basically, you have 1 image directed to both eyes that are stitched together to give the illusion of being inside a cylinder or a sphere. However, the image seems completely flat, but it makes it possible to watch it on another device like a computer screen (using drag and drop to navigate) or a smartphone (using motion sensors).

On the other hand, stereoscopic content displays two sets of videos (one for each eye). This type of content resembles much more the way se see the real world, as each eye gets a slightly different angle of the same information (just like our own eyes do). This enables the brain to get a sense of depth in 360° content. Stereoscopic videos can only be watched on HMDs.

14.     Resolution

The resolution (of a screen most of the times), is the degree of details an image has depending on its number of pixels. High resolution means more pixels to render more details. Of course, it will also depend on the size of the screen. A low-resolution image will seem detailed enough on a small screen (ie a smartphone), but will immediately be pixelized or blurry in a larger screen.

On a VR device, the image resolution often results on a lower level of details, because the surface on which the content is rendered appears much larger than on a flat rectangular screen, which means the image is stretched on a wider area. Also note that for stereoscopic view, each eye will only see half of the actual resolution.


15.     Refresh rate

Refresh rate is the number of time your monitor can redraw the image on the screen. Just as the frame per rate, the refresh rate can cause latency. The refresh rate is very important for a realistic virtual environment. It is measured in Hertz (Hz). Your refresh rate should match your frame per second, otherwise the images displayed by your monitor will not match the images generated by your computer.

16.     Room-Scale VR

It is a sub-category of VR that implies that the VR environment is the size of a room. Instead of a stationary VR experience, they can walk around a “box of freedom” in which they can freely navigate. Most of headsets recommend a minimal space of 2 meters by 2 meters for a comfortable virtual reality experience. For security purposes, some headset ask you to place a security grid (a guardian on Oculus quest), and free your ground surface from any obstacles.

New call-to-action

What devices are necessary for 3D visualization?

The autostereoscopic device

Autostereoscopic devices cover a family of screens that display stereoscopic images without the use of special headgear or glasses. The binocular perception of 3D effect can be produced through different technologies such as parallax barrier or lenticular array. These devices are inherently the easiest to set-up. As such, they are well-suited for commercial spaces with a lot of foot traffic. On the flip side, they are also the type of devices with the least immersive experience.

The head-mounted display (HMD)

The head mounted display (HMD) refers to the virtual reality headset. It is basically an inbuilt display of lenses before a screen the size of a smartphone. It has the form of a helmet (or googles sometimes), and can be strapped around your head for comfort. There are three types of headsets to consider for VR:

  • Smartphone-powered headsets, which are entry-level headsets, mostly for 3-DOF applications
  • Standalone or all-in-one headsets that are limited in use
  • Tethered or pc-powered headsets who rely on a high-end computer system

The immersive room (or CAVE)

A CAVE (cave automatic virtual environment), is a projection-based display for virtual environment made of at least two screens. The users wear 3D goggles with stereoscopic view that enable them to be completely immersed in the virtual experience. They are more immersive than a Powerwall, and are more suited than virtual headset to collaborate on large-scale 3D models with a high image quality.

The drawback of the immersive room is the steep price and the space requirement. You will at least need 500K to set up an installation. For VR first-timers, trying a smaller system for your use case might be a good idea.

But if you want to go for a way cheaper investment and still work with various people in collaboration throughout the world, then try cloud vr. You'll only need to have one master pc (which could totally be on the cloud). It is a cloud solution that will execute the VR software directly on cloud servers, and stream only the necessary resources to the end users. Thus, no need for powerful hardware, data storage and image processing requirements, as it is the case in classic AR/VR systems. 

The powerwall

The PowerWall is a projection-based VR display system comprised of one or several screens, that display two slightly different set of images to the viewer. The users need special 3D glasses that shutter in sync with the images to enjoy three-dimensional view. Powerwalls are less immersive than virtual reality headsets, but they are better suited for promoting teamwork, as the users are not in full immersion, and can communicate easier with coworkers.

Lately, it has been increasingly popular to create Powerwalls using LED tiles instead of projectors. It can take less space than a projector and can be fitted in any meeting room, which is ideal for a presentation in VR. However, the image will be pixelized when you stand close to the screen.

So to professionally use VR you'll need to choose between all existing devices but you'll also need to choose between wether you need on-premise or SaaS to deploy vr.

How did virtual reality really begin?

Virtual reality history is not exactly a recent one. The technology is built upon ideas that date back to the 1800’s. Many inventors wanted people to get “inside” stories, or experience simulated environments through sensory sensations. However, many of the inventions failed because the technologies virtual reality relies upon (like computer graphics, processors, internet bandwidth, 3D rendering …) were not invented yet, or were not performant enough.


The idea of being immersed in a created universe isn’t new. The term “virtual reality” was created by Jaron Lanier in 1987, founder of VPL Research, as he was inventing the first VR gear (including virtual reality glasses and tracking gloves). The 90’s were the first-time virtual reality became popular, as well as viable for industry use and research purposes. But it was still too expensive and lacked the proper technologies to properly tricking the brain into believing the virtual simulation was actually “real’.

Today’s virtual reality owes a great deal to the inventors who paved the way for high-end VR equipment, yet accessible for many use cases. Of course, the technology is still evolving, and the specific applications can be still very costly.

Is virtual reality just a hype?

Since virtual reality has started to make itself known, back in the 90’s, it was immediately placed on the Gartner Hype cycle. But the hype cycle merely classifies emerging technologies according to their degree of attention. In other words, the hype cycle quantifies the buzz around new technologies. So what about the virtual reality hype?

how to read a gartner hype cycle

If you take a look at all the hype cycles, VR has always been there, and most often in the “Valley of disappointment”, that it never seemed to leave. But today’s improved processors, and the birth of affordable HMDs, virtual reality experienced a revival. The difference is that VR isn’t just a niche sector for big industrial groups. Now, companies with smaller sizes can afford virtual reality hardware and software. So, is VR just a hype? Not really, but it still needs more than a few success stories from early adopters to really take off.

How is virtual reality shaping today's work?

Now that VR got affordable, more and more companies are adapting their workplace for VR collaboration. And that has become necessary in today’s world, when many workers have to work from home. However, with complex projects, working from a distance may not be possible with videoconferencing tools such as zoom, skype or teams. By immersing people in a virtual world, it will open new possibilities like reviewing the design of a product, iterate and make decisions around a technical issue, and promote a better teamwork.

Virtualization of workplace by 2025 with virtual reality

By 2025, virtual reality will have completely reshaped the way we work, especially for remote settings. Of course, many companies still have to adapt, because there’s a gap between their expectations and efficient VR-based work sessions. However, the main areas you should focus on to keep the quality of work both in the virtual and real world are:

  1. Giving your employees the right equipment (VR hardware and computer)
  2. Training the workers to use the VR technology
  3. Choosing the right VR software for your use case

What are virtual reality use cases?

Since VR has become an important part of business, let’s review the main use cases that are currently covered by using a vr software for engineers:

1.     Running a project review in virtual reality

Design reviews are necessary in many projects, in order to seek feedbacks and correct assumptions from your teams, other departments or even customers. It enables to put everyone on the same page about the project. Many companies rely on CAD models to convey design ideas, but it is a limited tool. Running a project review in virtual reality enables you to display and interact with your 3D data in scale 1:1, which will be cheaper than a physical prototype, and more intuitive than looking at a CAD software on a computer screen.

For more information, check-out these 6 essential steps for a project review in VR.

2.     Conducting ergonomics studies

Ergonomic analysis can be very useful, especially when designing inner spaces. Whether we are talking about a building or a vehicle, it is important to know how the future customers will react to the design. For example with a car, you want to know if they can reach every command and have a good visibility. With virtual reality, optimizing the ergonomics can be done in an instant, and reduce costs and time-to-market.

3.     Identifying errors at the early stages of conception

One of the problems with creating physical prototypes, is that you can’t detect the errors before the prototype is created. That means more resources are spent for product models that might still present errors or risk for the users. Besides, when it comes to large-scale models, it’s near-impossible to get the true scale of the product and find the errors on the model. Virtual reality enables you to interact with the 3D model at 1:1 scale, avoiding you costly rework and identify the risks and conception error, even in the case of bespoke adaptations.

4.     Collaborating with workers all around the world

Teamwork is a challenge for companies that have employees working remotely, or subsidiaries in different countries. To answer their needs, virtual reality offers the possibility to create shared digital workspaces where engineers from different locations can work on the same CAD model in real-time. Collaboration in VR enhances communication between teams and accelerates the design-to-market processes.

5.     Evaluating risks for maintenance operators

Before sending technicians or engineers in a potentially dangerous installation, it is important to give them proper training to assess risks and safety or the workers. Virtual reality helps you set up training for situations that are otherwise too dangerous or too expensive to reproduce in real conditions.

6.     Visualizing and interacting with complex data


Sometimes, flat rendering is not enough to fully visualize and interpret complex sets of data. Interactive virtual reality helps you see and analyze phenomena that would otherwise be invisible to the naked eye.

7.     Assessing and designing workstations

In the context of the digital factory and Industry 4.0, virtual reality is a key technology to optimize, simulate and engineer a production line, and interact with it in collaboration. With the 3D model of the workstation (or the entire manufacturing line), virtual reality can be a powerful tool to assess the operator’s safety and well-being and the workstation ergonomics. These studies can be further optimized with the use of a digital twin and a full body tracking gear to visualize the operator in action, both in the real world and in the virtual world.

What is the ROI of using VR in business?

Virtual reality can adapt to many use cases and better process in your company, but what is the return on investment (ROI) of using VR? Many benefits are drawn from VR in engineering, like saving time and money at different stages of product development. But the gains are not just quantitative, but also qualitative like improving process and product quality, fostering innovation mindset and refining the worker’s skills.

What other technologies and devices combine well with VR?

Virtual reality is a powerful tool. In the industry, computer-generated world generally bases the simulation and content on 3D modelling software, that are converted under a format the VR software can use (note that TechViz does not convert data). There can also be other inputs coming in the VR world that can be beneficial for engineers, depending on their use case:

  • 3D point clouds created with LiDAR or photogrammetry technology, helps you combine existing buildings or environment with VR technology
  • Artificial intelligence because it can generate insightful data and run different simulations
  • Tracking devices, whether it is full-body tracking, vr hand tracking, finger tracking or eye-tracking that adds new input and information on how the user reacts to the virtual simulation or objects
  • Digital twins that allows people to access information on the real asset via a virtual simulation can be supplemented by predictive simulations in VR
  • Robotics are a very large field that intertwines with VR as it ranges from sensors generating data to co-bots sharing workspace with the technician

Many of these technologies are very trendy right now, and will benefit any company trying to go through their digital transformation.

How do I become a VR engineer?

If you already are an engineer, the answer is very easy: just buy a VR headset. There are multiple plug-ins that interface seamlessly with the CAD software you already use, bringing most of the benefits of VR with very low time and financial investment.

How to become a vr engineer? To become a true VR expert, there will obviously be a whole different set of skills to master, starting with programming languages. Virtual reality often use C, C++ for development, and there are a few libraries available with tools and features to integrate VR through programming. You may probably need to be familiar with IDE (integrated development environment) such as Unity and Unreal 3D engines. If that fits your profile, don’t hesitate to apply for a job at TechViz! We are always looking for highly professional and passionate experts to help us grow our solutions.

New call-to-action


Other blog posts that could interest you:

Topics: vr for engineering