• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Electrical Engineering News and Products

Electronics Engineering Resources, Articles, Forums, Tear Down Videos and Technical Electronics How-To's

  • Products / Components
    • Analog ICs
    • Battery Power
    • Connectors
    • Microcontrollers
    • Power Electronics
    • Sensors
    • Test and Measurement
    • Wire / Cable
  • Applications
    • 5G
    • Automotive/Transportation
    • EV Engineering
    • Industrial
    • IoT
    • Medical
    • Telecommunications
    • Wearables
    • Wireless
  • Learn
    • eBooks / Handbooks
    • EE Training Days
    • Tutorials
    • Learning Center
    • Tech Toolboxes
    • Webinars & Digital Events
  • Resources
    • White Papers
    • Educational Assets
    • Design Guide Library
    • Digital Issues
    • Engineering Diversity & Inclusion
    • LEAP Awards
    • Podcasts
    • DesignFast
  • Videos
    • EE Videos and Interviews
    • Teardown Videos
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Bill’s Blogs
  • Advertise
  • Subscribe

What sensors are used in AR/VR systems?

May 22, 2022 By Jeff Shepard

A wide range of sensor technologies is required to support augmented reality and virtual reality (AR/VR) systems. Today’s AR/VR implementations are mostly focused on visual and audio interfaces and rely on motion tracking and listening/voice recognition sensors. That’s beginning to change with new types of sensors and various types of haptics being introduced. This FAQ begins by reviewing several of the basic position sensing technologies used in current AR/VR systems, presents a proposed AR/VR system for neurorehabilitation and quality of life improvement, reviews how thermal sensing technologies are being developed to provide more complete AV/VR environments and closes by looking at emerging haptics technologies for thermal- and touch-based feedback.

AR involves the creation of an environment that integrates the existing surroundings with virtual elements. In a virtual reality environment, the system creates the complete ‘reality’ and only needs to know about the person’s relative movements and orientation. A basic VR system uses an inertial measurement unit (IMU) that can include an accelerometer, a gyroscope, and a magnetometer. An AR system needs to know where the person is, but it also needs to understand what the person is seeing, what the person is hearing, how the environment is changing, etc. As a result, AR uses more complex sensing, beginning with an IMU and adding time-of-flight sensors, heat mapping, structured light sensors, etc.

Figure 1: Augmented reality that integrates existing surroundings with virtual elements requires more complex sensing than creating a completely virtual environment. (Image: Association for Computing Machinery)

Both AR and VR are immersive environments, especially for VR, complete immersion is necessary for a good user experience. The potential for motion sickness is a challenge when developing AR/VR systems. AR/VR devices must capture user movements quickly and accurately to avoid motion sickness. The IMUs and other motion sensor technologies must be highly stable with very low latencies. IMUs used in AR/VR systems combine an accelerometer, a gyroscope, and a magnetometer to facilitate error correction and quickly produce accurate results, enabling the system to track head movement and position.

In addition, both AR and VR systems can benefit from various forms of user feedback and interaction including gesture and voice recognition. Gesture recognition can be based on real-time video analysis (which can be energy and compute-intensive) or more advanced technologies such as e-field sensing, LIDAR, and advanced capacitive technologies. For a discussion of gesture recognition, see the FAQ “How can a machine recognize hand gestures?”

One key for AR environments is to accurately, quickly, and continuously present computer-generated images of the actual environment. Most AR headsets rely on one or more special types of imaging sensors including; time of flight (ToF) cameras, vertical-cavity surface-emitting laser (VCSEL) based light detection and ranging (LiDAR), binocular depth sensing, or structured-light sensors. Some use a combination of these sensors.

ToF cameras combine a modulated IR light source with a charged coupled device (CCD) image sensor. They measure the phase delay (time of flight) of the reflected light to calculate the distance between the illuminated object and the camera. Thousands, or millions, of these measurements, combine into a ‘point cloud’ database that represents a three-dimensional image of the surrounding area. A more recently developed technology, VCSEL-based LiDAR can produce higher fidelity point clouds. In addition, VCSEL technology is used in smart glasses and other wearable devices to produce more compact and lower power displays and gesture recognition systems.

Structured light sensors project a defined pattern of light (IR or visible) onto the surroundings. The distortion of the patterned light is analyzed mathematically to triangulate the distance to various points in the surroundings: the camera pixel data (a type of primitive point cloud) is analyzed to calculate the difference between the projected pattern of light and the returned pattern, taking into account camera optics and other factors to determine the distances to various objects and surfaces in the surroundings. While a single structured light sensor can be used, more accurate results are realized when two structured light sensors are used and their outputs are combined.

Additional sensors found in AR systems include directional microphones (which may be displaced in the future by bone-connection directional audio transducers), various biosensors such as thermal sensors, ambient light sensors, and forward and rear-facing video cameras. Plus, a wireless link is needed to download all of the sensor data and upload the video information for the real-time creation of an immersive and dynamic environment.

Neurorehabilitation

AR/VR technologies are being developed for a range of medical applications. In one instance, a system for immersive neurorehabilitation exercises using virtual reality (INREX-VR) is being developed based on VR technologies. The INREX-VR system captures real-time user movements and evaluates joint mobility for both upper and lower limbs, recording training sessions and saving electromyography data. A virtual therapist demonstrates the exercises and the sessions can be configured and monitored using telemedicine connectivity.

Figure 2: Hardware components of the experimental INREX-VR system for immersive neurorehabilitation. (Image: MDPI sensors)

The INREX-VR software was developed to use off-the-shelf VR hardware and is suited for use by patients suffering from a neurological disorder, therapists responsible for supervising the rehabilitation of patients, and trainees. The system can evaluate the emotional condition of the user through heart-rate monitoring and their level of stress using skin conductance. Facial recognition can also be implemented when the system is used with the VR headset.

Heating up future AR/VR

Devices that can add a thermal dimension to AR/VR environments are a ‘hot topic.’ Various thermal sensing and thermal haptics technologies are being pursued (Figure 3). Thermal sensing technologies proposed for AR/VR systems include thermoresistive, pyroelectric, thermoelectric, and thermogalvanic. While these sensors convert thermal energy into electric currents or voltages, the power levels are too low to support energy harvesting but are sufficient to enable thermal sensing. In particular, polymer composites and conducting polymers are expected to find use in future thermal sensors. The mechanical flexibility of polymers makes them especially suitable for wearable thermal sensors. Some of the proposed polymer-based thermos sensors can function in contact and non-contact modes, further increasing flexibility. Stability of performance and reaction delays resulting from thermal hysteresis are two challenges before thermos-sensors are ready for use in AV/VR systems.

Figure 3: A range of technologies are being developed to implement thermal sensing and thermal haptics in AR/VR environments. (Image: Advanced Functional Materials)

Thermal haptics or thermal stimulation is another emerging area of AR/VR that will provide an enhanced level of immersion with AR/VR environments. Thermal haptics are expected to rely on thermoresistive heaters or Peltier devices, which control the temperature or a target area but are more power-hungry than polymer-based thermal sensors. On the plus side, active thermal haptics may have faster response times than the proposed polymer-based thermal sensors. With thermal haptics, users will be able to feel the temperature of a virtual object and have a more realistic interaction with their environment.

Thermoresistive heaters may have lower hysteresis than Peltier devices, but thermoresistive heaters only provide heating, while Peltier devices can provide both heating and cooling. A key factor in developing thermal haptics will be the need for user controls that enable the selection of a range of temperatures that provide stimulation without causing discomfort or burns. New materials will be needed for wearable Peltier devices that are both flexible and lightweight. Materials development is a key activity to enable both thermal sensing and thermal haptics in future AR/VR systems.

Ultrasound haptics

Ultrasound-based haptics have been developed based on the precise control of a series of speakers that send out precisely-timed ultrasound pulses. The time differences are designed so that the ultrasound waves from the various speakers arrive at the same place simultaneously and create a pressure point that can be felt by human skin (Figure 4). The focal point where the ultrasound waves meet can be controlled in real-time and changed from moment to moment based on hand positioning or other factors.

Figure 4: The combined vibrations of the ultrasound waves at the focal point create a pressure point that can be felt by human skin. (Image: ultraleap)

A tracking camera serves the dual purposes of gesture recognition and provides the exact position of a person’s hand, enabling the focal point to be positioned at a specific spot on the hand. By controlling the motion of the focal point it’s possible to create multiple tactile effects. Haptic feedback is expected to use AV/VR systems beyond gaming, for example, system controls and interactions for shopping kiosks and vending machines, automotive interiors, building automation, and other areas.

Summary

Sensors are a key area of technology that enable compelling and immersive AR and VR environments. Due to the greater complexity of creating AR environments that stitch together the surrounding reality with virtual elements, AR sensor systems are more complex. New sensor modalities are being developed to increase the realism of AR/VR systems, including thermal sensors and thermal and touch haptics.

References

Consumer AR Could Have Saved Lives, Economy, Association for Computing Machinery
Emerging Thermal Technology Enabled Augmented Reality, Advanced Functional Materials
Flexible Virtual Reality System for Neurorehabilitation and Quality of Life Improvement, MDPI sensors
Sensors for AR/VR, Pressbooks
Touch is going virtual, ultraleap
GMW, Magnetometers: A Comprehensive Guide

 

 

You Might Also Like

Filed Under: Applications, AR/VR, FAQ, Featured, Sensor Tips Tagged With: FAQ

Primary Sidebar

EE Engineering Training Days

engineering

Featured Contributions

Five challenges for developing next-generation ADAS and autonomous vehicles

Robust design for Variable Frequency Drives and starters

Meeting demand for hidden wearables via Schottky rectifiers

GaN reliability milestones break through the silicon ceiling

From extreme to mainstream: how industrial connectors are evolving to meet today’s harsh demands

More Featured Contributions

EE Tech Toolbox

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE Learning Center

EE Learning Center
“ee
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.
“bills
contribute

R&D World Podcasts

R&D 100 Episode 10
See More >

Sponsored Content

Advanced Embedded Systems Debug with Jitter and Real-Time Eye Analysis

Connectors Enabling the Evolution of AR/VR/MR Devices

Award-Winning Thermal Management for 5G Designs

Making Rugged and Reliable Connections

Omron’s systematic approach to a better PCB connector

Looking for an Excellent Resource on RF & Microwave Power Measurements? Read This eBook

More Sponsored Content >>

RSS Current EDABoard.com discussions

  • Why Use Virtual Serial Ports and Serial Port Assistants in Communication?
  • Right Half Plane Zero
  • dc-dc converter in series
  • Single ended measuring ports and balanced antenna
  • Thermal modelling of repetitive power pulse

RSS Current Electro-Tech-Online.com Discussions

  • Fun with AI and swordfish basic
  • Microinverters and storeage batteries?
  • FFC connector white
  • Is AI making embedded software developers more productive?
  • Can I make two inputs from one??
Search Millions of Parts from Thousands of Suppliers.

Search Now!
design fast globle

Footer

EE World Online

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDABoard Forums
  • Electro-Tech-Online Forums
  • Engineer's Garage
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips

EE WORLD ONLINE

  • Subscribe to our newsletter
  • Teardown Videos
  • Advertise with us
  • Contact us
  • About Us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy