• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Electrical Engineering News and Products

Electronics Engineering Resources, Articles, Forums, Tear Down Videos and Technical Electronics How-To's

  • Products / Components
    • Analog ICs
    • Connectors
    • Microcontrollers
    • Power Electronics
    • Sensors
    • Test and Measurement
    • Wire / Cable
  • Applications
    • Automotive/Transportation
    • Industrial
    • IoT
    • Medical
    • Telecommunications
    • Wearables
    • Wireless
  • Resources
    • DesignFast
    • Digital Issues
    • Engineering Week
    • Oscilloscope Product Finder
    • Podcasts
    • Webinars / Digital Events
    • White Papers
    • Women in Engineering
  • Videos
    • Teschler’s Teardown Videos
    • EE Videos and Interviews
  • Learning Center
    • EE Classrooms
    • Design Guides
      • WiFi & the IOT Design Guide
      • Microcontrollers Design Guide
      • State of the Art Inductors Design Guide
    • FAQs
    • Ebooks / Tech Tips
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • 5G

How are we tapping into the potential of VR?

July 27, 2020 By John Koon

Virtual Reality (VR) technology was primarily used by video gamers to play 3-D games. Not anymore. The technology has changed so much over the years that now medical, sports, training, industrial, and many more applications use it. And it has significant business potential.

So what is VR?

Manufacturer STMicroelectronics has spent a great deal of time in researching the subject, defines VR as follows:

  • Virtual Reality (VR) – An immersive experience with a fully occluded view with content that is rendered (i.e., CGI) or captured video and/or images or combinations thereof.
  • Augmented Reality (AR) – A fully transparent headset or glasses through which the viewer will see the real world with superimposed information in the user’s field-of-view. The content is typically overlaid text, images, or graphics.
  • Mixed or Merged Reality (MR) – Here, there are some differences in definition. Still, a commonly accepted one is a device that combines a camera with a VR headset to deliver live video/image feed to the display. The “mixing” or “merging” is the integration of the real-world images from the camera with the rendered or captured video/images.
  • eXtended Reality (XR) – A fully transparent device/headset that delivers realistic, 3D, holographic-like content in the user’s visual field-of-view. Here virtual and physical objects are indistinguishable.

 

Figure 1: The VR hardware platform includes a high-performance MCU processor, memory, audio, and optic electronics integrated into a single chip or chipset. (Source: STMicroelectronics)

A VR hardware platform is a head-mounted display (HMD) or goggles with embedded electronics and VR software that interfaces with video, audio, and sensors to create an environment that a human being can interact with.

As shown in Figure 1, the VR hardware platform includes a high-performance MCU processor, memory, audio, and optic electronics integrated into a single chip or chipset. Because it is battery-powered, the designers must create it to be energy efficient yet high performance. In real-life applications, latency has to be kept to less than four milliseconds. Otherwise, the user will experience motion sickness because the video observed cannot catch up with the real environment as the user’s head turns.

What is the VR market potential?

According to ABI Research, a technology research and advisory firm, 35 million VR head-mounted displays (HMD) will be shipped in 2024, with market revenues reaching $25 billion. The revenue will come mostly from the consumer segment (media/entertainment), with notable enterprise revenues in education. Revenues from other segments are expected to grow in the future.

The SyncThink technology: Using VR for insight into the brain

Figure 2: The EYE-SYNC platform includes a wireless, Bluetooth-enabled, custom-designed VR goggles, a smartphone, and a tablet. (Source: SyncThink)

In the early 2000s, neurosurgeon Jamshid Ghajar MD, Ph.D., had an idea. He wanted to come up with a way to measure people’s attention span through eye movement objectively. Early in his studies, he found eye movement abnormalities to be very common in patients with concussion. A few years later, he presented these findings to the US Department of Defense, was awarded several grants to study the military population, and began developing EYE-SYNC technology (Figure 2).

In 2014, after years of studying military personnel returning from the Middle East with deficits after traumatic brain injury (TBI), Ghajar arrived at Stanford to study student-athletes. There, he met Scott Anderson, at that time the head of the Stanford Sports Medicine program, which oversees the medical care provided to student-athletes. In 2015, Anderson began using a prototype device based on EYE-SYNC technology in clinical trials. The trials examined how eye movements, brain trauma, and cognitive functions relate to one another (Figure 3). Soon after, Anderson integrated the technology into the clinical care of more than 900 athletes across 36 athletic programs. In 2016, EYE-SYNC technology was awarded its first FDA clearance as a Class II medical device.

Figure 3: The EYE-SYNC technology has given us a window to the brain. (Source: SyncThink)

Ghajar later founded SyncThink and commercialized the EYE-SYNC technology as a product in 2017 following 15 years of clinical research. The EYE-SYNC product has evolved into an award-winning digital brain health platform. The platform includes a wireless, Bluetooth-enabled, custom-designed VR goggles, a smartphone, and a tablet. Figure 3. Top academic, research, clinical, and sports organizations across North America use EYE-SYNC.

Our human brain interacts with its environment by predicting incoming visual information to determine how to react. Using the eyes to receive information, the brain tends to synchronize a person’s behavior with the information received constantly. If this synchronization is off by 0.25 seconds, a healthy brain will react a certain way. If the brain has a problem, it will respond differently. The difference between and the normal and abnormal reaction is called “variance” or “errors”. EYE-SYNC measures the variance. Over the years, SyncThink has developed a very comprehensive data set of eye signatures for clinicians to use in diagnosing brain functions.

“EYE-SYNC has given us a window to the brain. When a clinician gives the patient a set of instructions to follow, the patient will move his eyes accordingly. Using multiple built-in cameras, EYE-SYNC can detect the movement of the patient’s eyes. The captured data will then be compared to a set of well-defined eye signatures. With this new ability to compare findings, clinicians now have information that was once unavailable,” commented Anderson, Chief Clinical Officer of SyncThink, “It is indeed remarkable!”

You may also like:

  • CS40L25 haptic drivers
    Haptic drivers provide immersive touch experiences in PCs, wearables, automotive…

  • Tiny RGB projector handles AR/VR display tasks

Filed Under: FAQ, Featured, Microcontroller Tips Tagged With: FAQ

Primary Sidebar

EE Training Center Classrooms

EE Classrooms

Featured Resources

  • EE World Online Learning Center
  • CUI Devices – CUI Insights Blog
  • EE Classroom: Power Delivery
  • EE Classroom: Building Automation
  • EE Classroom: Aerospace & Defense
  • EE Classroom: Grid Infrastructure
Search Millions of Parts from Thousands of Suppliers.

Search Now!
design fast globle

R&D World Podcasts

R&D 100 Episode 8
See More >

Current Digital Issue

June 2022 Special Edition: Test & Measurement Handbook

A frequency you can count on There are few constants in life, but what few there are might include death, taxes, and a U.S. grid frequency that doesn’t vary by more than ±0.5 Hz. However, the certainty of the grid frequency is coming into question, thanks to the rising percentage of renewable energy sources that…

Digital Edition Back Issues

Sponsored Content

New Enterprise Solutions for 112 Gbps PAM4 Applications in Development from I-PEX

Positioning in 5G NR – A look at the technology and related test aspects

Radar, NFC, UV Sensors, and Weather Kits are Some of the New RAKwireless Products for IoT

5G Connectors: Enabling the global 5G vision

Control EMI with I-PEX ZenShield™ Connectors

Speed-up time-to-tapeout with the Aprisa digital place-and-route system and Solido Characterization Suite

More Sponsored Content >>

RSS Current EDABoard.com discussions

  • Unusual gap shape of ETD59 ferrite core?
  • Vco cadencd
  • WH-LTE-7S1 GSM module and SIM card problem
  • Effect of variable gain amplifier and LNA on the input RF signal's phase
  • Pic 16f877A Hex file

RSS Current Electro-Tech-Online.com Discussions

  • Very logical explanation on low calue C3
  • HV Diodes
  • intro to PI
  • Need help working with or replacing a ferrite tube
  • Help wanted to power an AC120v induction motor ( edited from Brushless motor - thank you @SHORTBUS= )

Oscilloscopes Product Finder

Footer

EE World Online

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDABoard Forums
  • Electro-Tech-Online Forums
  • Engineer's Garage
  • Microcontroller Tips
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips
  • Wire & Cable Tips

EE WORLD ONLINE

  • Subscribe to our newsletter
  • Lee's teardown videos
  • Advertise with us
  • Contact us
  • About Us
Follow us on TwitterAdd us on FacebookConnect with us on LinkedIn Follow us on YouTube Add us on Instagram

Copyright © 2022 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy