Devices that can detect and convey sense of touch may have applications in telemedicine
UT Dallas researchers are extending the borders of virtual reality, going beyond virtual spaces in which people can see and hear each other to an environment that adds the sense of touch.
The technology would make it possible for physical therapists, for example, to work with patients in other locations. When a patient pushes down on a device, a doctor’s device in another location would also move down with the same force, as if the patient were physically pressing the doctor’s hand.
Professors in the Erik Jonsson School of Engineering and Computer Science are creating a multimedia system that uses multiple 3-D cameras to create avatars of humans in two different places, and then puts them in the same virtual space where they can interact.
In traditional telemedicine, a doctor and patient both appear on the same screen and are able to talk, but they are not in the same physical space.
“With in-home rehabilitation, doctors ask a patient if he or she has done their exercises, but the patient may not be doing them correctly,” said Dr. Balakrishnan “Prabha” Prabhakaran, professor of computer science at UT Dallas and a principal investigator of a $2.4 million project funded by the National Science Foundation to create the system.
“It is one thing for a patient to say he or she did their exercises, but it is another to watch them in action, feel the force exerted, be able to correct them on the spot and get immediate response.”
With large amounts of data, such as tracking images or movement, there could be significant lag time or delays in transmission. The grant funds creation of the algorithms and software needed to transmit the data through the internet in real time. There are four major areas of this system under research by experts in the Jonsson School.
Haptic Devices
Haptic devices are pieces of equipment with resistance motors that apply force, vibration or motion to the user to provide feedback. For example, touching a virtual stone with a haptic device would feel hard, while touching a virtual sponge would provide less feedback and feel more pliable.
If both doctor and patient have haptic devices in his or her physical environment, the applied force can be sent to the other person. A doctor could feel the strength of a patient’s muscle, for example.
“Each device sends lots of data and combining that information in real time is a big challenge,” Prabhakaran said.
Prabhakaran has expertise in multimedia systems and using haptic devices in real time.
Teleoperation and Control
Anyone who has used a service such as Skype has likely experienced a delay in communication – suddenly words get lost or are slow to transmit. A similar effect could happen with haptic devices.
“We absolutely do not want instability,” Prabhakaran said.
Dr. Mark W. Spong, dean of the Jonsson School and holder of the Lars Magnus Ericsson Chair in Electrical Engineering and the Excellence in Education Chair, is a leading researcher in control and teleoperation – operating of machines at a distance. He is developing techniques to eliminate instability in communicating the data from the haptic devices over the network.
3-D Data Compression
To minimize the amount of data that needs to be exchanged, sophisticated algorithms need to be created. That’s where Dr. Xiaohu Guo, associate professor of computer science at UT Dallas and a project co-principal investigator, comes in. He’s an expert in computer graphics, animation and modeling.
Guo is refining techniques to not only allow the data between haptic devices to be transmitted over the network more efficiently, but also creating 3-D visual images of original movements in real time.
“We do not only want the person to be moving the device, we want them to have a visual feel of what the movement is causing,” Prabhakaran said.
Guo has had success transforming large amounts of data using what is known as spectral transformation techniques. These techniques rely on manifold harmonics to first transform 3-D images into points that represent the surface of an object. The data is then compressed into a smaller form that can be sent faster over networks.
Body Sensors
People using this platform would use body sensors similar to those installed in smartphones that can tell whether the user is looking at the device in portrait or landscape views.
“If we put body sensors on the patients, then his or her movements can be tracked with high accuracy,” Prabhakaran said. “The advantage of the sensor is the data that is generated is only a few bytes large, so it is easily transmitted over the network.
“You need a 3-D model to provide visual perspective, but if you are dealing with a lousy network and can not have consistent visual perspective, the body sensors could provide that information.”
Dr. Roozbeh Jafari, assistant professor of electrical engineering at UT Dallas and a co-principal investigator of the project, is an expert in cyber-physical systems. He has built wearable computers for monitoring different aspects of human health, behavior and thought, and is developing sensors for this project.
Researchers at the University of California, Berkeley and the University of Illinois at Urbana-Champaign are working on other aspects of the system, such as refining the overall user experience and coordination of the cameras used to visually capture the movements and interactions. Rehabilitation specialists at the Dallas VA Medical Center will test the system on patients.
While the main goal of the research, which is about halfway complete, is telemedicine, other applications include dance instruction or any type of education in which people need to be in the same space, Prabhakaran said.
Original release: https://www.eurekalert.org/pub_releases/2013-02/uota-udr020513.php