With an eye toward a not-so-distant future where some people spend most or all of their working hours in extended reality, researchers from Rice University, Baylor College of Medicine and Meta Reality Labs have found a hands-free way to deliver believable tactile experiences in virtual environments.
Users in virtual reality (VR) have typically needed hand-held or hand-worn devices like haptic controllers or gloves to experience tactile sensations of touch. The new “multisensory pseudo-haptic” technology, which is described in an open-access study published online in Advanced Intelligent Systems, uses a combination of visual feedback from a VR headset and tactile sensations from a mechanical bracelet that squeezes and vibrates the wrist.
“Wearable technology designers want to deliver virtual experiences that are more realistic, and for haptics, they’ve largely tried to do that by recreating the forces we feel at our fingertips when we manipulate objects,” said study co-author Marcia O’Malley, Rice’s Thomas Michael Panos Family Professor in Mechanical Engineering. “That's why today's wearable haptic technologies are often bulky and encumber the hands.”
O'Malley said that's a problem going forward because comfort will become increasingly important as people spend more time in virtual environments.
“For long-term wear, our team wanted to develop a new paradigm,” said O'Malley, who directs Rice’s Mechatronics and Haptic Interfaces Laboratory. “Providing believable haptic feedback at the wrist keeps the hands and fingers free, enabling 'all-day' wear, like the smart watches we are already accustomed to.”
Haptic refers to the sense of touch. It includes both tactile sensations conveyed through skin and kinesthetic sensations from muscles and tendons. Our brains use kinesthetic feedback to continually sense the relative positions and movements of our bodies without conscious effort. Pseudo-haptics are haptic illusions, simulated experiences that are created by exploiting how the brain receives, processes and responds to tactile and kinesthetic input.
“Pseudo-haptics aren’t new,” O’Malley said. “Visual and spatial illusions have been studied and used for more than 20 years. For example, as you move your hand, the brain has a kinesthetic sense of where it should be, and if your eye sees the hand in another place, your brain automatically takes note. By intentionally creating those discrepancies, it’s possible to create a haptic illusion that your brain interprets as, ‘My hand has run into an object.’
“What is most interesting about pseudo-haptics is that you can create these sensations without hardware encumbering the hands,” she said.
While designers of virtual environments have used pseudo-haptic illusions for years, the question driving the new research was: Can visually driven pseudo-haptic illusions be made to appear more realistic if they are reinforced with coordinated, hands-free tactile sensations at the wrist?
Evan Pezent, a former student of O’Malley’s and now a research scientist at Meta Reality Labs in Redmond, Washington, worked with O’Malley and colleagues to design and conduct experiments in which pseudo-haptic visual cues were augmented with coordinated tactile sensations from Tasbi, a mechanized bracelet Meta had previously invented.
Tasbi has a motorized cord that can tighten and squeeze the wrist, as well as a half-dozen small vibrating motors — the same components that deliver silent alerts on mobile phones — which are arrayed around the top, bottom and sides of the wrist. When and how much these vibrate and when and how tightly the bracelet squeezes can be coordinated, both with one another and with a user’s movements in virtual reality.
In initial experiments, O’Malley and colleagues had users press virtual buttons that were programmed to simulate varying degrees of stiffness. The research showed volunteers were able to sense varying degrees of stiffness in each of four virtual buttons. To further demonstrate the range of physical interactions the system could simulate, the team then incorporated it into nine other common types of virtual interactions, including pulling a switch, rotating a dial, and grasping and squeezing an object.
“Keeping the hands free while combining haptic feedback at the wrist with visual pseudo-haptics is an exciting new approach to designing compelling user experiences in VR,” O’Malley said. “Here we explored user perception of object stiffness, but Evan has demonstrated a wide range of haptic experiences that we can achieve with this approach, including bimanual interactions like shooting a bow and arrow, or perceiving an object’s mass and inertia.”
Study co-authors include Alix Macklin of Rice, Jeffrey Yau of Baylor and Nicholas Colonnese of Meta.
The research was funded by Meta Reality Labs Research, and Macklin’s work was supported by a National Science Foundation training grant (1828869).
- Peer-reviewed paper
“Multisensory pseudo-haptics for rendering manual interactions with virtual objects” | Advanced Intelligent Systems | DOI: 10.1002/aisy.202200303
Authors: Evan Pezent, Alix Macklin, Jeffrey M. Yau, Nicholas Colonnese and Marcia K. O’Malley
DESCRIPTION: Researchers from Rice University, Baylor College of Medicine and Meta Reality Labs have demonstrated a new hands-free approach to convey realistic haptic feedback in virtual reality. Their “multisensory pseudo-haptics” technology uses a combination of headset visuals and tactile feedback from a wrist bracelet to convey sensations of touch. (Video courtesy of MAHI Lab/Rice University)
- Image downloads
CAPTION: Tasbi, a mechatronic bracelet, can deliver hands-free tactile feedback in virtual reality (VR). Combining that feedback with pseudo-haptic visuals from a VR headset allowed researchers from Rice University, Baylor College of Medicine and Meta Reality Labs to create virtual haptic experiences that users found believable. (Image courtesy of MAHI Lab/Rice University)
CAPTION: Illustration of the components in the Tasbi tactile bracelet. (Image courtesy of MAHI Lab/Rice University)
CUTLINE: Marcia O’Malley is Rice University’s Thomas Michael Panos Family Professor in Mechanical Engineering, professor of computer science and of electrical and computer engineering, and director of Rice’s Mechatronics and Haptic Interfaces Laboratory. (Photo by Jeff Fitlow/Rice University)
- Related stories
Rice tapped to develop 3D-printed ‘smart helmets’ for the military – Nov. 10, 2021
Houston Methodist, Rice U. launch neuroprosthetic collaboration – April 6, 2021
Can you feel what I’m saying? – Oct. 29, 2018
Tactile feedback adds ‘muscle sense’ to prosthetic hand – May 30, 2017
Gentle vibe designed to give docs smoother moves -- Sept. 6, 2016
- About Rice
Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation’s top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 4,552 undergraduates and 3,998 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 1 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger’s Personal Finance.