Where virtual reality is taking us next

January 24, 2022

Whether or not the metaverse ever materializes, virtual reality is poised to change the way we live and learn.

Graphic by Violet Dashi
Graphic by Violet Dashi

Visit UM-Dearborn Assistant Professor Fred Feng’s lab and you’ll catch a glimpse of how virtual reality is already changing how researchers approach their work. Feng, who specializes in studying active, sustainable transportation and spends a lot of time thinking about how to make streets safer for cyclists and pedestrians, can study such topics now without always using actual roads. His cycling simulator setup does revolve around an actual bike. Its rear wheel is fixed to a stationary cycling trainer so research subjects can pedal without propelling themselves across the lab. While they do, they wear a VR headset that projects a virtual cycling experience zooming along at a speed commensurate with their pedaling. But there are some smaller, subtler features in Feng’s setup that are worth noting. A suspension platform simulates the side-to-side, forward-and-backward motion of normal cycling. He even recently added a fan that blows on you in proportion to your speed to simulate wind resistance — all to make it feel more like the real thing.

Feng’s effort to simulate the wind in your hair isn’t just for fun or flare. It’s essential to his research. “Typically, the reason we use virtual reality is because it provides an immersive environment compared with other kinds of simulations,” he explains. “So if we want to study how people would behave and react on a real road, it has to feel like a real road.” In fact, his focus on details that transcend the VR headset’s visuals and audio reveal a lot about the fundamental challenges of creating satisfying virtual reality experiences. Vision may feel primary in most people’s sensory experience of the world, which is why the initial wave of wearable VR technologies has likely focused on headsets. But it turns out all our senses quietly contribute to an experience that we would label as real. In fact, many people start to get nauseous after wearing a VR headset for a little while. This so-called “cybersickness” is a result of the moving VR visual experience being at odds with what our inner ear is telling us is happening to our actual stationary bodies. It’s a problem, Feng says, that still doesn't have a complete technological fix, though researchers think the solution may lie in simulating, and more importantly, coordinating multiple sensory experiences. Interestingly, this may be why the simple addition of a fan in Feng’s lab seems to reduce participants’ nausea in his cycling simulator.

This challenge of making VR multisensory is pushing the technology in some interesting new directions. Professor Bruce Maxim, who’s been working and playing with VR technology in some form or another for 15 years, says simulated touch is the next big frontier. The VR prototype glove that Facebook (now Meta) debuted recently when it announced its jump into the metaverse, would allow users to feel virtual objects using lots of sensors and tiny puffs of air. But Maxim says this trend has been around for years. For example, at a technology convention a few years back, he ran across a flak jacket for military games that gave you a little punch in the chest if you were hit by a bullet. “That never made it commercially. Believe it or not, people didn’t like the feeling of getting shot,” Maxim says. “And now we’re seeing lickable screens to simulate taste. They’re even trying to bring Smell-O-Vision back,” he says, referring to a mid-20th-century technology that was featured in a scent-enhanced movie and met a quick end

Maxim finds these technologies interesting, but he says it’s difficult to predict which combinations and applications might provide something approximating a “real” experience. The question he finds more worthwhile is whether users would find these experiences useful or entertaining. Because of VR’s current reliance on vision-based simulations, Maxim doesn’t see much difference between what you can do in a VR environment now and what you could do 15 years ago in a game like Second Life. A visionary game, Second Life allowed users to create an avatar and interact with other people logging in from around the world in simulated environments that were rendered on computer screens and navigated with keyboards, mice and other touch-based interfaces. Without VR goggles, Second Life was definitely a “flatter” experience. But undoubtedly, it provided millions of users a fulfilling alternative reality to live and play in. “Things are fun when they’re fun,” Maxim says. “And I’ve seen some incredible things, like rollercoaster rides and snowball fights with VR headsets. But how much replayability is there with that? Once you’ve been in a snowball fight for two or three hours, are you going to want to do that again?” In fact, if anything, Maxim says gaming is headed in the opposite direction, toward games that you can play with just your phone

Interestingly, one area where Maxim and Feng say VR could be more of a near-future proposition is education. The pandemic has taught us a lot about remote learning, including its limitations for interactivity. Maxim, for example, imagines a Second Life-like virtual reality classroom, where instructors can walk around, observe students doing group work, and write on virtual whiteboards. Students could break out into groups, talk out problems, and do virtual labs. On the latter front, one of the exciting new developments is the ability to convert engineering designs from CAD software directly into Unity, a platform that renders virtual reality environments. In fact, Feng says UM-Dearborn faculty are developing VR-related teaching materials for engineering students in the undergraduate and graduate human-centered design programs, where students could use VR headsets to interact with 3D virtual prototypes. “Another application I find interesting is using VR to visualize point cloud data,” Feng says. “So you could wear VR goggles and actually walk through the data, which would be a really interesting, immersive way to approach spatial data visualization.” 

Even without a fully fledged metaverse, VR’s potential to transform the way we live, work and learn looks interesting indeed.

###

Story by Lou Blouin. The VR cycling simulator in Feng's lab was developed by IMSE doctoral student Ayah Hamad as part of her master's thesis.