Here’s how researchers and developers are making VR experiences astonishingly real and useful in our daily lives.
When the web was still in its infancy, virtual reality — an immersive computer-simulated lifelike experience first developed during the era of mainframe computers — was a hot (if murky) topic of conversation.
VR’s 40-year journey from sci-fi to reality is amping up to finally hit the mainstream.
Investment in VR is reaching a fevered pitch as its potential stretches from entertainment to manufacturing, tourism and real estate industries. Intel acquiring Replay Technologies to deliver 360-degree VR replays to sports fans is just one example fitting Goldman Sachs Group’s estimate that VR will be an $80 billion market by 2025.
Tech giants like Facebook, Google and Samsung are also supporting cutting-edge research to help improve a user’s virtual experience. As researchers and developers at universities and in the private sector race to make VR meaningful for people’s lives, it’s a good time to explore several projects aimed at unlocking new VR experiences.
Coming out of the Cave
VR has helped change the way that many scientists do their job. Field work can be expensive, difficult and, in some cases, impossible.
For years, Brown University’s computer science department has had a VR room, known as “the CAVE” (for Cave Automatic Virtual Environment). The CAVE is an 8-foot cube with projected images on three walls and the floor. Using LCD glasses that provide stereo depth perception, tracking devices and positional audio, researchers could enter a world they wouldn’t otherwise be able to reach.
“There’s something about being in one of these 3D virtual reality rooms that’s super compelling. It’s like, ‘Whoa! Wow! Amazing!’” said David H. Laidlaw, professor of Computer Science at Brown University and head of the Visualization Research Lab and its new virtual environment, known as the YURT.
The University’s astronomy department used the CAVE to explore the surface of Mars. Archaeologists were able to relive past digs. The CAVE even provided a close-up look at the brain’s wiring and a realistic proxy for fetal surgery.
“We’ve had people looking at 3D fluid flow, like blood flow through coronary arteries,” explained Laidlaw. “We’ve had researchers looking at airflow over bat wings.”
While the Cave’s three-wall display, along with a virtual reality screen for a floor, provided a realistic environment for its users, Laidlaw wasn’t satisfied. He set about improving on the experience.
“The main thing we tried to do was get to the level of human perception in as many of the factors of the virtual reality experience as we could,” he said.
The result is Brown’s new virtual environment, the YURT (Yurt Ultimate Reality Theater), whose rounded walls and conical roof eliminate corners and flat walls, which interrupt the VR experience.
The YURT uses 69 cameras, each with 2 million pixels. The overall resolution of the room — taking into account that many of the camera images overlap — is still well over 100 million pixels.
“The resolution of the screens in the YURT now is about the same resolution as your eye,” he said, noting that it matches the level of brightness and contrast humans can distinguish. “It’s like a retina display. You don’t really need to go beyond that because humans can’t detect it. We are completely saturating the human perceptual ability.”
In addition to the limitations provided by what humans can see, the YURT is also limited by the hardware. Right now, especially for video, the room is capable of displaying far more detail than most cameras can capture.
“Even cameras with multiple lenses, capable of capturing the whole sphere are still only doing it from the perspective of one viewer. For VR, we need the whole 3D world to be defined,” said Laidlaw.
“If you can imagine watching a real live football game in the YURT and being able to walk around on the field, between the players and dodge them, we don’t have that yet. It would be cool, though — especially because you wouldn’t have to worry about them hitting you. You couldn’t feel it.”
By the time image capture is able to provide the view of a football game that Laidlaw described, it’s possible users might be able to feel the players running into them. Nullspace, a technology company founded by three University of Rochester undergraduates, is perfecting a haptic suit that allows VR users to get tactile feedback from their virtual worlds.
Lucian Copeland and Jordan Brooks were seniors and Morgan Sinko was a junior when the trio first came up with the concept for the Nullspace suit.
“The original idea came from Morgan watching his brother trying to play a dodgeball game on Microsoft Kinect,” Brooks explained. “And he watched his brother get increasingly frustrated with it because he couldn’t interact physically with the balls in the game.”
The group attempted to design a tool that would incorporate a user’s sense of touch into the virtual world. The result was a jacket and gloves that include 32 independently-activated vibration pads, allowing users to feel sensations in their fingers, hands, arms, chest, abdomen and shoulders.
Nullspace says that the suit allows users to feel close to 120 different sensations. This haptic feedback, Brooks explained, is similar to a vibrating cell phone.
“Only it’s a little more powerful, so you can feel it through your clothing,” he said.
Eventually, the suit could provide a wide range of other sensations, including the ability to model the weight of a virtual object. For now, the suit is based on vibration feedback.
The initial reaction to the suit proved that there’s a lot of interest in being able to interact in virtual environments.
“We took it to several Maker Faires and competitions, and everywhere we went, there was tremendous excitement,” Brooks said. “At our first Cornell Cup, there was a line of people wanting to try the suit that wrapped all the way around the convention hall.”
The applications for gaming are obvious, but there are other areas where the Nullspace suit can pay dividends.
“We’ve talked to one or two people at NASA about training astronauts for space walks,” Brooks said. “We’ve talked to people at the VA about treating veterans with PTSD through immersion therapy. The suit lets you get in and immerse yourself in the experience, to let you relive it and process it in a healthy way.”
Brooks said the technology could even be used for remotely control robots.
“Our goal is to make haptic feedback accessible to everybody in virtual reality,” he added. “We want to give everyone the sense of actually being present in virtual reality, instead of just being a ghost.”
(Don’t want to be) All by Myself
Virtual reality can be a lonely business. Conventional wisdom indicates that once a user puts on the headset, the current world and the user’s real-world companions drop away.
The computer science department at New York University isn’t looking to advance VR technology, but rather what can be done with it when it becomes widespread. In essence, the researchers want to create the first VR social network by allowing multiple users to share the same experience.
The ongoing project, which combines social interaction with VR, is called the Holojam. It allows up to four users to share a 32- to 34-foot virtual space without being tethered by wires, as other multi-user systems require.
NYU Computer Science professor Ken Perlin, director of the University’s Games for Learning Institute, told UploadVR that he’s most interested in the impact the technology will have on future societies, when everyone uses VR as part of their daily lives.
From a virtual experience that rivals the capabilities of the human eye to a fully tactile immersion in the world, VR technology is becoming more powerful by the day. But as intense an experience as the technology can provide, it needs to be shareable before people will fully embrace it. This research will help ensure the “next big thing” is finally here to stay.