The Quest to Create Utterly Normal Virtual Reality Experiences
In his new book, Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do (W.W. Norton & Co., 2018), Jeremy Bailenson argues that virtual reality (VR) has the potential to transform work — from the way in which we train for difficult assignments to how we communicate with others.
As founding director of Stanford University’s Virtual Human Interaction Lab, Bailenson has been at the forefront of developing immersive VR simulations and bringing them from the academic world to the real world. Bailenson, who is also the Thomas More Storke Professor in the Department of Communication at Stanford, is a cofounder of STRIVR, an immersive technology training company based in Menlo Park, California, whose clients include corporations such as Walmart, HTC, and Microsoft, and professional sports organizations such as the National Football League.
MIT Sloan Management Review spoke with Bailenson about how VR is being used for training, the challenge of creating intimacy via VR, and why meetings conducted via avatars could be more effective than those that take place via Skype. Freelance journalist Frieda Klotz conducted the interview, and what follows is an edited and condensed version of their conversation.
MIT Sloan Management Review: Why is now such a key time for VR?
Bailenson: I’ve been doing this for 20 years, and up until very recently, the biggest VR company employed maybe 20 engineers. What you’re seeing now is that the tech giants are getting involved — Samsung and Facebook, Sony and Google — so you’ve literally got hundreds of engineers working on identifying problems and coming up with solutions.
At the same time, the technology is becoming cheaper, and the quality has drastically improved. Now we’ve got VR headsets, or head-mounted displays, that cost hundreds of dollars instead of tens of thousands. All the tech parameters — things like low latency, meaning minimal delays, and high update rate, meaning speed, and accurate tracking, meaning the way VR can measure body movements, along with high field of view and image resolution — are drastically improving. So this is a very special time simply due to the availability and quality of the hardware.
How are companies using these technologies?
Bailenson: Right now, it’s clearly training. Virtual reality comes from the flight simulator: In the late 1920s, people wanted to learn how to fly, but they didn’t want to do that from a training manual. Flying is dangerous, and making mistakes is costly when you’re in the air. Hence you have a flight simulator.
The killer app for corporations has been “flight simulation” of the job — learning how to talk to people, how to operate dangerous equipment, how to do all these things in a virtual scenario where you’re encouraged to try and fail and get feedback when you make mistakes. Virtual reality is complete mental transportation: Your senses are entirely replaced by computer-generated images so that you don’t see or hear the real world. You’ve gone, say, to the bottom of the ocean, you feel like you’re in a completely different place. It’s the kind of thing that you do in short bursts, for 10 or 15 minutes. You have an experience, and you learn from it, and then you move on.
Last year, 150,000 Walmart employees trained in virtual reality, where they practiced their strategy for holiday rush. They learned about safety violations and scanning the room to find things they should be attending to. A number of corporations are using VR for soft-skill training — not just teaching workers to operate tools on a factory line, but to do interviews, to be able to hire and fire, to have those difficult conversations.
Do you see VR changing the way we work and communicate professionally?
Bailenson: My dream is to reduce the need to commute. The idea that we all get into metal boxes in a line and go to our offices, sit at our desks, pound on a computer for eight hours, and then get into the box again and drive home — it’s absurd.
But the problem with the videoconferencing tools that many companies use at the moment is that they aren’t good enough to get the magic handshake effect — that special social presence you get when you’re around other people. What I’ve been working on for 20 years are VR systems that give you this connection in avatar-based communication — things like eye contact and posture and facial expressions — to create the intimacy and nonverbal behavior that you get face to face.
Obviously, some meetings are so important you have to have them, and then there’s the watercooler effect, the role of informal conversations. But if we created conditions in which employees commuted two days a week instead of five, we would be doing a Herculean service for the world.
So how can that sense of in-person intimacy be achieved in virtual reality?
Bailenson: Over the past 17 or 18 years, I’ve conducted 40 or 50 experiments, developing ways to reproduce nonverbal behavior in a VR environment and then measuring its psychological impact. How much should we focus on eye contact as opposed to finger movements? What features are needed to produce what we call “interactional synchrony,” which is a psychological term for this very tight coupling of nonverbal behavior between people?
There’s no way we’re going to perfect all the nuance that you have face to face, so the question is, what do we prioritize? Of course, eye contact is critically important in facial expression, and the mouth and smiling have a bigger effect than other parts of the face. It’s kind of a brute-force approach to figure out what’s the best way to produce the sense of intimacy.
If I look at a video screen, I can still see a speaker’s eyes and mouth. But you’re saying good communication needs much more than that?
Bailenson: Yes. For a one-on-one conversation, you might be looking at an image of your friend on the screen, but then you’re not looking at the camera. The camera tends to be on the top of the screen or on the bottom of the screen so that your friend doesn’t make eye contact. She’s going to see you looking down or up because you’re looking at the image of the other person and not directly at the camera. So even in a one-on-one, there’s often not direct eye contact.
It’s once you scale up to multiparty conversations that these sorts of videoconferences really fail hopelessly. Imagine a typical conversation at work, where it’s two people at one table and a third person sitting on the other side of the room. Then think about how important that sidelong glance is when one person looks to another, what that signals. So much work in psychology shows how critically important these micro-movements are. There’s a huge literature on it, and yet they get smothered when these multiparty conversations are taking place [via video].
But with VR, when everybody’s wearing their goggles at an individual location, what they’re seeing is a room filled with avatars. The physics of what they’re looking at and where they’re standing is preserved properly, because in VR you’re tracking everybody’s movements and you’re all drawing them into one unified scene. And you can actually create things like eye contact.
I know it sounds counterintuitive, but the level of realism that can be preserved when expressions are being drawn on avatars is much better than when you’re using multiple cameras to film people in different locations. And VR can go beyond that. My lab has developed data visualizations of speakers’ heartbeats. When people see into their conversation partner’s physiology like this, it can make interactions even more compelling than those conducted in real life.
What about augmented reality?
Augmented reality is when you’re wearing, say, special glasses that allow light and sound to come in from the real world, but you’re putting a digital layer over it, which provides access to extra information. For example, if you’re in a room with a group of people, they could have name tags floating over their heads so you know who is who.
Augmented reality is still searching for its killer application. I don’t think we’ve seen it yet.
In your book, you say that VR will be worth $ 60 billion in the next decade. What does the business community need to know in terms of the risks involved in VR?
Bailenson: One of the more practical and mundane considerations is safety. VR takes users out of their environment, so you need to make sure the space in which they’re using it is safe. Cactus plants, plate glass tables, running fans — all those should be removed.
But the biggest challenge in introducing VR is to use it correctly. Often, companies gratuitously throw VR onto situations in which a normal computer would work just fine. Really good use cases start with a problem that needs to be solved where VR fits that problem.
I’m at Stanford, and many of the students here come up with ideas about forming startups. When people pitch ideas about what we should use VR for, typically, 19 out of 20 of them are ideas where I’ll say, “You know what, that’s a great idea, but video works really well for that,” or “The written word works just fine for that, and it doesn’t need to be in VR.” The challenge for companies is to leverage VR for things that are uniquely amazing in VR and that wouldn’t be better served by [two-dimensional] video.
If you think about holiday rush for Walmart — there are people everywhere, and it’s a super intense and arousing experience. When you’re in VR, your palms literally start sweating, and you have to look around and use the entire space. It’s a use case that only works in VR. Those are actually fairly rare. The key question is, is there a problem that you are having trouble solving?