Skip to content


Virtual Human Interaction Lab: High-end Research on Presence and Immersion in Virtual Reality

Situated very conveniently near Stanford University’s historic Main Quad, Associate Professor Jeremy Bailenson and his research team built a remarkable new research facility, Virtual Human Interaction Lab (VHIL), where they conduct experiments on dynamics of user experience and social interaction in immersive virtual environments (VE). Last week, we had the opportunity to visit this exciting place with guidance of Cody Karutz, manager of the lab, who explained the lab’s technological infrastructure and perspectives on social research. Although Virtual Reality (VR) seems to be at the center of attention, defining VHIL only as a ‘VR Lab’ would not be fair, as it contains settings with many different user interface devices and immersive visualization technologies, such as Cave Automatic Virtual Environment (CAVE), optical motion tracking devices, and audio-visual/haptic feedback systems.

The leader of VHIL team, Professor J.  Bailenson, recently co-authored a book entitled Infinite Reality: Avatars, Eternal Life, New Worlds, and the Dawn of the Virtual Revolution with Distinguished Professor of Psychological and Brain Sciences Jim Blascovich. In their new book, Professors Blascovich and Bailenson present a very comprehensive analysis of the (history of) technologies surrounding VR paradigm with their social, psychological and cognitive aspects, well-supported with empirical findings from years of research experience. After visiting Professor Bailenson’s class ‘Virtual People’ at Stanford University and having the chance to attend a guest lecture by Professor Blascovich, the guided tour of VHIL was supposed to be the next logical step to grasp this magical realm. Indeed, strolling around the virtual VHIL and letting myself down the virtual pit, while wearing a Head-Mounted Display (HMD) with ambient sound as powerful as I can physically feel under my feet, was exactly the kind of experience I was looking for!

For more detailed information of current research projects at VHIL, click here.

At the entrance hall to VHIL, a Microsoft Kinect setting welcomes the visitors, tracking every movement we make in the 3D space that is Karutz’s office, and visualizing on stereoscopic display TV, as in 3D without the need for glasses. As Karutz explained during the tour, the lab is still in construction and there are more comprehensive technologies to be implemented.  Although the VHIL team has a large collection of, and years of experience with, various forms of user tracking technologies, Karutz stated that VR is a rapidly changing field, and the industry is pushing towards more comprehensive and accessible motion tracking devices such as Microsoft’s Kinect (or Wii Remote, Playstation Move etc). Nonetheless, VHIL’s main focus is on experiences of presence and copresence in immersive VEs, so the tour continues with the CAVE setting and VR rooms.

Everything at the VR lab is monitored and manipulated within an elaborate control room.  There are two separate rooms for testing immersive user experiences and tracking movement in physical space and translating it into the virtual space. The smaller tracking room has 8 cameras for optical tracking markers, while the large room has a more accurate tracking infrastructure. Although Kinect and other motion tracking devices are relatively accessible for home-use, immersive audio-visual systems and HMDs are still very expensive for the average user. Karutz relates this with the push from the industry and lack of development in end-user applications, which limits the design and production of affordable mass-produced devices. Although current HMDs are bound to cables to monitor location and movement of users, VHIL will be presenting the demo of their first wireless HMD this summer.

VHIL also contains a CAVE setting. At the first room, the two-walled CAVE renders 3D images on 70-inch diagonal 3D displays. The CAVE enables users to experience immersion and interactivity in VEs without the need for HMDs, while its capacity to deliver high motion sensitivity is lower than other systems.

In the larger multisensory tracking room, VHIL offers more accurate tracking and more immersive VR experiences, provided by a new HMD with wider and more realistic visual fields, tactile feedback on the floor which shakes with sound and movement in virtual space, and the 23 channel surround sound system which pushes the boundaries of realism in environmental simulation. Another very promising development at VHIL’s new premises is that the team now has two tracking rooms, which means they will be able to have multiple participants simultaneously and in their actual social contexts.

During Cody Karutz’s guided tour, we’ve had the chance to experience the VR environment by falling down trap pits in the virtual VHIL room, and chopping virtual trees. By using a customized haptic user interface device, originally designed for first person shooters, VHIL’s VR environment also offers study of interaction with new tools to affect virtual objects, such as cutting trees in a virtual forest to study how experiences in immersive VEs would change real-world behavior. Bailenson and his research team use these interactions and their real-world consequences as sources of data to observe how learning through immersive VE experiences could change people’s recycling and consumption behaviors. Their research also casts light on other aspects and uses of these technologies, such as avatar-based (transformed) social interaction, and collaborative working on clinical applications or building virtual spaces together.

VHIL Lab team claims that our future lives will incorporate more technologies of immersive virtual social interaction, which is a reasonable assumption for many recent use cases. Jeremy Bailenson and his colleagues are trying to understand this emerging world of VR and how it will affect societies. Their research on user behavior and experience in immersive VEs are inspirational for other forms of online worlds where participants collaboratively design and build their own communication environments, and a successful model for many research establishments such as RUC’s Experience Lab.

Posted in Blog, Video and Sound, Workshops and Seminars.

Tagged with , , , , .


5 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.

  1. Ron T Blechner says

    As soon as I see the wired, bulky VR headset, I sigh, because it’s like the industry has not learned anything in 20 years since VR was first around. People don’t want to use bulky gear.

    The same setup can be achieved with a sub-$200 set of HMD glasses and a Gyro-mouse. I was personally doing this with colleagues in 2006, and again last year.

    So I ask: What’s “remarkable” or “new” about this facility?
    I also am curious – the author writes extensively about the setup but doesn’t specify at all *what kind* of experiments will be done. What are the folks at this facility trying to examine, that hasn’t already been examined and published before. Simply sticking a HMD on someone and walking around a virtual environment is 20-year-old research. I’m sorry if that sounds harsh, but amnesia and lack of historical perspective is rampant in virtual reality / virtual worlds industry.

  2. Ates Gursimsek says

    Dear Ron Blecher. It is always good to be critical, and I appreciate your comments on my experience at VHIL. You may be right about your criticism on how we, as young researchers, are suffering from a historical amnesia (or maybe I do, personally). I should say that this “amnesia” is partly caused by “curiosity and enthusiasm”, as first-hand experience with such impressive multimodal settings is not always easy to obtain. My inspiration partly resulted from my own interest, as I have been looking forward to visit Professor Bailenson’s lab and learn about their new research facility for quite some time. As I’ve written, and included in the video, what’s particularly interesting about VHIL (to us) was the high-end immersive environment which provides not only audio-visual feedback but also haptic responses by the physical environment itself (and other interface devices). The level of immersiveness in our experience was definitely not something you could compare with a 200$ VR-google (and they’re still implementing wireless HMDs). As you know, experience of presence in immersive VR systems is closely related to how the environment is modeled for interaction and how the feedback is delivered to the user via a range of senses; which is clearly related to the design of the setting. Another significant affordance of the lab, as I’ve also mentioned above, is the possibility to observe multiple participants in various interaction contexts, gathering a multitude of (visualization, interaction and tracking) technologies in one research environment. I believe Bailenson and his colleagues did a successful job by designing this place specifically for the needs of their particular research field; which also serves as a fine teaching space (maybe I should’ve mentioned that). As Karutz stated, the lab is in the making and I will try to follow their works to see what’s “new” in their new labspace. However, what inspired me about the Lab was not only the technology itself, but VHIL’s positive approach in opening its doors to fellow researchers, and their welcoming attitude to share their experiences and settings. Now that I read your comment, I think I may have included more information on their research projects, instead of mostly providing links to their actual pages (which include their research objectives and results, as well as other reviews on the projects). This was probably because I was trying to communicate with colleagues from a variety of fields about the lab’s design and the observation settings; rather than an overview of VR research field. In fact, I specifically saved some of the reviews from VHIL’s research for another blogpost, which I was planning to write after I finish reading the book. Meanwhile, it is always good to discuss and learn. Thank you for initiating the discussion.

  3. Ron T Blechner says

    Hello Ates, thanks for replying. A few thoughts:
    “as first-hand experience with such impressive multimodal settings is not always easy to obtain. ”

    I’m skeptical what the extra gear – haptic feedback, 8 extra cameras, etc – actually does to improve immersiveness. I’ve read a number of studies that illustrate that people are sufficiently immersed even without HMDs. But even haptic feedback is not a new research field.

    I also noticed that the researcher in the video asked a lot of leading questions. Where’s the control group? Where’s the quantitative analysis? This qualitative, “isn’t this cool?” research is 20 years old, and again, it’s my belief that it’s on researchers to discover the kind of work that’s already be done, so that they can both advance the field and give credit to those who came before them.

    “Now that I read your comment, I think I may have included more information on their research projects, instead of mostly providing links to their actual pages”

    I don’t want to poo-poo this research – quite the contrary. I challenge you and I challenge researchers so that the research will move forward farther. But your opening to this blog post was about how this was “remarkable” and “new” – something you’ve yet to describe.

    If you want to impress me, or other researchers already in the VR/VW space, putting it in the context of what’s really new about this is key. Are there new conclusions the researchers came to based on quantitative study of participants? Were previous suggestions from earlier researchers able to be tested by newer, cheaper technology? Are there controlled experiments illustrating the value of haptic devices as part of an immersive experience? These are a few of the things that would be valuable to the field.

    Suggested reading:
    http://slfuturesalon.blogs.com/second_life_future_salon/2006/01/phillip_torrone.html

  4. ates says

    Dear Ron,

    Thank you for the interesting read, I’ll be looking more into innovative new designs such as this one. First, I need to make some clarifications: As you might have guessed by now, I am not associated with VHIL and their research projects, and the video I posted was not a research video; it was merely a demo that they’ve provided during the tour. It’s me with the HMD, and my wife behind the camera (only the last part is with another visitor). None of the questions were asked to obtain data, but only to learn more about the setting. Also, although I recognize the importance of knowing about this field, I am not a VR researcher (but I study collaborative design in VWs) and that is why I hesitated to discuss their findings on a blog post that is based on our reflections from a guided tour. Again, I may have overdone the tone of my writing as we were motivated about the visiting experience ourselves.

    But I always enjoy a challenge! So, here are some of their research that made me interested in visiting the new VHIL Lab:

    The reason why I find the research of this team interesting is not because they come up with cheaper or more useful devices and design ideas, but they are focusing on social implications of being present in an immersive VE. The VR project we discussed in detail during the tour was on environmental consciousness and how long-term playful experiences in VR could change RL behavior. Researchers asked their participants to try out the tree chopping experience, after which they followed up the participants and found significant changes in their consumption and wasting behavior. This project is supported by the Dept. of Energy, who seems to be planning to develop educational simulations and immersive training environments.

    Other projects that are conducted by Bailenson and his colleagues include quantitative analysis and on-site observations of ‘transformed social interactions’ in VR, where they investigate how designing user-tracking VR settings for new learning environments or collaborative building spaces should include knowledge of social and non-linguistic codes. In their previous VR lab, VHIL crew investigated limits of visual and behavioral realism, as well as characteristics of virtual agents/avatars, such as plasticity or attractiveness, that could affect cognition. I tried to compile some of their papers that would represent the research perspectives, although I do not think any of these studies were conducted in the new lab. You may have already read some of these, but I thought it would be nice to share some of the related readings from the research studies for fellow readers of this discussion. For more of VHIL team’s research studies, you can visit: http://vhil.stanford.edu/pubs/.

    - Bailenson, J.N., Beall, A.C., Blascovich,J., Loomis,J., and Turk, M. (2004). Transformed social interaction: Decoupling representation from behavior and form in collaborative virtual environments, Presence: Teleoperators and Virtual Environments, 13(4), 428-441
    (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.3.9760&rep=rep1&type=pdf)

    - Bailenson, J.N., Beall, A.C., Blascovich, J., Loomis, J., and Turk, M. (2005). Transformed social interaction, augmented gaze, and social influence in immersive virtual environments, Human Communication Research, 31, 511-537. (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.145.3896&rep=rep1&type=pdf)

    - Bailenson, J.N. (2006). Transformed social interaction in collaborative virtual environments, in P. Messaris and L. Humphreys (Eds.), Digital Media: Transformations in Human Communication (pp. 255-264). New York: Peter Lang. (http://vhil.stanford.edu/pubs/2006/bailenson-social-interaction.pdf)

    - Bailenson, J.N., Yee, N., Merget, D., and Schroeder, R. (2006). The effect of behavioral realism and form realism of real-time avatar faces on verbal disclosure, nonverbal disclosure, emotion recognition, and copresence in dyadic interaction, Presence: Teleoperators and Virtual Environments, 15(4), 359-372. (http://people.oii.ox.ac.uk/schroeder/wp-content/uploads/2007/01/face%20disclosure%20presence.pdf)

    - Bailenson, J.N., and Beall, A.C. (2006) Transformed social interaction: Exploring the digital plasticity of avatars, in R. Schroeder and A.-S. Axelson (Eds.) Avatars at Work and Play: Collaboration and Interaction in Shared Virtual Environments, (pp. 1-16), London: Springer. (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.145.4515&rep=rep1&type=pdf)

    - Yee, N., Bailenson, J.N.,Urbanek, M., Chang, F., and Merget, D. (2007). The Unbearable Likeness of Being Digital: The Persistence of Nonverbal Social Norms in Online Virtual Environments, CyberPsychology & Behavior, 10(1): 115-121. (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.119.9840&rep=rep1&type=pdf)

    - Yee, N., and Bailenson, J.N., and Rickertsen, K. (2007). A meta-analysis of the impact of the inclusion and realism of human-like faces on user experiences in interfaces, Proceedings of CHI 2007, April 28-May 3, San Jose, CA, pp. 1-10 (http://www.takebay.net/data/chi07/docs/p1.pdf)

  5. new york city video production company says

    bring the virtual world in reality.

You must be logged in to post a comment.