This post is part of an ongoing monthly series of posts that will focus on our current efforts in the Museum’s Science Bulletins team to create and test prototypes of Hall-based digital interactions using AR and VR using our scientists’ digital science data, and to share some of the lessons we learn along the way.
After iterating our AR Shark in October (more here), learning at least one approach towards turning scientific data into a digital interactive, we shifted focus to a different set of biological data. We created a digital interactive based on a scan of a creature that, in comparison with a cuddly shark, was so terrifying it caused the occasional visitor to scream and rip off their virtual reality gear.
By November we started to fall into an effective rapid prototyping routine: pick a set of digitized data, work with a scientist and our crack team to bring it into an AR or VR device, share the interactive prototype in our Halls with the general visitors, and evaluate their response (and, if desired, iterate again and again until we learn something useful and are ready to start from scratch with something new). Friday morning is set aside for public sessions; Friday afternoons for processing what we learned that week.
In November we chose to focus on insects and to explore what we might learn in virtual reality. Using HTC Vive, we created a virtual weevil, placed within a virtual orange grove, that offers visitors the opportunity to explore insect respiration (weevil’s have no lungs!) through click on the critter and taking it apart.
As this was just a 2-3 minute long prototype, albeit an effectively engaging and immersive one, we didn’t fill in the details for every transition. For example, the visitor begins at the human scale, looking around the grove, spotting a tiny weevil on a leaf. However, upon clicking the leaf, the visitor is shrunk down to the insect scale, without much warning nor explanation. As a result, many a poor visitor, looking away from the weevil during the transition, thus unaware they’d changed scales, would turn their head back to find a monster looming overhead, as if poised to attack and devour them. Not our intention, for sure, but their shrieks of delight certainly attested to the deep immersion one can achieve within even a roughly sketched VR experience.
Our experience with the VR Weevil lead me to reflect upon what makes VR so engaging and how that might related to its learning outcomes. My current suspicion is that, with VR, engagement comes first, as the foundation, creating the content for learning. So what makes VR so engaging? VR engagement comes through offering visitors an experience that combines being both (a) embodied and (b) immersive. This is about both place – a location other than one’s own – and self – that you yourself are the subject within that space.
(a) Embodied means you feel like you’re inside the experience, using aspects of your body and your senses, as opposed to experiencing something from the outside. That’s the difference between watching a movie, and being in the movie; playing with a game controllers, versus being the game controller. Users of the VR Weevil often used language that described its embodied nature.
(b) Immersion happens when what you currently see, hear and/or touch is replaced with sights, sounds, and objects that are someplace else (whether real or virtual).
- With immersive views, that can mean a 360 field of view, where to see everything on view you need to turn your head, or walk around.
- With immersive audio, that could mean having sound effects, music, and 360 audio (where sound is locked to a location in virtual space).
- With interactivity, that means more than just touch. It means being able to affect the world around you, perhaps with a gaze, or the sound of your voice.
When there is engagement, when the visitor feels like “they are there” (with equal emphasis on both “they” and “there”), all sorts of learning can happen. When visitors felt engaged by their ability to walk around, view, and interact with a virtual weevil, their curiosity was peaked, and learning could begin. And while 2-3 minutes is a rather short period of time, the vast majority of visitors left the experience able to report something new they had learned about the vast differences between human and insect respiration.
Some (and far from all) of the key lessons we took away from this round of prototyping:
- IMMERSIVE encounters support engaged learning. Few if any of the students understood in advance they were about to encounter a giant weevil. But once they did, the immersive encounter generated a “need to know” and most picked up the educational content provided by the experience (about how insects breathe and the structure of weevil’s wings).
- EMBODIED interactions increase engagement. The ability to project one’s body into an experience – in the case of the VR Weevil, through virtual hands, and the ability to walk around the 360 scene – increased visitor’s connection with the virtual world and its content.
- BRIEF engagements can be still be satisfying when they are intense. The VR Weevil was designed to last no more than three minutes. Yet even after waiting up to 30 minutes for a turn, children responded as if it had all been worth it. We surmise the intensity of the immersion contributed to their satisfaction.
- IMMERSION can be isolating. As one child reported, “Most people playing a video game are outside. Here you are inside the game.” Another child agreed, reporting, “I was in another world, not just the museum.” The immersive nature of the experience was exciting for these children, but if we aim to deepen their connection with the Museum’s spaces and content, and maintain their connection with the people in their social group, this a challenge we must confront moving forward.
Come back soon to see some of what we did, and learned in December.