This post is part of an ongoing monthly series of posts that will focus on our current efforts in the Museum’s Science Bulletins team to create and test prototypes of Hall-based digital interactions using AR and VR using our scientists’ digital science data, and to share some of the lessons we learn along the way.
This past year we’ve been exploring how our eyes, and sometimes our ears, can be be invited to play Let’s Pretend: imagine you are seeing a CT scan of a shark in front of you, or imagine you are hearing the HVAC in the Big Bone Room. Our hands, so far, have been left out of this virtual party: imagine you can touch the shark, or pick up that dinosaur fossil. We’ve used HTC Vive’s controllers to manipulate things – imagine you can click here to make the weevil turn transparent – but we haven’t had a tool that invites visitors to (pretend to) touch augmented objects with their own hands.
That is, until now.
One of the products coming out of this year’s Consumer Electronics Show was the Holocube, from Merge:
We knew its developers (a chunk of the team that worked with us on such augmented reality experiences like MicroRangers now works at Merge). When we received our developer version, and access to the SDK, we were excited to learn what we might find if we took the same digital specimens we’ve been porting into such platforms as Hololens, Tango, and Vive into something designed for a visitor to hold in her hand.
The cube is designed to work with their Merge VR viewer – a snazzy, museum-friendly Google Cardboard-style device – but can also work with an unadorned mobile device. With a nice weight in your hand, and slightly squishy like the free swag you often find at a conference, the cube offers a visual cue to the mobile device’s camera. The app decides what to do with what it sees. In other words, the Holocube is dumb – just a collection of prettified QR codes – with all the intelligence residing in the app. And that’s what makes it so compelling – the technology is invisible to the user (like the paper cards in Disney World’s Sorcerers of the Magic Kingdom). And code can always be updated, the possibilities only limited by our imagination (and resources).
We started with our mako shark and created two different experiences. The first surrounds the cube with a rock column, which the user can rotate or turn over, as the shark ominously circles round. In this example, the user is not holding the shark but rather using the Holocube as a device to control the its movement. In the second experience, the cube IS the shark – and invites the visitor to play with it, as one would with a wooden block. You can turn the shark upside down, move it in the air like a toy to eat your friend, or move it through the camera to reveal the layers within the CT scan. We also added some touch features – click on the screen (not the cube, which would probably be better) to watch the jaws open and close.
The third experience comes from our recent youth course on microfossils. The youth went into our collections and researched some previously-unstudied forams (the size of a grain of sand). One of their CT scans was turned into a digital specimen you can hold and, with a click, look inside as it separates into two halves. And the final experience is a bat skull, which like the previous two digital specimens you can observe and interact with through physical manipulation.
Below is a short video of all four, on my desk:
It took just a few days to code the app but, once it was up and running, we took it out to the Hall of Biodiversity, where we just happen to have the mako shark overhead and a bat on the wall. Located between the two, we set up an iPad on a stand and invited passers-by to “hold a shark in their hand.”
After months of anxiously handing over devices to children that cost hundreds, if not thousands, of dollars, it was quite a relief to watch them fight over a block of foam. And yes, people loved it. When you work with a new piece of technology, you need to spend time and energy learning how to operate it. But everyone knows how to “operate” a block. It’s design is an invitation to play, and that’s what people did. They picked up the Holocube and marvelled at the digital specimens in their hand. And while they played with the specimen or its animations we offered facilitation that connected the toy in their hand back to the scientist who produced it, the tools they used to create it, and the research questions they used it to explore.
We tried other directions as well – like accessing the front camera, rather than the back, so you could see yourself with the object, as well as making a smaller one-cubic-inch cube (thank you 3D printer!) to see if children preferred the shorter distance. But after observing and interviewing around 150 people, here are some of the key lessons we took away from this round of prototyping:
- HANDABLES ARE COMPELLING: Okay, it’s not a real word (at least not yet) but visitors LOVE “handables” – augmented objects you can hold in your hand.
- HANDABLES ARE INTUITIVE: It was very satisfying to offer visitors an experience with a high level of innovation but a low learning curve to master, as its interaction design intuitively builds on visitor’s prior knowledge of working with blocks.
- PLAY IS ENGAGING: As much as visitors enjoyed the moment of designed discovery – the shark swimming around your hand, the microfossil that opens – they were equally engaged, if not more so, with their ability to simply explore the specimen through non-directed play.
- UNADORNED ASSETS ARE EDUCATIONAL: Offering the object on its own, with context provided by live facilitation, provided visitors with a direct line to achieving our intended learning objectives.