Oct Update: Prototyping Biological Data Interactives through an AR Shark

This is the first in a new monthly series of posts that will focus on our current efforts in the Museum’s Science Bulletins team to create and test prototypes of Hall-based digital interactions using AR and VR using our scientists’ digital science data, and to share some of the lessons we learn along the way.

In late September and October we began to work together, as a new team in the Museum’s Science Bulletins department, around a CT scan of a Mako shark. How could we turn this CT scan into an engaging interactive? How could we integrate it into a Hall experience? And could we evaluate the prototype to learn what it might teach us about using digital ways to connect visitors with science data?

First, we turned the shark into an animation – it could swim, it could turn, it could bite. We designed it for our Hall of Biodiversity, with its static, overhead model of a Mako shark that was just crying out to be augmented. (It turns out, when the dioramas come alive at night, that’s what they all say to the security staff: “Augment me!”). And we worked with two AMNH scientists (who had supplied us with the scan, with John G. Maisey pictured below) to make sure the ways we modified it were scientifically accurate.

Untitled

We created one version in Google Tango, a tablet device that one can hold up to see the shark floating in the air and, through touching the screen, make it do tricks. We created a second version in Microsoft’s Hololens: look up at the physical shark, watch the CT scan augment it, then interact with your own copy (depending on the iteration, through hand or voice commands).

The arc of work began to come into focus: rapid prototype, test with the public, process the lessons learned, then iterate for the next step. Our first public engagement took advantage of our booth at the MakerFaire, in Queens. We offered two AR Sharks on the Hololens, as you can see below:

Untitled

Returning to the Museum, we began offering it to visitors passing through the Hall. We found that we could turn CT data into an engaging AR interactive (visitors particularly loved the animations), that visitors enjoyed both Tango and Hololens experiences (but in different ways), and, after a number of iterations and public testing sessions, we gathered lots of data (both qualitative and quantitative) to inform our understanding about this type of public engagement.

Some (and far from all) of the key lessons we took away from this round of prototyping:

  • WE can leverage science assets. Research-quality CT scans can be turned into interactive, public-facing digital assets.
  • VISITORS love it. Random visitors within the Hall found the experience of viewing and interacting with an augmented shark engaging, exciting, and a fresh way to experience content within the Museum.
  • WEARABLES work. Visitors were comfortable putting on a wearable device.
  • AR can enhance Hall content. The AR shark helped visitors make a connection with the shark model hanging overhead.
  • AR need not isolate visitors. While the AR Shark was a single-user experience, the visitors’ social group often observed and made comments.
  • VISITORS are willing to learn something new in order to experience something new. The innovation delivered, however, has to be commensurate with the effort put in.

(Personally, I found this last observation to be quite profound, and I explored it in this post: A Suggested Model for Future-proofing Digital Interactives)

Come back soon to see some of what we did, and learned in November.

About Barry

Innovating solutions for learning in a digital age.
This entry was posted in From My Work and tagged , , , . Bookmark the permalink.