A Visit to the Future: oblong’s Mezzanine & Greenhouse

What you picture when you hear the phrase “Minority Report” says a lot about you.

For some it refers to the short story by science fiction writer Philip K. Dick. For others it refers to the Steven Spielberg/Tom Cruise blockbuster based less on the story than on a profound passion for imagining the near future.

And for many (as a simple search on YouTube will reveal) it refers to a different way of imagining human/computer interaction, as represented in the film, and is a common point of reference for interface designers as one shape of our possible future.

Now, the folks who designed the CONCEPT for the film have created a company, oblong industries, to make that future today. And yesterday a few folks from the museum and I traveled to their NYC demo location to get a tour.

First, to make sure we are on the same page, here is a clip of the concept in high action from the film Minority Report (to provide context, Cruise’s character is watching video on the computer screen of a predicted future in which a murder is committed; his job – learn the identify of the murderer to prevent the crime before it is committed; to his surprise, the murderer is himself):

oblong had two different sets of tools to show us: Mezzanine and Greenhouse. I will describe each below and then share a video I made of both in action.

Mezzanine is a tool set to support creative collaboration in a multitasking, digitally-connected Eden.

In the example below, there are three horizontal screens with two “corkboard” vertical screens on the side:

Untitled

ANYONE can control the content on the screen, at the same time, using different devices, and in different ways. You can download the iOs app below to your iPhone or iPad to drag the content around the screen, download it to your device, annotate it, and then send if back up to the screen.

Untitled

Or you can project from your Mac or Windows machine. You can write on a physical white board and then select to have an image capture sent to the screen for possible cropping and displaying. Or… bum ba bum… you can use the wand:

Untitled

The wand (and there can be more than one) allows you to select items on the screen to change their size, drag from one part of the displays to another, rotate and more. Just point, click, and drag or rotate. You can even use it to control machines projecting on the screen – such as clicking on links on a projected web page.

The integration of the wands, handhelds, laptops and more was mind-boggling and seamless. It offered a praxis for a way of working quite antithetical to the modes of interacting and connecting promoted by the one-board, one-pen, one-connected-laptop model we are forced to experience every day. Rather than centralizing control over what receives our attention in one person, Mezzanine democratizes it.

That is exciting and scary. I am hard pressed to think of a time I worked regularly with a group that could take full advantage of the expectation that we would be multitasking non-stop (paying attention to multiscreens, both private and public, with multiple shifting datastreams) and empowered, not hobbled, by the required deep digital integration (actually, what comes to mind is when I produced a simulcast of a live speech by Kofi Annan onto the web and into four different virtual worlds at the same time). But I do desire that and can easily imagine today’s youth expecting this level of multitasking, collaboration, and use of their digital savvy in the same way my 3-year old expects my laptop screen to respond to her touch. The tools of the future might be here but I’m not sure yet if we can handle it.

That said, I can see it being used in an amazing high tech room by a skilled facilitator, presenter or educator, integrating the mobile devices or the students or audience to interact with the content. Also, one location’s Mezzanine can be integrated with a Mezzanine at another location, taking multi-site programming to a new level.

Greenhouse was the more exciting for me of the two. Any reader of my blog knows I have been thinking recently a lot  about interactive science visualizations. I had seen videos but it was amazing to do it in person – using finger-level gestural controls of a data visualization. Greenhouse, as described on their web site, “enables creative coders and engineers to create and rapidly prototype spatial applications: multi-screen, multi-user, multi-device interfaces with gestural and spatial interaction.” If you watched me and just me, I’d look like a concert conductor gone mad.

And did I mention it’s free?

This is what Minority Report looks like and now I can say, having visited oblong industries, feels like. As you can see towards the end of the video (that’s my hand) I had a physical connection with the content of the computer through a gesture-based language that was simple and intuitive. It didn’t feel like a work in progress. It felt good. (As I write this now on my laptop but view it on my 27″ Thunderbolt Display, I almost expect to be able to use my fingers to move everything around, both within and between the two. And after experiencing Greenhouse, why shouldn’t I?).

Greenhouse is an exciting direction for interactive science visualizations. Since it is a tool for developers, you can decide what content you want to look at, and how. You decide what data sets you want to explore. And because it is free to use for experimenting, prototyping, teaching, making art, or other types of non-profit work, I can’t wait to see what people do with it (and if we, some day, have anything to offer as well).

And now a two minute video of our visit where you can see both tools in action:

About Barry

Innovating solutions for learning in a digital age.
This entry was posted in Critiques, Digital Playground and tagged . Bookmark the permalink.