Trends

How Oblong helped IBM build its ‘immersion rooms’ with giant displays

John Underkoffler is best known as the science adviser for the landmark sci-fi film Minority Report, where actor Tom Cruise uses “data gloves” and gesture controls to manipulate a transparent computer. As the CEO of Oblong Industries, he has been trying to bring that to life.
He founded the company in 2006 and launched the Mezzanine software for interacting with screens via gestures for enterprise collaborators in 2012. I visited Oblong’s warehouses in downtown Los Angeles in May 2017. There, I saw huge video walls, curved to immerse the user inside a visual experience. You could really see data and the connections between objects in a visual way. He showed me you could grab an image and toss it from a computer screen to a big video wall, seamlessly.
And now the company has shown off the technology with its partner, IBM. Big Blue recently took the wraps off its IBM Watson Experience Centers, which have an “immersion room” with 45 high-definition displays with 93 million pixels, all acting as one screen.
One of the centers, the IBM Watson Immersive AI Lab, opened in Austin, Texas. Under the hood is Oblong’s g-speak spatial operating environment. To date, IBM has taken more than 15,000 people through the Watson Experience Centers, helping them turn “terabytes into insights.”
I spoke with Underkoffler and Pete Hawkes, Oblong’s director of interaction design.
VentureBeat: You’re doing something with IBM now?
John Underkoffler: And have been for a long time. I’m sitting here with my fabulous colleague Pete Hawkes, who is Oblong’s director of interaction design. He’s been the prow of the ship on most of the IBM work.
VentureBeat: They have a facility in Austin that th1ey’re opening up now for visits?
Pete Hawkes: Right. I was there all last week. They call it — let’s see, they have various names for it. It’s basically an immersive design lab. I can give you their proper name for it later. But the lab there is at IBM’s fairly older complex. It’s been there a long time. They have a few different facilities. It’s a very significant design presence there in one of the buildings, where many of the thousand or so designers they hired over the last few years reside.
The immersive design lab has a nice large Mezzanine space, alongside a reconfigurable room they use for testing and prototyping experiences at their three executive briefing centers, which are at Astor Place in New York, Cambridge in Boston, and the Financial District in San Francisco.
Underkoffler: As a bit of background, we’ve been doing heavy work with IBM since probably 2012. When you visited our warehouse, where we do all the large-scale prototyping for those IBM projects and similar projects with other customers, you almost certainly.

VentureBeat: I wonder how big some of these displays are that they’re using, the scale of what they’re visualizing.
Hawkes: At Astor Place, the first large immersive space they set up — there’s a large display wall there, a 41-foot display wall. Three sections of 7,200 by 2,700 Christie MicroTile display pixels. Then it has an immersion room of 45 HD planar displays, which tally to 93.3 million pixels. It’s a lot of space to work with. That space wraps around you in 300 degrees.
The first space is a little bit more architecturally grounded. It’s five sides of a hexagon. The newer spaces are now kind of rounded out into 15 panels. It’s more of a curved space, 15 columns with the same count of pixels. We simulate at similar scale at our warehouse. Austin has a unique configuration in that they’ve installed their production displays on a rolling mechanism that allows them to flatten out and then re-curve the wall live.
Underkoffler: It’s the Transfomer space.
Hawkes: They were a little space-constrained at their lab in Austin. Up until about a year ago they were very tight-lipped. They kept their cards very close as to the content and the specifics of what you could see in this space. They released a few teaser things for press, but they really wanted folks to schedule time to come and visit the space rather than sharing much about what they were doing.
Fortunately we received permission to submit the work we’ve been producing for the spaces for national and international design competitions this year. A film crew went through and documented a bunch of the work. We were shortlisted for the UX Design Awards in Europe, and then also tapped for Austin Design Week this last week as a stop on one of many tours throughout the city. We hosted a two-hour event that demonstrated a bit of the content and also went into, from a design standpoint, how we produced content at that scale.
Underkoffler: The beginning of that whole thing was like many of our beginnings. We sort of inherited someone else’s design for the space. That’s often how we end up starting with a client. A bunch of architects designed this hexagonal room with 45 giant displays, and the separate 41-foot pixel wall. By the time IBM came to us and said, “Uh, let’s work together and put something in this.”
As we all know, the world is filled with giant display walls. But almost none of them are interactive in any way. We’re kind of the only people in the world who can turn that stuff on, who can make it fully interactive. There are hardware-based companies that will let you do — not even Dragon’s Lair things. More like pre-programmed branch points. That’s hardly interactive. In order to tell their story around Watson and other novel and difficult-to-explain offerings, IBM knew they needed an experience that was as live and real-time and interactive as the stuff they were trying to get the world to understand and buy.
They came to us. We’d already transformed their primary research facility, the T.J. Watson Center in Yorktown Heights. We build half a billion pixels, something like that, an enormous number of interoperable pixels in the main demo facility there on the ground floor. They came back to us and said, “We have this new problem. We have these enormous pixel spaces. We need to tell the story of AI, the story of cognitive computing.”

Above: Pete Hawkes showing an Immersion Room.

Image Credit: IBM

It was a process of working with their designers and their communications people to put together an evocation of these very abstract ideas around AI and cognitive computing, and do it in a visceral way using the Oblong spatial wands for control. It’s a uniquely great way to deal with a 41-foot wall when you’re not going to use touch. You can’t run up and down the wall. You need that laser-pointer action at a distance. All of our expertise up to that time got focused and funneled into building a set of experiences that made it worth having that many linear feet, that many square feet of pixels.
VentureBeat: What are they using that for as far as problems to solve?
Hawkes: They’ve shifted over the years. Initially it was primarily a marketing experience. There were a lot of explanatory videos that were both trying to create some hype, but also some understanding in the AI space. What we’ve brought with live software is we can work with actual data sets and integrate Watson capabilities, services and other things, directly into the software.
They still have a narrative slant to a lot of their primary experiences. These are tailored to specific industries. The ones that they’re using today are — there’s a story around disaster response, a hurricane that is approaching New York City. There’s a separate story around the financial services sector, specifically identifying fraud and how AI can help in that manner. We just released a new experience this year around supply-chain management and the implications of AI in that space.
In addition to that, some of the more exciting work that we’ve done — instead of just pure storytelling, it’s a software piece that’s new and different every single time. One of the more popular experiences is called News Discovery, where they analyze all of the news they can scrape from the world’s servers — upwards of 40,000 to 50,000 articles in a single day — and then analyze the content of those articles for high-level concepts and sentiment. Are they primarily positive or negative? And then allow you to look at related concepts and view them geographically, so that you can see how those concepts and sentiment maybe stack differently in Asia versus North America, by linking concepts together and diving directly to the web-based news content, which is viewable in the space.

Above: Oblong’s g-speak spatial operating environment powers this room.

Image Credit: IBM

That’s a lot of fun from a software standpoint, but the way we’ve utilized the space to sift and sort through that massive amount of data is what makes it truly unique. If you haven’t seen that particular application, I highly recommend finding some time the next time you’re in any one of these hubs.
VentureBeat: Have they credited this for helping them accomplish particular goals?
Hawkes: Right now they’re not selling it — it’s not something that they’re offering. “Buy one of these rooms with this service.” Their underlying goal is primarily to demystify AI and start a conversation with executives. That’s their primary audience for these experiences.
I can only speak from personal use of the software, but it’s highly gratifying. Compared to, say, what Google gives you, when you search up the news for the day — you get what Google thinks are the 10 or 12 best things in a list. That’s all you can see at a time. Many of those results are fed to you because of dollars people have spent to put them in front of you, instead of being relevant to your query or your actual intent.
What’s unique about this particular experience is that when they hand people the wand to use it, or take them through the experience, you don’t have to have a predefined path or course, because it changes. As you see the data, you respond in real time. Your queries change based on a more visual and visceral response to the results that you’re seeing. The filtering changes as well based on that.
Source: VentureBeat
To Read Our Daily News Updates, Please Visit Inventiva Or Subscribe Our Newsletter & Push.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker