IBM Watson / Experience

At the heart of each IBM Watson Experience Center is an immersion room that surrounds its audience with 45 HD displays and 93+ million pixels. From the opening of IBM Watson's headquarters at Astor Place in NYC Oblong has worked closely with IBM to develop new spatial visualizations and group interactions for these unique spaces, which host over 15,000 visitors every year.


Collaborators: Mirada (2014-2015), Local Projects (2016-2020)
Role: Director of Interaction Design, Engineering Management, Account Management

The goal of each Watson Experience Center—located in New York, San Francisco, Cambridge, and Munich—is to demystify AI and challenge visitor’s expectations through more tangible demonstrations of Watson technology. Visitors are guided through a series of narratives and data interfaces, each grounded in IBM’s current capabilities in machine learning and AI. These sit alongside a host of Mezzanine rooms where participants further collaborate to build solutions together.


The process for creating each experience begins with dynamic, collaborative research. Subject matter experts take members of the design and engineering teams through real-world scenarios—disaster response, financial crimes investigation, oil and gas management, product research, world news analysis—where we identify and test applicable data sets. From there, we move our ideas quickly to scale.

Accessibility to the immersive pixel canvas for everyone involved is key to the process. Designers must be able to see their ideas outside of the confines of 15″ laptops and prescriptive software. Utilizing tools tuned for rapid iteration at scale, our capable team of designers, data artists, and engineers work side-by-side to envision and define each experience. The result is more than a polished marketing narrative; it's an active interface that allows the exploration of data with accurate demonstrations of Watson’s capabilities—one that customers can see themselves in.

Under the Hood

Underlying the digital canvas is a robust spatial operating environment, g-speak, which allows our team to position real data in a true spatial context. Every data point within the system, and even the UI itself, is defined in real world coordinates. Gestures, directional pointing, and proximity to screens help us create interfaces that more closely understand user intent and more effectively humanize the UI.

This award-winning collaboration with IBM is prototyped and developed at scale at Oblong’s headquarters in Los Angeles as well as IBM’s Immersive AI Lab in Austin. While these spaces are typically invite-only, IBM is increasingly open to sharing the content and the unique design ideas that drive its success with the public.