Making of SFO's Flight Deck Data Visualization


As passengers of San Francisco International Airport (SFO) walk into the newly renovated Terminal 3E they are greeted by a 16-foot wide glowing data visualization that hovers above six multi-touch screens, the Flight Deck. The Flight Deck highlights SFO’s global reach, services, amenities, and museum exhibits. We want to share the backstory about the development of the interactive data visualization component of the Flight Deck.

The Projection

Our goal was to create a visual beacon in an airport that stood out from other digital displays and had the ability to be seen on both sides. We envisioned a uniquely shaped floating image without bezels that was constantly moving to capture the attention of passengers whether they are arriving or departing. To meet this challenge, we experimented with several concepts but decided on attempting a 24/7 rear-projection in a bright airport terminal. Rear-projections in backlit environments are difficult since projectors don’t project black. Initial tests (images below) showed that a projection in the space would be possible given the proper projector, proper dark grey rear-projection film, and window treatment. We worked with the architects to specify a frit for the windows immediately behind the projection. We used a 20,000-lumen projector, the brightest rear-projection film we could find, and optimized the placement of the glass and projector for the visitor approach. Visitors approach the Flight Deck parallel to the projector, so that they experience the brightest portion of the projection when walking toward the Flight Deck.

The Experience

We wanted to create an ever-changing experience where a Kinect camera would use people’s presence to alter the background, and real-time data from flight paths would highlight the global reach of SFO from moment to moment. Key rewards were also synchronized with the touch screens so that visitors are able to take over the entire projection if they collect hidden rewards within the touch screen experience. The software was developed using Cinder, and used OSC for network communication. An early decision was to heavily thread our architecture so that we could maintain a minimum of 60 fps for low interaction latency and smooth animation. Unfortunately several components in Cinder were not properly written for heavy multithreading such as the Console or the Kinect class, so we added locks and properly threaded components as needed. Early interaction prototypes tested the limits of our frame rate and latency to determine areas for optimization.

Kinect Interaction

As curious visitors approach the Flight Deck a Kinect camera captures their form and creates triangles and shapes that play with their form using an attraction based physics system, as seen in the image above. As visitors move left to right the triangles leave trails of triangles on the screen that orbit the space, creating constantly changing visuals. Visitors can also use their arms to attract and control the shapes around them. A rear mounted Kinect camera captures approaching visitors and uses computer vision algorithms to convert the depth image to blobs. The computer vision allows for tracking of any number of people, vs two people for skeleton tracking. In addition, we also use depth data from a depth range that is normally unreliable for skeleton tracking. We modified the Kinect API for Cinder to use an extended depth range to capture people who are up to 16 feet away and to see people within a 15 foot wide area.

Data Visualization

Live flight data is visualized on screen as well as inbound and outbound activity from SFO in the last 24 hours. Positional data from flights is received in a polar coordinate system and is converted to a Cartesian coordinate system on the GPU. Our data visualization uses the positional data from flights so that flight paths can be visualized with bends and waviness vs perfectly smoothed flight paths. The challenge with positional data was that live flight data is not always available in all positions of a flight so some interpolation had to be mixed with the actual data. SLERP was utilized for interpolating positions and real data was combined with interpolated data using a gated smoothing algorithm. The approach allowed for us to properly visualize flights, for example flights to Asia, where positions are occasionally lost but picked up at various locations.

Looking Forward

The interactive data visualization represents only one aspect of the Flight Deck. The Flight Deck also consists of a multi-touch experience and a mobile experience and was a collaboration of creative and technical talent. Since the launch, we’ve heard a lot of feedback where people are comparing the Flight Deck to their favorite sci-fi shows and movies. We hope that we can continue to create forward thinking experiences for the public.

Comment