Published on Tuesday, 19 June 2012
Identity FX has created a 2D interactive web commercial campaign for Nike, made to generate
a user controlled 170 degree transmedia experience for the web, live events and broadcast.
|On the 'CP3 Jump Man' website, starring basketball player Chris Paul, users can interactively participate in the Nike launch commercial by operating a camera, ensuring that no two viewings would be the same.
Identity FX provided post production services over a twelve-week schedule ranging from data management, editorial, pre comp, visual effects, colour correction and online finishing, using the SGO Mistika editorial and finishing system. On set, the production team used a large array of nine arc-mounted RED EPIC cameras, moved once, for a total of 17 camera positions with a single camera overlap. The Identity FX team then created a 1,000-frame final project, comprised of the seventeen individual camera positions that captured up to twelve green screen layers, per camera, in a 170 degree arc. This resulted in a total of 11 hours of 5K footage, captured over a two and a half day shoot.
The live action footage was to incorporate a chain reaction of various interdependent scenes that included a man on fire, a mounted policeman rearing up, a fan falling from the announcer's booth, and a child nearly trampled by a horse. Thus, the production realised early on that it would not be possible to capture in a single take, across a single 18-camera array.
To plan the production’s field of view it was essential that identity FX build the camera rig in Maya first, planning the distance between cameras and their height on the array to maximise the viewing angle for the users. Doing this also allowed them to create parity throughout their post production pipeline. The Maya camera rig, once approved, was then handed to the camera crew who constructed their array rig based on these measurements and values, which could in turn be exported from Maya as .fbx files and imported into Nuke to create the link between previs, production and post.
As production got underway the compositing team was also brought on set – specifically an artist working in mocha for tracking, After Effects artists to create quick alignments as they went, and Nuke artists. Whenever an especially effective take was captured, the Identity FX team would advise the director and script supervisor. They would also request the footage from all cameras and complete an on-the-spot composite to ensure spatial integrity between al the layers in their build. Whenever a successful performance was locked, the trajectory of that performance was plotted out on a map of the set obtained from the Art Department, to help when moving the array for further takes.
|Matching the Action
The team also faced the task of isolating the moments of action in the basketball play, shot on green screen, into separate green screen layers across the array. The camera array had to be moved backwards and forwards for each layer as well, which resulted in additional issues that had to be resolved. This involved matching the action across multiple takes and various array moves across up to a dozen different layers, as well as processing the extra volume of 5K RED EPIC material at 48fps across the timeline.
The Mistika provided a non-destructive pipeline allowing them to create editorial decisions in post. They needed to be able to lock camera positions, so having the ability to stack all camera views and match the action in the Mistika was critical. They could then do real-time debayering, allowing the transcode from R3D to DPX or EXR formats. By shooting 5K, they were also prepared for any resolution output which is valuable for a transmedia project.
The Identity FX team eliminated brightness disparities between the cameras, and optimised the green screen for extractions, using the RGB Correction function in Mistika. The R3D parameter tool also allowed the cameras' metadata to be ingested and debayered at high quality. The warp and de-noise tools were used to control lens distortion and noise, especially in the blue channel. The time warp function refined the performance line-up across the multiple takes, and the infinite timeline made it possible to line up all 17 cameras vertically in order to synchronise the performances layer by layer and eliminate drift problems in compositing. www.identityfx.com www.sgo.es