Brickyard Animates Smooth-Talking ‘Animals’ for Adobe
Brickyard VFX delivered animation for Adobe spot ‘Animals’, which premiered online and features a chimpanzee and a horse on a film set, sharing their thoughts on huge Super Bowl ad spends as they prepare to perform in a new round of spots for the 2013 Big Game. The Brickyard team, led by VFX Supervisor Mandy Sorenson and Head of CG David Blumenfeld, animated the two animal stars. The live action was shot using a real horse and an actor in a chimpanzee suit. See the spot here.
David was the on-set VFX Supervisor for the shoot, assisted by Chris Sonia, one of Brickyard’s compositors, and noted that working with live animals brings particular challenges on set. “To support the CG work, we captured HDR photography of the various environments for all lighting setups and obtained the relevant camera and set measurements,” he said. “The chimp wasn’t tagged with tracking markers, but we did affix four white stickers to his mug to help with tracking where a 2D solution was adequate for the talking gag.”
Reference Photography & Scans
As well as capturing abundant reference photography and video footage, David ensured they had enough takes of different types needed to create realistically talking animals, either in-camera or in post via 2D compositing or 3D facial replacement. Different takes were shot with the chimp mouth puppeteered in various styles, as well as the horse performing his ‘talking’ in different ways. Some of these included chewing carrots or celery, as well as rigging the horse mouth with mono-filament so his lips could be manipulated to look more like talking.
As extra insurance, they made sure that clean plates were shot for almost every setup to allow paint-back of the background where CG elements were incorporated over the top, and more flexibility for split screen composites.
Brickyard also digitally captured the heads of the chimpanzee and horse to extract geometric and textural data of the practical animals. To do this, while on set, rough measurements of the horse and the chimp suit and mask were taken along with a large amount of photographic and video reference. David Blumenfeld said, “After filming was completed, we subcontracted Jan Huybrechs, CEO and founder of Another World Studios, to travel out to the puppeteer facility and horse ranch and perform 3D scans of the chimp mask and the live horse.
“Jan's technique allowed him to digitally extract textures during scanning for the chimp, but because the horse is a live animal and couldn’t stand still for the scanning process, multiple photographs were taken and could then be projected back onto the surface using ZBrush.
“After cleanup and delivery of the scans and textures, additional work was done to enhance the maps and create new ones during look development. The chimp was shaded with numerous texture maps in the shader, while we used a combination of texture maps and plate projection for the horse.” Armed with all of this data, the team of modellers, character animators and compositors were able to complete precise 3D animated face replacements to synchronize the animals’ mouths with the dialogue.”
During the shoot, the lines were played back on stage for the actors to work with. However, because they didn’t expect having to perfectly match this performance with what they were creating, they hadn't set up witness cameras for this process. “Unfortunately, the puppeteering of the chimp mouth never really synced properly with the audio track, so in the end, we had some additional compositing work to replace the mugs with digitally created 3D elements, either by warping, frame replacement, or clever masking and blending," said David .
“In retrospect, some additional takes without any facial puppeteering may have been beneficial, but at the time of the shoot, we expected that most takes could be mixed and matched to create the full effect in-camera through clever editing or via some minor 2D warping. It wasn't until after the edit was complete that it became clear we would need CG replacements in nearly every talking shot.”
Playing the Monkey
Fortunately, the team were prepared. In the early stages of post-production, David had decided that it would be helpful later during facial animation to have a record reference of himself speaking the lines. To do this, he first had the shots converted to QuickTime movies and placed on an iPad to be able to view them easily during looped playback. “Then we set up our in-house Brickyard shooting stage with basic lighting and a three-camera setup using Canon 5D cameras with 85mm lenses to capture the performance from the front, left diagonal, and right diagonal.
“I sat and played back the movies and rehearsed the lines until I could speak them in sync, and then performed them multiple times while recording video of it. This footage was then synced and post-processed into image sequences with shot number and a running frame counter that could be loaded into the Maya animation scene with synced audio for animation reference. It provided a great starting point for the animation, and was quick and easy to capture and use.”
The animators at Brickyard managed to keep both animals looking like animals, even though many looks and gestures recall what people do. The on-set actor who performed in the chimp suit was an expert able to give a performance matching the dialogue, and because the horse was real, his on-set behaviour was already good reference. The animators also watched old ‘Mr. Ed’ TV clips and internet videos of real chimpanzees. David said, “I generally begin with filmed reference of a real person speaking - myself in this case - and then modify the animation to suit the type of animal performing.
“For example, chimpanzees are very similar to humans with an articulate mouth, so their speech looks more natural when it resembles a person. But a horse’s facial performance must look drastically different to feel natural. David described two possible options. “The first is to over-accentuate the motion, such as high flapping gum movement. While this looks a bit strange at first, viewers recognise this style of talking horse from the old TV shows like ‘Mr. Ed’. It looks comical but fits into the suspension of disbelief.
“The other less comical route to go, which we chose, is to minimize the movement while ensuring the mouth opens and the lips curl enough to appear to make the proper sounds. In the rigging phase of the 3D models, we created a skeletal jaw structure and fully articulated IK tongue. The rest was created with blend shapes. While they can sometimes look like they move linearly without using some sort of radial basis function or pose space deformation, using them in combination with each other and with a skin-cluster bound skeleton usually solves this problem.
“We modelled the standard facial phonemes, and then added some extreme ‘ooo’ and ‘o’ sounds. We added various mouth corner pulls and extensions, and built the rig with a live target using a wire deformer to add shapes quickly in scene if needed. Because our cg pipeline uses a fully referenced system, model changes, rig updates, and other modifications were quickly pushed out to animation scenes in minutes.”
Mandy Sorenson colour corrected the spot in Autodesk Lustre and composited the 3D facial components into the live action using Flame for a photorealistic result. The Brickyard team used Maya for all 3D animation, and completed match moving with PFTrack with The Pixel Farm. The matchmove was performed using portions of the faces from the 3D scans, and could be locked down tightly. If too much motion blur or partially obscured footage interfered, the image was hand finessed either in Maya or with warping or frame copying in Flame during compositing. In the CG renders, done in Renderman, separate layers for the face surface and fur allowed compositing over the top of the blended CG for smoother integration.
"I'm very proud of what our team accomplished with this spot," said David. "We only had about a week to complete two full facial rigs from start-to-finish, and when you see the commercial, it looks like you’re watching real animals having an actual conversation." Tips he gave for building rigs quickly include ensuring that your rig is well suited for the task it has to perform and is simple and straightforward to use. This way, it will operate more quickly.
“Nothing holds up animation like a slow, barely interactive rig,” David said. “Rigs should be easily modifiable and allow quick updates when necessary. While it's impossible to future proof a rig, and decisions must be made early on what things to exclude or techniques to use, proper use of naming, grouping and connection allows changes and upgrades without breaking things.
“Additional features we employed that are very helpful were head snapping. The rig can be quickly set to either stick to the matchmoved face in the plate, or alternatively sit perfectly still front and centre for performing lip-sync animation. Nobody wants to animate complex facial dialogue with a rig that turns and moves all over the place."
He also noted that what really matters is the two-dimensional composited image created at the end of the process. “Some quite advanced looks can be achieved with 2D techniques, for example, sticky lips. This can be incorporated into the rig to be automatic with a fair amount of work, but on a project with only seven dialogue shots, it can be accomplished with two clusters during the character finaling phase of animation. Shortcuts like these can be taken without sacrificing quality or functionality and, along with proper planning , contribute to a quick, high quality, low aggravation turnaround." www.brickyardvfx.com
Published on Wednesday, 20 February 2013