Published on Thursday, 14 August 2014


Students’ Robot Love Comes to Life on Set with Xsens MVN

At the Dodge College of Film & Media Arts at Chapman University in California, students produced a live action-CG short comedy an official selection at Comic-Con International Festival using the Xsens motion capture suit. Titled ‘Some Like It Bot’, the film depicts a human’s first date with his electronic love interest. Produced rapidly with motion capture, the film was recently an official selection at Comic-Con International festival.


The film’s primary set is a robot-only bar that the lead character has to sneak into in order to meet his girlfriend. This set piece allowed the team to design a cast of interesting background characters. But in the process, they soon realised they’d need to take a new approach to capturing them on screen.

“We had some fun ideas like bartenders that served from the ceiling and intimidating 12-foot-tall robot bouncers. A cast like that could have cost us quite a bit of time and money,” said Alex Griffin, now a recent graduate of Dodge College of Film and Media Arts and new filmmaker. “Using the suit, our team used an Xsens MVN suit and could act out their moves anywhere and then feed that data into MotionBuilder to make the characters we had visualised come to life.”


The cast and crew spent the next few weeks defining a plan that would help speed up production of the film. Alex paid extra attention to the interaction of actors with the cameras and their Xsens MVN suit, capturing the live-action shots as naturally as possible.

“Our actors - myself included - knew from the rehearsals and shot plans where the cameras were going to be, and more importantly, where their sightlines were,” said Alex. “This way, we could use motion capture for the actors to play off the digital characters on set. It’s one thing to animate a whole world, but live-action blend with digital counterparts is a different challenge.”


Not being limited to a motion capture studio was very helpful. Because Xsens MVN is a mobile system, Alex could bring it wherever the actual set was and perform mocap there. For example, when they are in the bar, the actors could act together against the same background Alex wanted to film. In post they could add in the robot CGI over the mocap data from their female robot actress.

A team from Xsens visited the set to help them start off the shoot. At the beginning of each day of production, the crew prepared to shoot after calibration. The calibration aligns the body segments to the sensors, which can pick up the slightest movements. A basic set-up takes about ten minutes, and involves putting on the suit and measuring the actor’s body height and foot size. Xsens’ team also offered tips on how to raise the output quality. For example, the ideal performance was without any walls obstructing it, avoided large metal objects and maintained a relatively close proximity to the operator.


The students started to experiment with the system’s capabilities and successfully completed the shoot over three days. “We tried some real-time filming with the Xsens MVN suit, often doing a shot with an actor as the robot and another with our live human interacting with them from a distance, in order to keep our frame clean,” said Alex. Since they were aiming for a certain composition in each shot, they didn't always want the human character in the actual frame, for instance, for a closeup shot of the robot talking to the human. By acting at a distance, they could keep the same engagement between the actors while also keeping the shot clean.

The crew liked using this technique with over-the-shoulder shots or close ups. Everyone had a basic idea of where they were positioned in the shot and were able to match eyelines on set instead of spending time re-shooting or editing in post. 3ds Max was used as the main post production application because it handled the motion capture data the most effectively. From there the animation was rendered in V-Ray and sent to Nuke for compositing.


Their data from Xsens’ inertial system came through very clean. Inertial sensors capture the motions directly from the person and relay information back to the computer. As well as doing without a dedicated mocap studio, no cameras are required to monitor markers on the set. Xsens MVN can be used indoors, outdoors or anywhere the crew needs it. A real-time preview of what is being recorded can also be displayed. The students on the project who were used to keyframing for hours after filming particularly liked the system.

“Since we were shooting live-action, the ability to work on set made a huge difference for us,” said Alex. “We can now understand that Chapman’s optical motion capture room really only works well for full animation. In an animated project, the characters and environment are created together, but with an integration piece like 'Some Like It Bot', the characters have to align with an existing environment.”


Using Xsens MVN, the students could create characters on the actual set they wanted to use as the environment for their scene, setting this system apart from a room full of cameras and tracking markers. Another example of this would be the film ‘Ted’, where the bear’s actor wore the MVN suit and acted out the role of Ted off-camera.