Categories: Features

Raging Bulls

Oscar winner Mauro Fiore, ASC jumps back into the VFX ring for the new robot-boxing drama, Real Steel

In Real Steel, Charlie Kenton (Hugh Jackman) is a boxer who, in trade parlance, “could have been a contender.” But in the not-too-distant future, after human prizefighting is outlawed, boxing becomes a ‘robots only’ sport, forcing Charlie to keep his hand in the only way he can: managing low-end battle ‘bots. Reuniting with his estranged pre-teen son (Dakota Goyo), Charlie is inspired to build a fighting machine that can knock everyone’s block off, and go the distance.

While evoking every classic boxing movie from The Champ to Rocky, Real Steel’s storyline actually derives from Science Fiction Hall of Fame inductee Richard Matheson’s short story Steel, which the author adapted into a poignant Twilight Zone episode starring Lee Marvin. Nearly 40 years ago, Matheson’s adaptation of his story Duel resulted in a memorably harrowing TV-movie that launched the career of Steven Spielberg, who executive produced Real Steel and convinced Shawn Levy to direct. Cinematographer Mauro Fiore, ASC, came aboard shortly thereafter, as he recalls: “I had a two-hour meeting where Shawn described the film conceptually. He had this melancholy view of the future that involved looking back at imagery of a nostalgic past.”

Levy showed Fiore boxing images of Muhammad Ali, along with classic 1950s-era photographs; and (even after his Oscar-winning turn on Avatar) Fiore was reluctant to use digital capture, given Levy’s film-only reference. “I had to be convinced by the visual effects department,” he admits. “I think there’s a bit more flexibility in the DI when originating on film, but for this picture’s technical needs, digital was the way.”

Those needs revolved around a digital workflow that incorporated material of motion-capture human performers and 8-foot tall CGI robots, played back on the Detroit-area locations via monitors. Executive Producer/First AD Josh McLaglen, who has worked on Levy’s last four movies, explains that the industry has always catered toward a strict division of labor – prep, shoot and post – which didn’t fit the unusual demands of Real Steel. “We’ve tried to alleviate post issues up front by pre-producing our movies to a state of readiness that enables the director and VFX house to determine the number of shots and their duration, which saves time and dollars on the back-end,” McLaglen describes. “Digital Domain, under VFX supervisor Erik Nash [see Exposure, this issue], was interactive and pro-active, improving shots as well as finding solutions.”

In fact, Real Steel’s in-house virtual art department first devised robot concepts, then built those assets with 3D rigging. “DD would clean up the render, ensuring the range of motion worked for each 3D character – avoiding collision interpenetration between clavicle and cervical, for example – before kicking it back to us for data acquisition,” McLaglen adds.

Motion capture sessions were done at Giant Studios, where performers enacted boxing sequences – even though principal photography would not begin until months later. Video reference of the fights allowed Levy to quickly assemble a cut before making selects from the real-time mocap, which was then motion-edited to create the proper mass/speed impression for much larger figures, since the robots stand about two feet taller than the humans portraying them.

“With motion editing and scale offsets done, we’d have robots with appropriate lumbering movements, rendered at 8-foot tall, before going to Virtual Camera,” McLaglen continues. “VC shoots the coverage in the style used on the Zemeckis animated films [see ICG November 2009 and March 2011]. We’d often shoot a whole sequence virtually, creating a roadmap for the live-action to follow. In addition to the mocapped robots, we could put in proxies for Hugh and Dakota and cards to represent the crowd, then shoot the various angles on virtual camera platforms with appropriate focal lengths, cutting Avid events or storyboards that made it seem like we’d lived it before going out on location.”

McLaglen says this previz-minded workflow allowed other aspects of production to be addressed during prep. “When holding what we called complexity meetings with department heads,” he continues, “I’d project a previsualization so we could all talk out a sequence. If somebody saw they’d need more space to get a rainbar into a particular location, then another department would know to order a bigger crane to avoid crowding him. And we had the locations scouted in the computer, enabling us to time-lapse across a whole day to determine where the sun will be. Mauro might need to shoot backlit coverage of the kid, who has shorter hours of availability, and at a glance I can tell him what time of day works for that light, and schedule around that constraint.”

Fiore says he tested several cameras, including the Genesis and ARRI D21, finding the latter’s sensitivity to light a bit low for his taste (“though” he says, “the optical viewfinder was enticing”). He settled on Sony’s F35, which featured genlock – an absolute necessity given the need to synchronize mocap with live-action. After experimenting with the newer 12-bit F35, Fiore elected to stay with the proven version, provided by Otto Nemenz, rating the image at an ASA equivalent of 400 and framing for a 2.4:1 aspect ratio. “We benefitted from the efforts of Ryan Sheridan at Nemenz, who had committed to a fiber [optic] cable system that let us record to Codex with perfect transmission,” Fiore says, “which was significant because of all the meta-data, such as Simulcam information.”

The F35 cameras were tethered via fiber cable to a mobile lab set up off set, where a dailies colorist prepared one-light dailies using a pair of Codex Digital Labs. The Codex boxes were used to create deliverables (linked to the original camera data) for review, editorial, visual effects and back-up purposes, and to make meta-data available to the various post departments. The latter included not just simple shot information, but also detailed notes made by Levy and Fiore about specific shots and scenes.

Real Steel crewmembers (who McLaglen calls the film’s “brain bar”) had previously worked on Avatar, for which the Simulcam system was developed by Videohawks Virtual Production Supervisor Glenn Derry. The Simulcam system freed up James Cameron to shoot his actors on the mocap stage while seeing on a monitor in real-time how they would roughly appear as their alien characters in an approximation of the appropriate landscape. “For this film, we relied on Simulcam-B,” McLaglen says, “which involves playing back previously acquired performances as a data acquisition into a live action environment. We used it maybe a dozen times on Avatar but Real Steel had ten times as many shots. We couldn’t play back a full Avid event assembly, but it could handle a section at a time, so while on location, the director would be able to see the previously acquired performance for the robots beginning to square off on his monitor.”

Live-action work was based at Raleigh Studios Detroit. And because Fiore prefers to initiate color correction on set, a location trailer with Avid Unity and Truelight systems was set up, utilizing miles of the Nemenz fiber to link cameras with video assist via their own Videohubs.

“Even recording [4:4:4] RAW,” Fiore explains, “I like making the look more polished, with an indication of where the image is going in timing.” But on-location correction proved unwieldy, so a timer was brought in to manage color issues from the controlled environment of an office, ensuring dailies would reflect the intended look. “As a DP I didn’t want to spend my time in a tent. I prefer to work right next to the director, finding solutions to problems as they arise. Shawn really moves things along pretty efficiently, so we’re there to work, without much time for contemplation.”

Many of the film’s boxing matches take place in different arenas. In addition to the Silverdome and a dried-out polar bear pool at the local zoo, fights were staged with multiple redressing of Detroit’s famed Cobo Center, while a nearby firehouse was transformed into a gym set. SteadiCam operator David Emmerichs reports that, “our grip and lighting crew made the decayed industry look really work for us. There were a couple of sets built in old warehouses, but mostly we worked in practical locations.”

According to Fiore, each location had its own complications, due primarily to the need for customization. “We shot at the historic [Ford] Model-T factory [in Highland Park], which required a full tent – about 600-foot by 80- foot – for blocking daylight. The building also had to be prepared for rigging which was difficult in places.” At Fiore’s suggestion, Key Grip John Janusek and Gaffer Chris Culliton, and their teams, surveyed locations prior to prep, providing preliminary budgeting data that helped determine cost and complications beforehand while building on the virtual landscape data already in the system.

Since this film’s future wasn’t all touch-screens and jetpacks, only light touches were used to convey a different look. “It is a future that looks like the past,” Fiore notes. “A kind of sentimental thing with ‘50s design elements, even in the vehicles. For the boxing matches, I didn’t want the standard concert-with-moving-lights thing, and avoided photographing any trusses because that suggests concerts too. I achieved most of the different looks through LED light ribbons, which gave unique accents to each venue, depending in part on the architecture of the location. At the top of the ring, I had fluorescents inside metal grates; whenever possible we preferred designing new kinds of lights rather than use a practical as-is, just to give a special flair to each setting.” Vari-Lites were also employed for the bouts.

A number of scenes involving robots either walking around or entering the ring required a different capture methodology, one implemented on location. “Image-based-capture [IBC] allowed us to record a physical presence on set that the actors could see and relate to,” McLaglen explains. “We’d put stilts on human performers to get them up to the appropriate height level to provide an eyeline. Like Simulcam-B, you can see the composition, and it is all right there in front of you so there’s no chance you’ll give the animated character a haircut through blind framing.”

An array of Sony EX3s were used on set to provide IBC video reference from all sides, which would allow Digital Doman to composite a CG robot over the human stand-in, an approach employed for much of the South African indie hit, District 9. “Glen Derry had infrared LEDs on our camera, plus infrared cameras to track our camera and Technocrane,” David Emmerichs states.  “Plus there was great lighting reference available from the full-size physical robots.” Legacy Effects provided nineteen robots, many of which could be puppeteered on camera for closeups, augmenting the CG effort.

While Fiore used Cooke S4 glass for most of the shoot, the SteadiCam boxing action employed Angenieux’s short zooms. “Those lenses were very high quality and let us change focal lengths quickly for the fights,” Fiore says. “SteadiCam seemed to be the right tool to replicate what was done on the mocap stage with the Simulcam system.”

Working on elevated parallels – platforms that deploy like a folding chair – arranged at different heights around the ring, Emmerichs could descend or ascend while executing circling SteadiCam moves. “The parallels let me get close to robot-eye level in the ring,” he explains. “In addition to being able to see the mocap on monitors to get a precise idea of the fight choreography, we’d also get an overhead view of the ring, indicating the path of the camera and the speed we’d need to move to get from one position to the next. This ensured the mocap really matched with the drama.”

Acknowledging that the film required an epic quality in key sections, Fiore relied on the Technocrane nearly every day, citing the lack of flexibility in fixed-arm units. “[Steadicam] allowed to us show off the various venues and put across this past/future look in a big way,” he notes.

The zoo location was one such epic moment, playing host to aerial shooting by David B. Morell, who, like Emmerichs, benefitted from seeing the virtual fighters played back on set before composing his frames. Morell also shot helicopter footage of Kenton’s truck making a cross-country trek to his next bout. “That was some dynamite stuff,” he reports. “We followed this green covered transport through a beautiful dusk setting pretty closely. Spacecam stabilization is superior, and together with a RAMS articulated arm, which banks the camera opposite from the copter’s banks, we can track along either side or come straight in at a subject while mounted in the nose position.” Low light levels forced Morell to abandon the F3.5 10:1 zoom lens used in daylight shooting, in favor of a 4.7:1, 17-80 Angenieux Optimo that afforded him a stop and a third more in exposure.

After editorial and VFX wrapped, Fiore supervised the digital intermediate, timed by Skip Kimball at Modern VideoFilm. (Modern had also calibrated monitors on-set and provided color-corrected dailies.) “The DI went very smoothly, due to our VFX producer Ron Ames having ridden herd on color issues,” Fiore recounts. “He had our timer go through everything again, in case there was a day where the correction hadn’t been all the way there, so micromanaging color issues really paid off.”

Bringing in a VFX-heavy feature for $80 million is no mean feat, and Josh McLaglen, a veteran of the some of the most complex logistical films in recent memory, points to the unique workflow as a big part of that process. “If you’ve got two ordinary people talking over coffee or even fighting each other for most of your movie, our production paradigm is unnecessary,” he concludes. “But if two of the four main characters are paranormal, it makes really good sense, and is a cost-effective methodology for budgeting complex films. Avatar was translated virtual-to-virtual, only rarely with humans being interactive. But this one is going virtual-to-live-action in the real world. To accomplish that, I think we used the tools better this time.”

By Kevin H. Martin / photos courtesy of Dreamworks SKG

EDITOR

Recent Posts

The Blind Side

FX's newest anthology series is a tough, compelling watch that says as much about the…

3 weeks ago

Refuse to Lose

Kira Kelly, ASC, hits the hardwood for Writer/Director Sydney Freeland's personal tale of hooping life…

2 months ago

Hot Time in the ATL

Directors of Photography Joe “Jody” Williams and Michael Watson mix 16mm Ektachrome with 8K digital…

3 months ago