Categories: Features

Star Fall

New emotional worlds (and workflows) burst from high above our planet in the stunning new stranded-in-space drama, Gravity

During early days of planning for Alfonso Cuarón’s Gravity, cinematographer Emmanuel “Chivo” Lubezki, ASC, AMC, remembers the director’s offering to let him exit the project. “Once we realized how much of it was going to be CG, Cuarón actually said, ‘Chivo, I’ll understand if you don’t want to do this. It’ll probably be horribly boring for you to deal with all the green screen. But I really don’t want anybody else but you to light it, and I think this could be something unique.’ When invited to do something I haven’t done before, it’s incredibly exciting, so I was hooked.”

Gravity details the travails of a newbie space traveler, Dr. Ryan Stone (played to perfection by Sandra Bullock), who is lost in low-Earth orbit. While out on a spacewalk, debris from a satellite decimates the International Space Station (ISS) Dr. Stone is sharing with veteran astronaut Matt Kowalsky (the reliably glib Geoge Clooney), squashing plans for a ride home. The project required a workflow designed to address specific color management issues, innovative lighting and capture methodologies, and motion-controlled robotics driven by the most extensive of previsualization efforts, all of which challenged this already visionary partnership between director and cinematographer to achieve a new level of storytelling.

“Previs began simply, with Alfonso doing storyboards,” recounts Lubezki, who has known Cuarón since film school in their native Mexico and shot nearly all of his features. “Then Framestore’s [VFX supervisor] Tim Webber got deeply involved in what became an even more successful collaboration than the three of us had on Children of Men, where we had a CG baby that was as good as anything I’ve seen.”

Lubezki likens the experience of learning to use CG lighting, with numerous animators to having 12 gaffers on set. “The process involved endless relighting,” he remarks, “because if you change the animation, a face might not read right anymore. We wanted a look as real as possible, so we set up rules and limitations, which also kept me from going insane over all the potential options.”

High-end digital products like Maya allowed Lubezki to set bounce and key light, which he says were essential to lock-in up front. “Whether the ISS was seen in direct sunlight or in darkness,” he adds, “it wouldn’t look like it was in an environment apart from the astronauts or another craft. We spend the whole movie looking back at Earth and the spacecraft against a starfield, which might not allow for much variety. Being able to go from one lighting scenario to another helped make the movie feel more like a journey, to convey the idea that we’re traveling from one point to another.”

It took Framestore and U.S. previs house The Third Floor months to create a feature-length piece of animation – replete with sound effects – that was such a convincing template, Lubezki’s daughters thought it was the finished movie. “We became involved in nearly all aspects of production, from production design to reference gathering,” Webber recalls. “There was a learning curve for our animators, who are used to putting weight into characters and had to break from that habit to work in space.”

Typical tricks of the VFX trade to sell scale and movement were off the table. “You can’t rely on aerial perspective because without atmosphere there is no attenuation of image due to distance,” Webber continues. “And the lack of reference points [in space] can get you into trouble, like not being able to tell if the character is coming toward you or the camera is moving toward him. Even though we played it straight with respect to science and realism, we did put in more stars than you’d be able to see in daylight, just so there’d be some frame of reference to gauge movement.”

When the rain of debris decimates the ISS, Framestore elected to run dynamic simulations of the event, with rules governing physical laws driving the simulation. “It’s difficult for an animator to work out all the technical aspects of the motion of a cord in zero gravity when you’ve got all this going on,” Webber explains. “So seeing the simulations gave us a lot of inspiration, plus the occasional surprise that was so nice we’d incorporate the idea into the scene. Eventually we got all the movements worked out and Chivo had the lighting the way he wanted it, and we worked to translate all that information into directions that would drive the operations on-set.”

Among the most noteworthy aspects in the Cuarón/Lubezki canon is their increasing reliance on sustained single shots. “Alfonso and I first became excited about solving scenes in a single shot on Y Tu Mamá También,” Lubezki notes. “At that point it was just finding a way to make the scene more immersive and draw the audience in. It starts out very objective, like you’re a fly on the wall, but from an uncomfortable perspective where you can’t yet see their faces, yet the boy’s nervousness is captured. We move closer in after he begins to approach [an older woman], taking in her reaction and then we go wide again, with an elasticity that encompasses both objective and subjective. It was an intuitive approach, and one we didn’t pause to think out. We’ve learned more since then, as on Children of Men, and now with Gravity, that exploration goes further.”

Shooting digital was also uncharted territory. “We love film and prefer to shoot on film, but the grain would have been a problem, given that we were going to release in 3D,” Lubezki explains. “Capturing on Alexa [with Zeiss Master Primes] gave us the latitude I needed, especially with very harsh, hard highlights that I didn’t want to look digital, and all the scenes in near-total darkness.” The duo did manage to incorporate a shot-on-film sequence into the movie. “We are back down on Earth at the end, so to give that a different dimension and gravity, we shot it in 65 millimeter, which has its own kind of hyper-reality. You can almost breathe the wind as you watch that scene.”

Digilab Services was engaged to manage the on-set data, and lab supervisor James Eggleton says Gravity was the rarest of creatures: a show with plenty of prep time. “Arriraw had not yet been released, so there were no workflow solutions,” Eggleton shares. “Part of the wariness of taking on a new camera and file format is over the possibility of not being able to handle it in post, but after evaluating the processing tools at Shepperton Studios to determine which produced the best-quality images, we locked in a methodology – using Codex Onboard and Studio recorders – and settled on an algorithm to convert Arriraw to DPX files for Framestore, which allayed a lot of fears.”

While Digilab’s initial effort focused on wrangling camera data, the show’s exacting color-management needs soon became apparent. “Nearly every completed shot would include multiple sections captured separately under different lighting conditions, and they all had to blend seamlessly,” Eggleton laughs. “The human eye picks up on subtle differences in qualities of white, so very finely tuned LUTs were required. We devised several camera settings with rules to govern color reproduction: setting one for use on this stage with that lighting, and so on. This approach got us 90 percent of the way on technical color matching, and then we worked with Chivo on look-up tables to bridge that last bit and apply his creative looks on top.”

The groundbreaking workflow began with the camera original fed to a Filmlight Truelight color processing box on-set, which then applied Lubekzi’s choice of looks. “Truelight let him apply printer lights like normal dailies and gave many more options than just a base look,” Eggleton continues. Chivo’s decisions were used in the render pipeline going to Avid editorial and to Framestore, with the final step in Digilab’s color pipeline involving application of a virtual film stock, developed by Lubezki and a colorist brought in during prep, that provided a film-style rolloff curve, with all data stored on LTO tape.

Playback was another area requiring attention. “We had to ensure that the 10-bit color monitors used by Chivo and Alfonso were calibrated accurately,” Eggleton explains. “They were capable of near DCI-P3 color coverage, representing what is achievable in the cinema, so the displays showed a complete emulation of our color pipeline. We also took over an old screening room near the stage and installed playback facilities. We could eject the mag from the recorder, then take the camera data on a Codex Datapack over for quick ingest and playback on a Christie 2K projector.”

The screening-room viewings became an essential tool in Lubezki’s kit. “I wanted to grade these shots as they came out of the oven,” he reveals. “Between each setup in the cube, new packages from Framestore had to be installed with all the motion control data and to make sure the robot arms were safe and that the lighting inside the box would reflect the next environment, so while that took place, I could run over to the screening room and time the shot. If you just send Raw, the possibilities for interpretation are wide, forming what I call ‘digital quicksand.’ You have to give the compositors very precise color and contrast so they know how it extends into their animation.”

Live-action shooting involved even more technological innovation. Webber characterizes the process as “having to invent many different boxes, combine them with other existing and get the lot to live together for the duration.” For example, to ensure accurate and repeatable camera moves, production brought in Bot & Dolly’s programmable robots to carry and move cameras and lights through elaborate arcs. The firm devised a Maya-based series of commands that allowed Framestore animators to direct the robots in ways that matched previs action in precise detail.

Lubezki found the robots to be superbly useful tools in many instances. “But there were times when we needed to go faster than the robot could move,” he says, “and I also wanted to be able to change the color of light as we moved.”

So Production constructed a cube, within which the actor could be harnessed, surrounded by an array of programmable LEDs. Lubezki had used LED panels in the past, but it wasn’t until he went to a Peter Gabriel concert featuring billboard-style LED lighting that he decided such an approach made sense for Gravity. “I called a meeting to explain the idea,” he recounts, “and there was resistance at first, due to issues of flicker and color variance; if you have the panel oriented north/south, there is one color, but inverted you get a different hue. All that had to be resolved, but it was worth the effort.”

Also within the cube was a large TV monitor, which displayed animation from the previs. “This was wonderful in a couple of different ways,” Lubezki continues. “The actor sees the environment and how objects are moving in that environment, and at the same time we can see the interaction of that light on the actor. We capture true reflections of the environment in the actor’s eyes, which makes the face sit that much better within the animation.”

Those reflections had to accurately reflect the lighting situation in low Earth orbit. “I wanted to use the precise amount and size of bounce light you’d get from the Earth at that height,” he adds. “We had a consultant work out the exact size of Earth in relation to the astronaut, which let me know the size of the bounce light when positioned twelve feet away from Sandra. Then there were times when we needed very rapid lighting moves with harsh sunlight effects, which were not possible with the LEDs. So we’d pop off the side of the box and use Vari-Lites on cranes to get that intensity and allowed for color changes, which were very exciting when we got into sunrises and sunsets.”

In keeping with Cuarón’s desire for an “IMAX in space” look, Lubezki labored to give the shoot a film-like dimension. “This all had to look like it was captured via lens, not algorithm,” he insists. “So we asked Tim to put some aberrations in during post to make it feel more like the photos we’ve seen from NASA.”

In fact, the massive data and rendering needs for the show required Framestore to utilize 15,000 processor cores, and that coupled with new rendering techniques allowed Webber’s crew to deliver elements – composited in Nuke – that were convincing in appearance and appropriately illuminated.

 

The combination of slowly changing perspectives in sustained shots and wide lenses paid dividends for Gravity’s 3D finish, which was overseen by Vision3 stereo supervisor Chris Parks. “We had a virtual camera at Framestore that let us control depth functions,” Park says. “When Sandra floated off in space, we separated her slightly from the starfield, using 3D to make her feel very small. At another point we went very deep, when we see her POV as her hands reach out to those of another astronaut coming to camera. At the point when they make contact, we increased the interaxial to five times normal, then scaled it back down as they separate and drift apart.”

When Bullock’s character looks through a shuttle’s shattered windscreen, she is stunned to see a dead crewmember. “Alfonso asked how the 3D could make this more powerful, but without just throwing something out into the audience to make them duck,” recalls Parks. “We decided to float a little Marvin the Martian doll out into the audience, which is a kind of fun, lighthearted tension-breaker, and then we whip-pan right off that to the dead astronaut, which makes the emotional revelation more jarring.”

Using floating windows [see There’s No Place Like Home, ICG March 2011], Parks was able to subtly distort the frame for a six-minute take. “It’s a dream sequence that doesn’t reveal itself as such right away,” he elaborates. “To give a subconscious impression that something different was happening, we pushed the top right corner into the set while pulling the bottom left corner ahead, skewing the whole view. That sense of unease represents, to me, how 3D can be expanded beyond just giant VFX movies. It’s a tool that can be most effective in very small spaces, as this one shot reveals.”

Technicolor colorist Steve Scott handled the DI, interfacing with Webber, Cuarón and Lubezki in person and via long distance in both the UK and Los Angeles. “I feel it is very much my responsibility not to ‘break’ the CG during the intermediate process,” Lubezki declares. “Tim and all of Framestore did so much to integrate our live astronauts with their CG suits, so whatever touches I made at the end were respectful of what they’d done to sustain the illusion while completing the environment.”

Lubezki calls it an “unfortunate necessity” that Gravity had to be completed at 2K resolution. “I would describe Alfonso as a true artist – an author of art rather than a manufacturer of product. But he still must make allowances for budget, and it would have been prohibitively expensive and time-consuming to finish in 4K. If it were possible,” the cinematographer smiles, “both of us would still be working on the movie to perfect every last shot.”

By Kevin H. Martin / Photos courtesy of Warner Bros. Pictures

EDITOR

Share
Published by
EDITOR

Recent Posts

The Blind Side

FX's newest anthology series is a tough, compelling watch that says as much about the…

3 weeks ago

Refuse to Lose

Kira Kelly, ASC, hits the hardwood for Writer/Director Sydney Freeland's personal tale of hooping life…

2 months ago

Hot Time in the ATL

Directors of Photography Joe “Jody” Williams and Michael Watson mix 16mm Ektachrome with 8K digital…

3 months ago