Categories: Features

A New Hope

On-set VFX makes a mighty return via the use of LED wall imagery on The Mandalorian, the first Star Wars live-action series for television. 

by Kevin H. Martin / Photos by Francois Duhamel, SMPSP & Melinda Sue Gordon, SMPSP

 


While working on the Star Wars film Rogue One, director of photography Greig Fraser, ACS, ASC, proposed shooting cockpit scenes using an LED screen that displayed the exterior space environments. When implemented, the process would allow the cinematographer to capture in-camera interactive light and reflection effects from the background visual effects created ahead of shooting by Industrial Light & Magic (ILM). 

That pioneering work initiated by Fraser became a jumping-off point for the Disney+ series The Mandalorian. ILM, along with several other vendors, collaborated on the creation of a virtual, largely in-camera workflow, which employed a screen-based capture volume that successfully matched the look and feel of the original Star Wars features. Taking place after the events depicted in Return of the Jedi, the eight-episode first season chronicles the exploits of a bounty hunter (Pedro Pascal) as he roams the stellar backroads of a galaxy far, far away. However, before The Mandalorian could go to space, ILM VFX supervisor Richard Bluff, along with associate Kim Libreri and ILM creative director Rob Bredow, met with Fraser to talk about a virtual production approach with LucasFilm staffers. “And,” as Fraser explains, “it turned out that [ILM chief creative officer] John Knoll had been looking at building technology that would let us expand that Rogue approach to whole environments.” 

 Bluff, who was on board ten years earlier when George Lucas was exploring the possibility of a live-action Star WarsTV series, picks up the narrative. 

“Jon [Mandalorian creator Jon Favreau] was adamant that due to the scope and scale of what would be expected from a live action Star Wars TV show, we needed a game-changing approach to the existing TV production mold. It was the same conclusion George [Lucas] had arrived at more than 10 years ago during his TV explorations, however, at that time the technology wasn’t around to spur any visionary approaches.

ILM Visual Effects Supervisor Richard Bluff says Mandalorian creator Jon Favreau (center) “knew that any breakthrough [in making the first Star Wars live-action series for television] was likely to involve real-time game technology.” One set here with Series director of photography Baz Idoine (left at camera), IG-11 Physical Performer Rio Hackford (rear in grey) and writer Dave Filoni (right in yellow cap) / Photo by Francois Duhamel, SMPSP

“Jon already had extensive experience in immersive film and multimedia workflows directing The Jungle Book and The Lion King,” Bluff continues. “And he knew that any breakthrough was likely to involve real-time game technology. He challenged me and other key creatives to fully explore how we could crack the obvious production challenges of a sprawling live action Star Wars TV show.”

Bluff says Libreri had been pursuing game-engine technology to support animated or live-action productions, and that became a component of what was pitched to Favreau. 

“ILM then partnered with Epic [Games] to make their Unreal Engine for gaming into a robust production tool,” Bluff adds, “allowing for real-time display on LED screen walls.” (In June 2018, a 35-foot-wide capture volume was built to test the screen’s potential.) 

Nine-millimeter pixel resolution LED panels had been used on Rogue One, a limitation Fraser now sees as archaic compared to the 2.8-millimeter panels currently deployed. 

Photo by Francois Duhamel, SMPSP

“The results on screen have a lot less moire, which is the trickiest part of working out the shooting of LED screens,” Fraser reveals. “If the screen is in very sharp focus, the moire can come through. That factored into my decision to shoot as large a format as possible, to ensure the lowest possible depth of field.” Fraser chose the ARRI ALEXA LF, adding that “Panavision was just building the Ultra Vista lenses, so we may have been the first to use them. They have a fast fall-off, so we duck the moire issue. And the anamorphic aspect is very pleasing, keeping with the established softer analog look going back to 1977. I took them along while shooting Dune, but now they’re back for season two.” 

Rogue One’s 2nd Unit director of photography, Barry “Baz” Idoine, was chosen to follow Fraser’s work on the pilot and complete the first season. “Prepping in May and June, we wanted to see what lens worked best for the volume and gave us the aesthetic,” Idoine elaborates. “Panavision’s Dan Sasaki gave us prototypes for the 75- and 100-millimeter. We asked for two full sets [T2.5 50-millimeter, 65-millimeter, 75-millimeter, 100-millimeter, 135-millimeter, 150-millimeter and 180-millimeter], based on our desired focal lengths. Combining the LF sensor with the 1.65 squeeze on the Ultra Vistas, you get a native 2.37 [ aspect ratio]. Those lenses have a handmade feel in addition to being large format, and a great sense of character. We didn’t use any diffusion filtration at all.”  

Pilot director of photography Greig Fraser, ASC, ACS, says the trickiest part of working with large LED background projections is the amount of moire. “If the screen is in very sharp focus, the moire can come through,” he explains. “That factored into my decision to shoot as large a format as possible, to ensure the lowest possible depth of field.” Above B-camera operator Karina Silva shooting Kuiil holding “The Child,” aka Baby Yoda. / Photo by Melinda Sue Gordon, SMPSP

While details of the virtual production process were being ironed out, production designer Andrew Jones relied heavily on LucasFilm’s brain trust in conceptualizing the look of the show. “There was a clear aesthetic coming from their art department in San Francisco, led by Doug Chiang,” Jones reports. “The team contributes massively to every Star Wars project and was central to our show, creating concept art for every set. We tried to interpret and reproduce their concepts faithfully, including as much production value within the scope of this series as we could on this crazy schedule. We tried not to let the process influence the concepts – initially, at least. Getting deeper into things, we realized what was possible with these [LED] screens within the capture volume. So, we began offering up more ideas for the environments through a concept artist of our own in Los Angeles. Doug was in contact with Jon throughout, providing weekly reviews.”

Conceptual designs were built out as 3D models. Jones says the virtual art department would put those into an Unreal Engine scene, texturing and lighting, and then bring in the director of photography. “That gave us an idea of how much should be built practically,” he shares. “Previs artists at The Third Floor – which provided previs for the whole series – started designing shots based on where we had placed the volume. If they blocked a scene and ran into a wall of the volume, that would flag a problem, causing us to rethink how the set was laid out or how to restage the action.”   

Halon virtual art department supervisor Kenny DiGiordano and his team became an extension of Jones’ production art department. “We worked with the creatives, not only from an aesthetic point of view but also with troubleshooting and testing tech,” DiGiordano recalls. “We set up lighting scenarios in Unreal that the DP wanted to see on screen. We would dial-in lighting, then render out a 360 VR image prior to sending it to ILM. Up on screen, we could see how the light affected things on the stage, namely the reflections in Mando’s helmet. We added custom lightboxes and bounce lights, implementing virtual lighting as you would do [conventionally] on an actual set.”

“If some element in the scene needed to transition from being a physical presence on stage to part of the volume image, then ILM would have to scan it in advance to generate appropriate screen imagery,” notes Bluff. “Visual effects had to learn very quickly to play with all these other traditional production departments, some of which had jurisdiction to a degree how VFX placed set decoration in their CG environments.” 

By June 2018, getting the project greenlit was the principal concern. 

“Andrew Jones was producing material to try out in the volume,” recounts Bluff. “I thought about falling back on an old VFX technique for the projected backgrounds – photography. To test whether a projected landscape would look convincing, don’t begin with CG, just do a 360-degree wraparound photo of the desert and test subjects against that. I pulled in digital supervisor Enrico Damm, known at ILM as one of the most talented environment artists we’d ever had. I asked him to build me a fully 3D environment purely from location photography. He shot thousands of stills at a building on Angel Island, using the images to generate geometry through photogrammetry, and then mapped the pictures back onto the geometry. When we put this up in the volume, it looked amazing – photoreal. Greig Fraser went to Jon – I followed him over – and said, ‘I always believed this technology would work. But I never thought it would be this convincing!’”  

Halon’s virtual art department, led by Kenny DiGiordano, became an extension of Production Designer Andrew Jones’ team, “rendering out” a 360 VR image prior to sending it to ILM “to see how the light affected things on the stage, namely the reflections in Mando’s helmet,” DiGiordano shares. “We added custom lightboxes and bounce lights, implementing virtual lighting as you would do on an actual set.” / Framegrab Courtesy of Disney +

Blending the different processes for an in-camera solution required many vendors. Profile Studios provided camera tracking, lens (FIZ) data, and alignment of the physical and virtual worlds. “The camera tracking and lens data were streamed simultaneously to three Unreal Engine [UE4] workstations and a StageCraft workstation, each operated by the ILM stage [Brain Bar] team,” reports Profile president and creative director Matt Madden, who served as The Mandalorian’s virtual production supervisor. “UE machines handled all content rendering on the LED walls, including real-time lighting and effects, and projection mapping to orient the rendered content on each LED panel, deforming content to match Alexa’s perspective. For each take, the StageCraft operator was responsible for recording the slate and associated metadata that would be leveraged by ILM in their postproduction pipeline. 

“ILM developed a low-res scanning tool for live set integration within StageCraft,” Madden continues, “taking advantage of Profile’s capture system to calculate the 3D location of a special iPhone rig, recording the live positions of a specific point on the rig to generate 3D points on live set pieces. The Razor Crest set, a partial set build, used this approach, with the rest of the ship a virtual extension into the LED wall. Controls on an iPad triggered background animation, giving the impression the physical Razor Crest set was moving. Once imagery had been rendered for each section of LED wall, the content was streamed to the Lux Machina team, which QC’d both the live video stream and the re-mapping of content onto walls, [which] Fuse Technical Group operated and maintained.”  

Madden says a typical shoot would involve the Brain Bar loading a 3D environment matching the physical set build into Unreal. “I used an iPad containing a UE4 interface that allowed us to remotely update the LED wall imagery. Under the guidance of the DP, I would start by dialing-in the position and orientation of the virtual world for the first camera setup. After the virtual background location was approved, we would move on to any wall lighting adjustments for either virtual or physical sets. I would use the iPad for lighting changes requested by the DP to enhance the physical set and actors, and the Brain Bar team would make any final adjustments to the 3D scene itself based on input from the VFX super, such as color-correcting the dirt on the virtual ground to match the dirt on the physical set.” 

The Brain Bar team took input from camera as well, with gaffer Jeff Webster directing the setting of virtual flags to light the set through his iPad. 

“The Brain Bar didn’t just drive the ship on the volume work,” Jones states. “In addition to lining up the virtual and physical, they triggered any animation. 

“The on-set ILM VFX team adjusted the wall of LEDs locally to match it with our adjacent physical build,” Jones continues. “When time allowed, we did some repainting to match the physical set piece to the virtual one. And we have other practical elements on set to help sell the illusion, like traditional wind-machine effects. If some of the more experimental screen loads didn’t work, we could always turn the screen green and shoot it that way, or even bring in a hard green surface. In those cases, we still got great lighting on the characters, because you only need to see the green right behind Mando to pull a matte.”

Idoine says that working with large LED screens was an immensely positive experience. “Seeing this 3D rear-projection of a dynamic real-time photoreal background through the viewfinder is tremendously empowering,” he declares. “It’s phenomenal because it gives so much power back to the cinematographer on set.” / Photo by Francois Duhamel, SMPSP

With the virtual workflow tied to so many partners and departments, it fell to Richard Bluff to keep everyone in the loop. “It was my responsibility to make sure [Production] saw all the changes during development, as often as possible,” he recounts. “Landis Fields is one of the best at modeling, painting and technical animation I’ve seen, a real Swiss Army Knife of a guy. I brought him in early as virtual production visualization supervisor, and he crafted elaborate breakdowns as you’d see in promo pieces. That was instrumental in demonstrating to people in various disciplines how the work of all these companies fit together. It was an informative and visual way to let everyone know how collaborative VFX would be and reassure them nobody here was going to try a land-grab.” (Virtual production supervisor Clint Spillers headed up the process for LucasFilm.) 

Fraser says the process was light-years apart from a typical feature workflow. 

“I worked with guys who built 3D models, and to some degree, I had to teach them filmmaking,” he explains. “These images were not just created for a movie; they were going up on a screen that had to fully integrate with what I did on the live-action side. That meant showing them how I would light something and also why it was important to me that the light be controlled in that fashion, and how it was in service to the story.” 

Another part of the learning process involved understanding the limitations of the gaming engine for real-time applications. “There’s no point in doing complicated 3D forest when the gaming engine – at present anyway – isn’t capable of properly rendering a forest,” Fraser adds. “That might result in a compromise to use a sparse background with a few trees, or to look in a different direction.”

Idoine says there was also the tricky issue of matching foreground contrast to the LED backgrounds. “We are talking about different qualities of light,” he notes. “If it isn’t done right, it can look ‘off’ in a very distracting way. When we resolved that, the differences between what we saw on set and our final in the DI were quite minor in most instances.”

A single-camera approach was considered for the volume, but Fraser and Idoine quickly learned the rules of the system and were able to add a second camera to the mix in most instances, and multiples for action scenes. 

“A-camera is tracked by a motion system, which knows the height and distance from the screen,” Idoine explains. “Within what is called the frustum – a plane that intersects that field of view – is the 6K image. Outside of that field, the image is low-res, basically just for ambient lighting to provide an accurate reflection in the lead character’s shiny metallic helmet. If the frustum intersected with the frameline of B-cam, we’d have to change that camera to find an angle that showed the 6K. If A-camera moved up or tracked back significantly during a take, that perspective change would also mean the frustum altered and would impact the ability to use a second camera.” 

Both Fraser and Idoine are quick to note their partnership was unprecedented in their past television experience. “In traditional series work, a DP might ‘take over’ a series, or ‘continue’ with alternating episodes,” Fraser explains. “But the workflow for [The Mandalorian] was basically created from scratch, and Baz was working beside me during preproduction to help develop and overall approach. That meant while I was ‘lighting’ or ‘building’ 3D backgrounds for multiple episodes across the series [including episodes that Idoine would be shoot], Baz was working with the directors at pre-vis, or in physical preproduction, even for episodes I may ultimately DP. This blurring of the lines was very exciting and even extended into when I was starting to prep Dune in Budapest. With the longer lead times of building the backgrounds, the first time I may have seen the ‘load’ was over a secure video link from Baz in Manhattan Beach. I could suggest changes to Baz, or the episode’s director, if it was different from what we had originally planned.”

Fraser (above middle) says his partnership with Iodine (right, behind camera) was unprecedented from any previous experiences in television. “The workflow was basically created from scratch, and Baz was working beside me during preproduction to help develop and overall approach,” the Oscar nominee notes. “That meant while I was ‘lighting’ or ‘building’ 3D backgrounds for multiple episodes across the series [including episodes that Idoine would be shoot], Baz was working with the directors at pre-vis, or in physical preproduction, even for episodes I may ultimately DP.” / Photo by Francois Duhamel, SMPSP

Idoine says shooting with LED screens was a challenging but immensely positive experience. “Seeing this 3D rear-projection of a dynamic real-time photoreal background through the viewfinder is tremendously empowering,” he declares. “It’s phenomenal because it gives so much power back to the cinematographer on set, as opposed to shooting in a green screen environment where things can get changed drastically in post.” 

A-camera operator Craig Cockerill, SOC, found the thought that went into the technology as impressive as the results seen through the viewfinder. 

“Everything always remained in the proper 3D perspective, relative to the camera,” he describes. “While operating, you see this scene in its entirety and can get lost in it very quickly. The shots have been designed to keep the limitations of the system from revealing themselves, which says a lot about the level of preparation. We did shoot sets outside the capture volume as well. In some of those cases they put up a green screen, so an extension looking outside or down a hallway could be added later. That was fairly conventional and in keeping with other projects I’d done.” 

Fraser and Idoine relied primarily on LED’s to easily vary the color and intensity of light. “Occasionally we used HMI’s for a harder quality of light,” Idoine says, noting that “Digital Sputniks, which were Greig’s go-to on the pilot and other projects, were useful, especially as hard sources. During the series we had roughly 800 modules [one beast – 40 modules, four beams – 72 modules, 66-DS6 – 396 modules, 100 DS3 – 300 modules, six DSI – modules].” 

Interactive lighting also helped link practical on-set effects with the post effort. 

“We created a lot of interactives on set for flames and blaster fire,” Idoine continues. “Jeff and his crew built a trigger incorporated into the guns so a red LED diode would flash when the weapon fired, cueing visual effects for their beam animation.” The trigger rigging did double duty, also triggering interactive light gags around the set at the same time. 

“Every day on [The Mandalorian] we were making decisions on how to go forward with this process, so it was like history evolving as we worked,” Fraser describes. “Each of the various departments works independently, but at the end of the day, they had to stand together.” / Photo by Francois Duhamel, SMPSP

Bluff, along with Jason Porter, was responsible for dividing up the 4,000 VFX shots between ILM and fourteen other vendors, including Hybride Technologies, Pixomondo, Base FX, El Ranchito, Important Looking Pirates, Creation Consultants, Ghost, MPC and Image Engine Design. “There has been an enormous number of practical elements shot for previous Star Wars films, so we leveraged as much as possible from ILM’s asset library,” Bluff explains. “For example, there’s a scene in Episode Five when Mando sees two Banthas off in the distance. I was adamant we shouldn’t build a fully animated and rigged furry Bantha for just two shots and suggested we pull out the plates from A New Hope’s dailies. I knew I could come up with a shot design to leverage the Banthas from that.” 

Bluff says his team took a trip to the ILM archives and reshot some old matte paintings. “When Mando flies toward Tatooine, we are actually seeing the [Ralph McQuarrie] matte painting seen early in the original film,” he recounts. “We reused another painting of Mos Eisley for a fly-in; in that case, I sent a photographer out to the exact spot George shot his original plate, capturing high-res elements so we could up-res as necessary.” 

Jones sees further refinements to the shooting methodology in Season Two, increasing the potential for StageCraft. “The limitations of the gaming engine that are still with us relate to heavy animation and deep rows of characters,” Jones explains. “I don’t see us being able to pull those off now; although we continue looking at potential remedies, so it would be a mistake to think of [StageCraft] as a one-stop solution – but it is getting better all the time. Another consideration is that some directors might take issue with having to lock themselves into baked-in materials months in advance of shooting. That kind of lead time to create the screen content isn’t going to be a good fit for everyone.” 

Fraser points out that “The Mandalorian is being viewed on small screens rather than in theaters, so we have to take into account how it will look on an iPhone or iPad.” (Company 3 supervising finishing artist Steven J. Scott handled DI duties on the series.)

Justifiably proud of the end results of this multi-team effort, Fraser seems cognizant of how StageCraft potentially marks an epoch in the annals of virtual production. 

“Technology serves us; we don’t serve it,” the Oscar nominee concludes. “So, it doesn’t make sense, to me, to embrace something just because it is new. But if it can help us do things as well as if we were doing it for real, but more economically, that makes good sense. Every day [on The Mandalorian] we were making decisions on how to go forward with this process, so it was like history evolving as we worked. Each of the various departments works independently, but at the end of the day, they had to stand together.”  

Production Designer Andrew Jones says, “…it would be a mistake to think of [StageCraft] as a one-stop solution – but it is getting better all the time.” / Framegrab Courtesy of Disney +

 

Local 600 Production Team

The Mandalorian – Season 1

Pilot Director of Photography: Greig Fraser, ASC, ACS

Series Director of Photography: Barry “Baz” Idoine

Additional Director of Photography: Ryley Brown

A-Camera Operator: Craig Cockerill, SOC

A-Camera 1st AC: Paul Metcalf

A-Camera 2nd: Amanda Levy

B-Camera Operators: Karina Silva, Chris Murphy, Simon Jayes, SOC

B-Camera 1st AC: Niranjan Martin

B-Camera 2nd AC: Jeremy Cannon

DIT: Eduardo Eguia

DUT: Robby Marino

Still Photographers: Francois Duhamel, SMPSP, Melinda Sue Gordon, SMPSP

Unit Publicist: Gregg Brilliant

2nd Unit and Additional Crew

Operator: Greg Daniels

1st ACs: David Parson, Bill Coe

2nd ACs: Peter Parson, Trevor Coe

DIT: Luis Hernandez

 

editor

Share
Published by
editor

Recent Posts

Refuse to Lose

Kira Kelly, ASC, hits the hardwood for Writer/Director Sydney Freeland's personal tale of hooping life…

4 weeks ago

Hot Time in the ATL

Directors of Photography Joe “Jody” Williams and Michael Watson mix 16mm Ektachrome with 8K digital…

2 months ago