Many are calling HPA 2015 the best ever, with even more growth on the horizon. By Debra Kaufman. All Photos Courtesy of Hollywood Post Alliance.
While NAB, CES, and CineGear draw more overall eyeballs each year, it’s the HPA (Hollywood Post Alliance) Tech Retreat that’s become the industry’s most anticipated event. The beauty of the HPA Tech Retreat is not just the pristine desert location (held again this year in Indian Wells, CA, from February 9 to 13, with no slot machines or cigarette smoke for miles) but the human chemistry – highly qualified professionals focused on one topic: the latest and greatest in entertainment industry technology. In other words, it’s geek heaven.
Even though this year’s event drew a record 500 broadcast, film and TV industry professionals from all over the world, turning this once-intimate event into a larger convocation, form and content were as high as ever due to the Tech Retreat’s annual programmer, Mark Schubin, who brings together surprising and interesting speakers, while also adding levity and quizzes on topics that are often quite arcane.
Of course, all of the main technologies discussed at HPA (redubbed Hollywood Professional Alliance) – high dynamic range, ACES, and UHD/4K – have been prominent on industry radars for many months. ICG President Steven Poster took a lead role via Tuesday’s HPA Super Session, which also included cinematographer Daryn Okada, ASC; filmmaker Howard Lukk; and DI artist Dave Cole. Moderated by journalist Carolyn Giardina, the discussion focused on how Hollywood creatives think new technologies have changed filmic storytelling.
All panelists were in agreement on the great potential of high dynamic range (HDR) for cinematographers and post professionals. “I haven’t been this excited since I shot my first 16-millimeter film,” Okada confessed. Cole dubbed HDR “a great storytelling mechanism that lifts the constraints we had previously.”
Anyone who’s seen HDR can attest that it offers the immersive quality that’s almost 3D-like in nature – without the glasses or headaches. However, the question in my mind arose: is the tail wagging the dog? Television manufacturers have already put out “HDR” TV sets, even though there are no HDR standards. The result may likely be another “Wild West” scenario of widely varying display devices and multiple ecosystems.
“Versioning is a huge problem,” Lukk noted, and Cole agreed. “At the moment, it means that when I grade an HDR show, it will be for a particular set or projector,” the colorist expressed. “What we need to arrive at is a standard so it’s interoperable on all the other formats/devices. Otherwise, we’ll have 50 versions of every show.”
“Give us one standard, at least for theatrical HDR,” Okada pleaded. “We can’t grade for every set. But there needs to be a way of grading now so that we don’t throw information away that we want later [for home displays].”
Cole did caution that HDR should be used judiciously, noting that “the bulk of the image should be in a comfortable range for viewing in a dark room without eye strain.”
“HDR is a wonderful creative concept,” Poster added. “It’s not about being able to see everything from one end of the spectrum and light scale to the other, it’s about being able to choose, to artistically use these values as the artist sees fit.”
Poster also urged attendees to “start thinking about a system” that controls the display, and the color correction within that display, depending on the material. “In the meantime,” he said, “if we can’t do that, let’s get the manufacturer to have one button that can determine how the program is viewed.” The trick, all agreed, is whether or not an HDR standard will be created in time to have an impact on the industry, especially on TV manufacturers, who have no reason to cooperate with their competition.
In another session titled “Next Generation Cinema,” cinematographer Bill Bennett, ASC, and ET Consultants’ Garrett J. Smith and cinematographer/author Dave Stump, ASC, addressed issues related to high frame rate (HFR) and new test materials, respectively – HFR being a technology that received an overall “thumbs down” in one of the Tech Retreat’s signature events.
At the end of each day, organizer Jerry Pierce asked the audience to vote by raising hands on the topics discussed – with very few hands raised for HFR. And the speakers reinforced that negative perception. Cole noted that “HFR actually pushes the viewers away from the story.” Poster agreed, stating that, with HFR, “we lose the ability to suspend disbelief.”
“Shooting in 24 frames per second has been the best possible way to make an audience suspend disbelief,” he added. “The minute you get off the 24-frames-per-second standard, there’s something temporal happening that changes the way you feel.”
Poster also expressed his unhappiness with what he called “the scourge” of motion interpolation in new TV sets. “If we wanted motion interpolation, we would have done it in post,” agreed Cole.
It wasn’t all gloom and doom for HFR; Bennett and Tessive’s Tony Davis demonstrated Tessive Time Filter software that samples HFR footage down to 24 fps. With the Tessive Time Filter software, which takes ACES, ARRI RAW and other inputs, footage acquired at HFR of 120 fps or higher can then be sampled in postproduction, via “a weighted average which equals an effective shutter” to 24 fps.
“I believe this technique has the possibility of greatly improving what I do,” described Bennett, who shoots a lot of car commercials. While software like Tessive’s could potentially have an impact on HFR’s acceptance, it seems more likely the technology is destined for niche uses.
Without a doubt, the topic that took the center stage in multiple sessions was the Academy Color Encoding System, known to all as ACES. ICG Magazine Executive Editor David Geffner led Poster and Michael Goi, ASC, in a discussion on “Suspending Disbelief: When and How to Use New Tools and Techniques.” Sony Pictures Studios’ SVP of technology Bill Baggelaar, DIT/color Bobby Maruvada and cinematographer Theo van de Sande, ASC, shared “The 101 on Implementing ACES 1.0” in a session moderated by Giardina. And Andy Maltz, managing director, Science and Technology Council, Academy of Motion Picture Arts and Sciences (AMPAS), talked about the implementation of ACES v. 1.0 with a presentation called “From Snowflakes to Standards: Maintaining Creative Intent in Evolutionary Times.”
For a group of engineers who are constantly creating standards to corral new technology, Maltz was preaching to the choir. I’m sure every CTO, post executive and engineer is praying that filmmakers adopt a workflow that will take a lot of the “gotchas” out of post. Will they? Time will tell. Maltz admitted it’s been an uphill battle, saying how “adopted infrastructures have a few things in common. Nothing works perfectly in the early days, the pioneers took some arrows, there were leaders and followers, open standards were developed and adopted, unforeseen opportunities presented themselves.”
Maltz described ACES’ ten-year development path. The early focus was color management and future-proof design, with field trials beginning in 2011 and iterative architecture refinements based on field experience. The first SMPTE standards came out in 2013 and 2014. Since then, numerous productions have used the ACES workflow, including Chasing Mavericks, 101 Dalmations, Justified, and Alamo Bay. He pointed out that ACES offers major benefits not just to studios and networks but also to smaller productions and facilities. “It enables them to punch above their weight,” Maltz described, “by providing color science in a box.”
The Developers Kit, with finalized core encoding and transformer specifications as well as supporting infrastructure components (metadata and implementation guidelines and procedures) was released to Product Partners in December 2014. Based on industry feedback and conversations with end users, new features in ACES 1.0 include working spaces for color correction and VFX; improved on-set support; metadata and containers for LUTs; metadata look modification transforms; HDR support; a versioning system; OpenColorIO configurations for VFX tools; and user experience guidelines.
“We need a comprehensive component naming and versioning system, and it turns out this isn’t easy – the devil is in the details,” Maltz admitted. Noting that open architecture is successful when social institutions grow to support it, Maltz enumerated the ways that the industry has organized around a single color and archiving standard.
“Training events and textbooks have been published with ACES chapters,” he said, “and ACES has been used in unexpected applications in the broadcast and medical industries. There are independent educational videos, and not just in English. ACES also enables vendor and end-user contributions, so the Academy isn’t providing everything you need,” added Maltz, who says all ACES-relevant documents are available for free at www.Oscars.org/ACES.
The great synergy that occurs at HPA is due to the co-mingling of ideas between the techno geeks and creatives in the field. For example, cinematographer van de Sande, who used ACES on Deliverance Creek (nominated for a 2014 ASC Award), said he “understood immediately [that ACES] could give us a much more efficient way of showing the image to everybody who is involved on the set. There’s a tremendous advantage in that confidence, which makes it more efficient and economical.”
Baggelaar said ACES provided a “common language” on Chappie, making for “a very flexible environment that allowed the post facility to help create the product that the cinematographer and director were after.” Chappie DI artist Maruvada added that what ACES does for him and his colleagues on set is “gets us to the creative work a lot sooner because we’re not trying to match cameras. For me, the biggest advantage is the preservation of the intent of the DP and director all the way through the process.”
Does ACES save time? Van de Sande reported that the DI for Deliverance Creek, a 90-minute, complicated period film, took only two days because of the decisions made on set. Maruvada agreed. “Yes, I can get all my DI tools on set and put that in metadata that can be transmitted all the way down the pipeline,” he offered. “ACES is a workflow that lets us get to the cinematographer’s look faster.”
When asked if there are any improvements they’d like to see in ACES 2.0, van de Sande mentioned more control of the post process. “You can manipulate tremendous detail,” he said. “In the past, most of the time we did damage control from what happened on set. It’s important that you bring more color correction to set so you can do more work in the DI, rather than spend your time getting your color back.”
Baggelaar said Sony Pictures is pushing to adopt ACES for theatrical releases and TV shows, in part to know what’s archived in the vault. “ACES will give us that structure and the confidence that we can bring the look back to the original intent,” he noted. “We’re able to turn around ACES workflows we’ve done and flip them around with HDR versions. If the system supports the ability to flip a component and not worry about underlying data, that’s a huge advantage.”
Poster also made a strong statement in support of ACES.
“We cinematographers are very passionate about our work,” he announced, “to the point where we tend to push our way into postproduction so we can protect the vision of the director as we made it happen on the set. The tools we have to color on set and that flow through to postproduction are very important. ACES is incredibly important because it gives us a common playing field.”
Last but not least on the HPA tech menu was 4K/Ultra HD (4K/UHD), another topic I’ve personally covered a lot in the last few months. My impression from HPA was that, like it or not (and many do not), 4K/UHD is here to stay, from movie theaters to the Internet. But it’s also in its infancy as a standard.
“The clarity is nice,” said Cole. “But in terms of acquisition, what camera manufacturers call ‘4K’ is not necessarily the true resolution we’re talking about. We’re at this weird time in the industry when numbers are important, not the aesthetics of the image. We have to ask ourselves where we are going, why, and what the effects are.”
Poster talked about using a 4K-camera system (Canon C500) for a 2K result on his last project. “My choice was to go 2K 12-bit to get more color depth,” he explained. “What I’m excited about is the quality of the sensors that we’re getting in these cameras. I was able to shoot all my interior and exterior nights at 3200 ISO and get a great image with very little noise. At 12-bit, the roll-off of the colors was so gentle and exquisite that it reminded me more of film than anything I’ve shot in digital.”
Okada has also shot 4K for a 2K release. “I used it for oversampling,” he described. “I’ll always trade resolution for bit depth or dynamic range. If 2K gave me more color, more bit depth, than I would go that route.” Lukk used an “up-resed” 4K for his short film Emma. “When we got into the close-ups, we started to see all the textures,” he recounted. “We backed down on the makeup because you could see the makeup in 4K, and HDR brought it out even more.”
Poster praised Tiffen’s new filters as working “really well with 4K.”
“I’m now seeing more use of glass diffusion than I have in a decade,” he relayed. “What actor wants to see every pore on her face?” What the resolution gives is natural tonality, added Okada. “If you put a piece of glass in front of the lens, it’s not saying you don’t need the resolution,” he said. “It’s just orchestrating it for the story.”
New test materials are necessary for all of the new technologies bandied about at HPA. Smith and Stump are spearheading the AMPAS’ Science and Technology Council’s decision to create test materials to investigate the parameters of HFR, HDR, high-density brightness and wider shutter angles.
At HPA the organizers screened The Affair, a scripted technical exploration that involved more than 100 volunteers from most branches of the Academy. The material was shot on a single ARRI ALEXA (2K at 24, 30, 48, 60, 72 and 120 fps) and at a variety of shutter angles as well as on two Sony F65 4K cameras, using a 3ality 3D rig with a custom IRND filter on one camera to achieve two different exposures six stops apart.
“We also created a rigorously detailed report, which encapsulates our intent and the test plan going forward,” the pair said. “The materials are currently being used to study wide gamut, HDR, high brightness and even wider shutter angles. We plan to create more test materials, but we want to foster community interaction.”
Once again, the HPA Tech Retreat succeeded in being many things at once: illuminating and tedious, enlightening and head-scratching. It’s always been a great place to catch up with old friends from across the globe; this year it became a better place for visiting with Guild members. One can’t help but have come away from the laid-back desert gathering with more optimism – the seriousness with which the ultra-techy members of our community approach our industry’s creative processes and standards is always quite impressive.
FX's newest anthology series is a tough, compelling watch that says as much about the…
Kira Kelly, ASC, hits the hardwood for Writer/Director Sydney Freeland's personal tale of hooping life…
Directors of Photography Joe “Jody” Williams and Michael Watson mix 16mm Ektachrome with 8K digital…