5 THINGS: on HDR
1. What is HDR?
I love me some acronyms and today is no different. While yes, it does stand for High Dynamic Range, it takes on different meanings depending on where you lie in in the production process. If during acquisition, it’s more appropriately called HDRi or High Dynamic Range imaging. If it’s used to talk about monitoring, then we call it HDR….and that’s what I’ll be using through the rest of this episode.
Regardless of acronym semantics, at both points, you’re dealing with a wider range of luminance and color tone than what you’ve traditionally dealt with – and the challenge is handling HDR with as much color and detail as possible as it makes it way through production and post.
The immediate rub is that most cameras can’t capture, and most monitors can’t display this wide “spectrum” of visual depth.
Well, why is that you ask? See, human eyes can perceive about 20 stops of dynamic range, and possess the ability to see about 9 stops of this at any one time. Think of looking into the shadows of your bedroom – you can focus on the shadows and begin to see more detail within those shadows as your eyes adjust.
Most consumer cameras can’t capture this wide array of 20 stops and usually expose right in the middle. This often means you’ll lose a ton visual detail in your shadows and bright spots during both acquisition and exhibition.
As is with most things digital, there is a compromise between what we want, and what we get, and thus you need to compromise on your shot.
So, in concept, HDR is fantastic. More visual data for a more immerse experience. More better pixels – not just more of them.
We’ve been robbed for years, and I think it’s high time we took back what we’ve been denied. Now, how do we go about that?
2. How do I shoot HDR?
The short answer is that you’re most likely gonna need a new camera. I know, you’re still making payments on that Betacam, but older sensors just can’t capture the rich goodness HDR allows for. Some popular, newer cameras, like the C300, Alexa, Amira, Sony F \3/5/7/55/65, red epic, dragon, BMD cc, etc… All good enough.
Now, normally this is accomplished by shooting in formats that support the amount of dynamic range HDR plays with – such as Log-C, as well as the CODEC the footage is recorded in. And while, yes, it’s technically possible to shoot HDR in rec. 709, you’re best shooting in a format that has a greater dynamic range.
This amount of “greater dynamic range”, however – is still in flux. Standards, people standards. Many cameras tout “high dynamic range”, but the label “high” is somewhat arbitrary, and very much based on other factors, not all of which the camera controls.
Sony, RED, and ARRI have really been leading the higher dynamic range charge. Some HDR features are now trickling down into prosumer cameras as well. However, due to the lack of standardization, the ability for consistent image manipulation can vary. Not only does their dynamic range capture ability vary, but they shoot with different looks to retain the data that was acquired, which means accurately taking what was captured in the field and working with it in post can vary from camera to camera.
3. How do I edit HDR?
Now you’ve got it, how do you work with it?
As we are still in HDRs commercialization infancy, robust HDR support is not quite as robust as we’d like. However, Adobe did introduce their first iteration of HDR support inside Premiere’s Lumetri panel last September, which includes HDR capable scopes, and support for several of the standards for HDR data retainment.
Avid Media Composer, as v8.5 supports HDR spaces. When a user selects a new project, Dolby Vision, Sony S-Log3, BBC/NHK and ACES profiles are available. Color transformations are handled automatically once an HDR color space is selected.
It’s still a good practice to follow a more traditional offline/online model, bypassing the need for HDR during editorial. This allows you to really exploit your HDR content during the color phase, using tools as Resolve 12.2, which does have HDR support.
Many folks are shooting HDR in addition to a larger UHD or higher frame size, making the processing requirements for your editing system even greater – so an offline/online workflow works on multiple levels.
HDR support is still the wild, wild west, and NLEs are no exception. For now, we simply have to wait for a bit more updates and standardization.
Other techniques to give the faux HDR look involve locking off a video camera, and recording at different exposure levels, and layering them in post. There are also plugins and filters to approximate the HDR look. When looking into plugins, keep in mind, you need to SHOOT in an HDR format to get true HDR – some plugins or filters just fake it. But of course, how can you TELL it’s being faked without a monitor to view it on? Good Question.
4. How do I watch HDR?
What good is HDR if we can’t see it?
Just like 4K, 3D, HD, and even color video, what good is this new format if you can’t consume it?
Sadly, unless you buy a new monitor – you’re not going to see the massive benefit of your HDR. While, yes, HDR gives the creative latitude in post to craft more pretty pictures (just like 4K and re-centering does for an HD release), the full extent of that HDR work won’t be noticeable without an HDR capable monitor.
As I mentioned earlier, Luminance is one of the largest differentiators between HDR and non-HDR content. With monitors, we refer to this luminance measurement in terms of “nits”. A single nit is about equal in brightness to a candle. Traditional television monitors can display up to several hundred nits but isn’t helped by the fact that the traditional HD color space – Rec. 709 – can only handle around 100 or so nits. To really consume HDR content, you’ll need several times that.
Remember the standardization issues I mentioned earlier? Yeah, they continue into exhibition as well. Dolby Vision, Philips, Technicolor, BBC – among others – are developing their own presentation technologies – making the nit standard for HDR consumption to vary, from 400-800 nits, up to the reported theoretical max of 10,000 nits that Dolby Monitors can handle. Which, quite frankly, will blind you. We did see several TVs this year at CES capable of around a 1000 nits, so the future is NOW.>
And while the current standards purport to take nit variations into consideration, it still means different viewing experiences on different technologies.
Lastly, just like 4K, no traditional broadcasters in the US are pumping out HDR on cable television. However, some VOD outlets are currently in the first stages of HDR distribution. This year will be huge for HDR, and I’m sure we’ll see many changes and more titles added to VOD outlets.
5. What is the future of HDR?
Fortunately, we’re already on our way there. As mentioned, cameras are beginning to be widely available with an increased amount of dynamic range and NLEs are beginning to handle this range. Most new televisions coming out this year support some form of HDR, and the nit count for viewing monitors is only growing, albeit incrementally.
However, just like acquiring UHD or larger is still used for folks who are ultimately watching SD and HD– HDR still adds another tool to your creative toolbox enhance your storytelling ability.
The good thing for you is that the ability to deal with HDR in post – both from an editing, grading and workflow standpoint will certainly require someone with your skills, so you best get learning.
I want to thank the otherworldly Juan Salvo at the Colour Space in New York for the metric ton of color knowledge he shared with me for this weeks episode. Mind that extra “u” in ColourSpace kids…it stands for…Juan.
Have more HDR concerns other than just these 5 questions? Ask me in the Comments section or stalk me online. Also, please subscribe and share this tech goodness with the rest of your techy friends….especially at parties.
Until the next episode: learn more, do more – thanks for watching.