DCS > Essays > “The Long Night” – And How To Translate The Filmmakers Creative Intent to Home Displays

“The Long Night” – And How To Translate The Filmmakers Creative Intent to Home Displays

JM Headshot2014Med
by James Mathers
Cinematographer and Founder of the Digital Cinema Society
(Excerpted from the May 2019 Digital Cinema Society eNewsletter)

Got OnTVx400Like a lot of people, I’m a big fan of HBO, and in particular Game of Thrones.  This program ranks among the highest quality cinematic content being produced today, and I’m not just referring to home entertainment; in many ways, it rivals theatrical features.  As the slogan says, ”Its not TV…It's HBO.”  This crown jewel in the HBO lineup is not only critically acclaimed, it draws a large audience and motivates consumers to cough up the monthly premium.

Some of the industry’s top creative and technical talent work diligently to deliver a quality entertainment experience.  Yet, also like a lot of people, I took issue with the images in Season 8, Episode 3 "The Long Night.”  Granted, it was a battle with The Night King, and as GoT characters often say, “the night is dark and full of terrors,” but many viewers saw nothing but dark.  The following is not a screen grab, but it might as well have been.  OK, I'm exaggerating a little, but I'm just saying…it was dark!

BlackScreenOnTV400

I’m not bragging, but I have a pretty sweet home theater set-up, with a high-end 2K projector on a 10’ screen, fed by cable.  If it was hard for me to sometimes tell what was going on in this epic battle, it must have been all the more confusing for those watching on suboptimal displays.

Who am I to criticize these creatives?  I’m not; but somewhere along the line there was a breakdown that kept many fans from enjoying the image quality that the filmmakers intended.  Let's look at some of the potential pitfalls and see how they might be avoided in the future.  The Colorist, Joe Finley is one of the tops in the industry, working in a first class facility at SIM Digital.  He is credited with coloring virtually all of Game of Thrones, (61 episodes,) as well as many other high-end features and television series.  The multi-award winning Director of Photography, Fabian Wagner, ASC, BSC also has many major credits including the feature Justice League, and the series, Sherlock.  Guys like these don’t make mistakes, they make choices.  They obviously decided to go dark, and like it or not, that is their creative prerogative.  Yet something was lost in the translation to the home viewer.

I’m sure the images in the DI suite looked amazing, but that is uncompressed and on perfectly calibrated display.  Their choice was obviously to go for optimal image quality, and I can’t blame them for aiming high.  Instead, they could have chosen to dumb down the image by brightening things up a bit and adding false contrast.  It might have played better in Peoria, or even on my home projection in Studio City, but there is really no way to deliver a master that plays well on every display in vastly varying viewing environments.  They could make compromises and get it to be more acceptable to a broader audience, but apparently they chose not to try.  They went for the highest quality master, and I can’t blame them for that; I only wish I could see the images as they were seeing them.

So, let's not blame the creatives, but look at the real demons in this scenario.  One is compression.  After sending the images through encoders and variable bandwidth networks to the home, any compression artifacts get magnified, especially in the dark regions.  The choices Content Delivery Networks, such as cable operators make in terms of bandwidth can have a huge impact on the final result. I know my cable provider has at times experienced overload outages with very popular programs like the Super Bowl.  With a show like Game of Thrones, whose finale had the highest viewership in the history of HBO, I would not be surprised if they dialed up the compression and squeezed the bit rate for more reliability at the cost of reduced picture quality.  I’m guessing their biggest fear was an outage during one of the most watched programs of the year.

Also factor in that complex and faster-moving images, like in a battle scene, require higher bit rates to maintain quality, compared to less complex and slow-moving scenes, like talking heads. The frame rate, resolution, and dynamic range of the content also impact encoding bit rates and help determine the overall quality of the images.

Then there’s the myriad of displays viewers might be watching the content on, anything from an iPhone to a 4K OLED with HDR.  Many of the consumers' displays do a lot of their own processing, sometimes boosting the peak luminance and raising black levels, or adjusting for a super bright part of the image, (for example the flaming torches in those battle scenes,) which can drive the rest of the picture down into complete blackness.  Colors, highlights and shadows bleed together, so detail is even harder to see.

TV Showroom400Part of the problem is that there is not a good standard for display setup, it’s whatever looks good on the retail showroom floor, which is usually too bright with oversaturated colors.  Unfortunately, that is what sells TVs and a large number of consumers leave their new sets in the factory default.  When you add in the various motion interpolation schemes created to reduce motion blur for content such as sports, you’ve got a real mess.  This has the very unfortunate side effect of making even the highest quality cinematic 24 fps narrative content look like a soap opera shot on a cheap video camera.  With these many factors varying from screen to screen, how can filmmakers ever hope to deliver consistent quality?

Atsc30 techAs technology evolves and features are added to displays such as HDR, High Resolution, High Frame Rate, and Wide Color Gamut, the margin for error will only increase.  Although consumers are not yet experiencing much HDR or 4K in the home, they soon will be, so filmmakers and distributors will have even more versions to consider.  Do we only serve the boutique market with the highest quality displays, or do we make compromises to please the largest possible audience with the best over-all images?  A better option would be to make these choices down stream so that each display device could choose the version and parameters that best suits its given hardware, bandwidth, and viewing environment.

The good news is that there is work afoot to try to harness the power of metadata delivered along with the entertainment content to offer these abilities and display the creative intent of the filmmakers in the best way possible given the individual consumer’s capabilities.  Dolby, for one, is developing a proprietary system that would require a decoder box at every display, either stand alone or built into new TVs.  They would charge the post-houses a licensing fee to encode material for their system, and then rent or sell the decoders to consumers, making a profit on both sides.  This will naturally add to the cost of new TVs, but it might be worth it if they can bring everyone on board and make it all work.

Another effort in development by companies including Panasonic, Samsung, and Sony is working through a consortium with the more open HDR10+ set of protocols.  They are trying to get all the manufacturers on-board to incorporate a broad range of standards to deliver consistently high quality HDR to the home.

According to Ron Martin, Director of the Panasonic Hollywood Lab, they are even working on what they call “Filmmaker Mode,” which will try to emulate as closely as possible what the filmmaker is seeing in the DI suite.  The display setup will be completely controlled by metadata without the ability to be overridden by the consumer.  Of course, viewers can always turn this mode off if they don’t favor the creative intent of the filmmakers, but it may help avoid the whacked out settings that sometimes result from the less technically minded among us trying to noodle with the picture on their TV.

There is also a new broadcast standard in development known as NextGenTV (NGTV) or ATSC 3.0.  It's a hybrid television delivery system using an Internet and broadcast integration to deliver a more robust transmission.

HDRTV400Once all these technologies get sorted out, it will be a boon to both creatives and viewers.  In the meantime, if you need a new TV, you might consider a set that can handle both Dolby Vision and HDR10, as one or both of these systems will likely be helping to control display setups in the home.  If you’re a content delivery network, please give us the bandwidth to enjoy our content as intended.  Build out fiber, 5G networks, and develop ATSC 3.0 to feed our insatiable media appetites.  And if you’re a Filmmaker, please keep in mind the broad range of hardware and viewing environments on which your work will be seen.  The night can be long and full of terrors, but it doesn’t have to be too dark to follow the story.

 

Everyone has an opinion on Game of Thrones; share yours and any feelings you have about using metadata to translate filmmakers' creative intent to the home display on the DCS FaceBook page:  https://www.facebook.com/DigitalCinemaSociety