DCS > Essays > Tales from the Tech Retreat — “ACES” and “Camera-To-Cloud”

Tales from the Tech Retreat — “ACES” and “Camera-To-Cloud”

Excerpted from the Digital Cinema Society eNewsletter of February 2015:

Tales from the Tech Retreat — “ACES” and “Camera-To-Cloud”

HPA TechRetreat2015As we enter the world of 4K and beyond, Digital Cinema cameras continue to produce an ever increasing amount of data.  How to safely and efficiently store and process this material while preserving the filmmakers creative intent is a growing area of concern.  Informed by my recent attendance at the annual HPA Tech Retreat, I decided to tackle the subject.  As HPA President Leon Silverman is often quoted as saying, "Workflows are like snowflakes. There are no two alike, and once they hit the ground, they simply disappear.”

However, filmmaking is a collaborative endeavor and with an infinite number of disparate methodologies, there is a threat to our ability to effectively work together.  There are efforts underway on a number of fronts to try to deal with these issues in the areas of storage, archive, look management, and a lot of smart people, (many of them DCS members), are on the case.

“ACES”

Aces landing page header3First let’s look at the Academy Color Encoding Specification, or “ACES” for short.  For several years the Academy of Motion Picture Arts and Sciences has sponsored a group of Color Scientists and Post Production Technologist to come up with a way to manage color throughout the life cycle of a motion picture or television project.  The Academy recognized the need for motion picture production and archiving standards. Just as in decades past with frame rates, aspect ratios, sound formats and even the size and spacing of film sprocket holes, common standards were required to increase industry-wide adoption and move the technology forward.

From image capture through editing, VFX, mastering, public presentation, archiving and future remastering, ACES is designed to ensure a consistent color experience that preserves the filmmaker’s creative vision.  It is not a certain “look” to apply with the idea of making all images appear the same; (although as FilmLight’s Peter Postma demonstrated in his Tech Retreat ACES demo, it can make the work of matching different cameras far easier if that is the intent.)  Rather, it provides an image format and device independent processing parameters for each piece of the system.  The idea is that this will help to ensure those unique visions will be properly reproduced throughout the pipeline.

Brandon Bussinger, Founder at the tech consulting firm, Working Order, uses the analogy of an operating system for a computer; it provides the core functionality necessary for multiple pieces of software to work on common data and hardware.   And as Modern VideoFilm’s Dave Cole puts it, “ACES brings a unifying workflow to an industry that has seen many secret sauces from the various facilities providing post services.  It also provides an archive format that can be restored at any time in the future, with confidence that the original intent of the creators can be reproduced.”

In addition to the creative benefits, ACES addresses and solves a number of significant production, post-production and archiving problems that have arisen with the increasing variety of digital cameras and formats in use.  It also helps deal with the surge in the number of productions that rely on worldwide collaboration using shared digital image files.  For example, a camera manufacturer will supply their own “IDT”, (Image Device Transform) to allow other systems to reproduce their camera’s images as intended.  It’s all about getting to a common Square One.

ACES is free, open, and device-independent.  After 10 years of research, testing and field trials, they have recently released v1.0 into the wild, and a large number of stakeholders have implemented the standards into their systems, from digital cinema camera manufacturers to color correction systems providers.  DCS is proud to say that many of our supporting companies are among the developers and early adopters.  In alphabetical order, these include: Adobe • ARRI • Blackmagic Design • Canon • Codex • Dolby • FotoKem • Fujifilm • Modern VideoFilm • MTI Film • Panasonic • Sony. (For more information, visit the AMPAS website:  http://www.oscars.org/science-technology/sci-tech-projects/aces  ).

“Camera-to-Cloud”

CloudGraphicWith ever increasing web connection speeds and decentralized production trends as producers are drawn to distant locations to take advantage of incentives, a cloud based post scenario is starting to attract attention.  However, there are concerns that could stand in the way of progress.  Another effort to bring order in the face of competing technologies deals with this area of collaboration via the cloud.  The Entertainment Technology Center at the University of Southern California (ETC) has been working on “Project Cloud,” or as some have referred to it, “Camera-To-Cloud”.

As with ACES, it brings together a core group of technology leaders in an effort to develop guidelines and accelerate innovation and adoption.  In this case, the mission is to come up with next generation cloud-based content creation, production, and distribution tools and processes. Support comes from much the same group of major studios that sponsored the work to develop the Digital Cinema Initiative, or “DCI,” which greatly aided the implementation of Digital Cinema projection and distribution.  This time, they have joined with postproduction service providers and cloud services companies in an attempt to guide a new process. The project looks at the entire content life cycle of media from pre-production collaboration and production, to marketing, distribution and all the way through archiving.

The recent Industry technology shift from traditional film to digital media brings with it an urgency to identify common workflow solutions.  Coming up with these methodologies will hopefully avoid format wars and proprietary solutions.  ETC Executive Director, Ken Williams warns, “When there is no effort to define guidelines for the underlying framework of an emerging technology, multiple formats evolve and battle for supremacy – like Blu-ray vs. HD-DVD, Beta vs. VHS – bringing uncertainty, inefficiency, and confusion to the marketplace, slowing product adoption and business growth for all players.” The idea of Project Cloud is then to avoid these potential pitfalls.

The ETC’s Erik Weaver laid out the program as he moderated a panel at the Tech Retreat called “Cloud Demystified: Understanding the Coming Transformation from File-to Network-Based Workflow.”  While the official definition of Cloud Computing comes from the National Institute of Standards and Technology (NIST,) Weaver jokingly shared a simpler description that has been circulating, using “someone else’s computer.”  In other words, it offers the ability to perform all manner of computation on systems that don’t need to be physically present at your particular location, which opens up many possibilities.

DigitalFilmTreeLogoDCS Supporter, DigitalFilm Tree is a company helping to lead the charge toward the cloud, as they have with previous cutting edge methodologies.  Company CEO Ramy Katrib, who is a Founding Member of DCS, was interviewed some 14 years ago for the original documentary from which the Digital Cinema Society was born.  At that time, his company was pioneering non-linear desktop editing.  More recently he has been joined by company CTO Guillaume Aubuchon.  Together they have sought to bring practical use of cloud technologies into active television and film productions.

Aubuchon, who was a panelist at the Tech Retreat panel described the company’s approach to the cloud.  “What we’ve been trying to do is apply IP technology to serve production and post. We discovered OpenStack, and used it for storage and software development, then matured it into an analytics platform.”  OpenStack is a free and open-source cloud computing software platform deployed primarily as an infrastructure service solution for controlling processing, storage, and networking resources throughout a data center.  It was first developed by NASA, and is currently managed by the OpenStack Foundation, a non-profit group established in September 2012.

Cloud 02 smallSo what magic can Camera-to-Cloud offer to improve efficiencies and secure our data?  The big promise, according to Aubuchon, is accessibility. “You want instant access to content everywhere it’s demanded.” However, he explained, such access needs to be controlled, so just as easily as it can be granted, it can also be instantly and automatically denied.  In addition, every piece of the file that is accessed is logged, with comprehensive version tracking.  And of course, he added “all this access is encrypted, at rest and in transport.”

So how might it work?  Some other exhibitors at the Tech Retreat displayed technology that could interface.

Several of JVC’s new 4K cameras have a built-in HD streaming engine with Wi-Fi and 4G LTE connectivity allowing direct live HD transmission to various hardware decoders.

Meanwhile, products from Teradek allow full HD to be uploaded to the web from any source as long as it has an HDMI or HD/SDI output. An on-set scenario might have a proxy coming right off the camera or a Data Wrangler uploading to the web as they are backing up the media.  Larger and less compressed master footage files, which require longer to upload, can then follow; but in the meantime, the content is accessible for post production.  Another Teradek product known as “Core” allows editors to share the program output of an editing session over a LAN or WAN in full HD. giving members of the creative team the ability to view changes in real time anywhere in the world on their smartphones, tablets, and computers.

Such use of the cloud allows for simultaneous location independent collaborative workflows, where for example, an Editor, Colorist, or VFX Artist might all be working with the data at the same time while spread out at various locations anywhere in the world.  Although the workstations may be spread out, the data is centralized, which allows a facility to scale and direct computational resources as needed for maximum efficiency.  Aubuchon explained that “All the CPUs, GPUs and storage can be tasked at a moment’s notice with dailies processing or a VFX shot…And then, at the end of the process, to add value, we run analysis of productions.”

As a test case, DigitalFilm Tree provided post services on the short film, “Luna” and with that unqualified success they are currently using these techniques on a major feature for Disney.  As a followup, the ETC’s Eric Weaver announced that a white paper on Project Cloud would be released in the next couple of months.

As an organization, we feel it is incumbent upon the Digital Cinema Society to vigorously support efforts such as ACES and Project Cloud.  Progress will be quicker and less painful for everyone if these cooperative efforts flourish and help provide a path forward.  We can help by reporting on their progress, trying to educate the community, and helping to spread the good word.  Count this essay as one such contribution.