As a DIT, DP, or Data Wrangler, have you ever considered that camera metadata doesn’t play a huge role on set, other than being collected? So why is it important? The answer to this question extends beyond set and lies deep within post-production editing, VFX, and color correction workflows.
To get a better understanding of the importance, function, and benefits of metadata, we sat down with Digital Cinematography expert and author of the book Digital Cinematography, David Stump.
In this interview, David shares his own experiences with Digital Cinematography regarding the transition from film to digital, how digital technology has impacted filmmaking, and the important role that metadata plays in post-production workflows.
But first, we asked David to give us some background on himself and his professional career in post-production roles!
David: I have filled a lot of different roles in film productions. I’m mainly a director of photography, but I’ve also held the roles of VFX supervisor, VFX cinematographer, and stereographer.
I’ve also held many roles within one film. Several times I’ve been co-producing movies where I also shot VFX, supervised, and ran all the way through post-production. So, I have on a number of occasions done the conform, color correction, plugged in the VFX, and output the DCP. So I kind of know every step of the way in making a digital movie. Although I’ve held a variety of post-production roles, all of them have been more defacto than titled. I don’t usually take a credit in that realm.
But, I think the distinction that I should make is that I’m just a filmmaker.
Have you been on set recently? If so, what does your work look like during the shoot?
If I’ve prepped a job well, my work on set looks like I’m a lazy buggar. Because, if I’ve prepped it well, it’s calm and everyone already has the answers that they needed from the get-go. Sometimes things break and sometimes it gets exciting, but if you do enough prep, then your shoot has time for either troubleshooting when bad things happen or for when good things happen and you find a happy accident.
What type of productions are you involved in these days and in what capacity? How did that change over time?
Throughout the pandemic, I was involved in shooting a lot of commercials, as well as working on an Italian feature titled Comandante. But most of what I did last year was commercial work. I’m also entertaining a couple of offers for SAG waiver shows1 as a cinematographer, and I’m hoping to get a couple of those because, at this point in my career, an independent feature or a good technical commercial is way more fun than a quarter-billion-dollar studio tent pole epic.
For example, on my last commercial, I worked with the director and production company on storyboarding. We did the animatic, as well as very technical testing. I also did a lot of engineering work because there were some serious physical effects. We dumped 1/4 million gallons of water onto a house to simulate a flood, and I engineered all of that.
I shot the commercial, VFX supervised it and supervised the color correction. The series of commercials is airing right now. So, the pandemic put me back in the commercial business for a while.
Do you typically wear multiple hats on a production, or are you usually siloed to one role?
I think I’m too outgoing to be siloed. I take it as recognition of a good producer when I’m involved in the show and they become acclimated to asking my opinion and involving me in all of the weird aspects of a show.
I’ll give you an example, I did VFX supervision on Reese Witherspoon’s Little Fires Everywhere. So I was VFX supervising, but then over the course of the show I also kind of became the defacto special effects supervisor since no one else had experience dealing with the problems of fire, fire comps, and fire on set. So, I ended up doing a lot of special effects supervision that I didn’t count on. But, to their credit, they recognized that I had done it before, and I knew the ins and outs of working with fire, so they pressed me into service on it. I’ve been blowing things up my entire career long, as safely as I can.
Where do you draw your inspiration from?
One place that I know that I draw inspiration from, is when I can do something that I haven’t done before. Or especially when I can do something that nobody’s done before. If you put something on the screen that nobody’s seen before, or you put it on the screen better than anybody’s ever seen it before, that’s very satisfying.
Transition from Film to Digital
When you started working in the industry, did you have the chance to work with analog film? What was the transition from film to digital like?
I started at a time when there was nothing but film. I watched digital technology slowly creep into film and filmmaking, and then I watched it become an avalanche as it inundated the film community. I witnessed firsthand how resistant filmmakers were to digital technology. And, it was a very strong resistance at first.
I saw editors who refused to use an AVID or a nonlinear editor. I saw VFX people who believed that digital would never be good enough to do proper visual effects. Then, I watched that resistance very slowly erode until we got to where we are today, which is that digital has pretty much taken over. While it hasn’t taken over all of acquisition it has certainly taken over 99% of post.
So, I’ve watched the progression from film to digital however slow and painful it was. And ever since the beginning of digital cinematography, I have told everyone and anyone who would listen that digital is going to take over many of the tasks of filmmaking. Get with it, because you’re either on this bus or under its wheels.
You are a true expert on digital cinematography, but what surprised you the most about what became important in digital cinematography in comparison to shooting with film?
The resistance to digital was not surprising. The velocity with which it evolved is still surprising. In comparison to shooting on film, it amazed me how good shooting on digital got and how quickly it got good.
I think one of the things that has changed since the digital revolution started was the speed at which filmmakers learned new technologies. We have learned how to learn much faster.David Stump
When cinematographers gave manufacturers feedback about their cameras, we started off facing a lot of resistance from them. Now, at least at the ASC level, manufacturers are constantly pounding on the door, coming in and asking us to give feedback. They want to know what they need to change in the cameras to make us happier. So that has completely flipped.
I think one of the things that has changed since the digital revolution started is the speed at which filmmakers learn new technologies. We have learned how to learn much faster, you know, the pace of change is exponential.
One thing that is not a surprise to me, that I sort of knew in my heart of hearts and predicted was that digital is going to evolve into something that has always been there. There’s a generation of film students now for whom digital has always been the same thing all their lives.
But what’s really interesting about that is the nostalgia and the love for doing things the old way on film. I’m constantly amazed at how often I do a lecture at a film school and I hear the same thing over and over again— “We want to shoot our projects on film.” It’s hard to understand how kids can be so nostalgic for something they never had when they were younger.
So you’ve been an advocate for digital since the beginning. Can you walk us through your typical digital workflow in more detail?
Every show has a different workflow and it depends on what the target output or market for the project is going to be. A lot of the parameters are the same from every workflow along the way, but in terms of color and data management, they can vary widely. I’ve done shows where I did color correction on set, but I’ve also done a number of shows where I just developed a few looks, and plugged them in.
The workflow evolves based on the participants’ appetite and tolerance for the technology.David Stump
Early on in the era of digital, I worked on a show in New England called The Bronx is Burning. We shot that with Thompson Grass Valley Vipers which were notoriously hard to wrangle the color for. I had Luther LUT boxes all around the stage for every camera and a menu of about 30 different looks for different scenes that we would plug in per camera per scene.
I’ve also done the workflows where we write one or two LUTs for the show and just work with vanilla dailies and let the colorist trim them.
The workflow evolves based on the participants’ appetite and tolerance for the technology. Some people are nerdier than others and some people are artists. Sometimes you have to hide the workflow from production and sometimes you have to feature it. It depends on the amount of buy-in and how much the other filmmakers and studios involved embrace the technology.
What is your suggested workflow when the production has to deliver in HDR? Do you suggest previewing HDR on set?
The first thing that I have to say about shooting HDR is that you have to work out a color-managed workflow with whoever your lab is going to be. And, you have to work everything out in advance including looks and lookup tables.
I don’t like everyone to see HDR on set. I like to look at the one or two HDR monitors on set and pipe SDR to everyone else. That way, fewer questions arise and there are fewer problems to solve. So I think in HDR it is really important to close the loop and round trip the images thoroughly before you even start. Then, you can do HDR monitoring on set. But it’s also important that you pay attention to what the SDR work looks like and that you show it to everyone on set.
Do you feel that workflows on set and in post-production have become less diverse over the last few years and people have begun to adhere more to conventions?
I don’t think it would be even close to fair to say that workflows have become less diverse because there are so many new things that you can do in a workflow that you couldn’t do before.
For example, the last show on which I took a credit as workflow consultant was an Italian movie called Comandante. We wrote a 30-page document laying out the workflow with the guiding principle of preserving every field of metadata that could be recorded by the camera all the way through the workflow.
In a number of instances, we used Pomfort [Silverstack] to repopulate some of the metadata fields that had been stripped by some post-production operations that we had no control over. It was difficult to get people on board, but at least we proved that it could be done on a show.
How do you recommend ingesting metadata from set into post-production systems?
The first thing I do is write a document and the gist of the document is thou shall not delete.
It’s been too easy to delete metadata for too long and people don’t appreciate that there are savings of time and money to be had in preserving and using metadata.David Stump
The last workflow document that I wrote mandated that all of the files generated derivative from the camera original be fully 100% populated with every field that the camera or the sound system could generate and that it was bad to delete any metadata at any point along the way, even if it meant that people setting up the pipeline in post had to relearn their jobs.
It’s been too easy to delete metadata for too long and people don’t appreciate that there are savings of time and money to be had in preserving and using metadata. I did a PowerPoint presentation on Comandante where I explained the value of using lens metadata from Cooke anamorphic lenses. I made a slide with a Gantt calendar page of a VFX shot.
So, if this theoretical VFX shot was difficult and was allocated 4 weeks of VFX labor for delivery, the first two weeks would be spent reverse engineering, match moving it, and undistorting it for the compositing process. The next two weeks would be spent doing the comp work. If it’s an easier shot then it might be one week of reverse engineering and three weeks of comp work.
However, if you preserve accurate lens and motion metadata, the reverse engineering work drops to one or two days. It’s pretty easy to see it when you quantify it on a calendar— you can either do a week and a half of labor or, spend a week doing creative work instead by applying automated metadata.
Automated metadata through the entire imaging pipeline will eventually save tons of money in film production.David Stump
I always juxtapose automated metadata against what was done in the past, something that I call “meta paper.” The data regarding the lens, camera height, and tilt angle was all sort of estimated, written down on paper, and filed away in a notebook on a shelf somewhere in the editor’s room. When VFX or anyone else needed that data, someone had to go get the notebook, rummage through it, and find the piece of data that was needed.
If it’s automated and built into the files, you don’t have to bother anyone to get it, and it’s not generated by a human being so it’s not distrusted nor is it prone to error. I think automated metadata through the entire imaging pipeline will eventually save tons of money in film production.
In a time when producers think the only way to save money on a show is to beat up rental vendors and the crew for their day rate, that’s a lot of money.
Is there any disadvantage to keeping track of every single possibly recorded metadata field?
Aside from the small amount of effort, it will take to get there, I see no disadvantage. It’s a tiny amount of data compared to the image and sound data and if you’re just pushing files around anyway, what is an editing system but a gigantic database?
What roles do color grades play that were made a) in pre-production, b) on set, and c) for dailies for final gradings?
Color grades made on set and in pre-production give studios, directors, and filmmakers confidence in what they’re doing. And sometimes, they are very close to what the finished product looks like. Other times, they’re a starting point from which you can diverge wildly in post-production. But, they do give confidence and they do create something that for lack of a better expression I call “buy-in,” which means that everyone has subscribed. That buy-in lets you then move forward in the day-to-day operation of the set. But you can’t move forward without buy-in. If the participants are skeptical of any of the steps along the way, life becomes way more difficult.
When it comes to dailies for final gradings, sometimes you hit a home run and what you got on set just needs to be trimmed. Other times it can vary wildly.
One of my mentors back in the day was Doug Trumble. I always like his explanation of the filmmaking process. He said, “You start by writing a script. Then you tear that script into single-word giblets of paper, and you spend the next year to year and a half reassembling it into a finished project. And, you have to know that no movie is ever finished. You just ran out of time and money.”
Kind of like that one quote from Leonardo Davinci, “Art is never finished, only abandoned.”
Do you use dynamic metadata within your VFX Workflow?
By way of example, on Comandante, we used dynamic metadata from Cooke anamorphic lenses. Those lenses used what’s called the /i system to provide dynamic metadata. We used it in conjunction with their JSON files that give you an ST map of the lens distortion. We applied those ST maps dynamically in real-time while focusing and irising the lenses.
The reason that we did that was because we were using game engine technology to present visual effects slap comps near real-time on set and the backgrounds for those comps were being generated by the unreal game engine. If you know how the unreal game engine works, it outputs a pinhole camera image that has no lens distortion built into the images.
Therefore, when you try to superimpose unreal game engine images over real photography from a camera on set with an anamorphic lens, the live camera and CG image distortions don’t match up. So, what we were doing with the metadata was driving a dynamic ST map to distort the output of the game engine to match the distortion of the real-world lens. It was very successful and in real-time.
What is the most important metadata for each post workflow— editing, VFX, and color correction?
I think it’s all important. Usually, the most important piece of metadata is the one you don’t have. There’s kind of a goofy defacto rule in film: the only way to be sure that you won’t need a piece of equipment is to rent it and put it on the truck.
The same applies to metadata. The piece that you always need the most is the one you didn’t record. So I think they’re all important. Metadata should survive through every iteration of a file all the way to archival. So, once you know that it’s possible, then you also know that the only reason that it doesn’t do that is because people don’t try hard enough.
Where do you see room for improvement as far as the collection and transfer of metadata from set to post?
There are three areas where there’s room for improvement in the collection and transfer of metadata:
- Software that assumes that no one is going to use the data and therefore they drop it from the file.
- Finding the common naming, syntax, and definitions for all of those fields of metadata.
- Getting people to understand that metadata has value. Just because you’re not using it, doesn’t mean that no one else in the imaging chain isn’t.
What advice can you give young DPs and DITs to stay on top of the creative process? How important is understanding (and constantly learning) technical details?
Learning technical details is urgent. A lot more urgent than a lot of young DPs and DITs I meet understand. There are some who just know intuitively that they need to study, and there are others, who underrate the difficulty of the job. There’s always been this tendency to underestimate it.
This is why I wrote the book, Digital Cinematography. It’s required reading at AFI, at USC, at UCLA, and every film school that I know of in LA requires it. There is a Mandarin translation of it, of the first edition that is required reading at the Beijing Film Institute.
What are the current developments that will shape the work on set in the next few years?
I think specifically the developments of metadata. What I was able to do this year with dynamic metadata from anamorphic lenses hasn’t been done before. It is groundbreaking work.
Another example that I like to use is the first Star Wars movie. Principal photography was shot with anamorphic lenses. The filmmakers started off trying to use anamorphic lenses for visual effects and pretty quickly gave up on that and went to Vista Vision, a spherical format with a very large format negative.
All of the visual effects work was done in Vista Vision and then once they had approved it all, they put it on the printer and reduced it to 35mm negative in anamorphic. The reason that that happened was because anamorphic lenses are so complicated and so difficult to reverse engineer. When you talk about match-moving or undistorting anamorphic lenses, it is or at least until now, it has been enormously difficult. This year it got vastly better because of the work done with Cooke lenses /i protocol on Comandante.
While there is little understanding of the importance of metadata on set, conversely there exists a high demand for metadata in post-production workflows. This knowledge gap between set and post contributes to additional and unnecessary administrative efforts to prepare movie footage for creative work in post-production.
Luckily, the solution to this problem can start with you, the DIT! By recording and safely transferring all possible recorded camera metadata from set to post, post-production teams will be able to maximize their creative workflows and minimize the time spent on administrative tasks.
When in doubt, always refer back to David’s guiding principle: “Thou shall not delete.”
Previously you mentioned that effective preparation on set leaves time for happy accidents. Can you give an example of a happy accident from set as a bonus for our readers?
I was the VFX cinematographer on the first X-Man movie. There’s a scene in a train station where Halle Berry uses her powers to blast one of the bad guys all the way across the train station through the air. I was the VFX Cinematographer on that show. My camera was a high wide and stupid vista vision shot that, from the side, captured the entire action from left to right in one still shot. One static frame. There were other operators whose job it was to follow the action.
On take one, all the operators who were following the action missed the shot. The director blew his stack. You’ve never seen anybody so angry. So, when we came back to do the shot again, after redressing the entire set, I set up my high wide, and stupid camera. I just gave the button to my first assistant and I said “This is a nothing shot, just turn the camera on, slate it, and I’m going to go do something else.”
I went to the camera truck and I dug out the crappiest little 2C you’ve ever seen in your life. The tiniest little film camera and I put a 75-millimeter anamorphic lens on it. I put it on a pan-tilt head on a high hat and I took what they call a clamshell video recorder, and I put it in my lap. I sat under the reader board in the train station, and I had the set decorator bury me in suitcases and blankets so that the only thing poking out was the lense.
I rehearsed a couple of times watching them drag a sandbag back and forth on the wires that they were going to fly the actor on to check the lines. Then, when they called action I followed the actor all the way across the train station. And, I absolutely nailed it! When I climbed out from under the blanket and took my little clamshell recorder over to the playback area on set, we learned that all of the same operators missed all the same shots again.
It kind of made matters worse because my shot was an unplanned shot, I just made it up and did it without telling anybody. So, after we had reviewed all of the other material, all the other playbacks, the director was screaming again. I handed the video playback operator my clamshell, and you could have heard a pin drop as the director watched the playback. Then, he turned to me and said one of the single most scary things I’ve ever had a director say to me, he said “You saved my movie.” It was horrifying.
 Screen Actors Guild is making deals with productions that are low-budget and independent, and the producer only has to sign a promissory document that says that they will adhere to whatever the eventual contract is, and they can hire SAG AFTRA actors albeit for a low budget because they won’t sign these deals for the big studios, the majors or the streamers, you have to be completely independent.
A huge ‘thank you’ to David for sharing your expert insights into metadata with us! Want more from this interview? Keep reading for some bonus content!
Update 23. Oct 2023: Removed a claim without a source about studios’ potential savings using metadata.
Data Management and Dailies in One.
David Stump, Filmmaker, Digital Cinematography Expert, Author of 'Digital Cinematography'
David Stump ASC BVK has been working in motion pictures and television production as a Director of Photography, Visual Effects Director of Photography, Visual Effects Supervisor, and as Stereographer, (including both live-action work and 2D to 3D conversion work), earning an Emmy Award, an Academy Award for Scientific and Technical Achievement, an IMAGO Technical Achievement Award and an International Cinematographers Guild Award. David authored the book “Digital Cinematography – Fundamentals, Techniques and Workflows” first edition (available in both English and Mandarin Chinese) and second edition (English only), and was a contributing author to both the “VES Handbook” and the “ASC Manual 11th Edition”.