Have you ever felt swamped by the challenge of navigating a foreign country’s unfamiliar environment and language? Feeling uneasy about not knowing what to expect and if you can somehow get by with your language skills and customs? Things could get out of control. On the other hand, the prospect of a valuable experience and a first-rate outcome tempts you very much. Stepping into the world of virtual production evokes similar feelings of discomfort within many production teams triggered by a lack of knowledge and confidence. These feelings prevent many film productions and DPs from even considering it or causing them to decide not to engage with it during the pre-production process. Can you relate?
Based on an interview with Emmy Award-Winning Lighting Designer and VP Specialist Peter Canning, this article provides his personal insight into the current state of virtual production, its advantages from a DP’s perspective, and how to approach it more confidently. It also gives you a better understanding of the topology on set, the related roles and responsibilities, and how the DIT and Livegrade Studio are integral to bringing DPs back into their comfort zone.
Before heading into the unknown, what would be your first step to building more confidence? Reach out to somebody who has been there and already gained tons of experience.
The Current State of Virtual Production
Peter Canning is Founder, Head Designer, and VP Specialist at High Res, an independent design practice in Dublin working across virtual production, film, television, live event, and architectural sectors. His career started by graduating from Ravensbourne College of Design and Communication, London, where he majored in television lighting. Peter went on to become a programmer and lighting designer. Winning the Emmy for Outstanding Lighting Design in 2014 (as a team with Al Gurdon, Michael Owen, Ross Williams) for the Sochi Winter Olympic Games opening ceremony was an important milestone in his career. He founded High Res 25 years ago and slowly emerged from realizing interactive lighting projects into virtual production. The latest virtual production projects High Res has been involved in are “The Pope’s Exorcist”, a Russell Crowe feature with an April US release date, “Retribution”, a Liam Neeson feature that is expected to be on the screens in the summer of 2023; “Anansi Boys”, an upcoming British fantasy TV miniseries for Amazon’s Prime Video; and two episodes of the second series of “Modern Love” for Amazon’s Prime Video.
As a lighting design company, High Res stepped into the world of virtual production from a different angle to most other companies.
Peter:
“We got into virtual production, maybe 2015 or 2016, from doing interactive lighting. So a lot of the time, we were using LED screens to create lighting effects that were not in-camera but out-of-camera. We’ve also been using LED systems for broadcast TV for many years. For example, we’re currently designing “Dancing with the Stars” in Ireland, using projection systems, LED systems, and lighting. And for all of those systems to work together to create one properly balanced camera picture, they all have to interact in synchronisation with each other. So that’s the big thing, it doesn’t matter if we are in a broadcast world, or live events, or film using virtual production technology – we would always synchronise lighting and LED screens together. So from how we’ve emerged in virtual production, the important thing is that we moved in from the lighting and director of photography side.”
As a company, High Res has evolved into a turnkey services provider for productions. So the range of services is much more comprehensive than just providing an LED screen or a server. It’s an end-to-end service that helps shape the entire production process from a pre-production stage.
Peter:
“We’ve come into managing right from the plate capture all the way through to the playback, to getting that content to relate to the script, playing it back on set, and sending it to the edit. So a lot of the productions we’ve done recently, we would have had a team that was involved in very early storyboards or had to deliver particularly creative solutions. So it’s a much, much wider arc of what we provide.”
Two major workflows appear in the virtual production world: a workflow with pre-rendered content to be played back on the LED screen and a real-time workflow.
The pre-rendered workflow achieves a high level of image quality and immersion, with the only negative being that the perspective is fixed and does not shift in relation to the camera’s movement.
In some cases, a mix of LED screen and green screen might be considered the option to go with. A 2.5D workflow is an option that could be used to create more movement and perspective in a pre-rendered workflow. This requires a composite of the plates to be created. However, the parallax effect resulting from this is limited compared to the real-time workflow.
The second major workflow involves a game engine (e.g. Unreal Engine) playing back 3D-generated content in real-time on the LED screen, which is probably the workflow you would think of first in a virtual production context due to the higher media presence. This workflow gives a lot of flexibility with the assets and perspectives and also allows for camera tracking to be introduced, which creates a more synchronized parallax to the camera.
Peter shares an interesting view about the mentioned possibilities, pointing out that the pre-rendered workflow is underrated in his eyes, and the industry is missing the benefits.
Peter:
“During and shortly after the COVID pandemic, it was like the industry skipped a step. Everybody wanted to go game engine. Everybody went into the huge LED volume “Mandalorian”-scale virtual production sets, which completely served the purpose because people couldn’t travel. But now that people can travel again, we see a reduction in that type of huge volume work. So, pop-ups, which is what we do, are where we see the shift of focus. People have underestimated plate workflows, the pre-rendered green screen replacement. I think we will see more of that happening over the next few years because of the benefits of it as a tool. It’s an easier-to-understand and less risky workflow. I’d say 80% of our work is still with plates-focused workflows because it’s a process that gives photorealistic imagery and real light to start with. So you get all the benefits of virtual production, but you don’t have the concern of building a world in Unreal Engine and all the associated complications.”
“And there’s still a way to go, and not all projects make a match with a game engine workflow – especially when it comes to timelines and budget. For example, with “Modern Love”, we were commissioned quite close to the shoot. There was no way you would build a (roughly) 200 Kilometer photorealistic train journey in Unreal Engine in the timeframe we had. But there was also no need because pre-rendered plates worked perfectly in that scenario.”
For a production considering virtual production and choosing a workflow, Peter explains how consulting in the early phase of pre-production is important.
We treat virtual production as a tool; we’re not selling it as a solution to everything.
Peter:
“We look at storyboards and then discuss with directors, DPs, and the art department, what the creative vision is to be delivered. And then, we start looking at the technology needed for that. Our approach is always: number one, look at the creative intent first. It’s not technology. You can find the technology to deliver the creative, but the creative is the most important thing.
And then we decide, well, what is the solution? Is this a game engine workflow? Is it something that can be done with plates? Or is it something – which happens – where the conclusion is that virtual production isn’t the right way to go? We treat virtual production as a tool; we’re not selling it as a solution to everything.”
Unfortunately, from Peter’s experience, many productions seek support too late in the process to incorporate virtual production into their project. So this would be the first adjustment a production should make in building confidence and making a well-thought-out decision for any project.
The Benefits and Challenges of Virtual Production
How deep have you been diving into virtual production technology so far? Are you already aware of the benefits and challenges? How can it enrich the work of the DP? Let’s take a closer look at the benefits and challenges, especially compared to shooting with green (or blue) screen for imagery replacement in post.
First, shooting in an LED volume gives productions more scheduling flexibility. They can act more independently from a location and daylight availability aspect, reducing the struggle with weather conditions and saving time by not moving the whole crew from one location to another. Repeating a scene with consistent lighting and a fast reset to starting positions (e.g., driving scenes) can speed up filming. It also encourages the actors’ and actresses’ performance by giving them sight lines and a more natural atmosphere to respond to than they would have with a green screen. It also gives the art and costume department much more flexibility in choosing clothes and props. Peter recalls his experiences while filming the “Modern Love” train episode.
So over the three days, we filmed in the volume, we had a consistency of light, and the actors loved it because they could look out the windows and at the reflections in the glass.
Peter:
“The director was very uninterested in filming with virtual production initially. He just wanted to film on a real train, saying he was not interested in the technology. But the COVID restrictions at the time meant that it was not possible to film on a real train.
Over the course of the shoot, he realised that if he were filming on a real train, he wouldn’t have been able to stop it and reverse it back. To do a shot again, we just hit stop and rewind. So over the three days, we filmed in the volume, we had a consistency of light, and the actors loved it because they could look out the windows and at the reflections in the glass. Photorealistic reflections came from the outside world on the LED screen and didn’t have to be put in in post. And also, we had the most important thing – which is again where we come from as a company – the light. So as the train moves through forests or urban areas, all of the light and shadows that relate to the imagery actually happen on the seats and tables, etc., around the actors. So the benefits of that are huge. And at the end of the shoot, the director having seen the benefits and said he never wanted to film on a real train again!”
So benefiting from using the ambient light of the LED screen in the sense of a lighting source instead of struggling with green spill or fix-it-in-post trade-offs is an enormous advantage for the DP’s work.
Peter:
“We’re finding that DPs are actually reducing the amount of lighting they’re using on a virtual production stage. None of this is replacing lighting. But they’re engaging and encompassing the interactive light that comes from an LED screen as their primary source and then augmenting it. So they’re capturing much more than they would with a green screen. It looks much, much more real. You can also pull focus, which we did numerous times in “Retribution” for example, between items in a car’s rearview mirror and something on the LED screen. Pulling focus back to the foreground is simple with photorealistic imagery on the LED wall, but is very difficult to do with a green screen background.”
What are the comparisons between shooting with a green screen or with pre-rendered content on an LED wall? Actually, it doesn’t make much difference. The basic workflow is the same whether you shoot plates in pre- or post-production. Only shooting 360° footage with several cameras brings in more complexity but will also give more creative control for decision-making on set by seeing all the little hidden things on the LED wall’s content before hitting the record button.
What emerges as one of the main hurdles to be taken by productions is that planning and scheduling claim much more attention, as the content for the LED wall must be captured and ready on the shooting day. So it’s all about content management!
Peter:
“One of the major difficulties is content management, script association, how the assets are being delivered, the expectation of them, and pre-lighting them – which needs time. We had a scene, for example, one day on “Nightflyers”, where an actor was ill and couldn’t film that day. Production changed the shooting schedule around last minute so that they could film alternative scenes, but without having the corresponding assets lined up for the screen. So we ended up having to turn the screen blue, making it a very expensive blue-screen solution!”
Despite all the apparent advantages, there are, of course, still difficulties that an ever-evolving technology inherits. Every project comes with unique stumbling blocks. The question is how to deal with them.
Peter:
“Every project has technical challenges at the moment. A lot of the time, the hardware isn’t capable of delivering some of the outcomes you want. Whether it’s refresh rates, camera shutter angles, or even the content playback bit depth and the control that some of the larger visual effects houses want. Technology is constantly evolving. The only thing we’ve not had a problem with is LED processing (our preference is Brompton). It’s also important to say: We’re not dealers or distributors. We always try to develop an R&D style relationship with companies, and affiliate ourselves with partners in industry that strive to test the technology, because we want to put forward a workflow to a client that we’re confident in. And some of the technical challenges need to be workshopped out. We always do screen test days. A big topic at the minute is ACES because of the 16-bit pipeline. None of the playback servers are playing back at a 16-bit depth. We’ve never managed to playback content above a 12-bit depth. It’s a prime example of the limits of technology at the moment.”
Ultimately, the advantages and disadvantages must be weighed against each other, and a decision must be made depending on the project’s requirements, budget, and creative intent.
The “Fix It in Pre!” Approach and the Topology of a Virtual Production Set
Working with an LED volume has increased the significance of pre-production. Therefore Peter recommends getting key people like the DP onboard as early as possible. The shape of the final outcome is a collaborative process.
Peter:
“We will develop an outline of what the process looks like, a tech list from our point of view; then it gets costed. In the next step, content has to be decided on. Whether it will be bought, captured or built. After that we would go into a pre-visualization stage. This is the stage where we start modelling a virtual representation of what will become the physical LED volume, including 3D models of props etc. It’s also the point where you find where there might be issues, such as are you going to see the floor or any gaps in your design. It’s great if the director and DP are available for pre-viz. It’s not always possible, but if they are in, they might see something they want to change before they’re actually on set. Pre-viz is an insurance policy.”
It usually takes a few days to build an LED stage and ends with the calibration day and setting up the color pipeline as agreed with the DP and post house. Once this is done, the DP comes in for the pre-light, which should be considered another important step to allow time for.
Peter:
“Our suggestion on pre-light days on set with the DP is to turn the screen on and look at how the screen is actually affecting the world you’ve created before. And then see: Do you need to augment that world with lighting? How much of the ambient light can you capture for your scene? And how much do you need to add? So we’ve gradually had DPs that come in light, and then dim their lights, dim and dim. In the end, you realise they’ve got two or three lights on, and they’re taking full advantage of the LED screen.”
On set, each production can vary hugely, but as an example, you will have a virtual production supervisor, a server programmer, and operator, a networking and processing technician, a few technicians maintaining the screen, and if it’s a game engine workflow, maybe another five or six people doing the integration with camera tracking, playback, and asset management. As a team, they are referred to as the Brain Bar. Take a look at the following graphics to better understand how everything and everybody is linked on stage.
How Livegrade Studio Catapults Directors of Photography Back Into Control on Virtual Production
It all comes down to efficiency on set. To operate efficiently, pure scheduling and strategic planning won’t do the job on its own. Peter points out an aspect that, in his eyes, is crucial in succeeding with a virtual production project.
You can spend millions on a virtual production piece, but if the DP isn’t engaged and interested in using it, you won’t get the results.
Peter:
“It’s funny – talking to Epic Games, and a number of other people over the last year – they have started focusing on DPs, because, to be honest with you, they’re actually the key people to engage in virtual production. You can spend millions on a virtual production piece, but if the DP isn’t engaged and interested in using it, you won’t get the results.”
“I also feel quite strongly that it’s really important to involve people in the process that should be part of the process. You don’t take control away from them. Because firstly, they won’t engage in the process, I think, because they feel that they’ve been excluded from it. But also, again, just from experience, they are usually the people that have the experience to make these decisions. It’s important to give control to the right people.”
Specifically looking at the role of the DP, capturing a balanced camera picture, including the LED screen and lighting, that will be sent to post, is the end goal. This is the point where Livegrade Studio can show up with its full power to support the DP’s creative intent. High Res utilize Livegrade Studio in their virtual production pipelines to allow adjustment of the color grade of the content going to the LED screen in real-time.
Pomfort’s integration for Livegrade Studio with Brompton Tessera Processors has been an absolute game changer for us. Because it allows control to happen by the people who need to have control.
Peter:
“Pomfort’s integration for Livegrade Studio with Brompton Tessera Processors has been an absolute game changer for us. Because it allows control to happen by the people who need to have control. Via Livegrade, we can design a system whereby the heads of department retain full control of their elements, ie. the DP has control of the processor via the DIT through Livegrade. In the same way, we give the lighting or intensity control to the lighting operator. So the lighting operator and gaffer can control the intensity of the screen, while the DIT can control the color grade. On a normal film set, that’s the two people – gaffer and DIT – that are on either side of the DP. The DIT will look at the color grade, LUT, and exposure. If you give the DP control of those elements, then he or she is in full control of creating a balanced image. A balanced frame between the LED screen, the lighting, and all the elements working together. And it’s our job to design a system which facilitates that.”
Let’s shine some light on a few real-world experiences Peter shared with us.
Peter:
“The first time we used Livegrade Studio with the built-in integration for Brompton, was on “Retribution” with DP Flavio Martínez Labiano and DIT Peter Marsden. There’s a tunnel scene with a car chase through the Tiergarten tunnel in Berlin. The lights of the tunnel captured in the plates were on the right-hand side of the car. But in the LED volume, we had to flip them, so the lights were on the driver’s side, so that when the lights were going past, Flavio could get that effective lighting to hit the actor’s face. When we came to shoot, they felt the blacks were lifted. So Flavio was saying to Peter Marsden: “Can you crush the image to make it quite high contrast? So we get more of that effect of motion.” And they were able to do that with Livegrade immediately and live.”
“On “Anansi Boys”, which we did last year with Ian Marrs as a DIT, they ended up running a second laptop with Livegrade specifically for the LED screen. They found it very beneficial from the point of view of being able to balance plates and balance the image for the T-stop they wanted to work on.”
So we said: ‘No problem! Here’s Livegrade!‘. So the DIT got involved immediately and graded the plates to exactly what the DP wanted. And the feedback was: ‘This is brilliant!’
“We had a really good example in a project in the latter half of last year. We did the “Pope’s Exorcist,” a Russell Crowe feature. They needed to use library footage of plates driving through the countryside, which they purchased, rather than shoot the plates. So they had content that worked, but had to be graded to suit the creative. The plates were graded ahead of the shoot, but when the DP arrived to do the pre-light, he looked at them and said: “This isn’t right, now that I see it. I want it to be darker, I need it to be bluer, I need it to …” So we said: ‘No problem! Here’s Livegrade!‘. So the DIT got involved immediately and graded the plates to exactly what the DP wanted. And the feedback was: ‘This is brilliant!'”
Another benefit of the DIT having Livegrade control of the content on the LED screen is that it can save work for other people on set. For example, the roles involved with the server operation of the LED screen.
Peter:
“The fact that the user interface is a UI that you add a device to, and the device is the LED screen; it’s a process that is really straightforward for the DIT. Decisions can be made about how bright an image is. Or the slight color grade of an image or something like that without involving the media server operators, who might have a whole load of other items to deal with. They might be programming for something else, they might be moving horizon lines, they might be prepping some asks from the DP to do with light cards or whatever. When you have Livegrade in place, the DP doesn’t have to go over to the guys running the LED screen. He just talks to his DIT and says: “Can you just put some pink in the screen? Or can you adjust the contrast? Can you crush the black levels?”. These are all requests the DIT can do instantly without having to involve the other people that are running the LED screen. And we find from a communication point of view that it is so much more efficient and also empowers the DP to be able to get their creative thinking across really quickly.
“When we did “Foundation” for Apple, the visual effects supervisor, Chris, said to me: “You guys need to do a translation on all of this processing stuff.” Because, you know, I don’t talk about “brightness”. I say “Can you take it down a stop?”. There was a language issue between the visual effects team and between guys that might have come from broadcast.”
Peter also shares his thoughts about the importance of the DIT role and having the DIT cart as a central decision-making spot close to the DP.
Peter:
“The DIT is, I find, an underestimated role in the virtual production process. It’s so important! Because with the DP’s instruction, they are balancing the picture. They are creating the T stops with the DP. The DP is in charge, but the DIT’s role is so important to get what I mentioned earlier, which is the balanced picture. They can look at the color, they can look at the LUT. They have the tools to look at the brightness and the contrast of the image. So they are the best people for the DP to interact with to be able to make decisions on how they compose the picture. They also normally have the best, calibrated monitor on set!”
Summarizing his experiences on a diverse range of productions, Peter concludes:
Peter:
“We were really impressed with Livegrade as a solution, as with the UI. Anybody we’ve dealt with is completely engaged in the process and loves it. I’m still a believer that the point where we integrate Livegrade – which is on Brompton, at the very end of the pipe before it hits the screen – is the right place to do it. Because otherwise, you’re compressing the signal before it even gets into the processor. So I think if you are adjusting contrast or brightness or anything like that, it should be done on the processors and not on the servers.”
Daring a tiny look into the future, what does Peter expect to happen on the technical horizon?
Peter:
“The technological advancements happening in virtual production with LIDAR, NeRF, photogrammetry and AI scanning are unbelievable even at these early stages. I’m really excited to see where this will go. Another big thing will be the discussion I’ve been having with post houses about whether you could just have the background CDL that can be sent to post. And metadata that’s usable. People who are in post sometimes were never on set. And if you just give them tons of metadata and do not explain what it is used for, it just gets dumped. We are also now moving to a stage where we would like to try playing back content in log. We are testing this in-house at the moment and actually putting the LUT on in the processing and allowing full LUT control from Livegrade, which would give the DP complete color control on set. That’s our next step!”
Conclusion
Succeeding in virtual production might not be as hard as you’ve thought. If you think your project would benefit from shooting in a virtual production environment, Peter’s recommendations and insights might be a good starting point: Get informed and involved as early as possible, fix it in pre, and put the most capable people in a position of control! This will affect the whole collaborative system on set positively so that the production process becomes an immersive experience for everybody.
One of the things we as a company have strived to do is to simplify all of this. There is a lot of misconception about virtual production, such as new roles replacing people or eliminating certain techniques. It should be a collaboration and integration with all of the departments on a production. And really, virtual production, working in an LED volume, is just another tool. There is a lot of jargon, there’s a lot of technology talk, a lot of developers, etc. But you can actually simplify it and get beautiful, incredible results, and it should be simplified.
Thank you so much for sharing these insights with us!
Peter Canning, VP Specialist at High Res
Peter Canning is Founder, Head Designer, and VP Specialist at High Res, an independent design practice in Dublin working across virtual production, film, television, live event, and architectural sectors. He has emerged from interactive lighting design into virtual production and has contributed to various productions worldwide with technical and creative proficiency.