Overview
On a film set, timing is everything. While many crew members stand close enough to the actors to hear their dialogue live, most depend on video monitors to see what the camera captures. But the image they see isn’t truly live — it’s delayed by the milliseconds it takes for the video signal to travel through the camera, transmission system, and video setup before reaching their monitoring devices.
This delay, technically known as latency, can quickly become noticeable and frustrating, especially when the image you’re watching isn’t in sync with the live sound around you. For departments that depend on real-time visual accuracy, such as focus pullers or directors, even a few frames of latency can make all the difference.
That’s why achieving low-latency live feeds is one of the most important goals for any video assist setup. In this article, we’ll explore which factors influence latency on set, outline how to measure it, and share best practices for keeping it to a minimum. Finally, we’ll show how Reeltime helps you deliver the fastest possible live video feeds from camera to monitor.
The problem with latency and why you want to reduce it
As visualized below, there’s always going to be some delay between the live action and its representations on the monitor.

Different crew roles on set have varying sensitivities to latency, but all benefit from fast, synchronized video feeds. For example:
- Directors want to see the live image in sync with the dialogue they hear on set. A mismatch between audio and picture can make it difficult to judge performance or pacing.
- Focus pullers rely on instant feedback from the camera image to adjust focus with precision. Hence, they often use a dedicated direct feed from the camera to minimize delay.
- Other crew members near the scene are easily distracted when the visual signal lags behind the real-world sound they hear.
In short, latency disrupts the natural flow of on-set collaboration. The goal of every video assist system is therefore simple: make the live video feed feel truly “live.”
Parameters to consider: What influences latency?
Let’s look at the phenomenon in a bit more detail so we can make informed decisions for our setup and workflow. Several components in the video chain contribute to total system latency. Understanding where the delay comes from helps you identify where improvements can be made.
1. Hardware and transmission delays
Each stage of signal processing introduces some delay.
- Camera SDI outputs: The time from sensor to the camera’s SDI output port varies between manufacturers and camera models. In Pomfort’s lab, the ARRI Alexa 35 shows about 3 frames of delay, while the RED Helium 8K camera shows about 5 frames. Digital cinema cameras of other manufacturers typically have a delay of approximately 4 frames as well.
- Wireless transmitters: Convenient and flexible, but they can also introduce latency under certain circumstances. Delay times vary significantly between high-end systems and more affordable products. The distance between transmitter and receiver can also influence delay with some models. If you want to optimize your setup for the lowest latency possible, consider wired connections whenever you can.
- Video I/O devices and software: Capturing, processing, and outputting video feeds through interfaces and software naturally introduces additional latency.
Good-to-know: Video routers and LUT boxes usually do not cause additional delay.
2. Display latency
Even after the signal reaches the monitor, the display’s internal processing can introduce additional delay. Therefore, always enable the “fast mode” or any equivalent low-latency setting on your SDI monitors to minimize this effect. Refer to your monitor manufacturer’s user manual to find out whether your monitor includes a fast mode or similar setting, or if it already operates with its fastest possible processing by default.
3. Syncing all sources
To ensure consistent timing, all audio and video sources must be properly synchronized. The sound department typically manages timecode distribution, while cameras should be timecode-synced or genlocked to maintain precise frame alignment.
Genlock itself doesn’t introduce latency; it simply ensures that all video devices start each frame at exactly the same moment. When a device locks to a genlock signal, it may briefly adjust its internal timing during synchronization, but this happens only once and doesn’t cause any ongoing delay.
While proper synchronization doesn’t directly reduce latency, it helps avoid inconsistent delays between signals, which can also be distracting on set.
4. Syncing audio and understanding audio delay
Just like video, audio signals can also experience delay — both in the digital chain and in the real world. In most setups, audio signals are processed and transmitted faster than video. Adjusting the audio delay in Reeltime compensates for this difference, ensuring that both signals stay perfectly synchronized.
Fun fact: even physical sound has latency. In the real world, audio travels at roughly 343 meters per second. That means if the director stands 10 meters away from the actors, the dialogue will reach them about 29 milliseconds later, which is almost one frame of delay at 30 fps.
While this natural delay is usually negligible on set, it’s a helpful reminder that sound and picture are always subject to timing differences, whether caused by distance, transmission, or signal processing.
Best practice for reducing latency
Once you understand where latency originates, you can take targeted steps to minimize it. The following best practices help you optimize your setup for the fastest possible live feeds on set.
- Prioritize wired connections whenever possible. When wireless transmission is necessary, choose high-quality systems that are optimized for minimal latency.
- Activate low-latency display modes on all monitors if available.
- Choose video I/O devices and software optimized for real-time playback.
- If you can live with unprocessed images, make use of dedicated features that allow bypassing video processing in your video I/O and software. We’ll cover Reeltime’s options for achieving the lowest possible latency, including live bypass modes, further below.
In addition, it’s helpful to know the latency of all components in your setup, including the camera. While the choice of camera model is often not in your control, understanding how much delay it introduces allows you to better pinpoint issues and avoid attributing latency to other parts of your signal chain.
To do that effectively, you first need to measure how much latency your setup actually introduces.
Measuring your setup‘s latency
Knowing how much delay your system introduces is the first step toward improving it. Measuring latency helps determine whether performance is within an acceptable range or if optimization is needed.
Why measuring matters
For video assist teams, accurate measurement ensures that what’s seen on set reflects reality as closely as possible. Latency that exceeds a few frames can cause miscommunication between departments and reduce confidence in the live feed.
Best practice for measuring latency
A simple and reliable method is to use a timecode slate or another visual reference that includes a precise real-world cue. Here’s how we do it at Pomfort:

- Real-world reference: Place a timecode display (or a device showing a running clock) within the camera frame.
- Direct live feed (camera latency): Use the direct SDI output from the camera as your reference signal. This is the fastest way to get a signal on a monitor with your production camera and serves as the baseline for understanding how much latency your video I/O and software introduce.
- Input latency: Compare the reference feed to the signal displayed in your video assist software to determine input latency. In most setups, the input latency is very close to the speed of the direct feed and, depending on the display latency of your monitor, it can even appear slightly faster.
- Output latency: Measure the delay from the software’s output to the external monitor to determine output latency.
To document your results, you can take photos or a slow-motion video with a smartphone. By analyzing all visible time codes in the image, you can pinpoint which part of the signal chain contributes most to total delay.
Typical video assist setup
When the measured latency values are higher than expected, it’s helpful to create a diagram of your setup to troubleshoot potential sources of delay. A simple representation of your signal chain, like the example depicted below, helps visualize the testing setup and the key measurement points. In our case, the signal chain consisted of: camera and audio sources → video router → video and audio input → Reeltime → video and audio output → video router → monitor.

4 Reeltime features that ensure ultra-low latency live feeds
Reeltime is designed with low-latency performance at its core. It provides several mechanisms to keep live video signals as fast and direct as possible.
For all supported video I/O manufacturers (AJA, Blackmagic, and DELTACAST) Reeltime uses the fastest possible methods the manufacturer’s SDK provides to ingest and output signals.
Thanks to leveraging the GPU performance of modern Apple silicon chips, processing happens in real time and does not introduce additional delay. Using a video I/O device for input and output in Reeltime typically adds between 2 and 3 frames of delay. But there are four features that can lower your latency significantly:
1. Automations for instant video routing
You can automate video routing configurations using Reeltime’s Automation Manager. This allows routing changes to trigger automatically based on specific events. For example, you can apply different configurations depending on whether an output slot receives a live or playback signal. This ensures the director always receives a direct camera feed with minimal latency. This feature bypasses video processing, so latency is reduced to a minimum, though the image will be unprocessed. Take a look on how to set up automations for video routing in the video below:
2. Automatic pass-through for AJA video I/O devices
Beyond automated routing, Reeltime can also deliver direct live feeds without a dedicated video router. Its automatic pass-through modes for compatible AJA devices enable near-zero latency between camera input and monitor output. Like automated video routing, this feature bypasses processing to achieve the lowest latency possible but outputs an unprocessed image. For a quick overview, we’ve summarized Reeltime’s automatic pass-through mode in the video below:
When using two or more simultaneous pass-throughs on a single I/O device, it’s important to genlock the camera signals. For example, if you want to show camera A on the director’s monitor and camera B on the script supervisor’s monitor (both as pass-through signals), ensure both cameras are genlocked. This can be done by using an Ambient Lockit, for example.

3. Full-screen output via HDMI or Thunderbolt
But what if you still want to provide processed live feeds, for example, to deliver tailored monitor views for different roles on set?
Reeltime’s full-screen output integration enables direct signal distribution to HDMI, DisplayPort, and Thunderbolt monitors via your Mac’s built-in ports. This allows standard computer displays or TVs to serve as reliable, low-latency monitors on set. When operated at 60 Hz, you’ll experience only 1–2 frames of added delay compared to a direct camera-to-monitor feed.
The 60 Hz refresh rate usually doesn’t introduce any undesired effects on the monitors as the original frames are simply displayed multiple times, without interpolation or frame blending. While it’s possible to use the original frame rate (for example, 24 fps), doing so results in significantly higher latency. This behavior isn’t specific to Reeltime and can also be observed when using any external display in macOS at a refresh rate lower than 60 Hz.
For a cost-effective multi-monitor setup, the quad view mode allows you to output a 4K signal via HDMI or Thunderbolt, which can then be split into four independent HD-SDI outputs using converters such as the AJA HA5-4K or similar devices.
4. Low-latency local streaming [Reeltime Pro only]
While some roles require high-quality SDI outputs, others might prefer real-time streams on their mobile devices. Reeltime Pro utilizes the on-set wireless network for local streaming to its dedicated companion apps, which include Reeltime Viewer (available for iOS and iPadOS) and Reeltime Cinema (visionOS). All apps feature industry-standard encryption and a custom protocol to ensure low-latency, reliable viewing with about one frame of delay compared to the direct live feed.
Overview of test results: Latency with different Reeltime video setups
The following overview summarizes the typical added latency introduced with different Reeltime video setup configurations. Data is based on extensive testing in the Pomfort Test Lab.

Conclusion
On-set latency isn’t just a technical metric. It directly affects collaboration, precision, and the creative flow of a production. By understanding where delay comes from, learning how to measure it, and applying best practices across your signal chain, you can dramatically improve the responsiveness of your live feeds.
With its optimized video processing pipeline and dedicated low-latency features, Reeltime helps you achieve the fastest possible live monitoring experience. Thus, you keep your video feeds in sync with the real world, frame by frame.
Next-gen playback power.
Posted in: Technology on set
Posted on: October 23, 2025
