Pomfort Basics

Timecode in Digital Cinematography – An Overview

9 min read
Pomfort Software film Production Blog Technology on Set Timecode in Digital Cinematography- An Overview

The moment you hit record, digital film cameras automatically create timecode – without much interaction by the operator. Typical production and post-production workflows rely heavily on these timecodes assigned to every single frame that’s shot during a production. So let’s take a closer look at why and how timecode is created and how it’s used in later post-production activities, to better understand the importance of having a unique identifier per frame.

Labeling and Identifying Individual Video Frames

Timecode— in particular SMPTE timecode (i.e. SMPTE 12M, a two-part document)— is the common means for labeling, and identifying individual video frames, e.g. on tape or in a video signal (e.g. in SDI). The format defines one number for hours, minutes, seconds, and frames, separated by dots or colons (such as, “01:15:47:11”). Besides the encoding of the numbers, special cases are also defined in that standard, such as counting drop-frame timecode for fractional frame rates (like 29.97 Hz – or to be precise 30/1.001 Hz).

Timecode in Video Files

An intuitive idea would be that each video frame simply gets its own timecode value. That’s the case, for example, with image sequences, where each file of the sequence includes a metadata field for the timecode of the file’s frame. But there are more advanced concepts as well: For example, in QuickTime files timecode is stored not as an attachment to each frame, but as a separate track. Therefore, a QuickTime file could have a timecode track with different frame rate than the video, or even multiple timecode tracks.

Since timecode is part of the clip metadata, it needs to be extracted by software in order to be displayed for users. Every NLE (non-linear editing) system and most software media players can do that and offer a timecode display that is updated for every frame during playback. But in case we want to make sure that we don’t lose the timecode— for example, when editing together different clips— the original timecode of each clip is sometimes “burnt” into the image as a text overlay at some stage in the process.

Using Timecode for Offline Editing and Conforming

In the simple case where a camera’s material is directly edited and then exported as the final delivery (e.g. when editing a smartphone’s recorded video directly on the smartphone), there is no explicit technical need to deal with timecode. That’s different when the material that’s used for “offline” editing (e.g. compressed, low resolution, and low bit-depth files) is not the same as the material that’s used for “online” finishing (e.g. original camera files, OCF). 

In order to recreate the edit during finishing (i.e. during the online process), the original camera material needs to be referenced on a frame-by-frame basis by the editing definition from the offline edit— often by using an “EDL” (edit decision list). This “conform” process has its origin in workflows where a video tape holds the “online” material, and compressed media files are the “offline” material[1].

In such workflows, the transfer of the edit from the offline material (compressed files) to the online material (original tape) is made possible by assigning the timecode of the online material to the offline material— and using the corresponding timecodes in both materials to find the right frames for onlining.

An extension of that process is used in the digital intermediate (DI) workflow. This process uses additional “Keykodes” for identifying individual frames on film (i.e. the “online” material in that case). When digitizing the film material for editing (e.g. with a Telecine) the video files get a timecode that is stored with the associated Keykode. When the edit is finished, the timecode sequence is exported from the editing system as an EDL.

With the stored Keykodes, the EDL can be translated into a Keykodes sequence (the “pull list”).  The frames of film with these KeyKodes are then scanned in high quality for “onlining” (i.e. finishing in a color correction system). For that, the corresponding timecode for each film frame is embedded into the files during scanning. The “online system” (i.e. the color correction software) can then assemble the timeline imported from the EDL with the scanned film files.  

Today’s digital cinematography workflows basically use a simplified version of the DI process: Instead of digitizing film for editing, the OCF files are transcoded to compressed, low-resolution “dailies” files for editing. Since digital film cameras automatically embed timecode in the OCF, this timecode is also used for the dailies, and no translation of timecodes is necessary. When bringing the OCF material back into a color grading system for finishing, either EDL files or the more advanced Advanced Authoring Format (AAF) files are used, and the finishing system finds the correct OCF clips by the timecodes from these metadata files.

Synchronizing Timecode on Set

On professional film sets image and audio is captured independently. Once again, timecode is used to synchronize video and audio for editing. The goal is that all devices that record something on the film set (the audio recorder, and possibly multiple cameras) use the same, synchronized timecode during recording. This is to keep all recording devices in sync at frame accuracy for the whole day so the dailies creation process is as smooth as possible.

For that, timecode generator devices are used, which provide highly accurate and reliable timecodes over a shooting day. The crew either “jam syncs” the devices once or several times per day by re-setting the possibly drifted internal timecode of each device with the timecode generators or the generators are attached permanently to the recording devices – which is the most accurate variant.

Audio recorders sometimes have their own high-quality timecode generators and sometimes are considered the “master” on set concerning timecode. But not all cameras offer a TC input, especially more consumer-grade devices that might be used as secondary recording devices. These devices can also be included in a timecode-synced system by using LTC (linear timecode). LTC is a continuous timecode track encoded into an audio signal.

Timecode generators usually also can be configured to create an LTC signal that can be plugged directly into the microphone input of the camera. The timecode is then recorded as a standard audio track inside the camera. With the appropriate software that LTC track can be converted back into timecode metadata, so that also the clips from the consumer-grade camera end up in editing with synced timecode.

Tape Name

We skipped one important limitation so far. The finite number of digits in a timecode limit the amount of individual frames that one timecode can distinguish. Depending on the timecode format, the highest timecode value is either one frame before 24 hours, or one frame before 100 hours. With the former limitation being the “safe” limit you can only distinguish 24 hours of material. This might sound not too bad at first, but in practice, you very quickly run into situations where you end up with duplicate timecodes within one production:

Cameras can assign timecodes to recorded material in “record run” (only counting up timecode when recording) or “free run” (constantly counting up timecode, also if the camera is not running, like a clock). The only method that works well when syncing multiple devices with the aforementioned timecode generators is using “free run”  on all devices. So although you are not recording a full 24 hours of material on each day, you might run into situations with clips from different shooting days sharing the same timecode after just a few days of shooting.  

And if you want to distinguish the frames of different cameras that shot multiple angles of the same take at the same time (purposely with the same timecode) in editing, you need an additional distinguishing property anyways to identify the right camera.

Long story short— timecode is not at all capable of distinguishing all the captured frames of a production. Nor was it designed to do that— it was only required to distinguish the frames on one videotape. So for identifying a frame in a stack of tapes, you actually needed two things: The tape name (a unique label per tape), and the timecode of the chosen frame on that tape.

This duo of timecode and tape name as a means of identifying frames is still present in most of today’s conform workflows. The convention is to either use the clip name or the name of the camera card as the identifying “tape” name. For both options it is of course indispensable that these names are unique in order to enable unambiguous distinction of clips and frames. Usually, this tape name is written as a metadata tag into the dailies clip file with the timecode to be carried throughout the entire editing process. Something similar happens with audio, where instead of “tape name” the “sound roll” is used for additional identification. The sound roll is often named the same as the recording medium with the audio files.

The chosen convention for using the tape name (sometimes also called “reel name”) is an important thing to agree on explicitly and in detail when creating dailies because it needs to be supported by both the editing system and the conforming system. The only way to fix a conform process gone wrong is by hand-picking the right OCF clip with a given timecode (which can be many) in the mastering system one by one, and checking them visually against the rendered timeline from the editing system – which can be a huge amount of work.


👀 This overview of the creation and use of timecode in film productions just scratches the surface of all there is to know about timecode. There are many details to learn and consider in the different corners of a workflow – such as using timecode generators and setting up timecode and framerates in cameras, syncing audio, configuring the timecode and tape name settings of dailies software, drop frame rates, metadata export for dailies, for example, via ALE, and performing a conform process.

Knowing how these activities interlock and depend on each other is a good background knowledge to have when dealing with these topics on set. Once again we see how the “post-production process” in some way starts with the settings of cameras and audio recorders and naming files and folders on set. Getting that right saves a lot of headache after shooting.

[1] That was a typical process for tape-based workflows. Offline editing was done in early file-based editing systems, and online editing was done by assembling the edit by tape-to-tape copies.

About the Author
Patrick is head of products for Pomfort’s on-set applications. He combines a technical background in software engineering with practical experience in digital film productions – and a preference for things that actually work.
Pomfort Basics

Articles with this tag cover basic information on Pomfort’s software applications, related workflows and technical contexts - perfect if you’re just starting out!