In every film production, thousands of media files are created while shooting. These assets and their respective metadata (like slate information, camera metadata, look metadata, etc.) need to be managed and processed along the production pipeline. Besides the physical data handling, it’s also a challenge to make the collected clip and look data accessible in a well-structured way.
That being said, it’s important to acknowledge that people have very different ideas about what “well-structured” means – depending on their role, workflow requirements, or personal preferences. Hence, it’s almost impossible to create one structure that fits all.
So, what are your options here? Typically, clip and look data generated on set gets grouped chronologically, as in one folder per shooting day. Obviously, this approach is of limited use to other involved parties, for example, in post-production. That’s why ShotHub offers you alternatives: Grouping clip and look data into smart groups allows you to create custom and meaningful overviews (instead of simply taking over the structure from set). That way, you make it easier for others to retrieve the information they need.
Different navigation approaches with smart groups
As said, depending on the context and a person’s role, the preferred approach to navigate through a clip or look library varies widely. So, let’s look at a few approaches people can take with smart groups:
- Scene-Based: Every movie or episode already has a structure derived from the script; it seems evident to group libraries by scenes, shots, and takes. With smart groups, clips are grouped by scenes taking their prefixes and suffixes into account so that the original order of the script is preserved. This sorting allows, for example, the editing team to browse a clip library “naturally” by scene and based on the sequential order of the script.
- Organizational: Others benefit from a chronological structure based on shooting days and then grouped by location, crew unit, or camera. Such an outline is especially valuable for organizing and processing daily tasks/duties such as creating dailies or generating reports per either one shooting day, location, or crew unit.
- Marked Items/Quick Find: When using labels, ratings, or flags in a specified way while collecting data on set, tagged clips or looks in the library can be grouped by these markers to indicate them for further processing. For example, in a clip library, all clips that require VFX work can be marked with a “VFX” label. Or use the flag to mark shots with look information as a reference for a scene or location.
Alright, now that we’ve seen what smart groups help you achieve, let’s take a look at how they actually work.
How do smart groups work?
Smart groups in ShotHub take advantage of the metadata that “naturally” comes with elements such as clip and look libraries. These elements are taken to auto-generate new groups out of them by building a new outline, e.g., sorted by “Scene, Shot, Take”. The smart grouping works hierarchically, which means each group can have sub-groups so that a new outline evolves, independently of the original structure of the project. There are several predefined smart group layouts to choose from, and every layout allows navigating through different levels of the smart group hierarchy. For example, using “Scene” as a grouping criterion is helpful for a feature film project, but when thinking of a series with multiple episodes (with the same scene names appearing in multiple episodes), sorting the library by “Season”, “Episode”, and “Scenes” makes a difference when navigating through the library.
The smart group view in ShotHub is available for all projects, whether it’s a clip library synced from Silverstack, Silverstack XT or Silverstack Lab or a shot library with looks uploaded from Livegrade Pro or Studio. Customizing the project view by smart groups can replace the on-set folder structure or even be used as a secondary filter to group shots from one folder or bin. Now that we have some technical understanding of how smart groups work, let’s take a look at how to implement them in your workflows.
How to use smart groups?
Browsing through a cloud project using a smart group layout is helpful to answer a variety of daily questions like “Which scene was shot in a specific location?” and “Which camera was used on that day?” or “Are there more takes for that scene than the one I got transcoded files for?”. Additionally, the smart groups can streamline workflows and make communication more efficient, for example, when thinking of the following scenarios:
In order to provide a list of flagged (circled) takes from a clip library to the editing department, the smart group layout “Flagged” lets you navigate through the scenes (as sub-groups) and lists all flagged shots in the clips table. Then, these lists of flagged clips can be exported as PDFs per Scene (so-called Clips Report) from ShotHub and sent to the editor or the assistant editor. And if such a smart group view is combined with a custom table layout, the Clips Report (PDF) will contain a customized set of metadata (e.g., labels, tags, comments, or cue points).
Another even more flexible way to share information is to invite somebody from the editing team to the cloud project to look up the data and create reports with selected metadata themselves. An invitation to the cloud project in ShotHub may also contain a smart group layout and a custom table layout so that the invitees start with a meaningful view which they can refine anytime by choosing another smart group or column layout.
As retrieving marked clips using the smart groups is quite simple, it becomes easy to hand over camera and lens metadata to the VFX department. If, for example, all clips intended for VFX work are labeled, the smart group layout “Label” allows one to find all VFX shots per scene. Like the PDF reports mentioned above, ShotHub can generate CSV files according to that list of shots. The CSV export also considers custom table view layouts so that everybody can select the relevant camera and lens metadata. In addition, dynamic lens metadata can be exported in a CSV file per clip so that all metadata collected on set is preserved for the VFX department.
A similar use case is sharing a set of CDLs with the dailies lab. Then the smart group layout “Location” might be handy as the look library is hierarchically organized by location, shooting day, and scene. As a result, a list of looks per location, shooting day, or scene is displayed, and the respective ASC-CDLs can be generated. Alternatively, those reference shots could be labeled in the Livegrade project so that the smart group layout “Label”enables others to retrieve the reference shots in the look library in ShotHub. Eventually, a smart group layout is sharable with the project invitation for look libraries in the same way as for clip libraries so that the invitees immediately find what they are looking for.
Finally, there are many more options to structure the library with one of the different smart group layouts. If used in combination with a custom metadata column selection and sorting or refined by a visual search for thumbnails or the metadata search option, every production and team can create a library view according to their requirements and preferences.
In this article, we showed how the preferred way of navigating a clip and shot library mainly depends on the respective context. Different users in different situations require differently structured overviews (e.g., scene-based, chronological grouping, or customized markers). And the more people are involved in a production, the more ways to access a project are needed. With the smart group layouts, you can create custom structures for your clip and shot libraries, ignoring the original folder structure from set and considering personal preferences or workflow requirements instead. Smart group views can even be shared with others via invitations, and the smart group filtering is also applied to exports generated from ShotHub. That way, using smart groups opens up new possibilities to make valuable data collected on set available to different departments across the production process.
 Here’s a production example of a collection of scenes that have strong relations but wouldn’t sort well with other scene names in regular alphanumeric sorting: “101”, “X101″, “101A”, “101B”.
 The feature for exporting dynamic metadata per clip as a CSV file is not released yet but will be coming soon.