Category Archives: Featured

Feature Turnover Guide – VFX

Feature Turnover Guide – VFX

Managing VFX is a daunting subject and a big task, even for smaller films. It’s so complicated that films that can afford it will hire a separate VFX Editor just to keep track of the film’s VFX and to create temp comps as placeholders until the VFX come in. I’ve been putting off writing this article for years now because the thought of trying to encapsulate it all in a generically useful way was a bit overwhelming, but here goes…

The Job of A VFX Editor

On bigger films, there is at least one VFX Editor, and often there are two or three. On smaller films, the kind with one editor and one assistant, that assistant editor handles all the usual AE duties plus all the VFX Editor tasks. If you’ve never done it before, it’s a steep learning curve. VFX Editors are responsible for:

  • Creating temp comps for the editor to use while cutting, before any VFX vendors start working
  • Tracking every shot in the timeline that needs any kind of VFX, and giving it a shot ID
  • Tracking changes to the cut that change what VFX are needed. This includes knowing when a shot is cut out, when a new shot is added, when a shot is slipped far enough that your handles no longer cover it, or when sync is changed within a shot that will affect sound or the vfx work that needs to be done.
  • Creating “count sheets” (sometimes called lineup sheets) that detail each shot, what work is needed, what elements the VFX vendor needs to complete the shot, and the timing of each element in relation to the overall shot.
  • Creating EDLs or Pull Lists for a Post facility to scan/render DPX frames of your plates and elements to hand over to the VFX vendor. If you don’t have a Post facility, you might be in charge of rendering out DPX frames yourself using the raw camera footage and software like DaVinci Resolve.
  • Receiving versions of all the shots from your VFX vendor, cutting them into the timeline, reviewing each version with the editor and director, then taking the notes from those review sessions and communicating them back to the vendor.
  • As shots are being finaled, assuring that the finished shots are delivered in the appropriate format to the DI facility, and that the right version of the shot was delivered.
  • Constantly checking and re-checking all of the above, because things always fall through the cracks.

The VFX Database

One essential tool that VFX Editors need in order to do their work is an effective way to manage all of this information. This usually comes in the form of a VFX Database, and most VFX Editors bring their own to each new job they start. Most of the time the database is made with Filemaker Pro, but some VFX editors have custom solutions, and if your needs are minimal you can get away with using a simple spreadsheet. There is no standard VFX database out there, though I’ve seen at least one that you can pay for if you don’t want to get into designing a Filemaker database on your own. Many VFX Editors are protective of their databases, which is understandable given the hours of customization they’ve put into creating them, so if you need a database and find someone with a good one who’s willing to share, consider yourself lucky.

Marty Kloner's VFX Database for Star Trek Into Darkness

Marty Kloner’s VFX Database for Star Trek Into Darkness

If you do borrow someone else’s database, one thing to consider is whether you like their workflow and are willing to emulate it. One of the reasons that these databases are so customized is that everyone has their own ideas on how to do each part. Are you someone who likes to enter all the shot information manually, or do you create subclips in a bin and then export a tab-delimited file to import into your database? Do you need a thumbnail for all your elements in addition to the shot itself, and if so do you need just one thumbnail or a heads & tails set to confirm start and end frames? And how do you name your shots? With a two letter sequence prefix and a padded four digit number, or do you break it up by scene first?

The answers to all of those questions help determine the needs of your database, so if you inherit someone else’s be prepared to do what they do, because if you want to go a different way you’ll find yourself very frustrated.

Also, it’s important to note that many VFX vendors will now give you access to their internal tracking systems. This is great, and can be a useful way to communicate, but never rely on the vendor’s database in lieu of your own. That’s a guaranteed way to have things fall through the cracks. You must always keep your own list of shots and what their statuses are.

The Basics

So assuming you’ve got an idea of how you’re going to track your shots, let’s go over the details of what information you need to include.

VFX Shots

A record of every shot is the basis for everything you’re going to do from here on out. The most important information to track is:

  • the Shot ID, which you are responsible for creating. A regularly used naming system is to come up with 2-letter abbreviations for all the VFX sequences in the film, and then starting from 0010, name all your shots in increments of 10. So if your sequence is called “Things Explode”, you would start out with shots IDed as TE0010, TE0020, TE0030, etc. This pattern is easy to communicate verbally, easy to type (no need to hit shift for an underscore separator, e.g.), and allows you to maintain a rough chronological order if you need to insert a new shot after an existing one. It is also not dependent on the scene number where a VFX shot is located, which some people like to include as part of the Shot ID, but which I think is an irrelevant piece of information for VFX purposes.
  • the duration of the shot. This can be either the duration in cut or the total duration turned over for work, or both. Whatever’s more useful for you.
  • the shot’s handles. Handles refers to extra frames you’re asking the vendor to include beyond just what’s currently in the cut. It’s common that you’ll receive a shot back and want to add a few frames to the head or tail. If you only turned over the footage that was in your cut at the time, you wouldn’t be able to trim the shot. But if you have 8-frame handles, for example, that’s 16 extra frames you’ll get back that you can use in the cut if you need to.
  • the description of the shot. This is where you tell your VFX vendor what exactly you want them to do (and hope that they read it). Even if it’s really obvious. Do they need to key out the greenscreen and add laser beams coming from a cat’s eyes from frames 39-47? If so, write it down in the description.
  • the status of the shot. Keep your own list of what shots are In Progress, On Hold, Omitted, Final, and CBB (meaning “could be better”). Don’t rely on your vendor’s list, but do crosscheck your list with your vendor’s at regular intervals to be sure you’re on the same page with what work is left to do.
  • the vendor. You might have more than one vendor working for you. Make sure you track which shots go to which vendor.
  • the turnover date. It’s useful to know what date you turned a shot over to be worked on. If you name your turnover batches, note that down too.
  • the final version and date. When you’re nearing the end of your film, you will want to check that the vendor delivered the right version of each shot. Keeping a record in your database of what version was finaled and when will allow you to make sure you’ve got the right files in your DI. If you find that your vendor has delivered a newer version of a shot than what you noted down, be sure to ask them about it. It might just be a tech fix (something small they noticed and fixed without needing client review), but best to be sure.

Screenshot of Marty Kloner’s VFX database, showing a shot list for Star Trek Into Darkness


Every VFX Shot requires at least one Element. An element is a piece of footage required to complete the shot. If your VFX needs are not complicated, many of your shots will have only one element. For example, if you’re removing a scar from an actor’s face, you only need to hand over the shot that’s in the cut. If you’ve got complicated shots, then you might have a background plate and multiple foreground elements. For example, a screen replacement is a 2-element shot. You have the shot in the cut that has a TV in it, and you have the content you want to be inserted into the TV. Both of those elements would need to be handed over to your VFX vendor, along with information on how the TV content should be lined up with the background plate.

Important information to track for Elements is:

  • the element name and version. This can be as simple as taking your shot ID and adding a suffix to it. So if the shot is TE0010, your element might be called TE0010_bg1_v1. And another could be TE0010_fg_smoke1_v1. Have a conversation with your vendor to determine if they have a particular preference for element naming. The version number is useful in case you extend a shot beyond its handles. Then you would have to deliver a new element at the extended length, and you would increment your element version to v2.
  • the tape and timecode of the element. This should be pretty obvious. You and your vendor both need to know which parts of each element you’re actually using. You need this so you can generate DPX files, and they need it so they can line up the elements correctly. If you have a post facility making DPX files for you, you might not get a chance to check that the DPX elements are right before they go off to the vendor, but if you have a record of what material was supposed to be turned over, you can start to troubleshoot.
  • the handles you’re including. Element handles often mirror the shot’s overall handles, but sometimes you might need to customize it.
  • turnover dates and scan orders. It is common, especially if you have to scan film or go through a post house for DPX files, to turn over the elements for a lot of shots at once in one lump EDL or Pull List. It helps to keep track of when these batches were sent and what the name of the batch was
  • speed information. If there are any speed effects on your elements, note that in the element description. If it’s a fancier timewarp effect, you might also include a screenshot of the graph and note which frames have keyframes and what their speed % is.
  • You should be prepared to locate lens and focal distance data for a particular piece of footage if requested. This can usually be found on the original camera reports, or sometimes in the notes of an on-set vfx supervisor if there was one.

Received Versions

Keep track of every Quicktime you get back from your vendors, the date you received it, and any notes from the director or editor on fixes that need to be made. As mentioned above, also note when a version becomes final.

Avid timeline from John Wick 2

Final John Wick: Chapter 2 timeline with dailies on V1 and final VFX on V2. VFX Editor: Kim Huston

What Your Editor Needs Of You

Every film is a fight against entropy, but there are some steps you should do to make life easier for yourself and the Editor.

  • You need a fast way to navigate to every shot in your timeline, so use timeline clip notes (as of Media Composer 8.8) or put locators in the center of every VFX shot on the timeline. Preferably, put the locator on the plate/dailies. Put the Shot’s ID as the locator text. When I cut, I like to keep my dailies on V1. When I get a version of the shot back from the vendor, that goes on V2. Adjust as necessary if you need to use multiple tracks for a temp comp. When I get a new version of the shot, unless there’s a compelling reason to stack them, I’ll overwrite the old version on V2 with the new one. So in this way your locator always stays in the timeline even as you get newer and newer versions of shots above it.
  • Check the cut every so often for changes, and do it more frequently the closer you get to the end of your schedule. Make sure every clip note or locator is still there. Since you can’t rely on the editor to always tell you when things have changed, reviewing the cut yourself will help you find shots that may have been cut or trimmed without your knowledge.
  • If a shot has been extended beyond the frames that were initially turned over to VFX, confirm with the editor before proceeding. If it’s only a frame or two beyond the handles, the editor might opt to cut those two frames in order to stay within the boundaries of the shot. If it’s been extended more than that, you’ll need to revise your element’s timing and resubmit a new version of it to the vendor.
  • Give clip colors to your shots. I like having one color for versions of shots in-progress, and another color for versions I’ve finaled. This makes it easy to see at a glance if there are any missing shots that have not yet been finaled.
  • Always check with your editor how they want to handle putting new versions of shots in the timeline. Do they want to cut them in themselves? Do they want you to cut them in on a new track and then leave it to them to drag down to a lower track, or do they want you to just cut it in normally and tell them which shots to look at? Any way is fine, as long as you are able to show or tell them what’s changed and needs to be reviewed. Never make a change to an editor’s timeline without their knowledge.


A turnover is the name for the package of information that you generate and give to your VFX vendors and DI facility so that the VFX vendors can get to work.

In the most basic format, a turnover involves:

  • Generating count sheets (example) and reference Quicktimes to give to your vendors
  • Generating a pull list that you give to the facility managing your raw footage so that they can render your elements into DPX files to be delivered to the vendor
  • Determining how to get those DPX files to the vendor. Bigger Post facilities will have their own file transfer software (Aspera, GlobalData, etc.), but on an indie level you may need to provide a solution like MASV Rush.

What Your VFX Vendor Needs From You

  • The cut. Every shot needs to be viewed in the context of its surrounding shots, and it will help your VFX vendor tremendously to have a copy of the scene where the shots they’re working on will go. With it, they can check their own work and timing before wasting your time with a version that may look good in isolation and have an obvious problem in context. So when you’re first turning over shots for a scene, send them a Quicktime with the shot names burned in (along with your usual Property Of…. security titles). I’ve written up a workflow for quickly creating these burn-ins using the locators on the timeline and the Avid SubCap tool. Check with your editor, and studio or post supervisor for any security requirements specific to your show before sending a cut sequence out.
  • Quicktime reference files. In addition to giving your vendor the full scene, you should send a reference for each individual vfx shot, including handles. If you’ve done a temp version of a shot then you should send that to your vendor as well. And some vendors will also ask for Quicktimes of every element you’re sending them to their full scan length.
  • Count Sheets (example count sheet for a timewarp from Hellboy 2). These PDFs (or occasionally CSVs) tell the vendor about every shot you’re requesting from them, what materials they will need to complete the shot, and where to find them. They detail any bit of information relevant to the artists working on the shot, such as speed effects, resizes, extensions, and elements that will come from secondary vendors.
  • Dailies LUTs may be requested so that the vendor can send Quicktime versions to you for approval that match the dailies color you’ve been editing with.
  • Communication. You should be in constant contact with your vendor about the status of all your shots, what versions of shots you should expect to receive week after week, and what notes you have to relay back to them so they can move on to the next version.

What You Need From Your VFX Vendor

  • On a regular basis you should receive Quicktimes of each shot, in the spec and codec of your offline edit (DNxHD115, e.g.), to put into your cut. These usually come any time there is a new version of a shot that you need to review, and should have their filename and a running frame count burned in on every frame, plus usually a 1-frame slate at the beginning with a few more details like the vendor, date, etc. This is all standard, your vendor will likely do this automatically.
  • When you’re ready to begin your DI, you should establish a workflow to get finished shots from your vendor to your DI facility. Sometimes the vendors will send them directly, and sometimes they’ll send the finished DPXs to you to check and relay to the DI.

Count sheet page from my Hellboy 2 Opticals database. VFX were handled separately by Ian Differ, but I handled the workflow for the hundreds of timewarps we used.


When you get to the finishing/DI part of the process, things can start getting lost easily. Your DI facility is often receiving shots from multiple vendors that have to match up exactly to the filenames listed in the EDLs that you’re providing them, and with so much data coming in all the time it’s common for mistakes to be made. Catch those mistakes as early as you can, but you should also get in the habit of asking for a VFX EDL from the DI timeline whenever they provide confidence check Quicktimes to Editorial. When you receive those, go through and make sure that each VFX version listed in the DI EDL matches up to the expected final version in your editor’s timeline. Use the confidence check Quicktimes as another means of visually making sure that all the shots look right and are correctly cut in. You may be duplicating some of this error checking work with the 1st Assistant Editor, but that’s okay. In this part of the process, you cannot be too careful. Errors that go unnoticed at this point can easily make it into the final deliverable, and obviously you don’t want to catch an error when you’re delivering the final DCP.

In this phase it is highly likely that a sound mix will be going on concurrently to the delivery of the last remaining VFX shots. It’s very helpful to the sound team if you keep an eye on any changes to the VFX that would affect what they’re doing. Like if the editor slips a shot that has a muzzle flash in it, your sound team will want to know that so they can adjust the sfx of the gunshot. It’s hard to keep track of everything that might affect sound, but just keep that in the back of your mind as you’re going through your normal duties.


I have not gotten very specific on a step-by-step workflow in this post because it is honestly different for everyone. Create a workflow that works for you, your team, and the specifics of your project. As long as the right information is getting relayed in a timely manner to your vendor, DI facility, VFX Producer & Post Supervisor, then you’re doing fine. Good luck!


Quick and Easy Dialogue Cleanup with RTAS

Quick and Easy Dialogue Cleanup with RTAS

On Star Trek Into Darkness I had the opportunity to break out of my usual Assistant Editor responsibilities and tackle a new experiment in temp sound editing. Will Files, Matt Evans, Robby Stambler and I formed a new mini-department within Editorial that was tasked with temping out the Editors’ sequences and mixing them in 5.1. There’s a lot to the process that is new and interesting, and I hope to get another post up soon that more fully flushes it all out, but for the moment all I want to talk about is a method for basic, global dialogue cleanup that is probably old hat to some (and par for the course for professional sound mixers), but was new and amazing to me.

This tip comes courtesy of Will Files, who as a loan-out from Skywalker Sound was the guy who guided this whole process on Trek and helped teach me, Matt, and Robby the ropes of the sound world.

RTAS Is Your Friend

Before this show, I didn’t really know what RTAS was useful for, much less how awesome it really is. It allows you to use many of the AudioSuite plugins that you would normally apply to a clip, and apply them to an entire track instead, without rendering (thus the RT in Real-Time Audio Suite). Up to five RTAS plugins can be chained together per track. When applied to dialogue tracks, you can chain together 3 RTAS plugins that will make your dialogue much more understandable and leave more room in other frequencies for your sound effects and music.

So, to get started, you have to show the expanded audio controls in your timeline, and make your track size big enough that you see the little RTAS boxes:


You can see that I have an EQ, a Compressor, and a De-Esser, in that order, on my dialogue tracks.  Let’s go through them:

1) EQ

The EQ you add here is basically a band pass with a little customization. Everything below 60Hz is gradually stripped away, as well as everything above 12kHz. This is because your typical dialogue won’t produce any audio in those frequencies that you want to keep, but by throwing it away you can start to address issues of boominess, high frequency hiss, and other technical problems with your production audio that get in the way of understanding the dialogue.

Aside from the band pass, this EQ also lowers frequencies around 120Hz by 2db, and raises frequencies around 4kHz by 2db. Again, this helps with boosting the frequencies of your dialogue that are most useful for comprehension, and removing frequencies that tend to get in the way, but without being as blunt as the band pass since these are frequencies you do want to hear.


2) Compressor

Now that you’ve removed unwanted frequencies, it’s time to normalize the volume. For that you use a Compressor, which will actively limit how loud your dialogue can get. If it gets too loud and crosses our set threshold, the Compressor will bring it back in line. The more the volume goes past the threshold, the more it will be reined in. This helps make sure there are no loud surprises in your dialogue, and will save you some of the hassle of mixing loud clips down to a more comfortable listening level.

In this case, we’ve modified three of the settings from their default states:

  1. Knee = 6.0 db.   This adds a little curve right at the threshold point, so that the ratio of a loud input level to its compressed output level is approached more smoothly. Without it, the compression would switch on at full force when the volume crosses the threshold. For a better explanation of this, read this article.
  2. Threshold = -20db.  By moving this up 4db from the default -24db, we’ve allowed our audio to be a bit louder before it activates the compressor.
  3. Gain = 4db.  This knob controls the output level of all audio passing through the Compressor, even audio that is below the threshold line. Since compression only reduces volume and can leave your dialogue levels feeling too low, adding a bit of make-up gain can help keep it at a good baseline.


3) De-Esser

This one does exactly what its name implies, and helps with any S sounds in your dialogue that can be particularly piercing to listen to. It’s basically another type of compressor that handles high frequencies instead of high decibel levels.  On this we’ve set:

  1. Frequency = 5.4 kHz.  This means the De-Esser will be triggered for frequencies above 5.4kHz.
  2. Range = -3.0db.   When the De-Esser is triggered, it will reduce the gain of the signal by up to 3 db, which should help reduce the effects of any piercing audio.


As Quick as A-B-C

For those short on time, I’ve attached an Avid bin called Dialogue RTAS Effects.avb to this article which contains these three presets. They are labeled A, B, and C and should be applied to your RTAS chain in that order.

Tip: To quickly copy RTAS effects from one track to another, hold down Option and drag the effects you want to copy from one track’s RTAS chain to another.




Thoughts from NAB 2012

Thoughts from NAB 2012

So after spending a day and a half on the floor of NAB 2012 (and a fun night at Media Motion Ball!), here are some of the thoughts I had and things I’m excited about after talking to various companies on the exhibition floor.

ATTO Technology Thunderbolt-to-10Gb Ethernet (link)

I asked the ATTO guys whether anyone had used one of these to connect a laptop or iMac to a Unity, and they said that they were so new there weren’t enough units available to send out for certification to companies like Avid. In theory it should work, and Avid is at the top of the list to receive a test unit, so hopefully we’ll see some results on that either from them or someone else who just gives it a go to see what happens. These would be great in scenarios where you just quickly want to connect a temporary system to your Unity, like for giving access to a trailer editor so they can pull selects from your dailies without taking the time or system off an assistant editor.

Amazon S3 Secure File Delivery (with or without Aspera)

So I wandered into the Aspera booth since independent-level secure file delivery is something I’ve long been interested in solving in a cheap way. Aspera is not cheap, and all of my experience with it has been with big studios or facilities that can afford it, but I saw something about Aspera linked to Amazon S3 and wanted to learn more. Having “Freelance” on my NAB badge made sure that no one from Aspera was interested in talking to me, but the representative from Amazon there was very nice and I chatted him up for a minute.

The rub is this: Aspera is offering so-called “On Demand” service, whereby you use their FASP transfer protocol to get your files quickly up to S3. You then get charged by Amazon for the bandwidth of whoever downloads that file, as well as for the use of the Aspera software. I was hopeful that something called On Demand would be more affordable for indies and people who still need to send very large, secure deliveries but don’t have the money or server infrastructure to have an enterprise-level solution at their disposal. Predictably, this is not the case. In fact, I’m not even sure why they’re calling it On Demand, since they want to charge you a monthly subscription fee of $750.

The upside, though, is that by Aspera effectively ignoring me and Amazon giving me all the time I wanted, I learned a lot about S3 that I didn’t know before. Most importantly, the Amazon guy pointed me in the direction of access control, which eventually got me to run into a page on the Amazon S3 docs titled Signing and Authenticating S3 REST Requests. It’s a mouthful, but down near the bottom what it says is that you can use what they call Query String Authentication to send a expiring link to a private file on S3. With some work in PHP one could pretty easily create an app to send links to private files that expire. It doesn’t seem like Amazon offers the ability to expire a link after it’s been clicked on once, or to provide logging, but for basic large file delivery this should work well enough to start.

Wacom Intuos5 (link)

I’m a huge fan of Wacom and I insist on having an Intuos tablet at my desk wherever I edit. I was a little skeptical of the Intuos5 just because I didn’t see the need for adding touch capability to the tablet when I already use the pen 100% of the time. Having played around with it, it is very nice, though I still don’t think I would use the touchpad much (I’m sure I’ll eat my words later). One problem it would solve, though, is that other people who jump on my system wouldn’t have to fumble around with the pen in order to do a quick task. I will miss the little displays that the Intuos4 had, though. The HUD that pops up on screen when you touch one of the side buttons is annoying and takes too long to display. So I’ll definitely be keeping the Intuos4 I have at home.

DCP Creation

I spent part of my second day on the floor talking to the EasyDCP and Doremi people. I have conflicting desires when it comes to DCP generation. On the one hand, free and open source tools already exist to allow you to roll your own DCP, and I oppose paying thousands for something I can do for free. On the other hand, even with the free tools it’s still a pain in the ass to actually do it properly, so paying for the cheaper end of DCP creation software like EasyDCP is possibly worth the time it would save me to figure out all the color space, multiple reel, and subtitling issues that are only the beginning of my problems when testing the open source tools. Additionally, EasyDCP offers a version that allows KDM generation, and they are coming out with a KDM database app to keep track of your server certificates and issued KDMs, and encryption plus KDM management is something the open source tools haven’t gotten to yet.

On the fancier, more expensive end of things is Doremi. They sell a software package to allow you to make your own DCPs (not sure the cost), in addition to selling hardware that can take an HD-SDI input and encode your DCP in real-time. I asked the rep what it would do with a 1080p signal, and he verified that it can upres the 1080p signal to 2k with either a flat or scope preset. This, for me, presents interesting possibilities, since on my last show we made quite a few temp DCPs with cuts played out from Avid, and in order to get the DCPs made we had to record our cut to HDCAM-SR, hand that off to a post house, and wait 24 hours to get a DCP back. If we could’ve rolled our own DCP, not only would we maybe have saved money in the long run, but the turnaround time for the DCP could be much shorter. We would only have to watch the layback of and then QC the DCP, instead of watching the layback of and then QC-ing the tape, running the tape across town, waiting a day, and then QC-ing the DCP. This is definitely something I want to pursue further for production companies or post houses where I anticipate regular work.

File-Based Camera Dailies Prep

At the Sony booth there was a guy demoing YoYo, which seemed very well thought out and potentially very very useful. It handles all the usual backup and transcoding of the master files from the camera, in addition to allowing LUTs and basic color correction, sound syncing including multichannel mixdowns or inclusion of only selected channels, and to top it off it will take advantage of your connected broadcast monitor if you have one. It’s definitely the most full-featured “DIT” app I’ve seen so far. The two things I wish it would do that I don’t think it does currently is maintain a database of all the clips it’s processed (useful as a codebook), and mark the audio it sends to Avid as coming from a film project (even if it isn’t) so that when you get into Avid you can slip sync by perf. The YoYo rep did say that if you have time the software can sync by processing the audio and finding the clap rather than by timecode alone, and that when it does that it would nudge the audio as much as needed on a sub-frame level.

The Arri booth also had the Codex Vault, which I didn’t get much time to check out but could be an interesting alternative, albeit one that doesn’t seem to allow quite so much customization beyond its presets as the YoYo software does. I definitely want to check this one out more before the next show where I need to worry about this.

Streaming Dailies to iPads

I checked out the G-Technology G-Connect, and I’ve also previously looked at the Western Digital version of the same thing. The goal for something like this would be to put your dailies on one of these devices, which then act as a Wi-Fi hotspot and can stream the video stored on them to any connected iPhones or iPads. This would be useful for allowing people on set to view dailies without having to load up each iPad with a copy of the dailies, but since they’re intended more for consumer level use the encryption involved (or lack thereof) becomes a sticking point. There is password protection on the user interface, but the Wi-Fi transmission itself is unencrypted, and no studio would allow dailies to be transmitted over the air like that.

Field Recorders

I’m cutting a show shot on Alexa right now, and we initially considered using a field recorder to dual-record DNX media while the Alexa was shooting ProRes 444, but we had to abandon it since the field recorder couldn’t grab the filename the Alexa was using for the ProRes file over the SDI connection it was using as its input. Since then the Alexa introduced exactly what we needed as a native option, but nevertheless it surprised me that this was a problem, since what use is a field recorder for editorial if the names are different from the master files I’ll want to relink to later?

The Sound Devices PIX 240 and 260 do now grab the R3D filename off a RED camera and can name their proxy files accordingly, but don’t yet work with any other cameras. Hopefully this becomes standard across all camera brands, as it would make an editor or DIT’s life a bit easier.

New Cameras!

There were a few new cameras to check out at NAB this year. I played around with the Blackmagic one, and I saw the new offerings from Canon and Sony. I don’t really know enough about camera tech to comment knowledgeably, but I do like Blackmagic’s consideration of not creating another proprietary file format, and that’s about all I have to say on that.

And I Still REALLY Want One Of These

Flanders Scientific LM-2340W


Well, that’s it for my initial thoughts from my first ever trip to NAB! Next year I’ll have to try to get more time off!

Automate VFX Sequence Titles

Automate VFX Sequence Titles

This tip comes by way of George McCarthy, who was our VFX Editor extraordinaire on Mission: Impossible 4. I also created an online EDL to SubCap Converter you can use in lieu of the more manual way described below.

If you’re on a show that has to turnover a sequence to a VFX house, you’ll likely find the need to export a Quicktime of that sequence with titles over any shot in the sequence that will be a VFX shot. This reduces confusion between Editorial and the VFX vendor, and is useful not only to label each shot with the shot ID, but also because it’s not always obvious which shots are supposed to have work done to them. Makeup fixes, for example, wouldn’t be immediately obvious when scrubbing through a Quicktime, but if you title the shot then it’s easy for the VFX vendor to match your count sheets to a visual reference.

SubCap Example

Before Avid added Generator clips to the effect palette, you had two options for titling your sequence. One was to manually type in shot names, durations, etc., and save each title individually in a bin. The other was to attempt to use the Autotitler function that existed in Marquee, though almost immediately after Marquee came out, Avid broke the Autotitler in a software update and left it that way for years. In either case, you’d still be left with the task of manually cutting in titles over each shot, which is a tedious and error-prone task.

There are now multiple methods for doing this in a more automated fashion, including one I just learned about that uses the Timecode generator plugin over a subclip, but in this article I’ll look at using the SubCap effect and feeding it a text file converted automatically from an EDL.

Prepping an EDL

There are a couple reasons why an EDL is handy for generating a subtitle file. The first is that you can include locators in the EDL, so you can reuse the locators you’ve already created in your sequence that list each shot’s ID. The second is that by using an EDL instead of a straight locator export, you can get timecode ins and outs for the shot, so that the subtitle is the appropriate length. From this you can also calculate the duration if you’d like to include that in your title.

So the first step is to make sure your sequence is ready, you’ve run the Commit Multicam Edits command, and all of your VFX shots have a locator somewhere on them that lists the correct shot ID.  Save your bin and open the sequence in EDL Manager. You do not have to lift out non-VFX shots from your sequence, but you do need to make sure that Locators are turned on in the EDL settings, and that your EDL type is CMX3600.

Make sure Locators are enabled in the EDL settings


Export an EDL from the video layer where your VFX locators exist, and then either use the converter I created to do it for you, or bring that into a text editor that allows regular expression Find & Replace (such as TextMate or jEdit). This is the regular expression I use to grab all the right bits from the EDL

\d{3}[^\n]*([0-9:]{11})\s([0-9:]{11})\s?\n(?!\d{3})(?:.*\r?\n(?!\d{3}))*?\* LOC: [\d:]{11}\s(\w+)[^\S\n]+([^\r\n]+)\r?\n?
Looks scary, I know. But that one line of gibberish looks for any series of lines in an EDL that include an EDL event next to a Locator comment. When it finds one, it saves the timecode in and out for the sequence as well as the color and text from the locator. With all that information saved you could choose whether or not to use the color to handle a certain color of locator differently than another, or to calculate a duration based off the timecodes. Using backreferences, you could fill in your Replace field with $1 $2\n$4\n\n, for example, and that would give you the format you need for a SubCap file. This RegEx won’t get rid of all of the non-vfx EDL events that you would want to ignore, so you’d have to go through and do that manually remove those lines or write a RegEx that negates the one above. Don’t forget to add the opening and closing tags, too. A small sample of the final product of a SubCap file looks like this:
<begin subtitles>

04:00:00:00 04:00:08:00
CS0010 (FORMERLY CS1000)

04:00:42:22 04:00:51:00

<end subtitles>

Importing into the SubCap effect

Once you’ve got your SubCap text file, throw a SubCap effect on an empty video layer and go to Import Caption Data to bring your titles in. Make your adjustments for appearance (make sure to check out the Global Properties pane as well), and optionally you can save a stylesheet for the future so you only have to make those adjustments once.

SubCap Effect Panel


Check Your Work!

This is the last step, and it’s very important. Just because the process is automated doesn’t mean that there wasn’t an error, or that your source EDL was perfect. Check your sequence to make sure it has everything it’s supposed to and nothing extraneous. Even on small shows there can be a lot of hands in the locator jar, and you might find an errant locator buried in a nested clip, or a missed two-cut shot that got separated from its locator. If you need to add a title, it’s easy to do so from the SubCap effect editor).

Timeline with SubCap-Imported Titles

EDL to SubCap Converter

I've released a new version of this tool at Please start to use that, and let me know if you run into any problems. This site will remain online, but all future development will happen on the new site. Thanks!

This tool takes an EDL as input (paste it in), and converts it to Avid DS Subtitle format, which is one of the two formats you can supply to Avid's SubCap effect....

Keeping Mobile Avid Media Updated

On most of the films that I’ve worked on, I’ve needed to keep at least two Avid systems up to date. This could be because an editor wants to be able to cut from home, or the Director wants the ability to cut on set. To handle this I wrote a Terminal (bash) script that searches for any new MXF or OMF media on specified volumes since the last time the script was run. On first-run, the script does nothing but set the date to compare against the next time the script is run, though if you need to change that instructions are below.

Mission: Impossible 4 Formats and Aspect Ratios

Mission: Impossible 4 Formats and Aspect Ratios

In terms of workflow, planning for MI4 was certainly a challenge. If it were just one film format or just one release format there would be nothing special to write here, but from the outset we knew we would be shooting multiple film formats and releasing in three different aspect ratios, and keeping on top of all that takes a little bit of effort. The final film contains imagery shot on six different formats, which means that there are a lot of different native aspect ratios and keycode/timecode systems at work. We also coerced all six formats into the constraints of just two aspect ratios, so in the end the anamorphic RED footage was slightly resized to fit into 2.35, and everything else was cropped to fit into the 8-perf aspect ratio, since that format had the most footage in the IMAX parts of the movie.

Experimenting with Streaming on a CDN

Experimenting with Streaming on a CDN

Recently I was asked to post a copy of the film I just finished cutting online for a producer in Europe to view. This meant I would need to encode a Flash Video file (FLV) or an H.264 MP4, and I would need to find someplace secure to host it online. On this and previous projects, I’ve often uploaded Quicktimes directly to my website and sent a link and login credentials to whoever needs them. If the recipients are in the United States, my hosting plan at MediaTemple is more than adequate to serve up a Quicktime or two. For slightly longer cuts I’ll usually go the extra step of encoding a FLV and directing the recipient to an HTML page with a flash player set up to play the FLV. I have also uploaded a 90-minute cut in FLV format to my server, but a couple people I sent it to had problems playing it all the way through, and when they had to reload the page they were forced to start downloading the 1.2GB file from the beginning.

In this case, since the producer is in Germany, I wanted to be able to provide him a quick, localized download, as well as the ability to jump to any part of the film instantly. This is a feature you find on most major video sites, and if your connection gets disrupted or you need to reload the page for whatever reason, you can pick up right where you left off with minimal downtime. Unfortunately I don’t have enough control of my MediaTemple server to install the appropriate software for this kind of streaming. Also, my server is located in Los Angeles, and having lived in Europe I’m well aware of how slow it is to transfer files transcontinentally. What I needed was a RTMP server in Europe, and without doing very much research I opted to try out Amazon Cloudfront.

The video below is an example hosted from my Cloudfront account. I didn’t have much I could put online publicly, so please enjoy the credits from a short film I cut called The Happiest Man Alive.

[jwplayer file=”HMA_Credits2.mp4″ streamer=”rtmp://″ provider=”rtmp”]

Amazon Cloudfront

Amazon Cloudfront is an extension to their popular S3 file hosting service. The way S3 works is that you upload whatever files your website requires to Amazon S3’s servers, and then they take care of making sure that when someone requests your website, the appropriate files are transmitted from a server closest to the requestor. So for example, a user in Japan would download images on your website from an Amazon server in Asia, while in Los Angeles you’d download the same files from a server on the west coast. You pay a few cents per gigabyte of data transferred, but your website loads much faster for everyone worldwide since they’re not waiting to load it from a server half a world away. This is what a Content Delivery Network (CDN) is designed to do, and though most CDNs are designed for big corporations like Apple and Facebook, Amazon happens to run one designed and priced for individuals as well.

What Cloudfront does is add a few more localization controls to S3, as well as a streaming capability. There are other streaming services you can pay for, but for me what gave Cloudfront the edge was the combination of a CDN and streaming service. This way, not only can the producer stream my film in the same way he would watch a clip on Youtube, but it will stream to him from an Amazon server in Frankfurt, Germany, even though I uploaded the file to an Amazon server in the US.


As usual, a few caveats.

1) Cloudfront and S3 are not replacements for a host server. They serve up files but they don’t serve up websites. You will still need your own server or a Google Site for your viewers to point their browser at, and you’ll need to configure a Flash player with the appropriate Cloudfront RTMP URL where your file is located.

2) If you don’t want random people seeing your content, make sure to protect the folder containing your HTML page in some way (.htaccess, server-side authentication, etc.). Your hosting provider likely provides at least basic folder password-protection.

3) Cloudfront does offer a “Private Content” option, which allows you to specify a security policy and expiration date in the link you send out to people, however I was not able to get this to work with a RTMP link in either Flowplayer or JW Player. I think once Amazon creates better controls for private content on their web front-end, this might get easier. I was trying to set it up using a 3rd-party Windows app to configure Cloudfront, and I couldn’t get it to work at all. I was also skeptical how secure this “secure URL” would be, considering that anyone with a little brains who somehow got access to that part of my website could copy the RTMP secure URL the same way they could copy an unsecured RTMP URL. You can get a little fancy in the secure URL by embedding a policy that restricts downloading the file to certain IP addresses or until a certain day, but considering that I couldn’t get even the basic template policies to work, I couldn’t explore the security of the fancier policies.

4) On Cloudfront, the first time someone in each worldwide region requests your content, there will be a slight delay as that content is transferred to a server local to that user. From then it will remain on that local server for 24 hours after the last request. So if I upload the file to the LA server and then my German producer requests it, there may be a very slight delay before it starts to stream for him. However, if he goes back to watch it again 12 hours later, it will already be cached on a local Amazon server and will start streaming immediately.

Use a Droplet to Prep Audio for FCP

Use a Droplet to Prep Audio for FCP

So I’ve had a chance to work for a while on FCP, and I have a few things in mind to post on it. First, though, a very simple trick to help with importing properly configured audio into FCP.

The Issue

FCP, like Avid, prefers uncompressed audio over MP3s, M4As, etc. It’s not that it doesn’t work when you import a MP3 into FCP, it just doesn’t work well. You’ll hear all sorts of clicks and dropouts as you play through the track, so to prevent this you need to convert your audio to WAV or AIF before you import. There is a program called Loader that will convert your audio for you, copy it to a specified folder, and then import it, but it costs $79. The other way you can do two out of those three actions (converting, copying to a specified folder) is to make a Compressor droplet. You still have to import it on your own, but to save $79 I think that’s a pretty good trade-off.

Compressor Droplets

Any compression setting in Compressor can be converted into a standalone application called a Droplet. As you might infer, you can drop things like files onto the Droplet, and it will then process that file into whatever setting the Droplet is programmed with. You can also specify a destination folder for that Droplet and tell it to run silently. Each file you drop onto it will queue up and encode immediately, with the resulting file being placed in your predetermined destination folder.

For me, I have two droplets, one for Mono and one for Stereo (both 24-bit/48k AIF)

Stereo Compression Setting

The image above is for the stereo setting, from which I then make a Droplet in the Settings window:

Make a Droplet

Once you click the Make Droplet button, a dialog will come up asking you where you want to save the Droplet itself, as well as which Destination (defined in the Destinations tab) you’d like that Droplet to send its files to:

Saving a Droplet

Once you click Save, your Droplet will appear wherever you saved it (on my Desktop in my case), and you can proceed to drop files onto it. The first time you run the Droplet a dialog will come up confirming your settings, and you should uncheck the “Show at Launch” checkbox in the lower left corner so that the Droplet runs silently in the future.

Once the compression is done (keeping Batch Manager handy is a good idea to check progress, though most audio takes a very short amount of time), you can import your new files into FCP and cut away.