Showing posts with label Footage. Show all posts
Showing posts with label Footage. Show all posts

Friday, September 12, 2025

Variable Frame Rates From Phone Footage Creating Problems In Premiere - No External Playback

 We have to deal with a lot of phone footage nowadays, and the phones are prone to creating clips with variable frame rates, with some hit and miss results in post, particularly when using Premiere.

 Have a look at this phone video clip specs:

Variable frame rate clip from iPhone, seen as such in Media Conch

 If you are like me using BMD 3G Monitor with Desktop Video, with non legit frame rates in Premiere Source window, you get no output. Even though Premiere is content playing back the clip internally, no go with external monitoring.

 Now if you create a legit frame rate Timeline and drop the variable FR clip into it, all is well because Premiere conforms the non-legit frame rate clip to the (legit) Timeline frame rate.

 When bringing such variable FR clips into Premiere, the FR stamping gets somewhat fantastical. In this example I have 5 clips, all from the same phone. They are all recognized in the metadata as 29.98 fps:

All VFR clips stamped as 29.98fps in Premiere

Same thing in the Media File Properties: 29.98. It should be 30fps, but because of variable FR, Premiere stamps them at 29.98.
Same thing in the Media File Properties window.

 For some reasons the first three clips playback just fine on the external monitor via Desktop Video/3G Monitor, the last two clips do not playback at all. Again, I'm talking about the Source window playback, not Timeline playback.

 On closer inspection, these clips show fantastical frame rates when invoking the Modify/Interpret Footage function. The Frame Rate is all over the place:
 Clip 1 = 29.9775
 Clip 2 = 29.9782
 Clip 3 = 29.9793
 Clip 4 = 29.9869
 Clip 5 = 29.9845

 Remember all five clips frame rate is rounded and stamped as 29.98 by Premiere.

 For some reason, only the first three clips will play through external playback in Premiere, not the last two. Maybe 29.97xx is an acceptable frame rate to BMD, but not 29.98xx? Who knows.

 A workaround is to manually stamp all these VFR iPhones clips to the proper 30 fps, (or 29.98 fps) in the selected "Assume this frame rate" and clicking OK. Batch processing is possible.

 Now all clips playback through the Source window / external monitor 3G Monitor device just fine. But it is changing the source frame rate, so it might not be desirable beyond permitting Source Monitor external playback via BlackmagicDesign DV/3G Monitor. You be the judge of that.


PP 25.5.0
macOS Sequoia 15.6.1


Tuesday, October 17, 2023

The Peculiar Way DaVinci Resove Handles Interlaced Footage

 DaVinci Resolve has a weird way to deal with interlaced footage.

First with Resolve, you better pay attention to what kind of media you are bringing in, and how it is recognized. In particular with legacy footage (NTSC, PAL) the Pixel Aspect Ratio is sometimes unrecognized, or recognized as Square. You will have to manually switch to NTSC or PAL Pixel Aspect Ratio. Same with Anamorphic or 4:3 footage. Check your footage for squashed looking image.

Second, the Field dominance is usually recognized properly via Auto, but best to double check against your actual footage anyway and make sure it matches. If not, again modify it by hand, and use the correct settings for your footage, Upper Field or Lower Field. Using the incorrect field will produce unwanted artifacts.

Then, when creating a new Timeline for exporting screeners, make sure the frame rate matches the footage. Ex. 29.97 for NTSC or 25 for PAL. It's counter intuitive, but do not check "Enable Interlace Processing" box. The effect will be to double the frame rate, resulting in a 59.96fps Timeline for NTSC or a 50fps Timeline for PAL. In most cases you do not want to enable interlaced processing.

Resolve User Manual p131:

  • Enable interlace processing: Interlaced media is supported throughout DaVinci Resolve. The “Enable interlace processing” checkbox forces DaVinci Resolve to process all operations internally using separated fields, in order to properly maintain the field integrity of interlaced clips in your program. In addition, each clip in the Media Pool has a Field Dominance drop-down menu in the Video panel of the Clip Attributes window that lets you specify whether clips are upper- or lower- field dominant; an Auto setting makes this choice by default.

    There is also a corresponding checkbox in the Render Settings panel of the Deliver page, named “Field rendering,” that lets you enable and disable field rendering when you’re rendering file-based output.

    There are two instances where you want to leave this setting turned off:

    • —  If you’re working with progressive-frame media, it is not necessary to turn this checkbox on. Doing so will unnecessarily increase processing time.

    • —  If you’re using interlaced clips in a progressive-frame project and you’re intending to deinterlace those clips using the Enable Deinterlacing checkbox in the Clip Attributes window, then you must keep “Enable video field processing” off. Otherwise, the Enable Deinterlacing checkbox will be disabled for all clips. For more information about deinterlacing clips,
      see Chapter 22, “Modifying Clips and Clip Attributes.”

      If you’re working on a project with interlaced media that you intend to keep interlaced, then whether or not it’s necessary to turn field processing on depends on what types of corrections you’re applying to your clips. If you’re mastering your program to an interlaced format, and you’re applying any adjustments that would cause pixels from one field to move or bleed into adjacent fields, then field processing should be enabled; effects requiring field processing include filtering operations such as blur, sharpen, and OpenFX operations, as well as sizing transforms that include pan, tilt, zoom, rotate, pitch, and yaw.

      On the other hand, regardless of whether you’re outputting interlaced or progressive-frame media, if you’re not filtering or resizing your clips, and you’re only applying adjustments to color and contrast, it’s not necessary to turn on field processing for interlaced material, and in fact, leaving it off may somewhat shorten your project’s rendering time.

You will not be able to go back to the original frame rate after you choose to Enable Interlace Processing. All outputs will be at double the frame rate. Even if you are applying resizing and other pixel shifting prone processing, you might still want to not enable it in order to maintain the original frame rate.

The other problem with Resolve is that not all formats/Codecs allow for interlaced output. For example Quicktime h264 is always progressive. There is no checkbox for interlaced rendering when selecting h264 or h265. Apple ProRes on the other hand has a checkbox and allows for interlaced rendering.

h264/h265 are mostly for delivery of files to web/computers. But they are also widely used for making screeners, and when the original footage is interlaced (legacy NTSC, PAL), it sometimes is a problem outputting either a progressive file, or worse a 59.94/50fps file.

Let's say the editor uses your files as temp clips into his/her cut, it will be a problem down the line when it's time to online with the original clip. The timing won't match, or the clip will be wrongly stamped as progressive.

Having to export first to ProRes out of Resolve, then convert the ProRes interlaced files to h264 interlaced using Compressor or AME (both allow for interlaced h264 outputs), is not an acceptable workaround, too time consuming and it adds opportunities for errors.

Granted, legacy formats are not much in use these days, except in documentary work as archive footage. And there are tons of archives recorded as NTSC or PAL interlaced. If you are moving your cut between NLEs, and/or when it's time to online, problems will arise. A good online editor will be able to flag and deal with these mismatches properly, but still.

Now if you are using the original interlaced footage in a progressive Timeline and finalizing in Resolve, then as described in the manual, just de-interlace the footage and you are in business. Just make sure all the clips settings are correct as discussed earlier.

It's odd to me that the proper interlaced workflow when dealing with screeners / temp outputs is so lacking in Resolve. Again Apple Compressor and Adobe Media Encoder are perfectly fine with interlaced h264 outputs.

Why is interlaced only possible with some codecs and not others in Resolve? To me it feels like a random decision on the part of BMD.