Oaktown wrote: ↑
Thu Mar 16, 2017 14:53
Ladin's suggestion is probably the safest approach but here are two other thoughts:
- When trying to sync multiple files, I would make sure all the clips are rendered in 30fps instead of 29.97fps. The difference is minimal (1 dropped frames every 1000 frames) but that become almost 8.1 frames for a 45 minute file which would be noticeable if the three files don't behave the same way since it's 0.27 of a second.
Assuming the audio is run separately, you could also sync your three clips using SMPTE. Make sure you use at the same frame rate as your clips.
I am a longtime video engineer in Hollywood – raised primarily in Broadcast and Theatrical Film production and post production.
With respect and appreciation to the author and his post, the response is somewhat incorrect. This is a common misnomer about 29.97 frames per second and "DFTC: Drop-Frame Time Code" – In actuality, NO FRAMES ARE "DROPPED," it is just the number LABELING (its "ID: Identification") of each video frame is changed.
An example would be is if I asked you to count 10 pennies on a table, but ask you to pretend that the number "9" does not exist. After number 8, you would say "10" for what would have been the ninth coin; and for the tenth penny, you may call it "dime." You don't have 9 pennies on the table if I asked you to drop the ID name of "9" – you still have 10 pennies.
Likewise, you are NOT really "dropping frames," causing any kind of drift over time.
However, depending how this software uses the Frames Per Second as a time base, then there could be a .01% difference if, for example, all layers and the final output composition are not all identical in frame rate. If this software bases its non-time code mode on the computer host's CPU/crystal, the "drift" should then be all the same; to wit, if Layer 1 has drifted by 5 frames in an hour, the software (if properly written) would drift all sync'd layers by the same amount, keeping everything identical.
All clocks, time code or not, eventually "drift." But in TV production, if you base all your "slave" devices on the "master device" or "house timecode clock," then they are all in-step with each other and so they all drift TOGETHER. People who use SMPTE (Society of Motion Picture & Television Engineers) time code on cameras think that just because their cameras have SMPTE time code, all cameras are in sync if triggered together. No, they eventually will drift ... UNLESS, you are slaving all cameras from one MASTER camera's time code generator ("Jam-Sync") – then they are all identical, because even if they drift, they drift together (the slaves self-correct to follow the master).
Read below if you want more detail and the history of Drop-Frame Time Code.
Time Code was invented for NASA by EECO (Electronic Engineering Company of California), and when color TV was being developed in the US in the 1960s, broadcasters asked EECO to make a similar system for counting video frames, which unlike film which could be held-up to the eye and seen, video tape held to the naked eye could not show frame pictures. There had to be a way to give a unique ID number to each of the 30 frames per second on videotape (25 fps in Europe).
Black-and-White TV ran at an even 30.00 frames per second, because the TV sets were timed to the cycles of power in the common household (60 Hz/120v). Since Europe had 50 Hz cycles of power, TVs there were timed for 25.00 frames per second.
When Color TV was being developed, we didn't want to transmit 2 signals from the TV station: one for the common Black-and-White and another for Color; so the color was "piggy-backed" on top of the black-and-white signal; and black-and-white TVs were supposed to ignore the color subcarrier. But this caused problems with the picture and sometimes even messed with the audio.
Engineers found a brilliant solution: They slowed the color down to 29.97 frames per second. There was still 30.00 frames per second in the video, but the color portion traveled at a different rate to eliminate the picture/sound interference problem. There was no "drift" in accumulation of the .03 difference, because each second would "reset" the count as a broadcast signal was live and refreshed 60 cycles per second.
When 1956-era videotape was being retrofitted for color and time code was invented to ID label and count the frames, engineers discovered that each exact 1 Hour of video tape was 3.6 seconds longer each hour (108 frames) on the time code clock compared to their stopwatch and the clock on the wall. How could this be? The clue was that the USA color subcarrier is 3.58 MHz and they remembered that they slowed color down by .03 frames. They were 108 frames off by count with the time code, which counted each frame normally, because time code was timed to the color signal (29.97), not the black-and-white signal (30.00).
So a counting rule that would be easy to remember was set: Drop the first 2-frames of counting (Frames 00 and 01), at the TOP of every minute. But 2 frames dropped per minute, with 60 minutes in an hour, would mean 120 frames would be dropped, not the 108 overcount which was needed to be compensated for.
120 minus 108 = 12. How convenient: How many numbers are there on a clock: 12. If we drop 2 frames per minute, we'd need to NOT DO THE DROP 6-times in the hour. How convenient again: How many 10-minutes are there in an hour: SIX!
So the rule became: For Drop-Frame Time Code (DFTC), we will drop (2) frames at the top of every minute, not calling the frames by the numbers :00 and :01 – we would instead jump calling the next frame :02. But, we would *NOT* do this drop at the top of each 10th minute, since we need (6) non-drop incidents to equal the 12 frames we need to put back. (Again, 2 frames every minute would drop 120 frames when we need only to drop 108. The difference between 120 and 108 = 12. So if we drop 2 frames at a time, let's not drop those 2 frames six times per hour, and let's do it every 10th minute at the top of that minute, since SIX 10-minutes would = 60 minutes... a perfect 1 hour. Of course, the 10th minutes where we don't drop 2 frames are 10:00, 20:00, 30:00, 40:00, 50:00 and 00:00 (Top of the hour).
AGAIN, WE ARE NOT DROPPING ANY FRAMES OF VIDEO! Doing so would cause a 'jump' in the video that would be noticeable. We are just dropping WHAT WE CALL those frames.
Once DFTC: Drop Frame Time Code rules were set, the regular time code that was 3.6 seconds too long was named NDF: Non-Drop Frame – this never matched human real time and the clock on the wall. Multiply 3.6 seconds too long by 24 hours in a broadcast day, and 3.6 seconds too long in counting each hour would equal 86.4 seconds in mis-count. Remember ALL video back in the early days was based on BROADCAST TELEVISION. No one had VCRs or DVRs or video cameras in the home (unless you were wealthy and could afford a $120,000 2-inch Quadruplex VTR and $100,000 cameras in the 1960s. Only broadcasters could spend that kind of money, so THEY defined the standards via SMPTE, like we had them today define HD as primarily 1080i or 720p).
Non-Drop Frame was labeled with colons; and the new Drop-Frame time code which matched the real time clock on the wall would have a period or semi-colon between the frame number(s), or at times, all the numbers, to catch your attention that it was DF TC.
Let's look at the consecutive frame IDs in Non-Drop Frame (NDF) for 6 frames:
HH:MM:SS:FF (Hours / Minutes / Seconds / Frame)
Notice COLONS (:) for NDF:
01:09:59:27 (This is Frame #27 of 1 HR, 9 MIN, and 59 SECONDS)
THE EXACT SAME VIDEO as above, but now, let's ID label the frames with Drop-Frame Time Code (DFTC)
(note the semi-colon or period):
01:09:59;27 or 01;09;57;27
BUT WAIT – Both sets of timecode labels above are identical and no frames were dropped! Why? Look at the minute: We are at the TOP of a 10th minute, where Drop-Frames are EXCLUDED (10:00; 20:00; 30:00; 40:00; 50:00 or 00:00). Remember, 6x per hour, we don't drop the 2 frames.
So if the timecode was going from the 8th minute to the 9th minute (instead of 9th minute to the 10th), we would now see the drop frames:
01:08:59.27 or 01.08.59.27
01:09:00.02 (TWO Frames :00 and :01 do not exist at the top of every minute; except @10th minutes)
So now you may ask, "Why even use NDF: Non-Drop Frame Time Code?" Well, before Non-Linear editing on computers (Avid Media Composer; Adobe Premiere Pro; Apple Final Cut Pro), all editing was done with Linear computers (CMX, EditWare SuperEdit, Axial, Sony). A producer or director would ask the editor, "Trim that IN point 10 frames earlier." Imagine the editor working in DFTC trying to calculate in his head: If his IN point is presently at DFTC 01;09;01;02 and takes away 10 frames, he'd have to check if he would cross a 10th minute barrier, and then figure to minus 8 frames (or 10???) in the computer... Confusing! Later software would do the math transparently, but in the beginning, the human editor had to figure it out, along with a thousand other technical things for each machine).
[If you would like to see how we did Linear Editing, both Off-Line (practice edit) and On-Line (final edit to deliver to broadcast), see the first 9 minutes of my example of a kids show: STUDIO SEE, PBS]:
So back in linear days, most editors liked to edit their tapes in NDF and ignore the rules of DF timecode. They would use a time code calculator to manually ensure the running time matched the real clock time. Then, when they were done with the edit, they copied their EDITED MASTER to a new tape (the AIR MASTER or DISTRIBUTION MASTER), and would use DFTC on making that copy, so that it always matched real clock time.
# # #