Resolume is first and foremost VJ software. Our biggest hurdle to cross, is not that we have to play a single file with a single framerate at a single speed, we have to mix and match many different video files, with many different framerates, at many different speeds.
Imagine you are playing a single 4 minute, 30 fps video. Everything is fine, and Resolume is rendering nicely at 60fps. Halfway through, you add a whole bunch of videos and effects, bringing the total fps down to 15. A normal player would now curl up an die. For Resolume, this sort of thing happens all the time.
So Resolume tries to make sure that your original video will still finish after 4 minutes, displaying the original video the best it possibly can at that framerate. We can't all of a sudden have a video play twice as long, just because your computer couldn't keep up with the rendering you're asking it to do.
This all means that when on timeline, each video is not tied to a master clock, but to its own internal timing. We make a calculation on what frame of the video should currently be shown, based on the original clip framerate, the current clip speed, the current rendering framerate, what frame it was previously showing and a bunch of other variables.
This means that invariably, tiny errors start creeping in. We're talking micro seconds here, but they add up. A single 4 minutes video will always play back in 4 minutes. It could be 4 minutes and 2 milli seconds, but you don't notice that difference. You do start noticing it when looping a clip several times. Because the timing of when the loop is triggered and the timing of when the loop restarts are all affected by the clip's own internal timing.
Btw, a really good way to see if two frames are matching or offset from each other, is to have a copy of the clip in a second layer and the blend mode on Difference I. If the clips are in sync, the output will be black. The more difference between the frames they are playing, the more white the output will be.
So the only way to ensure that all clips are on the same timing, is to use an external timing source to drive the playhead. BPM Sync is one way to do it, because then the playhead of the clip is synced to the phase of the current beat. Another could be SMPTE or OSC, which can set the playhead of a clip to a certain time directly.
I'm not saying exact framesynced playback of multiple videos will never be possible. I'm just explaining why it's currently problematic and the problems we have to resolve before we can make that happen.