Hi everyone,

I'm working on a DirectShow application that does playback of live video from external network sources (we have our own source filter that gets video from these sources). In the past when my application played live video, we simply disabled the clock and played whatever was received as soon as it came in. This works great on internal development network and most production ones, however any latency in the network translates directly into jerky and jittery playback and on some production networks with moderate levels of load even at 30fps video looks pretty bad.

I was thinking solution to this problem could be to enable the use of reference clock and then add 500ms to all timestamps of the received samples. This way, there is a 0.5 sec buffering that will hopefully even out any jitter in the transmission. At first this appeared to work great, but then I noticed other glitches and apparently they are related to the fact that the rate of DirectShow graph's clock may not match exactly that of the video source so in some cases if video is played long enough(hours), live video might get longer and longer delay.

I found http://msdn.microsoft.com/en-us/library/dd390645(VS.85).aspx MSDN page describing exactly this condition. However, I couldn't find any other information anywhere and the part that is confusing a little is that I know my video source has its own clock rate. But it's a network source so clearly I can't just query it for a reference clock interface. So does that mean I have to make my own ref clock COM object and somehow make it guess what the clock rate of the source is? Anyone know if there is a standard solution to this since it seems like it would be a standard problem?