VideoHelp Forum




+ Reply to Thread
Results 1 to 2 of 2
  1. Member
    Join Date
    Dec 2007
    Location
    United Kingdom
    Search Comp PM
    Hi guys,

    I am trying to create a college project. I started a movie that was being played through realplayer (speed 23.976). I set a stopwatch against the realplayer timing. I cannot understand why that after an hour there is a time difference between the two times (in this case the avi was 8 minutes behind the stopwatch). Understanding this disparity is integral to my project. If anyone can help I would really appreciate it. Below are my questions:

    1. Is there a difference in the time displayed on realplayer between playing a PAL DVD and NTSC formats? If so, what is the percentage difference between each format against a 'proper' second?

    2. Would realplayer play media with different framerates at the same speed? That is, would the elapsed time displayed with realplayer remain the same for media played at 29.97FPS and 25FPS? Again, what is the percentage difference between each format against a 'proper' second?

    3. AVI media typically comes in at the speed of 23.976FPS. But doesn't NTSC actually run at 29.97? Why would I slow the framerate from 25 to 23.976 to get NTSC when I want the speed to be 29.97?

    4. What would I need to do to convert media of 23.976FPS to 25FPS?

    I have more questions but if you can help with these that would be great!

    Thanks,

    Rosie



    Quote Quote  
  2. Member Krispy Kritter's Avatar
    Join Date
    Jul 2003
    Location
    St Louis, MO USA
    Search Comp PM
    NTSC and PAL are irrelevant on a PC. A 5 minute clip in NTSC or PAL will still be 5 minutes on a PC. Time displayed when playing video files can be affected by the file index. You can often also see differences in playback time between various video playback software.
    Google is your Friend
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!