VideoHelp Forum
+ Reply to Thread
Results 1 to 22 of 22
Thread
  1. Hi,

    I joined the forum because I couldn't find this answer duck-duck-going the web. Partly because english is not my main language and it may be that I couldn't use the right words.
    Anyway, my questions is why blu-ray video look more fluid in a HD TV than any other videos. Even if I download a ripped version of a bluray, it doesn't flows as smooth as the real blu ray. Note that I'm not speaking of the resolution, I'm speaking about the motion. Can someone explains me why Blu-ray looks better than ripped Blu-ray movies? In fact, can someone explain why Blu-ray flows smooth than even normal television?

    Best,
    Amid
    Quote Quote  
  2. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Ripped movies are rarely just "ripped". They're usually "ripped+converted" to a lower quality format. That conversion could be making it less fluid.

    Also, normal TV is very compromised WRT bitrate compared to a blu-ray. Example comparison -

    Common BD bitrate: 18-35 Mbps (up to 40Mbps max for 2D and up to 48/60 for 3D).
    Common ATSC video channel bitrate (which actually often contains MULTIPLE sub-channels of HD & SD): ~19Mbps
    Common ripped+converted MP4/MKV file bitrate: 5-12Mbps

    Can't go further into examples without knowing specific format info from you...

    Scott
    Last edited by Cornucopia; 28th Jul 2014 at 21:47.
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  3. Member
    Join Date: Sep 2012
    Location: Australia
    Search Comp PM
    A Blu-ray player can output movies to a TV at exactly 23.976fps through a HDMI connection, allowing TVs to make the most of their actual refresh rates. Something like a video card or such has to output at a certain fps, generally 50 or 60 hurtz and if their frame-rate conversion techniques aren't as good as the displays or the video cards refresh rate is different to what the displays physical refresh rate is the motion will never be as smooth.

    PAL TV is 50fps, NTSC is about 60fps. Movies are about 24fps, which doesn't convert very well to either TV standard, which means the movies have to either be sped up (PAL) or the frames have to be repeated in a 2:3 cadence (NTSC) to reach the correct frame rate. If the frame-rate has been pre-converted before reaching the TV it's difficult for the TV to know how to order the frames to produce smooth movement.

    I believe the point is, a Blu Ray player outputs the correct temporal information, giving the display everything it needs to optimise motion.

    Yesterday I tried watching a 25fps 1080p video through my WDTVLive and the motion was terrible. HDMI was set to AUTO in the WDTVs settings so I switched it to 50fps 1080p which fixed the problem. I'm not sure what resolution AUTO selected, but I have this nasty feeling it was interlaced (which is another problem with TV broadcasts).
    Last edited by ndjamena; 30th Jul 2014 at 12:48.
    Quote Quote  
  4. Member johns0's Avatar
    Join Date: Jun 2002
    Location: canada
    Search Comp PM
    Some ripped blu-ray are encoded incorrectly so the motion is jerky with other issues,when i rip my own blu-ray discs and re-encode to smaller sizes they have no issues in the motion at all,it's just how the person who re-encodes knows what they are doing as to get a decent video without crappy results.
    I think,therefore i am a hamster.
    Quote Quote  
  5. Member
    Join Date: Sep 2012
    Location: Australia
    Search Comp PM
    There are a lot of factors really, maybe the blu-rays are being frame blended to make the motion smother, or are getting other types of interpolated frames.

    If you're playing an MKV Blu-Ray rip, MKVs have no actual frame-rate so passing 24p onto the display isn't really an option. Which becomes a problem with 120hz TVs or similar, since HDMI devices don't support 120hz.

    This isn't a Basic Blu Ray info thread really, Blu Rays are no different from any other video, it's about frame-rates, encoding techniques and the details of the OPs particular set-up.

    So we need to know more about the OPs setup. Especially what country s/he's in and how each video is being played.
    Quote Quote  
  6. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Wtf? "Mkvs have no actual framerate". Where did you come up with that doozy?
    Mkvs have the same capacity to contain video (which DOES have an actual framerate) and/or audio (which DOES contain an actual samplerate), that all multimedia containers do.

    That has very little to do with a video card's ability to adjust the display rate to match a display's refresh rate.

    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  7. Technically, MKVs don't have a frame rate, but in the context of this thread, it seems like semantics.
    https://trac.bunkus.org/wiki/FAQ%3AWrongFrameRateDisplayed

    In my case my Bluray player seems to default to connecting with the TV at 60Hz. It's quite capable of switching the TV to 50Hz when I play a PAL DVD disc, and switching to film mode (I'm not really sure how that works as my TV doesn't support it) when playing a Bluray disc and the frame rate is 24fps, but when playing video via USB I'm pretty sure it's always 60Hz regardless of the frame rate. So maybe in the OP's case it's a frame rate/refresh rate difference, although he hasn't specified how he's playing the "ripped" version.
    Quote Quote  
  8. Member
    Join Date: Sep 2012
    Location: Australia
    Search Comp PM
    Originally Posted by Cornucopia View Post
    Wtf? "Mkvs have no actual framerate". Where did you come up with that doozy?
    Mkvs have the same capacity to contain video (which DOES have an actual framerate) and/or audio (which DOES contain an actual samplerate), that all multimedia containers do.

    That has very little to do with a video card's ability to adjust the display rate to match a display's refresh rate.

    Scott
    MKVs are variable frame-rate by default and, unless someone wants to come up with a tag to say otherwise, players have no way of determining the difference between CFR and VFR MKVs. Setting the output frame-rate to 24 while playing an MKV would be problematic since it could suddenly switch to 60p partway through.

    Most of what MediaInfo says about the contents of an MKV is simple guesswork.
    Quote Quote  
  9. Member
    Join Date: Feb 2004
    Location: Australia
    Search Comp PM
    Not quite

    Mediainfo gets the information directly from the internal streams where this is not provided by the container in this case ... so a frame rate which can be determined by hardware players exists ... software players can be flaky and inaccurate.

    Without a known frame rate a video of X number of frames would play in a split second or not in some cases.

    Then the question about hardware being used ... having not been answered leaves everyone trying to match up oranges with apples ... it never adds up to anything ... this gets asked far too often.

    Quality of sub sample upload and shared for review may yield some opinion of over quality but that would be all
    Quote Quote  
  10. Member
    Join Date: Sep 2012
    Location: Australia
    Search Comp PM
    In the vast majority of cases simply reading the stream headers will be enough. But since Matroska Timecodes will happily override whatever the header says and you can append one file on to the end of another it's still a guess.

    The last version of MakeMKV was putting some of it's b-frame time-codes 1ms off, which was enough for MediaInfo to decide the whole video was VFR.

    But this is a pointless. I'm wondering how someone could mess up a blu ray rip without it originally being interlaced.
    Quote Quote  
  11. Member fritzi93's Avatar
    Join Date: Nov 2003
    Location: U.S.
    Search Comp PM
    Originally Posted by ndjamena View Post
    A Blu-ray player can output movies to a TV at exactly 23.976fps through a HDMI connection, allowing TVs to make the most of their actual refresh rates.
    Like 24 fps on a 120 Hz TV, versus telecined 30 fps for 60 Hz GPU output, though the OP didn't say anything about the TV or whether the MKVs were being played from computer. I confess I immediately thought 2:3 pulldown judder when reading the first post.
    Pull! Bang! Darn!
    Quote Quote  
  12. Guys thank you for all the answers. I'm still a bit confused because of some technical jargon but I will get it in time. Meanwhile, let me give you an example about what I'm talking about.
    When I go to some store, like BestBuy and watch TVs in BluRay on FullHD Tvs, the motion seems very fluid (one extreme example would be Avatar). However, I never could get got the same "smoothness" by watching a movie in my computer using the HDMI cable to the TV, no matter if I play a ripped version using VLAN or if I play directly from BluRay drive. Yesterday, I went to amazon prime store and started watching an old movie (Amadeus) that was made far before HDMI was available and still played relatively smooth. I have a UN48H6350 that supposedly supports 120Hz.
    That's my question: Why can't I get the same fluidness/smoothness using my computer when even streaming from Amazon looks better?
    Quote Quote  
  13. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Originally Posted by ndjamena View Post
    Originally Posted by Cornucopia View Post
    Wtf? "Mkvs have no actual framerate". Where did you come up with that doozy?
    Mkvs have the same capacity to contain video (which DOES have an actual framerate) and/or audio (which DOES contain an actual samplerate), that all multimedia containers do.

    That has very little to do with a video card's ability to adjust the display rate to match a display's refresh rate.

    Scott
    MKVs are variable frame-rate by default and, unless someone wants to come up with a tag to say otherwise, players have no way of determining the difference between CFR and VFR MKVs. Setting the output frame-rate to 24 while playing an MKV would be problematic since it could suddenly switch to 60p partway through.

    Most of what MediaInfo says about the contents of an MKV is simple guesswork.
    MKVs are accommodating of all framerates by default (using timecodes & timecode scale #s), and treat CFR as a special case of VFR, which is fine. In a general sense, that's what it is. That doesn't mean it doesn't HAVE a framerate. Even VFR titles have a framerate, it is just different at different points in the title - determined by those timecodes & timecode scales. So CFR, being a special case where the scale does not change throughout, DOES have an easily-determined framerate. It is just derived from multiple fields of information vs. a single field of information.
    A player that attempts to properly support MKV should also properly support it's derivation/determination of the framerate, and also of any necessary framerate conversions needed during the course of title playback. Doesn't mean it can't do it, just that its job is somewhat harder to accomplish.

    In the case of stream FR designation vs. container designation + appending, if the MKV format has SPECIFIED that it's FR implementation is meant to override stream designations, then that is what an app should be paying attention to if it's going to follow the spec. Otherwise, it's a buggy/broken app.

    If it follows the spec, and a previously-24p title (which was needing pulldown during playback to match the 60Hz refresh of the display) subsequently changed mid-stream to a 60p title (which doesn't need pulldown), the app ought to just appropriately turn off pulldown. End of story.

    <edit>Also, since Blu-rays - the original topic - ARE only CFR (not counting badly-mastered/edited titles), it's a moot point.</edit>

    Scott
    Last edited by Cornucopia; 29th Jul 2014 at 13:11.
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  14. Can you guys give me some explanation regarding the example I gave you above?
    Quote Quote  
  15. Member fritzi93's Avatar
    Join Date: Nov 2003
    Location: U.S.
    Search Comp PM
    Originally Posted by amidsal View Post
    ... I never could get got the same "smoothness" by watching a movie in my computer using the HDMI cable to the TV, no matter if I play a ripped version using VLAN or if I play directly from BluRay drive.
    I presume when you say "Blu-Ray drive, you mean on your computer? I'm still thinking it might be pulldown judder. If your video card is set to 60 Hz, then the fps (frames per second) must be adjusted to some divisor of 60. In other words, telecined. Look up pulldown and telecine in the Glossary under "What is" above.

    When you play a Blu-Ray from a standalone player set to 24 fps (or auto, which should auto-detect the display), notice that will divide into 120 Hz 5 times. No need to telecine if the TV can do 120 Hz. Each frame will either be repeated 5 times or perhaps you have motion interpolation turned on, which creates intermediate frames. That will certainly smooth out motion, though often at the cost of some minor artifacting.

    If your TV has an onboard media player that can play your file(s), just put one on a thumb drive and test. Or perhaps you have a standalone Blu-Ray player that has a USB port to which you can hook up an external or thumb drive to to play the file through the player.

    I wonder though if your TV has a cinema or film mode that can identify film content and do reverse telecine? Look in your TV's picture settings and see if there's a film mode. Maybe that would make a difference.

    Anyway, if that's the problem, you have a sharp eye. Look for medium fast panning shots.

    Oh, AFAIK, no TV will accept an actual 120 Hz signal.
    Pull! Bang! Darn!
    Quote Quote  
  16. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Your TV is fairly capable, however you have WAY TOO MANY variables to easily troubleshoot here.

    1. What is the source footage framerate? How was this encoded?
    2. Is the source footage interlaced or progressive?
    3. Are you using a hardware (BD player, TV, gen media player, cable/sat box) or a software (HTPC, DLNA Beaming/Mirroring) player? Which one(s)?
    4. What is your outputting card, if software?
    5. What drivers?
    6. WRT motion, resolution, framerate, pulldown, what are the settings for the app/player? the drivers? the TV?
    7. What kind of motion effects are you seeing?

    Once you've isolated those individual variables and have eliminated those links, you will be able to find the "culprit".

    Example using your previous example: Play "Avatar" BD on your TV via a settop BD player. Looks smooth - GREAT. Ok, now you play the same BD in a PC's BD drive, using BD player app (PowerDVD, WinDVD, or TMT) via an HTPC. Same source, same destination, same pipeline, just different player. If your HTPC setup is adjusted correctly (and up to par), it should give an equivalent or identical experience. If not, you know either that your HTPC isn't up to par, or that your setup needs adjusting.
    Once that is figured out, then you move on to different sources with differing motion needs.

    Or, you work the other way around, keeping the player the same and using different footage...

    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  17. Watch the 24v30v60 video in this post full screen:

    http://forum.videohelp.com/threads/307004-Best-framerate-conversion-%28eg-23-97-to-30-...=1#post1888926

    How smooth are the three rows of circles? Which is most similar to what you see when watching Blu-ray movies? Which is most similar to watching video files?
    Quote Quote  
  18. Okay, by reading the last post I may have been using the world smooth wrongly. The image is perfect. The motion, however, is what is not as fluid as when I play from a Blu-ray or Amazon Streaming. One good example to make the comparison is when we play 3D games with more than 30fps: the transition between frames looks more fluid than any video I ever watched through my computer connected to the TV, for example. I didn't mean to say smooth in the sense of anti-aliasing or image quality; just about image movement.
    Let me see if I understood so far:

    The motion "fluidness" is related to the amount of frames per second that are transferred to the TV. If I transfer more than 24fps (23.9 something) the TV will have to convert it and it is the reason it may cause relative slow motion.

    Now it raised another question:
    My TV is 120Hz. The reason bluray and streaming looks so good in my TV is because 120 is a multiple of 24 and the TV can perfectly convert the input to the showed frequency. If this is the case, how can I make my notebook (using HDMI cable) send videos to the TV at 24n fps? This would make it look fluid, right? However, even when I play it my monitor, it doesn't seem fluid, which means that the FPS the software is exhibiting the video is not a multiple of what my monitor is capable of. Is this correct?

    Thanks,
    Amid
    Quote Quote  
  19. Member
    Join Date: Sep 2012
    Location: Australia
    Search Comp PM
    Unless you're made of money, your monitor is most likely 60fps, to convert 24fps to 60fps would require a 2:3 cadence, which means every second frame would display half again as long as the first.

    24fps isn't that great a frame rate in the first place, the Hobbit was shot as 48fps and there's a lot of pressure to get movies filmed at a higher frame rate. Frame rate conversion is the bane of video everywhere.


    -Edit- http://en.wikipedia.org/wiki/Three-two_pull_down


    ---------------------------

    Code:
    num_units_in_tick is the number of time units of a clock operating at the frequency time_scale Hz that corresponds to
    one increment (called a clock tick) of a clock tick counter. num_units_in_tick shall be greater than 0. A clock tick is the
    minimum interval of time that can be represented in the coded data. For example, when the clock frequency of a video
    signal is 30000  1001 Hz, time_scale may be 30 000 and num_units_in_tick may be 1001. See Equation C-1.
    time_scale is the number of time units that pass in one second. For example, a time coordinate system that measures
    time using a 27 MHz clock has a time_scale of 27 000 000. time_scale shall be greater than 0.
    fixed_frame_rate_flag equal to 1 indicates that the temporal distance between the HRD output times of any two
    consecutive pictures in output order is constrained as follows. fixed_frame_rate_flag equal to 0 indicates that no such
    constraints apply to the temporal distance between the HRD output times of any two consecutive pictures in output order.
    When fixed_frame_rate_flag is equal to 1, for all n where n indicates the n-th picture in output order and picture n is not
    the last picture in the bitstream in output order, the value of Δtfi,dpb( n ) is specified by
    So apparently MKVMerge goes back and rewrites the stream header after it's finished muxing. That would explain why Remuxing the MakeMKV VFR files converted it to CFR, MKVMerge didn't think 1ms was enough to warrant the VFR flag.

    Frame rate mode : Variable
    Original frame rate : 50.000 fps

    Frame rate mode would be the fixed_frame_rate_flag whereas original frame rate is time_scale + num_units_in_tick (or Default Duration as the case may be).
    Quote Quote  
  20. Member fritzi93's Avatar
    Join Date: Nov 2003
    Location: U.S.
    Search Comp PM
    Have you looked at your TV's picture settings to see if there's a "Film" or "Cinema" mode?

    Here's what the manual says Film Mode does on my TV (Sharp 70LE640U):

    Film Mode (3:2 Pulldown)

    Automatically detects a film-based source (originally encoded at 24 frames/second), analyzes it then recreates each still film frame for high definition quality.

    Advanced: Adjust effect to reduce judder from film contents. Select a desired level of judder reduction from 0 to +10.
    Standard: Detects, analyzes converts film source.
    Off: Normal viewing mode.

    Note
    - "Film Mode" does not function depending on input signal type.
    - "Film Mode" does not function when you set AV MODE to "Game" or "PC".
    - "Standard" does not function when channel display shows an input resolution of 480p, 720p, or 1080p.

    I really think you should at least look for it in your TV's picture settings. Then test your juddery video with it on, then off. I have mine set to +10 and it does a good job eliminating judder on 1080p MKVs played from my HTPC (GPU at 60 Hz refresh).

    If your TV is capable of this and Film mode doesn't help, then poof!, there goes my hypothesis.
    Pull! Bang! Darn!
    Quote Quote  
  21. Originally Posted by fritzi93 View Post
    - "Film Mode" does not function when you set AV MODE to "Game" or "PC".
    I bet the TV is set to PC mode to eliminate overscan. So he's getting 3:2 judder at 60 fps.
    Quote Quote  
  22. Member fritzi93's Avatar
    Join Date: Nov 2003
    Location: U.S.
    Search Comp PM
    Originally Posted by jagabo View Post
    Originally Posted by fritzi93 View Post
    - "Film Mode" does not function when you set AV MODE to "Game" or "PC".
    I bet the TV is set to PC mode to eliminate overscan. So he's getting 3:2 judder at 60 fps.
    Ah yes, that could be. My HTPC is connected via HDMI; overscan is compensated for in the GPU settings. AV Mode is custom (calibrated).

    Or film mode is not on by default? I wish the OP would give it a go and report back. I'm curious.

    [EDIT] The OP mentions gaming. Setting the TV's AV Mode to "Game" would also disable film mode (reverse telecine). And of course the standard film mode won't do for a 1080p source. It must be set to advanced (or whatever it's called in the TV's picture settings), with de-judder on.
    Last edited by fritzi93; 30th Jul 2014 at 21:03.
    Pull! Bang! Darn!
    Quote Quote