VideoHelp Forum




+ Reply to Thread
Results 1 to 22 of 22
  1. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    ok people, here's a simple question: assume you had an extremely high quality source and you wanted to encode it to a smaller size and assume you wanted to limit yourself to 20Mb/s, for the highest quality final product, would it make more sense to encode to 1920x1080, 1280x720 or 720x480 and assume that the codec will be H264 with all the settings maxed out with ac3 audio.

    my thinking is this: on the one hand the 1920x1080 has more pixels and thus can display more detail but on the other hand the 720x480 would have more bits per pixel and thus each pixel would be of higher quality.

    in the end i'm probably going to just try both and see which one comes out with higher quality but i'm wondering what your predictions are.
    Quote Quote  
  2. It depends on the particular video. A 1920x1080 still shot would require hardly andy bitrate at all. A 1920x1080 video shot with a noisy headmounted camcorder while whitewater rafting would look poor at 20,000 kbps.
    Quote Quote  
  3. jagabo is spot on - it depends on the source

    From my experiences, with average hollywood style blu-ray's, I've found 6,000-10,00kbps for 720p, and 12,000-18,000kbps for 1080p were rough guidelines that worked out ok.

    My crystal ball says use 1920x1080 if you are using 20Mb/s
    Quote Quote  
  4. This post has a sample that doesn't compress well:

    https://forum.videohelp.com/topic359692.html#1908806
    Quote Quote  
  5. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by poisondeathray
    My crystal ball says use 1920x1080 if you are using 20Mb/s
    up until 2 days ago i would have thought the same thing, then i ran into this 720x576 16:9 encode using h264 at a very high bitrate (i think it was at 15 Mb/s, i have to dig it up and look at it again) that was just absolutely incredible in terms of quality.

    i'm in the middle of doing the first of the 2 encodes, one at 1920x1080 and one 720x480, both at 20 Mb/s, i can't wait to see the results.
    Quote Quote  
  6. Resolution is a part of quality.
    Quote Quote  
  7. Member PuzZLeR's Avatar
    Join Date
    Oct 2006
    Location
    Toronto Canada
    Search Comp PM
    Alot does depend on the source as was mentioned.

    For one thing, the 1080 would win if the source wasn't overly complex because there would be no justification for the higher BPP in the 720, and even more so with overkill for the 480. At that amount of available data you may as well "spread it out".

    With higher complexity sources we start to level the playing field somewhat where the 720, and in turn, the 480, start to gain some efficiency advantage at the same bitrate, at least on the smaller screens.

    It works kind of like those graphs and curves you learned in economics or calculus class...

    I still look forward to your findings at any rate. But, keep in mind, you can still test your theory on a smaller scale (for quicker tests at least) when comparing similar logic to, for example, 480 vs 240.
    I hate VHS. I always did.
    Quote Quote  
  8. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by deadrats
    Originally Posted by poisondeathray
    My crystal ball says use 1920x1080 if you are using 20Mb/s
    up until 2 days ago i would have thought the same thing, then i ran into this 720x576 16:9 encode using h264 at a very high bitrate (i think it was at 15 Mb/s, i have to dig it up and look at it again) that was just absolutely incredible in terms of quality.

    i'm in the middle of doing the first of the 2 encodes, one at 1920x1080 and one 720x480, both at 20 Mb/s, i can't wait to see the results.
    And you were watching this on a large HDTV? Don't use a typical computer monitor for quality assessment. That will get you fired. Agree on an evaluation monitor first.

    What was your 1080p source?

    Rules of thumb assuming 20Mb/s h.264:
    - 24p movies and low motion, steady beauty shots will look better at 1080i/1080p resolution*.
    - High action sports or hand held reality will look better at 1280x720p/59.94fps
    - 480i Digital Betacam is 90Mb/s raw (~3x DCT compressed). It will look near as good encoded h.264 @ 40 Mb/s. At 20Mb/s it still looks good but its not high def. You see upscaled 16x9 480i DigiBeta routinely on PBS, History and Discovery HD networks. I can see the difference when they change to HDV, DVCProHD source. High budget shows will shoot HDCAM or film.

    *telecined 1080i/29.97 @ 20Mb/s should produce near identical result to 1080p/23.976 @ 20Mb/s since excess fields are repeated.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  9. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by edDV
    And you were watching this on a large HDTV? Don't use a typical computer monitor for quality assessment. That will get you fired. Agree on an evaluation monitor first.
    i viewed it on both my LCD TV 32" 1080i/720p set to 1080i and my 19"LCD computer monitor

    Originally Posted by edDV
    What was your 1080p source?
    usually hdtv captures from satellite, mpeg-2 video, ac3, the captures rival and at times surpass some of the commercial blu-rays i have.

    the 720x576 16:9 i mentioned was something i found on the net, i don't know what the source was but i would guess it was a commercial blu-ray, be that as it may, it's still one of the finest transcodes i have ever come across, especially considering the final resolution.

    part of me wonders if it was done using a high end hardware encoder...
    Quote Quote  
  10. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by PuzZLeR
    I still look forward to your findings at any rate. But, keep in mind, you can still test your theory on a smaller scale (for quicker tests at least) when comparing similar logic to, for example, 480 vs 240.
    i just got done, tried 4 different encodes, 1920x1080 and 720x480, both using mpeg-2 and h264, all at 20Mb/s, the 1920x1080 encodes blew the lower rez encodes away and quite frankly at those bitrates i could see no difference between the h264 and mpeg-2 encodes.
    Quote Quote  
  11. Originally Posted by deadrats
    i just got done, tried 4 different encodes, 1920x1080 and 720x480, both using mpeg-2 and h264, all at 20Mb/s... i could see no difference between the h264 and mpeg-2 encodes.
    That's why you do constant quality encodes. You get exactly the right bitrate for whatever video you're encoding.
    Quote Quote  
  12. Originally Posted by deadrats
    i just got done, tried 4 different encodes, 1920x1080 and 720x480, both using mpeg-2 and h264, all at 20Mb/s, the 1920x1080 encodes blew the lower rez encodes away and quite frankly at those bitrates i could see no difference between the h264 and mpeg-2 encodes.
    I bet you are "wasting" bitrate for the h264 encode. If transparency could be reached earlier, why use all that extra space? This is one reason why CRF mode is so useful, the other being much shorter encoding times

    @edDV - Where do the "red" cameras (www.red.com) stand in the hierarchy of professional cameras? or are they just "toys" at this point? Their specs on paper are amazing, at least to my amateur eyes





    Quote Quote  
  13. I think most of red's catalog is there to impress venture capitalists.
    Quote Quote  
  14. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by deadrats
    Originally Posted by edDV
    And you were watching this on a large HDTV? Don't use a typical computer monitor for quality assessment. That will get you fired. Agree on an evaluation monitor first.
    i viewed it on both my LCD TV 32" 1080i/720p set to 1080i and my 19"LCD computer monitor
    Probably not good enough to evaluate high quality source.

    "LCD TV 32" 1080i/720p" is most likely 1366x768 native with low end deinterlace.

    "19"LCD computer monitor" would likely be 1280x1080 native with processing by display card.


    Originally Posted by deadrats
    Originally Posted by edDV
    What was your 1080p source?
    usually hdtv captures from satellite, mpeg-2 video, ac3, the captures rival and at times surpass some of the commercial blu-rays i have.

    the 720x576 16:9 i mentioned was something i found on the net, i don't know what the source was but i would guess it was a commercial blu-ray, be that as it may, it's still one of the finest transcodes i have ever come across, especially considering the final resolution.

    part of me wonders if it was done using a high end hardware encoder...

    DirectTV and Dish MPeg2 HD are highly compressed (8-16 Mb/s) compared to production source (144-880 Mb/s).

    Your monitors probably don't allow full quality monitoring of BluRay.

    Downloads come without history. Who knows what it is?
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  15. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by poisondeathray

    @edDV - Where do the "red" cameras (www.red.com) stand in the hierarchy of professional cameras? or are they just "toys" at this point? Their specs on paper are amazing, at least to my amateur eyes
    They have great potential. Recording the output economically is the problem.

    We will know when they can compare test Red vs. Film or HDCAM-SR/Viper.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  16. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by jagabo
    I think most of red's catalog is there to impress venture capitalists.
    Need impartial tests.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  17. Originally Posted by edDV
    Originally Posted by jagabo
    I think most of red's catalog is there to impress venture capitalists.
    Need impartial tests.
    I suspect many of the listed formats, like 28,000x9334, only exist in theory. Not even a prototype.
    Quote Quote  
  18. 28,000 x 9334 isn't even listed in this graphic, it probably wouldn't fit LOL. I found this on wikipedia. There were some UHDV Monitor's demoing at a trade show in (Japan)? earlier this year.

    Quote Quote  
  19. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Assume hype until test. 8Kx4k is the next step for film production. Today we have 2Kx1K or 4kx2k.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  20. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo
    Originally Posted by deadrats
    i just got done, tried 4 different encodes, 1920x1080 and 720x480, both using mpeg-2 and h264, all at 20Mb/s... i could see no difference between the h264 and mpeg-2 encodes.
    That's why you do constant quality encodes. You get exactly the right bitrate for whatever video you're encoding.
    when i read this something clicked and suddenly i knew how that file i had found on the net was done, to test it out i i redid a couple of quick encodes, using tmpgenc, one with mpeg-2 with DC set to 10, motion search set to high, and constant quality set to max with a maximum bitrate of 20 Mb/s, the other using divx with constant quality and insane quality settings, both at a resolution of 720x576 16:9 and the results were for all practical purposes (at least as far as i can tell on my monitors, an exact match with the 1920x1080 test encodes i did earlier.

    i played around a bit seeing how much less bitrate i could use and still maintain quality, a max of 10 Mb/s still gave greate results, though the mpeg-2 encode at that setting started to suffer just a bit, though to my surprise the divx encode was indistinguishable from the 20 Mb/s encode.

    i also did 2 test encodes, one using apple's h264 and one using nero's digital avc and much to my surprise the results of both sucked, the apple codec resulted in colors that were "pale", almost washed out and the nero avc codec produced some very noticable blocking.

    really surprised by these 2, i have had some really good results with both of those codecs, but disappointing, vc-1 likewise left much to be desired.

    i wonder if maybe the fact that i use ffdshow to decode all video and audio files may be screwing things up, like the codecs themselves may be producing fine quality encodes but ffdshow may be decoding the streams improperly.

    i'm going to have to look into that, maybe upgrade to the newest version and try again...
    Quote Quote  
  21. Originally Posted by deadrats
    i redid a couple of quick encodes, using tmpgenc, one with mpeg-2 with DC set to 10, motion search set to high, and constant quality set to max with a maximum bitrate of 20 Mb/s
    Even at 85 TMPGEnc's encodes are nearly identical to the source and they will be significantly smaller (lower average bitrate). The bitrate cap only effects segments where the bitrate would have gone over the cap -- instead of going to a higher bitrate they are limited to the cap. The rest of the video is encoded the same.
    Quote Quote  
  22. Member vhelp's Avatar
    Join Date
    Mar 2001
    Location
    New York
    Search Comp PM
    evening eveyone.

    - High action sports or hand held reality will look better at 1280x720p/59.94fps
    Some reality shows are Progressive. Take that weight loss reality show..it is progressive, except for when they go into the persons personal life (whatever term that is refered to as) they will have that either in Film (telecine) or Interlace. This applies to other "reality" type shows as well, remember "buity and geek" is progressive as in the above mentioned show. I was testing an ATI tv wonder card while that show was airing the other day, and I came to those facts with that show..anyhow.

    Still..the problem with all these shows, all program stories/movies/etc, is that they starve in bitrate, and the preceived quality is subjective from person-to-person. How does the old saying go ? .. oh yeah, "the bigger they are, the harder they fall" .. something like that. IOW, on a larger tv screen, you will see the artifacts much more pronouced, except for those tv sets that have the *better* image processor, that *better* hides them or, if smaller tv set, they too can hide from the viewer's preceived distance.

    480 vs. 720 vs. 1080 ?, Well, in terms of bitrate (and codec format) I haven't done the math to arrive at a comprimising encoding setup.

    The problem with this (OP's) idea has to do with the source, and what is considered, official, in terms of many factors, such as:

    codec; format; bitrate; resolution; content; (ie, movie vs interlace vs. mixture) amount of video artifacts already present in the source and factored; for example: H264 vs MPEG-2 and the difference in their bitrate requirements vs. the amount of compression each of these format entails, and so on and so forth.

    The other factor to consider is whether a given video source (as seen by the OP for instance) was a hand-made version for demonstration purposes. Some houses will encorporate specialized encoding of the video and/or the the bitrate could have just be jacked sky-high for that demo. I mean. I've seen some demos that I wish I could get my hands on, (to see how it was put together or to peek at the bitrate, case they were cheating--they do that sometimes) but ask a salesmen for it and they give you some lame story or something.

    ...

    "19"LCD computer monitor" would likely be 1280x1080 native with processing by display card.
    Well, at least I know my 19" monitor has 1680x1050 native pixels.



    * thanks to jagabo (snipped from another thread topic)
    * nice small utility: http://majorgeeks.com/download.php?det=960

    You should see alternating thin horizontal black and white lines on the left and vertical lines on the right. You may or may not be able to distinguish the checkerboard pattern in the middle. It might appear as gray. But you should not see any moire or flickering in that image if the system is set up properly.

    The quality of the analog section of monitors varies. Some models skimp on the VGA section assuming you'll be using DVI. You will see a larger difference between the two in that case.

    Run the monitor test program from this site:


    http://majorgeeks.com/download.php?det=960
    On my monitor, set at 1680x1050, the pixel layout is crystal clear. No distortions whatsoever. However,
    when set to any other resoution, ie 1280x1024, that image was distorted--lots of moir type patterns of pixels. So, that was good (cheap) basic way of testing your monitor's best resolution setting, and I keep mine at its highest--manufacture recommended anyway

    ...

    Encoding scenarios vary from source to source. And the bitrate (how much is required) depends on the scene and how complex it is to encode it at the given bitrate or its strategy for that matter. While you might get away with a nice low bitrate in once scene, you might not, in another. Also, strange way how some encoders and codecs for that matter, handle bitrate over long runs. What I mean is. While one scene encodes fairly well in a given bitrate strategy, if continuing on (in the video encoding) the next few scenes could upset the final balance of the video's bitrate distribution pattern. I used to agonize over this phenomina in my early days in MPEG encoding. So, I would guess that its no different in other codec formats, such as H264 / AVC / VC-1, or whatever you call these incornations. And, for encoding, x264 cli is the prob the single best encoder out there..always evolving. So don't miss miss out. Requires command string preporation, etc., etc.

    ...

    i wonder if maybe the fact that i use ffdshow to decode all video and audio files may be screwing things up, like the codecs themselves may be producing fine quality encodes but ffdshow may be decoding the streams improperly.
    Yes, decoders do play a whole lot in the presentation experience. I"ve seen variable quality aspects from PowerDVD; WinDVD; ATI player; VLC player; Windows Media Player; and more.. and they all vary, some large and small. But, be watchfull of your graphics card IP engine. My new ATI HD-3450 can be a beast when playing some MPEG that I encode. Even the ones from some of my Hauppauge encoded mpeg come out pretty badly with tons of over-filtering--like temporal kaos. But there is a way to turn it OFF, I think. I"m looking into it because I think it might help us all out. I feel that many of us a suffering from this default cludgyness that these card makers put us through. It just makes no sense why we have to have all our videos go through the filtering process, every video..what were they thinking? Oh well.

    -vhelp 4948
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!