ok people, here's a simple question: assume you had an extremely high quality source and you wanted to encode it to a smaller size and assume you wanted to limit yourself to 20Mb/s, for the highest quality final product, would it make more sense to encode to 1920x1080, 1280x720 or 720x480 and assume that the codec will be H264 with all the settings maxed out with ac3 audio.
my thinking is this: on the one hand the 1920x1080 has more pixels and thus can display more detail but on the other hand the 720x480 would have more bits per pixel and thus each pixel would be of higher quality.
in the end i'm probably going to just try both and see which one comes out with higher quality but i'm wondering what your predictions are.
+ Reply to Thread
Results 1 to 22 of 22
-
-
It depends on the particular video. A 1920x1080 still shot would require hardly andy bitrate at all. A 1920x1080 video shot with a noisy headmounted camcorder while whitewater rafting would look poor at 20,000 kbps.
-
jagabo is spot on - it depends on the source
From my experiences, with average hollywood style blu-ray's, I've found 6,000-10,00kbps for 720p, and 12,000-18,000kbps for 1080p were rough guidelines that worked out ok.
My crystal ball says use 1920x1080 if you are using 20Mb/s -
This post has a sample that doesn't compress well:
https://forum.videohelp.com/topic359692.html#1908806 -
Originally Posted by poisondeathray
i'm in the middle of doing the first of the 2 encodes, one at 1920x1080 and one 720x480, both at 20 Mb/s, i can't wait to see the results. -
Alot does depend on the source as was mentioned.
For one thing, the 1080 would win if the source wasn't overly complex because there would be no justification for the higher BPP in the 720, and even more so with overkill for the 480. At that amount of available data you may as well "spread it out".
With higher complexity sources we start to level the playing field somewhat where the 720, and in turn, the 480, start to gain some efficiency advantage at the same bitrate, at least on the smaller screens.
It works kind of like those graphs and curves you learned in economics or calculus class...
I still look forward to your findings at any rate. But, keep in mind, you can still test your theory on a smaller scale (for quicker tests at least) when comparing similar logic to, for example, 480 vs 240.I hate VHS. I always did. -
Originally Posted by deadrats
What was your 1080p source?
Rules of thumb assuming 20Mb/s h.264:
- 24p movies and low motion, steady beauty shots will look better at 1080i/1080p resolution*.
- High action sports or hand held reality will look better at 1280x720p/59.94fps
- 480i Digital Betacam is 90Mb/s raw (~3x DCT compressed). It will look near as good encoded h.264 @ 40 Mb/s. At 20Mb/s it still looks good but its not high def. You see upscaled 16x9 480i DigiBeta routinely on PBS, History and Discovery HD networks. I can see the difference when they change to HDV, DVCProHD source. High budget shows will shoot HDCAM or film.
*telecined 1080i/29.97 @ 20Mb/s should produce near identical result to 1080p/23.976 @ 20Mb/s since excess fields are repeated.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
Originally Posted by edDV
the 720x576 16:9 i mentioned was something i found on the net, i don't know what the source was but i would guess it was a commercial blu-ray, be that as it may, it's still one of the finest transcodes i have ever come across, especially considering the final resolution.
part of me wonders if it was done using a high end hardware encoder... -
Originally Posted by PuzZLeR
-
Originally Posted by deadrats
-
Originally Posted by deadrats
@edDV - Where do the "red" cameras (www.red.com) stand in the hierarchy of professional cameras? or are they just "toys" at this point? Their specs on paper are amazing, at least to my amateur eyes
-
Originally Posted by deadrats
"LCD TV 32" 1080i/720p" is most likely 1366x768 native with low end deinterlace.
"19"LCD computer monitor" would likely be 1280x1080 native with processing by display card.
Originally Posted by deadrats
DirectTV and Dish MPeg2 HD are highly compressed (8-16 Mb/s) compared to production source (144-880 Mb/s).
Your monitors probably don't allow full quality monitoring of BluRay.
Downloads come without history. Who knows what it is?Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by poisondeathray
We will know when they can compare test Red vs. Film or HDCAM-SR/Viper.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by jagaboRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
-
28,000 x 9334 isn't even listed in this graphic, it probably wouldn't fit LOL. I found this on wikipedia. There were some UHDV Monitor's demoing at a trade show in (Japan)? earlier this year.
-
Assume hype until test. 8Kx4k is the next step for film production. Today we have 2Kx1K or 4kx2k.
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by jagabo
i played around a bit seeing how much less bitrate i could use and still maintain quality, a max of 10 Mb/s still gave greate results, though the mpeg-2 encode at that setting started to suffer just a bit, though to my surprise the divx encode was indistinguishable from the 20 Mb/s encode.
i also did 2 test encodes, one using apple's h264 and one using nero's digital avc and much to my surprise the results of both sucked, the apple codec resulted in colors that were "pale", almost washed out and the nero avc codec produced some very noticable blocking.
really surprised by these 2, i have had some really good results with both of those codecs, but disappointing, vc-1 likewise left much to be desired.
i wonder if maybe the fact that i use ffdshow to decode all video and audio files may be screwing things up, like the codecs themselves may be producing fine quality encodes but ffdshow may be decoding the streams improperly.
i'm going to have to look into that, maybe upgrade to the newest version and try again... -
Originally Posted by deadrats
-
evening eveyone.
- High action sports or hand held reality will look better at 1280x720p/59.94fps
Still..the problem with all these shows, all program stories/movies/etc, is that they starve in bitrate, and the preceived quality is subjective from person-to-person. How does the old saying go ? .. oh yeah, "the bigger they are, the harder they fall" .. something like that. IOW, on a larger tv screen, you will see the artifacts much more pronouced, except for those tv sets that have the *better* image processor, that *better* hides themor, if smaller tv set, they too can hide from the viewer's preceived distance.
480 vs. 720 vs. 1080 ?, Well, in terms of bitrate (and codec format) I haven't done the math to arrive at a comprimising encoding setup.
The problem with this (OP's) idea has to do with the source, and what is considered, official, in terms of many factors, such as:
codec; format; bitrate; resolution; content; (ie, movie vs interlace vs. mixture) amount of video artifacts already present in the source and factored; for example: H264 vs MPEG-2 and the difference in their bitrate requirements vs. the amount of compression each of these format entails, and so on and so forth.
The other factor to consider is whether a given video source (as seen by the OP for instance) was a hand-made version for demonstration purposes. Some houses will encorporate specialized encoding of the video and/or the the bitrate could have just be jacked sky-high for that demo. I mean. I've seen some demos that I wish I could get my hands on, (to see how it was put together or to peek at the bitrate, case they were cheating--they do that sometimes) but ask a salesmen for it and they give you some lame story or something.
...
"19"LCD computer monitor" would likely be 1280x1080 native with processing by display card.
* thanks to jagabo (snipped from another thread topic)
* nice small utility: http://majorgeeks.com/download.php?det=960
You should see alternating thin horizontal black and white lines on the left and vertical lines on the right. You may or may not be able to distinguish the checkerboard pattern in the middle. It might appear as gray. But you should not see any moire or flickering in that image if the system is set up properly.
The quality of the analog section of monitors varies. Some models skimp on the VGA section assuming you'll be using DVI. You will see a larger difference between the two in that case.
Run the monitor test program from this site:
http://majorgeeks.com/download.php?det=960
when set to any other resoution, ie 1280x1024, that image was distorted--lots of moir type patterns of pixels. So, that was good (cheap) basic way of testing your monitor's best resolution setting, and I keep mine at its highest--manufacture recommended anyway
...
Encoding scenarios vary from source to source. And the bitrate (how much is required) depends on the scene and how complex it is to encode it at the given bitrate or its strategy for that matter. While you might get away with a nice low bitrate in once scene, you might not, in another. Also, strange way how some encoders and codecs for that matter, handle bitrate over long runs. What I mean is. While one scene encodes fairly well in a given bitrate strategy, if continuing on (in the video encoding) the next few scenes could upset the final balance of the video's bitrate distribution pattern. I used to agonize over this phenomina in my early days in MPEG encoding. So, I would guess that its no different in other codec formats, such as H264 / AVC / VC-1, or whatever you call these incornations. And, for encoding, x264 cli is the prob the single best encoder out there..always evolving. So don't miss miss out. Requires command string preporation, etc., etc.
...
i wonder if maybe the fact that i use ffdshow to decode all video and audio files may be screwing things up, like the codecs themselves may be producing fine quality encodes but ffdshow may be decoding the streams improperly.
-vhelp 4948
Similar Threads
-
bit rate versus resolution
By topquark in forum Newbie / General discussionsReplies: 7Last Post: 24th Apr 2010, 09:25 -
How to achieve the maximum bit rate in variable bit rate mode ?
By Sxn in forum Newbie / General discussionsReplies: 42Last Post: 3rd Dec 2009, 12:53 -
Bit Rate
By shapper in forum Newbie / General discussionsReplies: 10Last Post: 2nd Jun 2008, 14:04 -
Bit Rate And Frame Rate
By bharathi_n_r in forum Video ConversionReplies: 2Last Post: 30th Nov 2007, 05:48 -
the right bit rate
By nojee in forum Video ConversionReplies: 5Last Post: 10th Nov 2007, 03:00