I have some familiarity with 480i30 bitrates vs quality. But how do these compare to HD?
If, for example, you have a 480i30 video in the SAME codec/format encoded in real time (such as from a camera), how many times the bitrate is generally needed for the same quality in the same codec/format recorded in real time for: 720p60, 1080i30, and 1080p60 respectively? (Assuming the content is the same.) What would the general guidelines be?
If there is a previous post, faq, etc. that would give comparison guidelines, then please point me to it.
+ Reply to Thread
Results 1 to 3 of 3
-
-
Do some test encodes using CRF and make note of the resulting file sizes/bit rate
Similar Threads
-
64-Bit AVISynth - Any Comparable Filter for 32-bit "Chromashift"?
By Alwyn in forum RestorationReplies: 19Last Post: 4th Jun 2023, 03:02 -
[solved] Merge two videoes with different resolution and bitrates - ffmpeg
By parvares in forum Newbie / General discussionsReplies: 3Last Post: 11th Apr 2021, 08:55 -
SVCD to Imovie comparable format
By Aldinho57 in forum Video ConversionReplies: 3Last Post: 25th May 2020, 12:49 -
How to determine max min and average bitrates for video rendering?
By Bencuri in forum Video ConversionReplies: 4Last Post: 16th Oct 2019, 12:02 -
[x265] Correlation between frame sizes and bitrates
By Jose Hidalgo in forum Newbie / General discussionsReplies: 24Last Post: 13th Oct 2019, 23:13