Hi,
I've just got into ripping my DVDs to make CD copies in the last week or so and I'm not sure which standard to use, VCD or SVCD (my DVD player will happily take either). All of the articles that I have read indicate that SVCD has higher picture quality but this is presumably because of a higher bitrate and at the expense of more CDs per movie. For convenience sake I'd like to use only 2 CDs per movie (except for really long movies).
My specific questions are: -
1. Is there an appreciable difference between VCD and SVCD at the same average bitrate? Is VCD better for low bitrates (say between 650 and 1150)?
2. What is the relationship between resolution and bitrate? Does a higher resolution at a given bitrate mean less information for each frame displayed? Does this then make VCD (at lower resolutions) more attractive for cramming more onto a CD at a lower bitrate?
Thanks in advance for your help.
Richard.
+ Reply to Thread
Results 1 to 6 of 6
-
-
A VCD has a bit rate of 1150bits/sec, not more or less. A SVCD has a max bit rate of 2520bits/sec.
How do you define video quality? Do you define it by picture resolution? Do you define it by compression artifacts? Motion artifacts?
A 704x480 image is twice as sharp and has twice the amount of details than a 352x240 image. Since the image is twice as large, it needs about twice the amount of data rate to maintain the image quality.
A video at 352x240 and 704x480 with the same bit rate will have different amounts of compression artifacts. The 352x240 video will have about half the amount of compression artifacts as the 704x480 video. However, the 704x480 will be twice as sharp as the 352x240 video, but it will have much more artifacts. So, pick what is important to you, high image detail and lots of artifacts, or low video resolution and less artifacts. -
don't forget VCD standard 2.0 uses only CBR allocation, which allocates same bitrate to every frame (horrible)
SVCD uses VBR, which allocates more bitrate to scenes that need it (i.e. action) and less bitrate to scenes that don't (i.e. slow or credits)
so, basically VBR is far superior because it takes bitrate from slow scenes and gives it to fast scenes -
A 704x480 image is twice as sharp and has twice the amount of details than a 352x240 image. Since the image is twice as large, it needs about twice the amount of data rate to maintain the image quality.
/Mats -
Originally Posted by mats.hogberg
given same bitrate, lower resolution will give each pixel more bitrate...but lower resolution=softer (fuzzier)
higher resolution gives each pixel less bitrate=blockier, but higher resolution=sharper -
My advice to you is to use svcd. I tried vcd a couple of times and every time I'm very dissapointed about the softness of the picture. It looks like my tv is about 20 years old or so. With svcd you can approach DVD sharpness, but you have to use at least 2 discs, but they cost nothing these days.
Similar Threads
-
Should i put average bitrate or max bitrate in 2pass encoding mode?
By tendra in forum Video ConversionReplies: 28Last Post: 11th Nov 2011, 07:38 -
VCD vs Half-D1 vs Low bitrate full-D1
By nharikrishna in forum Authoring (DVD)Replies: 44Last Post: 6th Feb 2011, 22:50 -
Tool for VCD at 2500 bitrate with No Re-encoding
By whenloverageswild in forum Authoring (VCD/SVCD)Replies: 21Last Post: 6th Mar 2009, 11:34 -
Entering Video Bitrate Makes Preset Switch To SVCD
By akulier in forum ffmpegX general discussionReplies: 7Last Post: 26th Feb 2008, 05:12 -
Entering Video Bitrate Makes Preset Switch To SVCD
By akulier in forum ffmpegX general discussionReplies: 1Last Post: 21st Feb 2008, 04:31