VideoHelp Forum




+ Reply to Thread
Results 1 to 6 of 6
  1. Member
    Join Date
    Dec 2002
    Location
    New Zealand
    Search Comp PM
    Hi,

    I've just got into ripping my DVDs to make CD copies in the last week or so and I'm not sure which standard to use, VCD or SVCD (my DVD player will happily take either). All of the articles that I have read indicate that SVCD has higher picture quality but this is presumably because of a higher bitrate and at the expense of more CDs per movie. For convenience sake I'd like to use only 2 CDs per movie (except for really long movies).

    My specific questions are: -

    1. Is there an appreciable difference between VCD and SVCD at the same average bitrate? Is VCD better for low bitrates (say between 650 and 1150)?

    2. What is the relationship between resolution and bitrate? Does a higher resolution at a given bitrate mean less information for each frame displayed? Does this then make VCD (at lower resolutions) more attractive for cramming more onto a CD at a lower bitrate?

    Thanks in advance for your help.


    Richard.
    Quote Quote  
  2. A VCD has a bit rate of 1150bits/sec, not more or less. A SVCD has a max bit rate of 2520bits/sec.

    How do you define video quality? Do you define it by picture resolution? Do you define it by compression artifacts? Motion artifacts?

    A 704x480 image is twice as sharp and has twice the amount of details than a 352x240 image. Since the image is twice as large, it needs about twice the amount of data rate to maintain the image quality.

    A video at 352x240 and 704x480 with the same bit rate will have different amounts of compression artifacts. The 352x240 video will have about half the amount of compression artifacts as the 704x480 video. However, the 704x480 will be twice as sharp as the 352x240 video, but it will have much more artifacts. So, pick what is important to you, high image detail and lots of artifacts, or low video resolution and less artifacts.
    Quote Quote  
  3. don't forget VCD standard 2.0 uses only CBR allocation, which allocates same bitrate to every frame (horrible)

    SVCD uses VBR, which allocates more bitrate to scenes that need it (i.e. action) and less bitrate to scenes that don't (i.e. slow or credits)

    so, basically VBR is far superior because it takes bitrate from slow scenes and gives it to fast scenes
    Quote Quote  
  4. Member mats.hogberg's Avatar
    Join Date
    Jul 2002
    Location
    Sweden (PAL)
    Search Comp PM
    A 704x480 image is twice as sharp and has twice the amount of details than a 352x240 image. Since the image is twice as large, it needs about twice the amount of data rate to maintain the image quality.
    I'll agree on that it's twice as sharp and has twice the amount of details, but since it's made up of 337920 pixels, compared to 84480 for 352*240, it ought to take 4 times the data rate to retain image quality, right?

    /Mats
    Quote Quote  
  5. Originally Posted by mats.hogberg
    I'll agree on that it's twice as sharp and has twice the amount of details, but since it's made up of 337920 pixels, compared to 84480 for 352*240, it ought to take 4 times the data rate to retain image quality, right?
    it's always a battle between blockiness and softness (fuzziness)

    given same bitrate, lower resolution will give each pixel more bitrate...but lower resolution=softer (fuzzier)

    higher resolution gives each pixel less bitrate=blockier, but higher resolution=sharper
    Quote Quote  
  6. My advice to you is to use svcd. I tried vcd a couple of times and every time I'm very dissapointed about the softness of the picture. It looks like my tv is about 20 years old or so. With svcd you can approach DVD sharpness, but you have to use at least 2 discs, but they cost nothing these days.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!