VideoHelp Forum




+ Reply to Thread
Results 1 to 13 of 13
  1. Not sure if I'm doing something wrong here but I burned two discs of an old movie, one a standard SVCD and one an XVCD and the XVCD had a much clearer picture even though the bitrate was less than that of the SVCD. The XVCD had all the normal setting of a VCD(352 X 240) except that the bitrate was 2200. The SVCD bitrate was 2520 which is much higher so I expected a clear image but it was closer to a low bitrate file in image quality. Did the 480 X 480 resolution of the SVCD cause this and how can it be countered? Thanks.
    Quote Quote  
  2. Two things:

    1. MPEG-2 is optimized for higher bitrates; MPEG-1 provides better results at low resolutions/birates.

    2. Were the settings identical? (i.e. VBR encoding, how many passes).

    3. What do you mean by "clearer picture"? At those bitrates the VCD would have less artifacts, but would be overall blurrier. The SVCD would be sharper, but perhaps have more artifacts.
    Quote Quote  
  3. Member ZippyP.'s Avatar
    Join Date
    Nov 2002
    Location
    Lotus Land
    Search Comp PM
    Depends on the source. If it was low res to begin with, then blowing it up to 480x480 would make it look worse. The point is that you need to match the I/O resolution as you can't just "create" a higher resolution. So what was the res of the source?
    "Art is making something out of nothing and selling it." - Frank Zappa
    Quote Quote  
  4. Naturally, if you have 2200 kilobits available to define 352 X 240, it should always result in a better quality picture than if you have 2520 kilobits available to cover 480 X 480.

    The other factor in this is your TV's ability to resize up on the fly. I have found that most TVs do a pretty fair job of resizing up, better than any encoder can.
    Quote Quote  
  5. Originally Posted by scottie78
    The other factor in this is your TV's ability to resize up on the fly. I have found that most TVs do a pretty fair job of resizing up, better than any encoder can.
    Just to be accurate, the DVD player does the resizing, not the TV.
    Quote Quote  
  6. Originally Posted by junkmalle
    Originally Posted by scottie78
    The other factor in this is your TV's ability to resize up on the fly. I have found that most TVs do a pretty fair job of resizing up, better than any encoder can.
    Just to be accurate, the DVD player does the resizing, not the TV.
    Thanks, junkmalle

    I wouldn't want to be inaccurate
    Quote Quote  
  7. Member ZippyP.'s Avatar
    Join Date
    Nov 2002
    Location
    Lotus Land
    Search Comp PM
    Originally Posted by scottie78
    Naturally, if you have 2200 kilobits available to define 352 X 240, it should always result in a better quality picture than if you have 2520 kilobits available to cover 480 X 480.
    Sorry, but I have to disagree with the above statement especially with the always.

    If you have a good quality source then 2520 is more than enough bitrate for good quality at 480x480, after all it is the SVCD maximum bitrate. The VCD (352x240/288 res) maximum is 1150, going higher (like 2200) makes it look better up to a point, beyond which it doesn't make any difference.

    Assuming a good quality, relatively noise free, high resolution source then the SVCD at 2520 Kbps will always look better than any XVCD at any bitrate.

    However, noisy low res video cannot be improved upon by using SVCD resolution. Noise and lots of motion (like camcorder footage) will eat up bitrate making a video bit-starved and requiring a much higher bitrate to make it look good...or a lower resolution if bitrate is limited.

    So...it all depends on the source.
    "Art is making something out of nothing and selling it." - Frank Zappa
    Quote Quote  
  8. Let's see. The source was a VHS tape that I captured to an avi file at a resolution of 480 x 480. When converted to SVCD, it has more artifacts than the XVCD although it does look sharper. The avi source looks very good so my guess is the higher resolution is to blame but if so, then all SVCDs have artifacts which make it look less clear than even VHS. Am I missing anything?
    Quote Quote  
  9. Member ZippyP.'s Avatar
    Join Date
    Nov 2002
    Location
    Lotus Land
    Search Comp PM
    Originally Posted by perdomot
    The source was a VHS tape...
    Noisy source. If you were able to make the bitrate higher then I'm sure that it would improve the quality. You can try a noise filter, TMPGEnc has one but it will increase encoding times. You could also try 352x480 resolution (called CVD or 1/2 DVD) and you'll probably see an improvement as well.

    2500 is a high enough bitrate for SVCD when the source is from a DVD for instance, but in your case it apparently is too low.
    "Art is making something out of nothing and selling it." - Frank Zappa
    Quote Quote  
  10. Zippy, are all VHS sources considered noisy? The avi looks crystal clear and pretty sharp. The XVCD also looks excellent if not quite as sharp.
    Quote Quote  
  11. Member ZippyP.'s Avatar
    Join Date
    Nov 2002
    Location
    Lotus Land
    Search Comp PM
    VHS capture is going to be noisy, and some of it is not detectable by your eye. A filter will smooth it out and allow the bitrate to go a lot further. Give it a try.
    "Art is making something out of nothing and selling it." - Frank Zappa
    Quote Quote  
  12. Which filter would you recommend Zippy? I use Avisynth to edit mostly.
    Quote Quote  
  13. Member ZippyP.'s Avatar
    Join Date
    Nov 2002
    Location
    Lotus Land
    Search Comp PM
    Originally Posted by perdomot
    Which filter would you recommend Zippy?
    I'll pass this off to others more experienced in this area, try this thread.

    Good luck.
    "Art is making something out of nothing and selling it." - Frank Zappa
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!