VideoHelp Forum
+ Reply to Thread
Results 1 to 7 of 7
Thread
  1. I've been in the process of ripping and converting a few of my blurays and DVDs, and have only just now noticed on my 40" sony LCD tv that there are some fairly noticeable artifacts, especially on fading scenes. For example: The very beginning of "Real Steel" where it fades from the Touchstone Pictures logo to the Dreamworks logo, there is a lot of artifacts. Can't see them on either of my laptop's 15" 1080p screen, or my desktop's 24" 1920x1200 monitor.

    Don't know if it's an issue with the connection between my PC and TV, but I've noticed that even when viewing the ripped m2ts file, there's a tiny bit of noticeable artifacting, while watching the bluray directly on my BD player, there's none to be seen.

    I've been encoding the videos with VidCoder using the High profile for x264, with a CRF of 20. I'm playing the videos on my laptop through an HDMI cable to my TV, and using Windows 7's Media Center as my primary method of viewing videos.

    Is there some optimal settings I could try that would keep the end file at a sane size?

    As an afterthought, I wonder if it might be an issue with levels on my TV display? I read a thread just now where somebody was complaining about videos being "washed out", but I can't seem to find anything to modify (that it will allow me to do) on my TV. A lot of settings are locked out as it says they're not available on a PC input, and I can't seem to fool the TV into thinking it's anything BUT a PC input. FWIW, I have an ATI video card in my laptop.
    Last edited by agent154; 2nd Apr 2012 at 22:06.
    Quote Quote  
  2. I'm a MEGA Super Moderator Baldrick's Avatar
    Join Date
    Aug 2000
    Location
    Sweden
    Search Comp PM
    Have you tried a different video player?

    And you can try adjust the ati color/level settings in the catalyst control center under the video tab.
    Quote Quote  
  3. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    Try Handbrake instead of VidCoder and see if you like the results better. The artifacts you see on your rips may be because your laptop is simply not powerful enough. The act of ripping does not introduce flaws into videos.

    I know that some HDTVs deliberately operate in a kind of degraded mode when being used as monitors and it could be that Sony models do this.
    Quote Quote  
  4. Member wulf109's Avatar
    Join Date
    Jul 2002
    Location
    United States
    Search Comp PM
    You can set the output size to anything you want in Vidcoder,1GB,4BG,etc. Maybe your just using too low a quality encode. Set your output size,check 2-pass and let Vidcoder decide the encode method.
    Quote Quote  
  5. Just to be sure we're clear: You *don't* see the artifacting while playing from standalone player to TV? Is that right? I think you should look at other factors besides your encoding.

    Personally, I use a crf setting of 20 to make MKVs for viewing on a 65" 1080p set and see no artifacting. You gotta go pretty low on the bitrate before you see blatant artifacting with x264. And I mean artifacting like you'd see at low bitrate MPEG2. Of course there's a quality hit with any re-encode, and my MKVs are *not* identical perceptually with the originals.
    Pull! Bang! Darn!
    Quote Quote  
  6. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    I see noticeable degrade at CQ 20 with x264. Adequate for TV show renders but IMO not for movies*. Try CQ 17 or 18.

    Artifacting points to inadequate bit rate or CPU/GPU decode power. Play the file on your Blu-ray player to separate issues.

    Levels issues point to your display card or HDTV calibration. A simple x264 encode should not affect levels.


    * movies tend to have lower overall luma levels with crucial detail in dark grays. These conditions expose artifacts. Movies are intended for viewing in dark cinema settings. TV shows are exposed with the expectation they will be viewed in bright rooms on TV sets.
    Last edited by edDV; 4th Apr 2012 at 11:29.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  7. Originally Posted by agent154 View Post
    Don't know if it's an issue with the connection between my PC and TV, but I've noticed that even when viewing the ripped m2ts file, there's a tiny bit of noticeable artifacting, while watching the bluray directly on my BD player, there's none to be seen.
    If the encodes don't look noticeably different to the original files when played on your TV, ie you can see artifacts either way, then yes it's likely to be a levels issue. 90% of the threads I've been involved in where encoding artifacts are an issue have turned out to be caused by sending the TV the wrong levels. And as you say you can't see the artifacts on a PC monitor.....

    Encoding the video can indeed seem to produce unnecessary artifacts (blocking etc) if you're playing the video using the wrong levels as to the best of my knowledge the x264 encoder will compress darker areas more than brighter areas. I've encoded a lot of Bluray discs at 720p using a CRF value of 20 and they look fine on my 51" Plasma.

    If you're connected via HDMI, you may be able to change the expected input level. My TV has a dedicated PC/HDMI input and it's the only one which lets me change the levels. My choices are "normal" or "low" but they could be labelled differently such as limited range or full range. In my case it's under advanced picture settings.
    Alternatively your video card's control panel may have a setting for automatically expanding the levels of video if you're using an input on your TV which expects PC (full range) levels rather than TV levels. I'm using a desktop PC but my Nvidia card lets me specify the video levels for each monitor/TV connected to it.

    Keep in mind it's also possible to send the wrong levels to the TV and make the picture look too dark rather than washed out, but one way to determine if the levels are correct is to play a video containing black bars as they should be, errr.... black (don't use a video for testing where the black bars have been cropped and are being added back on playback). If the black bars look dark grey, then you need to change the levels so they're black. If you change the levels and the black bars don't get any blacker while the picture gets darker then the levels were no doubt already correct. If you decide the levels were incorrect and change them, you might have to re-adjust your TV's brightness and contrast etc.

    It's very easy to send a TV the wrong levels and not notice as no two videos have the same brightness/contrast (I definitely wouldn't be making any movie/TV distinction in that respect). At one stage my video card decided to revert back to TV levels after a reboot (I'd set the TV to expect PC levels) so while in theory that could cause the video to look "washed out", it was probably days or even weeks before I noticed.
    Last edited by hello_hello; 5th Apr 2012 at 18:28.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!