VideoHelp Forum




+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 49
  1. Hi guys!

    As you see in the screenshot below, my x265 encodes have such color issues.

    Can anyone find out what causes it?



    Source : DVD9
    Filters I used in this encode : asharp, aWarpSharp, MSharpen, MSmooth, Tweak


    Please let me know if any more details needed.


    Regards,

    Evad3R
    Quote Quote  
  2. Originally Posted by anana View Post
    bitrate?
    Complete media info
    Run time : 2h 50mn
    Resolution : 704*304
    Frame rate : 25.000 fps
    Size : 1.44 GiB
    Aspect Ratio : 2.35:1
    Video Codec : x265/HEVC
    Video Bit rate : 967 Kbps
    Audio Info : AC-3 [6CH]
    Audio Language : Tamil
    Subtitles : English
    Chapters : Yes
    Container : MKV
    Quote Quote  
  3. video bitrate is the problem
    try to reencode the source (dvd) with bitrate -> 3000 Kbps
    Quote Quote  
  4. Originally Posted by anana View Post
    video bitrate is the problem
    try to reencode the source (dvd) with bitrate -> 3000 Kbps
    Does that mean x265 doesn't support bitrate below 3000Kbps?
    Quote Quote  
  5. x265 absolutely support bitrate below 3000Kbps...
    but, the video bitrate (967 Kbps) is too low to get a nice video
    Quote Quote  
  6. This was encoded by one of my friend.



    Media Info
    Movie Length ..............: 02:40:34
    File size .................: 699 MB
    File Container ............: MKV x265
    Frame Rate ................: 23.97 fps
    Resolution ................: 688*288
    Video Bitrate .............: 523 Kbps
    Audio Language ............: Tamil
    Audio Format ..............: AAC 6CH
    Audio Bitrate .............: 96 Kbps
    Subtitle ..................: English [SRT]

    This video's bitrate is very lower than mine. Yet there is no such color issue, as I face.

    Any ideas about it?
    Quote Quote  
  7. some dark areas isn't good when the bitrate's too low
    if you want the lower bitrate try to find the best bitrate for your video...
    Last edited by anana; 5th Mar 2015 at 04:25.
    Quote Quote  
  8. Originally Posted by anana View Post
    some dark areas isn't good when the bitrate too low
    if you want the lower bitrate try to find the best bitrate for your video...
    How to find it out?
    Quote Quote  
  9. try HandBrake's Preview
    Last edited by anana; 5th Mar 2015 at 04:49.
    Quote Quote  
  10. Originally Posted by anana View Post
    try HandBrake's Preview
    Thanks for that.

    I am on it!
    Quote Quote  
  11. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    Привет фану индийского кино
    Why so stubbornly ignore 10bit encoding ?! The difference in speed between 8bit and 10bit is 15% slower, but the quality of the image is a multiple of the above.
    Quote Quote  
  12. Originally Posted by Gravitator View Post
    Привет фану индийского кино
    Why so stubbornly ignore 10bit encoding ?! The difference in speed between 8bit and 10bit is 15% slower, but the quality of the image is a multiple of the above.
    Hi

    Not really ignoring. But, I do not know anything about it.
    Quote Quote  
  13. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    10bit is very beneficial to encode the dark video Just try it first and make sure.
    Quote Quote  
  14. Originally Posted by Gravitator View Post
    10bit is very beneficial to encode the dark video Just try it first and make sure.
    Thanks for the suggestion.

    I have a doubt. All the movies contain both dark scenes and bright scenes. If I use 10bit, will the bright scenes become brighter?
    Quote Quote  
  15. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    Originally Posted by Evad3R View Post
    I have a doubt. All the movies contain both dark scenes and bright scenes. If I use 10bit, will the bright scenes become brighter?
    No. Especially new UBD-4K discs will use 10-12bit.
    Quote Quote  
  16. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Evad3R View Post
    Hi guys!

    As you see in the screenshot below, my x265 encodes have such color issues.

    Can anyone find out what causes it?



    Source : DVD9
    Filters I used in this encode : asharp, aWarpSharp, MSharpen, MSmooth, Tweak


    Please let me know if any more details needed.
    Obviously we need more details, one image is simply not enough information.
    Please mention exactly what you think is the problem.

    The picture looks dark, is that the issue? Or the purple tint on the highlights on her face? Or the banding? Also it would help at least to show some pre and post so we can see how much your lavish filtering improved or degraded the issue.

    You only have 704*304 pixels per frame, the color information is even less than that. So at one point you reach a limit, you can apply a million filters but you have only so many pixels to work with. An alternative to get rid of the banding is up scaling the image. Obviously it won't get any better, but after that you have more pixels to work with. Also increasing the color depth to 4:4:4 could help or even PC level RGB. Again these things initially won't do anything but you again have more to work with.

    Of course when you are stuck on the "Got to stay NTSC/PAL on DVD mantra" then your options are very limited indeed!

    Last edited by newpball; 5th Mar 2015 at 09:53.
    Quote Quote  
  17. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Gravitator View Post
    Why so stubbornly ignore 10bit encoding ?! The difference in speed between 8bit and 10bit is 15% slower, but the quality of the image is a multiple of the above.
    Huh, that would only help if you can actually view the video with 10 bits. That is, to put it nicely, problematic for most people.
    Quote Quote  
  18. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    Originally Posted by newpball View Post
    Originally Posted by Gravitator View Post
    Why so stubbornly ignore 10bit encoding ?! The difference in speed between 8bit and 10bit is 15% slower, but the quality of the image is a multiple of the above.
    Huh, that would only help if you can actually view the video with 10 bits. That is, to put it nicely, problematic for most people.
    Problematic only lack assortment of 10-bit hardware decoders. So far available GTX 960.
    Quote Quote  
  19. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    You can go to this link X265 vs DivX265 (see the benefit 10bit for the dark video on samples).
    Quote Quote  
  20. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    No, you also need 10bit-capable displays. Or realtime 10-to-8 dithering of playback.

    Scott
    Quote Quote  
  21. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Cornucopia View Post
    No, you also need 10bit-capable displays. Or realtime 10-to-8 dithering of playback.
    You also need a 10 bit capable and enabled video card and software that supports it.
    Quote Quote  
  22. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Evad3R View Post
    If I use 10bit, will the bright scenes become brighter?
    No, a common misconception.

    The dynamic range of the display device and the color depth are two entirely separate things.
    Quote Quote  
  23. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    What are you bothering to display? We are talking about trying to improve the final video (DVD) when encoding to HEVC.
    Quote Quote  
  24. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Gravitator View Post
    What are you bothering to display? We are talking about trying to improve the final video (DVD) when encoding to HEVC.
    So basically you think that the finely added dithering going from 8 to 10 bit is still there when it is finally converted to 8 bits in the GPU or even before?

    How?
    Quote Quote  
  25. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    You do not consider that HEVC was initially created as a good smoother texture to attempt superiority AVC (Even the developers have already written about it).

    We do not try to improve the color 8bit turn it into 10bit, we try to save 8bit in the broader 10bit package.


    Why would a man not to give a sample for the selection of settings? It is possible that the original itself is suffering poor quality (which does not allow to improve the final video).
    Quote Quote  
  26. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Gravitator View Post
    Why would a man not to give a sample for the selection of settings? It is possible that the original itself is suffering poor quality (which does not allow to improve the final video).
    Frankly I would not be surprised the original looks better.

    I mean: asharp, aWarpSharp, MSharpen, MSmooth, Tweak all in a space of only 704*304 pixels.
    Quote Quote  
  27. Originally Posted by newpball View Post
    Originally Posted by Gravitator View Post
    Why so stubbornly ignore 10bit encoding ?! The difference in speed between 8bit and 10bit is 15% slower, but the quality of the image is a multiple of the above.
    Huh, that would only help if you can actually view the video with 10 bits. That is, to put it nicely, problematic for most people.
    It's a common misconception. It's beneficial even on a 6bit cheap TN display.

    The main reason for using 10bit encoding from an 8bit source is not because of the dithering or bit depth conversion from 8 bit source to 10bit. 10 bit is better for encoding in terms of compression efficiency. The intermediate calculations have a higher level of precision. I was dubious at first too when 10bit AVC became more common a few years ago, but it's pretty much proven at this point. The compression efficiency benefit varies a lot by content. On some sources it might only be 5-10% better at a given bitrate range. Other can be as high as 50-60%. Early testing shows the relationship holds with HEVC for 10bit. I can point you to some articles if you want and there are real world tests in various forums as well

    The negatives are compatibility (devices etc...) , slower encoding
    Quote Quote  
  28. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by poisondeathray View Post
    The main reason for using 10bit encoding from an 8bit source is not because of the dithering or bit depth conversion from 8 bit source to 10bit. 10 bit is better for encoding in terms of compression efficiency. The intermediate calculations have a higher level of precision. I was dubious at first too when 10bit AVC became more common a few years ago, but it's pretty much proven at this point. The compression efficiency benefit varies a lot by content. On some sources it might only be 5-10% better at a given bitrate range. Other can be as high as 50-60%. Early testing shows the relationship holds with HEVC for 10bit. I can point you to some articles if you want and there are real world tests in various forums as well
    If you can demonstrate it I will certainly believe it!

    If I understand you correctly you are saying that internal encoder processing of intermediate calculations is limited to the target bitrate, I would question the validity of that.

    I would think that a well designed encoder uses an internal bitrate for intermediate calculations far greater than 8 or 10 bit to prevent rounding errors. But I must admit, what is obvious for most people may not be so for stubborn engineers!

    At any rate, can you demonstrate the difference with images?


    Quote Quote  
  29. Originally Posted by newpball View Post
    Originally Posted by poisondeathray View Post
    The main reason for using 10bit encoding from an 8bit source is not because of the dithering or bit depth conversion from 8 bit source to 10bit. 10 bit is better for encoding in terms of compression efficiency. The intermediate calculations have a higher level of precision. I was dubious at first too when 10bit AVC became more common a few years ago, but it's pretty much proven at this point. The compression efficiency benefit varies a lot by content. On some sources it might only be 5-10% better at a given bitrate range. Other can be as high as 50-60%. Early testing shows the relationship holds with HEVC for 10bit. I can point you to some articles if you want and there are real world tests in various forums as well
    If you can demonstrate it I will certainly believe it!

    If I understand you correctly you are saying that internal encoder processing of intermediate calculations is limited to the target bitrate, I would question the validity of that.

    I would think that a good encoder uses an internal bitrate for intermediate calculations far greater than 8 or 10 bit to prevent rounding errors. But I must admit, what is obvious for most people may not be so for stubborn engineers!

    At any rate, can you demonstrate the difference with images?




    But again the proof is in the pudding, can you demonstrate it?

    Absolutely. Like I said, I was very dubious at first as well. But I 've done quite a few tests on this subject because I didn't believe it either. There were a bunch of articles on 10bit AVC encoding by Ateme, but I don't really believe something unless I test it myself. And they "sell" their products, so there might be some bias. It takes a lot of evidence to convince me. You don't see much testing on this now, because it's pretty much an established fact . (I guess you missed that bus or were sleeping the last few years ). Simply, there are mountains of evidence by now (at least with x264), and almost nothing contradictory. Very few things were as clear cut as this. It's early days for HEVC and x265, so I would hold off on making a firm conclusion just yet for 10bit and HEVC

    I posted some tests a few years ago about 10bit AVC . I'll try to dig them up. There are a bunch of tests at doom9 as well. Basically, like all good tests, you do a series of encodings at set bitrates and various types of different content, and compare. Visually (subjective), and by objective measures (PSNR, SSIM), it "wins" hands down. It basically never scores lower than 8bit encoding at the same bitrate, when measured against the 8bit source. The difference in the compression curve is larger at lower bitrates (benefit is higher at lower bitrates) . Sources like simple animation tend to show greater % compression gains than live action

    The biggest benefit visually, is retaining dither , fine details, and noise which are more commonly dropped by 8bit AVC at lower bitrates and commonly used settings. So the common complaints of "banding", blocking that you get, especially during lower bit rate scenarios, things like dark scenes, blue skies, gradients walls - is vastly improved.
    Last edited by poisondeathray; 5th Mar 2015 at 12:38.
    Quote Quote  
Visit our sponsor! Try DVDFab and backup Blu-rays!