VideoHelp Forum




+ Reply to Thread
Page 2 of 4
FirstFirst 1 2 3 4 LastLast
Results 31 to 60 of 116
  1. Originally Posted by jagabo View Post
    I guess I'd forgotten that. I think because some encoder GUI's default to 20 I assumed it was also the default for x264.

    Originally Posted by jagabo View Post
    There are some types of material where CRF=18 yields obvious artifacts. Dark, grainy video for example:
    I'd not argue there, but I think it's more of an exception than a rule.
    I've been involved in/read a few threads in the past relating to encoding "difficult" video and I'm still sceptical regarding the need to "tweak" advanced options. I'm not saying it doesn't work..... ie tweaking setting "x" might allow the video to encode perfectly, but generally the file size will increase accordingly. As it would if you simply lowered the CRF value.
    I remember one thread where I'm pretty sure someone offered a few tweaks which encoded the "problem" video perfectly at CRF18, but lowering the CRF value to 16 encoded it 90% as well, without increasing the bitrate anywhere near as much, and I think CRF15 did it just as well at the same bitrate.

    Originally Posted by jagabo View Post
    If you are encoding at low resolution and viewing at full 1080p (A standard defintion source watched on a 1080p HDTV, for example) all the small artifacts in the small frame are enlarged on the big screen, so they are more obvious. Whereas small artifacts in an 1080p encoding remain small when viewed on that same HDTV.
    Well I'd not really thought about it that way, but now you've mentioned it, it's fairly obvious.
    Plus I guess the resizing adds resizing artefacts to the compression artefacts which wouldn't happen when the video doesn't need resizing.
    Quote Quote  
  2. Originally Posted by hello_hello View Post
    Originally Posted by jagabo View Post
    There are some types of material where CRF=18 yields obvious artifacts. Dark, grainy video for example:
    I'd not argue there, but I think it's more of an exception than a rule.
    Yes. I normally use CRF=18 myself. For particularly grainy sources I add --aq-strength=1.8 or thereabouts.
    Quote Quote  
  3. Thank you all for providing so much good and valid information in a 'language' understandable I'm going away skiing for the weekend now but I'll have tons of stuff to dig into when I get back, thanks to all of you! I'm sure there will be more posts in this thread soon after I get back Thanks again!
    Quote Quote  
  4. Originally Posted by jagabo View Post
    Yes. I normally use CRF=18 myself. For particularly grainy sources I add --aq-strength=1.8 or thereabouts.
    The default aq strength seems to be 1.0, but if you use --tune grain, the aq strength changes to 0.5, so for grainy sources the x264 preset does the opposite of what you're doing. I've never really got my head around the reason for that though. I guess that's why I don't play with the x264 advanced settings much.
    Quote Quote  
  5. Originally Posted by hello_hello View Post
    Originally Posted by jagabo View Post
    Yes. I normally use CRF=18 myself. For particularly grainy sources I add --aq-strength=1.8 or thereabouts.
    The default aq strength seems to be 1.0, but if you use --tune grain, the aq strength changes to 0.5, so for grainy sources the x264 preset does the opposite of what you're doing. I've never really got my head around the reason for that though. I guess that's why I don't play with the x264 advanced settings much.
    The reason why the value is lower in the --tune grain preset is AQ takes bits away from edges . On grainy content , with low to moderate bitrates you get this "halo" effect where there is a grainless/splotchy transition zone with high AQ values, both around objects and frame edges .

    Of course, high bitrates (relative to content complexity) makes all encoding issues disappear... Even Mainconcept/Rovi AVC or Sony AVC look fine when you throw enough bitrate at it
    Quote Quote  
  6. Ok, well I've come to understand that I don't have many options with the video I have right now other than a pretty high bitrate.
    With all this discussion on here about encoding and bitrates etc I think I now understand my use for the "ProTune" mode on the GoPro Hero 3 camera. You get high bitrate video (~40Mbps) files instead of compressed video.

    See my thoughts below:

    Source video 40Mbps -> Edit -> Encode to 10Mbps = Great quality, small file size

    Source video 10-12Mbps -> Edit -> Encode to same bitrate = Poor quality, small file size

    Source video 10-12Mbps -> Edit -> Encode to 30-40Mbps / lossless format = Close to great quality, huuuuge files


    Am I correct?
    Quote Quote  
  7. Delete this post. It was a reply to another post that was deleted.
    Last edited by chrisofsweden; 3rd Apr 2013 at 03:12. Reason: Delete this post. It was a reply to another post that was deleted.
    Quote Quote  
  8. Originally Posted by chrisofsweden View Post
    Source video 40Mbps -> Edit -> Encode to 10Mbps = Great quality, small file size
    Not necessarily. It depends on the nature of the source. But if that's what you're seeing...

    Originally Posted by chrisofsweden View Post
    Source video 10-12Mbps -> Edit -> Encode to same bitrate = Poor quality, small file size
    It wouldn't necessarily give poor quality. Though it should deliver lower quality than the 10-12 Mb/s source, and lower quality than your first option.

    Originally Posted by chrisofsweden View Post
    Source video 10-12Mbps -> Edit -> Encode to 30-40Mbps / lossless format = Close to great quality, huuuuge files
    Only if your 10-12 Mb/s source is "great quality". You can't increase the quality of a source by encoding at a higher bitrate. Just reduce the loss of quality.

    In short, every time you reencode with a lossy codec you will lose some quality. If you use sufficient bitrate those losses will be minimal. Exactly what bitrate you need to prevent big losses will depend on the nature of the source.
    Quote Quote  
  9. I know that there are more variables to take into consideration, but GENERALLY are my assumptions correct?

    Let me put it clearer:

    A source video with great quality, bitrate 40Mbps, reencoded to ~10Mbps will produce a much better result (quality) than a source video with great quality, bitrate 10-12Mbps reencoded to ~10Mbps.

    A source video with great quality, bitrate 10-12Mbps reencoded to 30-40Mbps / lossless format = Close to the same quality as source but huuuge delivery files.

    This is generally true right?
    Last edited by chrisofsweden; 3rd Apr 2013 at 08:43.
    Quote Quote  
  10. You seem to really, really want it to work that way, but it doesn't. Please re-read jagabo's post #38.

    FWIW everything you say CAN be true in certain cases, but you cannot draw general conclusions.
    Last edited by smrpix; 3rd Apr 2013 at 09:23.
    Quote Quote  
  11. Originally Posted by chrisofsweden View Post
    A source video with great quality, bitrate 40Mbps, reencoded to ~10Mbps will produce a much better result (quality) than a source video with great quality, bitrate 10-12Mbps reencoded to ~10Mbps.
    Not necessarily. The former will usually be better but I don't think you can go so far as say "much" better. Again, it will vary depending on the content. And the encoder.

    Originally Posted by chrisofsweden View Post
    A source video with great quality, bitrate 10-12Mbps reencoded to 30-40Mbps / lossless format = Close to the same quality as source but huuuge delivery files.
    Yes. In the case of 30-40 Mb/s you'll get 3 to 4 times the size (as the size is proportional to the bitrate: size = bitrate * running time). Lossless will usually be much larger. But it could be smaller, depending on the content. Something like a simple bar chart (large areas of flat color, no noise) can compress very well with lossless codecs.)
    Quote Quote  
  12. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Let me put it this way:

    You start with a "pure" 100% quality LIVE image (subject).
    If you encode in the camera to 30-40Mbps, let's say you retain 95% of the source. 100% * 95% = 95% quality.
    If you were to instead encode in cam to 10-12Mbps, say you retain 80% of the source. Still good enough to watch, but noticeably worse than the other version. 100% * 80% = 80% quality.

    If you re-encode your 30-40Mbps stuff to 10-12Mbps, let's say you're retaining 85% (it's a little higher because some of the picture variability has already been removed by the previous encode). So NOW you see 95% * 85% = 80.75%.
    If you re-encoded your 10-12Mbps again to 10-12Mbps, let's say you're retaining 90% (not much change from the lower bitrate source). Now, you see 80% * 90% = 72%.
    If you re-encoded your 10-12Mbps to 30-40Mbps, let's say it retains 97% of the quality. That's now 80% * 97% = 77.6%.

    Does that help?
    BTW, those numbers are for example purposes only. I made them up. Please don't try to use them as exact figures in any calculation.

    Scott
    Quote Quote  
  13. Originally Posted by Cornucopia View Post

    Does that help?

    Scott

    Yes thanks. Seems I had it quite right in my head after all. For the sake of experimenting I've tried encoding 40mbit footage and 12mbit footage to 10-20mbit VBR and I can't say theres a big difference really as I thought it would be.

    I just can't understand how 'they' can make a bluray (about 40 ish mbit right?) into an x264 mkv at 8mbit look so good and retain all (well mostly) quality.

    Then again, vegas isnt encoding with x264 but with MainConcept h264 and some of you guys have said its not at all as good an encoder as x264.... That must be it, otherwise I still dont see the difference.
    Quote Quote  
  14. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    "Quite right"? NO. Close? Yes. What I wrote just backed up what jagabo had already said. Plus, as he said, there are MANY variables where one cannot just make a blanket statement like that. "Rules of thumb" maybe.

    When you say you don't see the difference, that tells me you have some more experience to gain... (not to downplay your existing experience).

    There is a difference. A more definitive, comprehensive and objective way to tell would be to do a "differencing" script in AVISynth, such as jagabo has recently described. Basically, what you do is:
    1. Encode 1st way.
    2. Encode 2nd way.
    3. Load both 1st & 2nd in AVISynth.
    4. Use Subtract (1st, 2nd) function in AVISynth to get the difference. (or you could use Overlay (1st, 2nd, mode="difference")).

    This will make clear to you if you do one difference between source & good encode and one between source & not-so-good encode. The not-so-good one will show MORE/WORSE elements of difference. Takes some of your "eye's compensation" out of the equation. Then, it's just a matter of getting more and more used to being able to visibly SPOT those differences on your own when they occur.

    Scott
    Quote Quote  
  15. Blu-Ray or DVD might be encoded from 2k or maybe 4k resolution, that was shot digitaly or was digitized from film. It was shot properly. And that you encode certainly not with some light version of MainConcept encoder that is in Vegas. So it is much easier to encode Blu-Ray than home video footage.

    Home videos are different . You can have noise in the video sometimes even during day light, artifacts.

    Not sure why even today MainConcept has no CRF encoding in their light versions in Vegas or Premiere, using 2pass VBR is quite obsolete for home user, CRF modes are much better, where you set quality (just a number that represents some quality - called quantizer) and encoder chooses bitrete on its own to keep quality that you want. So there is no bitrate lottery involved at all. x264 uses CRF (or VBR).

    Vegas encoder might be good for you, you just give it a bit more bitrate theoretically and you should be fine (but you do not know what bitrate, right ? it is insane), if you accept that guessing, targeting bitrate game.

    I found Vegas encoders (MainConcept and Sony) not good enough when you intend to encode video for streaming, low bitrate encodes 1M etc., that is the case to better go with x264.
    Second reason to go with external encoder is when you downsize interlace footage in Vegas. Definitively better to do it outside of Vegas with some Avisynth resizers (involving bob deinterlace and resize). Notice, making DVD from interlace HD footage falls into this category.
    Quote Quote  
  16. Originally Posted by chrisofsweden View Post
    I just can't understand how 'they' can make a bluray (about 40 ish mbit right?) into an x264 mkv at 8mbit look so good and retain all (well mostly) quality.
    But they don't retain all the quality. If you had access to the original and the encoded video you'd see the differences. Also keep in mind that professionally shot movies are cleaner than your cheap shakey camcorder video. So they compress much better.

    One of the best way to compare two videos is to interleave them:

    video 1 frame 1
    video 2 frame 1
    video 1 frame 2
    video 2 frame 2
    video 1 frame 3
    video 2 frame 3...

    Then flip back and forth between pairs of frames. The differences become much clearer. Especially if you use a screen magnifier while doing so. This is much better than side by side comparisons. You're eyes are especially sensitive to changes in a still scene.

    You can go through the two videos and save out corresponding BMP or PNG images then view the images with Window's built in picture view. But that becomes tedious if you want to compare many frames. Instead, you can use a simple AviSynth script to accomplish the interleaving:

    Code:
    v1 = AviSource("video1.avi")
    v2 = AviSource("video2.avi")
    Interleave(v1, v2)
    Then open that script with an editor that lets you step back and forth frame by frame. I usually use VirtualDub where the left and right arrows step forward and backward by one frame.

    Here's an example crop from two videos (both recorded on a DVD recorder, one at ~8000 kbps, the other at ~2000 kbps) side by side at their original size:

    Name:  sbs.png
Views: 596
Size:  72.2 KB

    By enlarge, gross details are retained, though the low bitrate image looks "rough". Now here's the same two crops interleaved and enlarged 4x (point resize):

    Click image for larger version

Name:	inter.gif
Views:	468
Size:	1.61 MB
ID:	17071

    The damage done by the low bitrate encoding is much more obvious.
    Last edited by jagabo; 3rd Apr 2013 at 20:23.
    Quote Quote  
  17. Thanks all of you for sticking in there for me.

    Good tip about using AVISynth Script with VirtualDub for comparing video frames and thanks for the in depth explanations and examples!



    It bothers me that this as isn't as simple and straight forward as I was hoping it would be. I kinda hoped it would stretch as far as me having to learn a video editing software and then I'd be in a good place. But encoding theory and general digital video understanding is taking a lot of time, pestering you with endless newbie questions... And I know it's hard to explain something advanced in a simpleton way. Some times, it's just not even a possibility. You have to get in there and get dirty, **** shit up and do it all over again in a different way and see if the result is better or worse.

    I'll just have to stick to it and hopefully my knowledge will grow with time and experience.


    Actually, I think this evening I'm gonna take an original high bitrate straightfromcamera clip and put it in vegas, don't do anything with it, just reencode it to MainConcept MP4 @ 10Mbit CBR and then take the same original clip and encode it with x264 @ 10Mbit CRF and see how much of a difference there is.
    Last edited by chrisofsweden; 4th Apr 2013 at 03:07.
    Quote Quote  
  18. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    One thing to consider when either learning terms/techniques or when troubleshooting, is to only change ONE variable at a time.

    So doing those A/B comparisons won't really start making internal sense until you try all the variables independently.

    CBR vs. VBR vs. CQ/CRF
    Mainconcept vs. x264
    10 vs. 20 vs. 40 Mbps
    etc

    The beginning of wisdom is once you realize how unwise you really are. (My bad paraphrase)

    Scott
    Quote Quote  
  19. Originally Posted by Cornucopia View Post
    So doing those A/B comparisons won't really start making internal sense until you try all the variables independently.
    This I know. Otherwise I'd never know what variable made a change.


    Originally Posted by Cornucopia View Post
    The beginning of wisdom is once you realize how unwise you really are. (My bad paraphrase)
    I'm starting to see this...
    Quote Quote  
  20. Originally Posted by chrisofsweden View Post
    take the same original clip and encode it with x264 @ 10Mbit CRF
    When you use CRF encoding you can't specify a bitrate. You specify the quality, the encoder uses whatever bitrate is necessary to deliver that quality. The only way to get 10 Mb/s with CRF encoding is encode over and over again with different CRF values until you get one that delivers 10 Mb/s. Use 2-pass VBR encoding to get a specific bitrate.
    Quote Quote  
  21. Originally Posted by jagabo View Post
    When you use CRF encoding you can't specify a bitrate. You specify the quality, the encoder uses whatever bitrate is necessary to deliver that quality. The only way to get 10 Mb/s with CRF encoding is encode over and over again with different CRF values until you get one that delivers 10 Mb/s. Use 2-pass VBR encoding to get a specific bitrate.
    Ah, right thanks. I knew this. Hard to keep everything apart. I'll try the 2-pass VBR and compare results.

    Do you think there will be a visually noticeable quality improvement encoding to x264 2-pass VBR 10Mbit over MainConcept 2-pass VBR 10Mbit using the same source file which is at ~40Mbit?
    Quote Quote  
  22. Originally Posted by chrisofsweden View Post
    Do you think there will be a visually noticeable quality improvement encoding to x264 2-pass VBR 10Mbit over MainConcept 2-pass VBR 10Mbit using the same source file which is at ~40Mbit?
    I've never used Vegas so I can't say for sure. But from everything I've read x264 is superior to MainConcept's encoder.

    Beware of 10 bit h.264 encoding. There are no consumer players at this time that support 10 bit h.264. So, for example, you won't be able to play those files on an HDTV that plays MP4 of MKV files.
    Quote Quote  
  23. Originally Posted by jagabo View Post
    Beware of 10 bit h.264 encoding. There are no consumer players at this time that support 10 bit h.264. So, for example, you won't be able to play those files on an HDTV that plays MP4 of MKV files.
    With this you mean MainConcepts h.264? Or is it the same with x264? Stay away from 10bit colors (I suppose thats whats meant with 10 bit encoding?) all together?

    Mostly my end material is going to be played back on computers.
    I'f need be, I'll do another encode from source to fit iPad specs.
    Quote Quote  
  24. Even on computers 10 bit h.264 (MC or x264) requires an h.264 decoder that supports 10 bit. Many people won't have that. Stick with 8 bit if you want good compatibility.
    Quote Quote  
  25. Originally Posted by jagabo View Post
    Even on computers 10 bit h.264 (MC or x264) requires an h.264 decoder that supports 10 bit. Many people won't have that. Stick with 8 bit if you want good compatibility.
    Ok thanks, will do.
    Quote Quote  
  26. THe only quick way to compare 1pass CRF in x264 and MainConcepts 2pass VBR is to encode it with x264 first,
    -you choose CRF value (quality) say 18, CRF=18, encode, you get a file with certain volume,
    -than you have to calculate average bitrate out of this and this value you use in Vegas plus comfortable maximum bitrate

    for example: x264 will produce 800MB video file with CRF=18, that means 6400Mbit - http://www.matisse.net/bitcalc/?input_amount=800&input_units=megabytes&notation=ieee
    length of video is 10 minutes = 600 seconds, so bitrate is 6400/600=10.7 Mbit/s, in Vegas terms 10,700,000, so this is your 2pass VBR average value
    Quote Quote  
  27. You can use Bitrate Viewer to get accurate average bitrate reports from most files. MediaInfo often gives inaccurate results because it just reports what the file header says.
    Last edited by jagabo; 4th Apr 2013 at 11:08. Reason: typo
    Quote Quote  
  28. That's actually good tip that bitrate viewer, Vegas' encoder is less off when I input average bitrate from bitrate viewer, for example, it showed AVG 3125kbps so I input 3,125,000 into Vegas for average and volume was much closer to the x264 original.
    Quote Quote  
  29. Originally Posted by hello_hello View Post
    ...but banding tends to be more of a problem than blocking.
    Now I've tried encoding with x264 using handbrake. The result is definately better quality at even smaller file sizes. And, all blocking is gone at even pretty low bitrates. However, NOW this banding you're talking about is introduced. Increasing the CRF value or choosing a really high bitrate and doing a 2-pass VBR DOES NOT eliminate the banding.

    This offcourse happens on footage where the sky is in alot of the frame. (The stuff I'm putting together is from a 4-day ski-trip with clear skies so there's alot of sky sadly)

    Anyone know what the best method is for hindering banding to occur?
    Last edited by chrisofsweden; 5th Apr 2013 at 01:25. Reason: Added a DOES NOT (typo)
    Quote Quote  
  30. Deblocking is a normal part of h.264 decoding. That's why you don't get much blocking. But instead you get banding and posterization. About the only way to prevent that is to keep (or add) noise and use sufficient bitrate. Or use 10 bit h.264 encoding. But if you do the latter your playback options will be limited.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!