VideoHelp Forum

Try DVDFab and download streaming video, copy, convert or make Blu-rays,DVDs! Download free trial !
+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 46 of 46
Thread
  1. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by JasonCA View Post
    For whatever reason, my ConverToYV24() doesn't seem to work.
    You need version 2.6 of AviSynth to use ConvertToYV24() -- YUV 4:4:4.

    Thank you jagabo! That is my problem! I'm using AviSynth 2.5. I'll upgrade to that here soon.

    Originally Posted by jagabo View Post
    Yes. But only "illegal" after conversion to RGB. If you're planning on adjusting colors later, and will be working in YUV, you don't need to worry about that during capture. As long as the colors aren't clipping at the hard boudaries (0, 255) you can still correct them. If you don't plan on adjusting colors later, or are using software that only works in RGB, you want them to be legal during your cap.
    It seems like the goal is to balance the Luma levels with the Chroma levels. For NTSC VHS, this means Luma is captured from 7.5 IRE to 100IRE to the capture card. Right? In the same way, the Chroma levels, at least to me, need to match the Luma levels. This is done by the Chroma levels being at a 75% saturation on the VectorScope. But I'm not 100% sure, which is why I am asking and discussing so that I better understand how to measure Chroma levels.

    You said, "If you're planning on adjusting colors later, and will be working in YUV, you don't need to worry about that during capture." But working in YUV, how do I know the UV portion of the signals are clipping to my capture card or not?

    "As long as the colors aren't clipping at the hard boudaries (0, 255) you can still correct them." By this, you mean after conversion to RGB. It seems to me, on the conversion to RGB that if the level of Chroma where also captured at the proper levels of the Luma, then in conversion to RGB they should not clip. I believe I remember reading on the Doom forum that one of the ways to ensure this is to make sure your Chroma levels are just slightly lower then the Luma levels. I'll try to go back and look for that.

    Originally Posted by jagabo View Post
    In your color bars example you use Tweak(sat=1.205) to make two of the bars "illegal". You could restore them by using Tweak(sat=0.8) or thereabouts. So even though they were illegal, they can be made legal again, without any loss (other than rounding errors).
    There were two scripts I posted in that post. One was where Tweak(sat=1.0). This is normal saturation meaning I am not at all affecting the color bars. On the other ones, I slightly increased the saturation, and you can see two of the color bands clipped. It was just to make the point that when looking at the U and V independently (not the UV vectorscope) that you CAN NOT determine by the U and V independent scopes that your color levels would clip. I suppose because of the relationship that the U and V has to the Luma on conversion to RGB.

    However, disregarding Luma, perhaps all I need to look at is the U and V independent scopes to see that there signal is not be clipped from top to bottom? If U and V are truly the Chromanace signal and they are not being clipped in the independent U and V scopes, then perhaps as far as what the Capture Card is seeing....the signal is fully captured. The UV may not match the Luma levels....but at least the U & V signal is not clipped. In which case, I can just lower the saturation via software and end up with the same result.

    Luminance and Chromanace are separate signals. Perhaps I'm confusing things a bit in trying to ensure both are balanced together. My main concern is ensuring that each signal (Luminance) and Chromanace are being properly captured to the capture card.

    Originally Posted by jagabo View Post
    Not entirely. Because where the YUV->RGB clipping occurs varies with Y too.
    Right...there seems to be some relationship between the Luma levels and the Chroma levels.

    Originally Posted by jagabo View Post
    Yes. If your YUV is rec.709 (high definition) you would change the ConvertTo's in the HighlightBadRGB() to include matrix="recy709". It's the same for PC.601 and PC.709.
    Ok great!
    Quote Quote  
  2. Originally Posted by JasonCA View Post
    You said, "If you're planning on adjusting colors later, and will be working in YUV, you don't need to worry about that during capture." But working in YUV, how do I know the UV portion of the signals are clipping to my capture card or not?
    Look at U and V in VideoScope(). As long as there are no flat peaks at 255, or flat valleys at 0, they are not clipping*.

    Originally Posted by JasonCA View Post
    There were two scripts I posted in that post. One was where Tweak(sat=1.0). This is normal saturation meaning I am not at all affecting the color bars. On the other ones, I slightly increased the saturation, and you can see two of the color bands clipped.
    No, they did not clip. U and V were still within the 0-255 range. They won't be clipped until you convert to RGB. So long as the video is in YUV you can fix it by reducing the saturation.

    * In practice, a capture device or driver may clip somewhere above 0 or below 255.
    Quote Quote  
  3. Maybe an example will help. Here's a random video that happens to be on my computer right now. Using this script:

    Code:
    v1=ffVideoSource("D:\Downloads\TEST1.mkv", fpsnum=30000, fpsden=1001).BilinearResize(320,240)
    v2=ColorYUV(v1, cont_u=500, cont_v=500)
    v3=ColorYUV(v2, cont_u=-170, cont_v=-170)
    
    Interleave(v1.Subtitle("orginal"), v2.Subtitle("bad"), v3.Subtitle("corrected"))
    
    ConvertToYUY2().VideoScope("both", true, "Y", "UV", "UV")
    v1 is the original capture:

    Click image for larger version

Name:	v1.jpg
Views:	1113
Size:	50.7 KB
ID:	22232

    v2 is a simulation of a capture where the colors were extremely over saturated. But the individual U and V values were still within the 0 to 255 range:

    Click image for larger version

Name:	v2.jpg
Views:	1417
Size:	63.8 KB
ID:	22233

    When you run HighlightBadRGB() on v2 you get an image that is almost entirely red:

    Name:  bad.jpg
Views: 1479
Size:  17.0 KB

    But the original colors can be restored from v2 because U and V were not clipped. v3 is v2 corrected to restore the original colors:

    Click image for larger version

Name:	v3.jpg
Views:	1116
Size:	50.9 KB
ID:	22234
    Last edited by jagabo; 20th Dec 2013 at 15:54.
    Quote Quote  
  4. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by poisondeathray View Post
    Typically what is used is a YCbCr parade. It's a waveform traceing of Y, Cb, Cr

    In avisynth you can use histogram("levels") , to see Y, Cb, Cr displayed . It goes from 0-255. If the ends are "bunched" up then you have clipping
    http://avisynth.nl/index.php/Histogram#Levels_mode
    Fantastic! No really, this is very helpful!

    Originally Posted by poisondeathray View Post
    To visualize the content, to detect "hot" areas in a frame, along the same lines as FCP and other various NLE plugins, you can use avisynth limiter() with show=something

    The default values are set to Y' 16-235 , CbCr 16-240 (but you can set different limits)

    http://avisynth.nl/index.php/Limiter

    show="luma_grey" will make it greyscale with values above & below max_luma and min_luma will be colored

    show="chroma_grey" will do it similarly for chroma
    I found something interesting when playing with this and I am wondering if it's a bug with the Limiter function:

    Code:
    ColorBars()
    ConvertToYV12()
    Crop(last,0,0,640,300)
    BilinearResize(640,480)
    ConvertToYV12()
    Tweak(sat=1.55)
    Limiter(last,16, 235, 16,239,"chroma") #luma (16 to 235), Chroma (16 to 239)
    Histogram("levels")
    In the above code, if you set sat=1.56 you will get clipping.

    However, if you change max_chroma=240, you can boost the saturation to ANYTHING you like and it won't clip.

    For example:

    Code:
    ColorBars()
    ConvertToYV12()
    Crop(last,0,0,640,300)
    BilinearResize(640,480)
    ConvertToYV12()
    Tweak(sat=8.0) # heavy saturation
    Limiter(last,16, 235, 16,240,"chroma") #luma (16 to 235), Chroma (16 to 240)
    Histogram("levels")
    Notice that I've set the saturation to 8.0 and set max_chroma to 240 instead of 239. Although the Historgram "levels" scope indicates the Chroma is slammed, the Limiter doesn't clamp the Chroma by showing any YELLOW. The default for Limiter is when max_chroma=240 and not 239. So why is this?

    Too bad the Limiter function did not specify an alternative color for representing a clamped Chroma (instead of having to be Yellow). As the AviSynth documentation says (NOTE: the wiki documentation seems to be lacking the clipping color):
    "show can be "luma" (shows out of bounds luma in red/green), "luma_grey" (shows out of bounds luma and makes the remaining pixels greyscale), "chroma" (shows out of bounds chroma in yellow), "chroma_grey" (shows out of bounds chroma and makes the remaining pixels greyscale). The coloring is done as follows"
    Caring on ...

    Originally Posted by poisondeathray View Post
    As long as you have values within YCbCr 0-255, you can salvage those vAs soon as you touch RGB all bets are off
    This is a very goood point! And that's what I am looking for to help answer my question. So heavily Saturated U and V is OK as long as it's not clipped. If the U and V limits stored are to heavily saturated, then they may NOT translate nicely to RGB. However, as you are saying, the data is not lost...the capture card did capture the full U and V signal as long as the U and V didn't get clip. So in post-processing for RGB, even though the U and V are heavily saturated, I can simply reduce the saturation and end up with the same results as if I would have captured at that saturation to begin with. Bingo! That answers my question! I'm sooooo thrilled!

    This makes me wonder:

    Is it better to have a more heavily saturated U and V (as long as they don't clamp) to better represent the color when the capture card is sampling the Chroma? If my Chroma was near greyscale, trying to expand would leave to banding...right? So, to get good sampling on my Chroma, is it good that my Chroma is evenly distributed between 0 and 255? Hmmmmmm


    Originally Posted by poisondeathray View Post
    All this badRGB discussion is more academic more than anything else. It's not used in practice . In practice, "bad RGB" exists everywhere , even in broadcast safe values. Even "full range" PC matrix 609/709 RGB conversion is full of out of gamut values. The reason is the 8bit RGB color cube model is tiny compared to the 8bit Y'CbCr color model. Graphically you can look at the color cube model - all values of 8bit RGB fit withing the 8bit YCbCr model, but the reverse isn't true .

    http://software.intel.com/sites/products/documentation/hpc/ipp/ippi/ippi_ch6/ch6_color...ls.html#fig6-4

    In order to express those some of those negative and out of gamut values, different matrices, and different RGB models are used. For example ITU Rec1361 is used for wide gamut displays. Some of those previous "out of gamut" negative values can be expressed and be seen. In the distant future, Rec2020 which covers even larger space - will be the new standard for 4K displays.
    For VHS, I'm capturing data in 8 bit Y'CbCR (or YUV), so does this mean that in the future, I'd be able to see better dynamic range with a larger RGB color space such as maybe Rec2020? 8bit Y'CbCr may not fully translate to the 8bit RGB. But, would 8bit Y'CbCr translate better to say 10bit RGB with a larger color gamut? Or how is that related? I am thinking, wow, maybe in the future I'll see VHS tapes in a way that I have never seen them before in terms of the color (realizing the VHS resolution will always remain low).
    Quote Quote  
  5. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    Look at U and V in VideoScope(). As long as there are no flat peaks at 255, or flat valleys at 0, they are not clipping*.
    Very clear jagabo! That's what I was looking for someone to tell me. So how U and V translate to RGB is a WHOLE other thing.

    In terms of what I am capturing, I can use VideoScope while looking at U and V and ensure that there are no flat peaks at 255 (depending on my capture card's limits).

    Originally Posted by jagabo View Post
    No, they did not clip. U and V were still within the 0-255 range. They won't be clipped until you convert to RGB. So long as the video is in YUV you can fix it by reducing the saturation.
    I got it now! It's making better sense .

    The relationship that I was trying to get is stated in the Limiter documentation. It says:
    "The standard known as CCIR-601 defines the range of pixel values considered legal for presenting on a TV. These ranges are 16-235 for the luma component and 16-240 for the chroma component. "
    So, I was seeing a relationship between Luma and Chroma but didn't know how it ties together. For VHS and Rec 601, it would seem that converting to RGB is best when Luma is from 16-235 and Chroma stored as 16-240. I'd probably get less RGB clipping since the Chroma would be within the conversion range of RGB. However, as said, conversion to RGB is it's own beast and can be made seperate from the Capture process!

    My main interest, as you are aware, is ensuring that I am not clipping the Chroma. So now I see that as long as U and V are not clipping in U and V from 0 to 255 (roughly and depending on capture card's max Chroma limits), then I am good! And now I see this can be detected via the Limiter function, Historgram ("levels") function, and VideoScope (U and V ) scope.

    It would therefore seem the VectorScope (no not the VideoScope) is better suited for how the U and V get translated to actual RGB colors. As said, the Chroma must be at 75% saturation which I guess is from 16 to 240 in Chroma space. But, I'll have to experiment a bit with this too see if that's really true.

    Originally Posted by jagabo View Post
    * In practice, a capture device or driver may clip somewhere above 0 or below 255.
    This is something I will experiment with on my capture card. For example, one of my capture card's is the ATI 600 USB. So, I wonder what the max Chroma clipping range would be. I'll experiment with that soon! This is very exciting!
    Quote Quote  
  6. Originally Posted by JasonCA View Post
    However, if you change max_chroma=240, you can boost the saturation to ANYTHING you like and it won't clip.
    Because Tweak() clamps chroma to the range 16-240 by default. Add "coring=false" and you'll see Limiter() is working properly.

    Originally Posted by JasonCA View Post
    It would therefore seem the VectorScope (no not the VideoScope) is better suited for how the U and V get translated to actual RGB colors. As said, the Chroma must be at 75% saturation which I guess is from 16 to 240 in Chroma space.
    The exact limit on chroma values (for legal RGB) is dependent on the Y value.
    Last edited by jagabo; 20th Dec 2013 at 17:59.
    Quote Quote  
  7. Originally Posted by JasonCA View Post


    However, if you change max_chroma=240, you can boost the saturation to ANYTHING you like and it won't clip.

    For example:

    Code:
    ColorBars()
    ConvertToYV12()
    Crop(last,0,0,640,300)
    BilinearResize(640,480)
    ConvertToYV12()
    Tweak(sat=8.0) # heavy saturation
    Limiter(last,16, 235, 16,240,"chroma") #luma (16 to 235), Chroma (16 to 240)
    Histogram("levels")
    Notice that I've set the saturation to 8.0 and set max_chroma to 240 instead of 239. Although the Historgram "levels" scope indicates the Chroma is slammed, the Limiter doesn't clamp the Chroma by showing any YELLOW. The default for Limiter is when max_chroma=240 and not 239. So why is this?
    For Tweak, you need to set coring=false, otherwise it clips Y' 16-235, CbCr 16-240

    http://avisynth.nl/index.php/Tweak
    coring = true/false (optional; true by default, which reflects the behaviour in older versions). When set to true, the luma (Y) is clipped to [16,235] and the chroma (U, V) is clipped to [16,240]; when set to false, the luma and chroma are unconstrained. [Added in v2.53].
    Originally Posted by poisondeathray View Post
    As long as you have values within YCbCr 0-255, you can salvage those vAs soon as you touch RGB all bets are off
    This is a very goood point! And that's what I am looking for to help answer my question. So heavily Saturated U and V is OK as long as it's not clipped. If the U and V limits stored are to heavily saturated, then they may NOT translate nicely to RGB. However, as you are saying, the data is not lost...the capture card did capture the full U and V signal as long as the U and V didn't get clip. So in post-processing for RGB, even though the U and V are heavily saturated, I can simply reduce the saturation and end up with the same results as if I would have captured at that saturation to begin with. Bingo! That answers my question! I'm sooooo thrilled!

    This makes me wonder:

    Is it better to have a more heavily saturated U and V (as long as they don't clamp) to better represent the color when the capture card is sampling the Chroma? If my Chroma was near greyscale, trying to expand would leave to banding...right? So, to get good sampling on my Chroma, is it good that my Chroma is evenly distributed between 0 and 255? Hmmmmmm
    There are 2 schools of thought. One is to make adjustments in post, adjusting the capture to some facilated flexible goal. The other is to get as close as possible to the desired . My thought is get as close as possible with 8bit captures, because of the posterization induced from large shifts. With higher bitdepth captures (10bit and more), and higher quality content , it's easier to make changes and grading without turning the picture to mush


    For VHS, I'm capturing data in 8 bit Y'CbCR (or YUV), so does this mean that in the future, I'd be able to see better dynamic range with a larger RGB color space such as maybe Rec2020? 8bit Y'CbCr may not fully translate to the 8bit RGB. But, would 8bit Y'CbCr translate better to say 10bit RGB with a larger color gamut? Or how is that related? I am thinking, wow, maybe in the future I'll see VHS tapes in a way that I have never seen them before in terms of the color (realizing the VHS resolution will always remain low).
    Yes, it means you have the ability to "see" more colors on a display, and a wider gamut - but you probably won't see much difference from a VHS source.

    10bit wide gamut displays are already available, and have been for years.

    There are new 4K models already out as well (expensive)
    Quote Quote  
  8. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    Maybe an example will help. Here's a random video that happens to be on my computer right now. Using this script:

    Code:
    v1=ffVideoSource("D:\Downloads\TEST1.mkv", fpsnum=30000, fpsden=1001).BilinearResize(320,240)
    v2=ColorYUV(v1, cont_u=500, cont_v=500)
    v3=ColorYUV(v2, cont_u=-170, cont_v=-170)
    
    Interleave(v1.Subtitle("orginal"), v2.Subtitle("bad"), v3.Subtitle("corrected"))
    
    ConvertToYUY2().VideoScope("both", true, "Y", "UV", "UV")
    v1 is the original capture:

    Image
    [Attachment 22232 - Click to enlarge]


    v2 is a simulation of a capture where the colors were extremely over saturated. But the individual U and V values were still within the 0 to 255 range:

    Image
    [Attachment 22233 - Click to enlarge]


    When you run HighlightBadRGB() on v2 you get an image that is almost entirely red:

    Image
    [Attachment 22235 - Click to enlarge]


    But the original colors can be restored from v2 because U and V were not clipped. v3 is v2 corrected to restore the original colors:

    Image
    [Attachment 22234 - Click to enlarge]

    WOW!!!! This really helps summarize and put it all together. From this you get a feel for how Luma, U, and V are stored, and the relationship on how it affects the conversion to RGB. Great example! I'm already playing around with this right now .

    Going from Original to Corrected, you can see how if I stored the Chroma out of the Rec 601 range (Chroma between 16 and 240), that the Chroma data was NOT lost...and that it's a matter of reducing the Chroma to get the Chroma back in the same limits of Y (luma).

    What that shows me is if on capture, if my Chroma Saturation is not exactly perfect, I don't have to have a panic attack as long as that Chroma Saturation is not clipping in U and V!

    You guys have really nailed what I was trying to get at! I'll continue to play with this.
    Quote Quote  
  9. Originally Posted by JasonCA View Post
    What that shows me is if on capture, if my Chroma Saturation is not exactly perfect, I don't have to have a panic attack as long as that Chroma Saturation is not clipping in U and V!
    Exactly.

    Here's an animation that shows you valid U and V values for all values of Y:

    Code:
    function AnimateY(clip vid, int offset)
    {
      ColorYUV(vid, off_y=offset)
    #  Subtitle("Y="+String(offset))
    }
    
    
    BlankClip(length=256, width=256, height=256)
    
    #build a greyscale gradient
    black=Crop(0,0,-0,1)
    white=black.Invert()
    StackVertical(black,white)
    BilinearResize(256,256)
    Crop(0,63,-0,-63)
    grad=BilinearResize(256,256)
    
    
    U=grad.FlipVertical().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    V=grad.TurnLeft().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    Y = grad.ConvertToYV12(matrix="PC.601").ColorYUV(off_y=-256)
    YtoUV(U, V, Y)
    
    Animate(last, 0, 255, "AnimateY", 0, 255)
    HighlightBadRGB(0)
    ConvertToYUY2().VideoScope("both", true, "U", "V", "UV")
    HighlightBadRGB() gives some artifacts (lines, blocks) at the edges (I might look into that later) but you get the idea.
    Quote Quote  
  10. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    Originally Posted by JasonCA View Post
    What that shows me is if on capture, if my Chroma Saturation is not exactly perfect, I don't have to have a panic attack as long as that Chroma Saturation is not clipping in U and V!
    Exactly.

    Here's an animation that shows you valid U and V values for all values of Y (may be my older version of AviSynth since i'm using 2.5):

    Code:
    function AnimateY(clip vid, int offset)
    {
      ColorYUV(vid, off_y=offset)
    #  Subtitle("Y="+String(offset))
    }
    
    BlankClip(length=256, width=256, height=256)
    
    #build a greyscale gradient
    black=Crop(0,0,-0,1)
    white=black.Invert()
    StackVertical(black,white)
    BilinearResize(256,256)
    Crop(0,63,-0,-63)
    grad=BilinearResize(256,256)
    
    
    U=grad.FlipVertical().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    V=grad.TurnLeft().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    Y = grad.ConvertToYV12(matrix="PC.601").ColorYUV(off_y=-256)
    YtoUV(U, V, Y)
    
    Animate(last, 0, 255, "AnimateY", 0, 255)
    HighlightBadRGB(0)
    ConvertToYUY2().VideoScope("both", true, "U", "V", "UV")
    HighlightBadRGB() gives some artifacts (lines, blocks) at the edges (I might look into that later) but you get the idea.

    Hmm, I got the following error msg. Must be do the crop. See error message below:

    Code:
    function AnimateY(clip vid, int offset)
    {
      ColorYUV(vid, off_y=offset)
    #  Subtitle("Y="+String(offset))
    }
    
    BlankClip(length=256, width=256, height=256)
    
    #build a greyscale gradient
    black=Crop(0,0,-0,1)
    white=black.Invert()
    StackVertical(black,white)
    BilinearResize(256,256) #ERROR MSG HERE: "Resize: Source image too small for this resize method. Width=1, Support=1"
    Crop(0,63,-0,-63)
    grad=BilinearResize(256,256)
    
    
    U=grad.FlipVertical().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    V=grad.TurnLeft().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    Y = grad.ConvertToYV12(matrix="PC.601").ColorYUV(off_y=-256)
    YtoUV(U, V, Y)
    
    Animate(last, 0, 255, "AnimateY", 0, 255)
    HighlightBadRGB(0)
    ConvertToYUY2().VideoScope("both", true, "U", "V", "UV")
    To fix above I made the following change:

    Code:
    black=Crop(0,0,2,2)
    Amazing how little the colorspace YUV maps to...there's a lot of clipping going on. I noticed the artifacts too. Yeah, I wonder why that is when using HighlightBadRGB(). You can see how much color is available to Y on the vectorscope too in this. So much fun!

    I'll continue to look at this.
    Quote Quote  
  11. Your fix doesn't work quite right, the cube is too small. Try this:

    Code:
    function HighlightBadRGB(clip vid, int "color")
    {
      color = default(color, $ff0000)
    
      badcolor = BlankClip(vid, color=color)
      Subtract(vid, ConvertToRGB(vid).ConvertToYV12())
      Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add") # Y = abs(Y-126)
      ColorYUV(gain_y=65000)
      Overlay(vid,badcolor,0,0,last)
    }
    
    function AnimateY(clip vid, int offset)
    {
      ColorYUV(vid, off_y=offset)
    #  Subtitle("Y="+String(offset))
    }
    
    
    BlankClip(length=256, width=256, height=256)
    
    #build a greyscale gradient
    black=Crop(0,0,-0,1)
    white=black.Invert()
    StackVertical(black,black,white,white)
    BilinearResize(256,1024)
    Crop(0,383,-0,-384)  # adjust here if necessary, you want black at the top, white at the bottom
    grad=BilinearResize(256,256)
    
    U=grad.FlipVertical().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    V=grad.TurnLeft().ConvertToYV12(matrix="PC.601").BilinearResize(128,128)
    Y = grad.ConvertToYV12(matrix="PC.601").ColorYUV(off_y=-256)
    YtoUV(U, V, Y)
    
    Animate(last, 0, 255, "AnimateY", 0, 255)
    HighlightBadRGB(0)
    #ShowBadRGB()
    ConvertToYUY2().VideoScope("both", true, "U", "V", "UV")
    If you look at grad, it should be a 256x256 block with a gradient from full black at the top (R=G=B=0), full white at the bottom (R=G=B=255). If AviSynth 2.5's BilinearResize works a little differently you may have to adjust the crop() between the two BlinearResize() calls. It doesn't have to be perfect to get the gist of what's going on. MKV attached.

    You can use Gavino's ShowBadRGB() instead of my HighlightBadRGB() to get cleaner results. Though, as you know, his function gives you a grey box where the valid colors are, and passes through the bad colors.
    Image Attached Files
    Quote Quote  
  12. I updated my HighlightBadRGB() function to be more accurate. Previously it was only looking at differences in the Y channel after ConvertToRGB().ConvertToYV24(). Now it looks for differences in the Y, U, and V channels.

    Code:
    function HighlightBadRGB(clip vid, int "color")
    {
      color = default(color, $ff0000)
    
      badcolor = BlankClip(vid, color=color)
      Subtract(ConvertToYV24(vid), ConvertToYV24(vid).ConvertToRGB().ConvertToYV24())
      absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add")
      absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add")
      absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add")
      Overlay(absU,absV, mode="add")
      Overlay(last,absY, mode="add")
      ColorYUV(gain_y=65000)
      Overlay(vid,badcolor,0,0,last)
    }
    Attached is my "valid YUV" animation made with the updated HighlightBadRGB() and with Gavino's ShowBadRGB()
    Image Attached Files
    Last edited by jagabo; 20th Dec 2013 at 23:13.
    Quote Quote  
  13. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    I updated my HighlightBadRGB() function to be more accurate. Previously it was only looking at differences in the Y channel after ConvertToRGB().ConvertToYV24(). Now it looks for differences in the Y, U, and V channels.

    Code:
    function HighlightBadRGB(clip vid, int "color")
    {
      color = default(color, $ff0000)
    
      badcolor = BlankClip(vid, color=color)
      Subtract(ConvertToYV24(vid), ConvertToYV24(vid).ConvertToRGB().ConvertToYV24())
      absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add")
      absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add")
      absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add")
      Overlay(absU,absV, mode="add")
      Overlay(last,absY, mode="add")
      ColorYUV(gain_y=65000)
      Overlay(vid,badcolor,0,0,last)
    }
    Attached is my "valid YUV" animation made with the updated HighlightBadRGB() and with Gavino's ShowBadRGB()
    This is great Jagabo! However, I just tried it and the video Overlay's are returning with half the size.

    I get the following AviSynth error msg: "Overlay: Mask and overlay must have the same image size! (Width is not the same)"

    To put a patch in to make it work, I did the following:

    Code:
    function HighlightBadRGB(clip vid, int "color")
    {
      color = default(color, $ff0000)
    
      badcolor = BlankClip(vid, color=color)
      Subtract(ConvertToYV12(vid), ConvertToYV12(vid).ConvertToRGB().ConvertToYV12())
      
      absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add")
      absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add")
      absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add")
      Overlay(absU,absV, mode="add")
      Overlay(last,absY, mode="add")
      ColorYUV(gain_y=65000)
     
      #NOTE: absU and absV for me are returning a image half the size of my input.
      #I added BilinearResize to resize the overlay to match my input video (I know, this is not a FIX)
      BilinearResize(720,480)
    
      Overlay(vid,badcolor,0,0,last)
    }
    After making it work, I see the BAD RGB everywhere in my video despite even lowering the Chroma.

    Have you tried running this through some actual video? Aside from that, I only have ConvertToYV12() so I reverted back to that for AviSynth 2.5. I'll upgrade to the new one soon!

    Quote Quote  
  14. With AviSynth 2.6 working in YV24 the U an V planes are the same size as the Y plane. Since you are working with YV12 the U and V planes are half the size (each dimension) of the Y plane. The chroma supersampling going from YUV 4:2:0 to RGB 4:4:4 and chroma subsampling going back to YUV 4:2:0 causes many small rounding errors.

    Try this for AviSynth 2.5:

    Code:
    function HighlightBadRGB2dot5(clip vid, int "color")
    {
      color = default(color, $ff0000)
    
      badcolor = BlankClip(vid, color=color)
      Subtract(ConvertToYUY2(vid), ConvertToYUY2(vid).ConvertToRGB().ConvertToYUY2())
      absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add").ColorYUV(off_y=-1)
      absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add").ColorYUV(off_y=-1)
      absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add").ColorYUV(off_y=-1)
      Overlay(absU,absV, mode="add").PointResize(vid.width, vid.height)
      Overlay(last,absY, mode="add")
      ColorYUV(gain_y=65000)
      Overlay(vid,badcolor,0,0,last)
    }
    Using YUY2 instead of YV12 gives less up/down sampling errors. Adding ColorYUV(off_y==-1) ignores small rounding errors (of course, it ignores small true errors too). But it gets the output close to that which you get with YV24 in AviSynth 2.6.

    HighlightBadRGB() and HighlightBadRGB2dot5():
    Click image for larger version

Name:	2.5.jpg
Views:	873
Size:	84.0 KB
ID:	22291

    That same video using YV12, and without ignoring small errors, gave many red dots everywhere.
    Quote Quote  
  15. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    With AviSynth 2.6 working in YV24 the U an V planes are the same size as the Y plane. Since you are working with YV12 the U and V planes are half the size (each dimension) of the Y plane. The chroma supersampling going from YUV 4:2:0 to RGB 4:4:4 and chroma subsampling going back to YUV 4:2:0 causes many small rounding errors.

    Try this for AviSynth 2.5:

    Code:
    function HighlightBadRGB2dot5(clip vid, int "color")
    {
      color = default(color, $ff0000)
    
      badcolor = BlankClip(vid, color=color)
      Subtract(ConvertToYUY2(vid), ConvertToYUY2(vid).ConvertToRGB().ConvertToYUY2())
      absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add").ColorYUV(off_y=-1)
      absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add").ColorYUV(off_y=-1)
      absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add").ColorYUV(off_y=-1)
      Overlay(absU,absV, mode="add").PointResize(vid.width, vid.height)
      Overlay(last,absY, mode="add")
      ColorYUV(gain_y=65000)
      Overlay(vid,badcolor,0,0,last)
    }
    Using YUY2 instead of YV12 gives less up/down sampling errors. Adding ColorYUV(off_y==-1) ignores small rounding errors (of course, it ignores small true errors too). But it gets the output close to that which you get with YV24 in AviSynth 2.6.

    HighlightBadRGB() and HighlightBadRGB2dot5():
    Image
    [Attachment 22291 - Click to enlarge]


    That same video using YV12, and without ignoring small errors, gave many red dots everywhere.
    I'm playing around with it right now. It seems to work so far! Excellent!

    AviSynth V2.6 official is not out yet. The preview is available. So where did you get your AviSynth 2.6? Or, which version do you recommend as a replacemnet to my AviSynth 2.5? Seems like there is an 'unoffical' multithreaded MT version.

    Apparently there is also a %avisynthdir% search path somewhere. I'm not sure where that is settable. I don't see it in my environment variables.

    Once I update, I'll try the version that works for 2.5 and the new one that should work under 2.6.
    Quote Quote  
  16. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    Originally Posted by JasonCA View Post
    Originally Posted by poisondeathray View Post
    Typically what is used is a YCbCr parade. It's a waveform traceing of Y, Cb, Cr

    In avisynth you can use histogram("levels") , to see Y, Cb, Cr displayed . It goes from 0-255. If the ends are "bunched" up then you have clipping
    http://avisynth.nl/index.php/Histogram#Levels_mode
    Fantastic! No really, this is very helpful!
    Yes, it is, and if you ever get around to actually capturing some VHS and using histogram(mode="levels") to analyse it, you'll see that U and V are so far away from clipping that you're wasting your time even thinking about it.

    Cheers,
    David.
    Quote Quote  



Similar Threads