You need version 2.6 of AviSynth to use ConvertToYV24() -- YUV 4:4:4.
Thank you jagabo! That is my problem! I'm using AviSynth 2.5. I'll upgrade to that here soon.
It seems like the goal is to balance the Luma levels with the Chroma levels. For NTSC VHS, this means Luma is captured from 7.5 IRE to 100IRE to the capture card. Right? In the same way, the Chroma levels, at least to me, need to match the Luma levels. This is done by the Chroma levels being at a 75% saturation on the VectorScope. But I'm not 100% sure, which is why I am asking and discussing so that I better understand how to measure Chroma levels.
You said, "If you're planning on adjusting colors later, and will be working in YUV, you don't need to worry about that during capture." But working in YUV, how do I know the UV portion of the signals are clipping to my capture card or not?
"As long as the colors aren't clipping at the hard boudaries (0, 255) you can still correct them." By this, you mean after conversion to RGB. It seems to me, on the conversion to RGB that if the level of Chroma where also captured at the proper levels of the Luma, then in conversion to RGB they should not clip. I believe I remember reading on the Doom forum that one of the ways to ensure this is to make sure your Chroma levels are just slightly lower then the Luma levels. I'll try to go back and look for that.
There were two scripts I posted in that post. One was where Tweak(sat=1.0). This is normal saturation meaning I am not at all affecting the color bars. On the other ones, I slightly increased the saturation, and you can see two of the color bands clipped. It was just to make the point that when looking at the U and V independently (not the UV vectorscope) that you CAN NOT determine by the U and V independent scopes that your color levels would clip. I suppose because of the relationship that the U and V has to the Luma on conversion to RGB.
However, disregarding Luma, perhaps all I need to look at is the U and V independent scopes to see that there signal is not be clipped from top to bottom? If U and V are truly the Chromanace signal and they are not being clipped in the independent U and V scopes, then perhaps as far as what the Capture Card is seeing....the signal is fully captured. The UV may not match the Luma levels....but at least the U & V signal is not clipped. In which case, I can just lower the saturation via software and end up with the same result.
Luminance and Chromanace are separate signals. Perhaps I'm confusing things a bit in trying to ensure both are balanced together. My main concern is ensuring that each signal (Luminance) and Chromanace are being properly captured to the capture card.
Right...there seems to be some relationship between the Luma levels and the Chroma levels.
Ok great!![]()
+ Reply to Thread
Results 31 to 46 of 46
-
-
Look at U and V in VideoScope(). As long as there are no flat peaks at 255, or flat valleys at 0, they are not clipping*.
No, they did not clip. U and V were still within the 0-255 range. They won't be clipped until you convert to RGB. So long as the video is in YUV you can fix it by reducing the saturation.
* In practice, a capture device or driver may clip somewhere above 0 or below 255. -
Maybe an example will help. Here's a random video that happens to be on my computer right now. Using this script:
Code:v1=ffVideoSource("D:\Downloads\TEST1.mkv", fpsnum=30000, fpsden=1001).BilinearResize(320,240) v2=ColorYUV(v1, cont_u=500, cont_v=500) v3=ColorYUV(v2, cont_u=-170, cont_v=-170) Interleave(v1.Subtitle("orginal"), v2.Subtitle("bad"), v3.Subtitle("corrected")) ConvertToYUY2().VideoScope("both", true, "Y", "UV", "UV")
v2 is a simulation of a capture where the colors were extremely over saturated. But the individual U and V values were still within the 0 to 255 range:
When you run HighlightBadRGB() on v2 you get an image that is almost entirely red:
But the original colors can be restored from v2 because U and V were not clipped. v3 is v2 corrected to restore the original colors:
Last edited by jagabo; 20th Dec 2013 at 15:54.
-
Fantastic! No really, this is very helpful!
I found something interesting when playing with this and I am wondering if it's a bug with the Limiter function:
Code:ColorBars() ConvertToYV12() Crop(last,0,0,640,300) BilinearResize(640,480) ConvertToYV12() Tweak(sat=1.55) Limiter(last,16, 235, 16,239,"chroma") #luma (16 to 235), Chroma (16 to 239) Histogram("levels")
However, if you change max_chroma=240, you can boost the saturation to ANYTHING you like and it won't clip.
For example:
Code:ColorBars() ConvertToYV12() Crop(last,0,0,640,300) BilinearResize(640,480) ConvertToYV12() Tweak(sat=8.0) # heavy saturation Limiter(last,16, 235, 16,240,"chroma") #luma (16 to 235), Chroma (16 to 240) Histogram("levels")
Too bad the Limiter function did not specify an alternative color for representing a clamped Chroma (instead of having to be Yellow). As the AviSynth documentation says (NOTE: the wiki documentation seems to be lacking the clipping color):"show can be "luma" (shows out of bounds luma in red/green), "luma_grey" (shows out of bounds luma and makes the remaining pixels greyscale), "chroma" (shows out of bounds chroma in yellow), "chroma_grey" (shows out of bounds chroma and makes the remaining pixels greyscale). The coloring is done as follows"
Caring on ...
This is a very goood point! And that's what I am looking for to help answer my question. So heavily Saturated U and V is OK as long as it's not clipped. If the U and V limits stored are to heavily saturated, then they may NOT translate nicely to RGB. However, as you are saying, the data is not lost...the capture card did capture the full U and V signal as long as the U and V didn't get clip. So in post-processing for RGB, even though the U and V are heavily saturated, I can simply reduce the saturation and end up with the same results as if I would have captured at that saturation to begin with. Bingo! That answers my question! I'm sooooo thrilled!
This makes me wonder:
Is it better to have a more heavily saturated U and V (as long as they don't clamp) to better represent the color when the capture card is sampling the Chroma? If my Chroma was near greyscale, trying to expand would leave to banding...right? So, to get good sampling on my Chroma, is it good that my Chroma is evenly distributed between 0 and 255?Hmmmmmm
For VHS, I'm capturing data in 8 bit Y'CbCR (or YUV), so does this mean that in the future, I'd be able to see better dynamic range with a larger RGB color space such as maybe Rec2020? 8bit Y'CbCr may not fully translate to the 8bit RGB. But, would 8bit Y'CbCr translate better to say 10bit RGB with a larger color gamut? Or how is that related? I am thinking, wow, maybe in the future I'll see VHS tapes in a way that I have never seen them before in terms of the color (realizing the VHS resolution will always remain low). -
Very clear jagabo! That's what I was looking for someone to tell me. So how U and V translate to RGB is a WHOLE other thing.
In terms of what I am capturing, I can use VideoScope while looking at U and V and ensure that there are no flat peaks at 255 (depending on my capture card's limits).
I got it now! It's making better sense.
The relationship that I was trying to get is stated in the Limiter documentation. It says:"The standard known as CCIR-601 defines the range of pixel values considered legal for presenting on a TV. These ranges are 16-235 for the luma component and 16-240 for the chroma component. "
So, I was seeing a relationship between Luma and Chroma but didn't know how it ties together. For VHS and Rec 601, it would seem that converting to RGB is best when Luma is from 16-235 and Chroma stored as 16-240. I'd probably get less RGB clipping since the Chroma would be within the conversion range of RGB. However, as said, conversion to RGB is it's own beast and can be made seperate from the Capture process!
My main interest, as you are aware, is ensuring that I am not clipping the Chroma. So now I see that as long as U and V are not clipping in U and V from 0 to 255 (roughly and depending on capture card's max Chroma limits), then I am good! And now I see this can be detected via the Limiter function, Historgram ("levels") function, and VideoScope (U and V ) scope.
It would therefore seem the VectorScope (no not the VideoScope) is better suited for how the U and V get translated to actual RGB colors. As said, the Chroma must be at 75% saturation which I guess is from 16 to 240 in Chroma space. But, I'll have to experiment a bit with this too see if that's really true.
This is something I will experiment with on my capture card. For example, one of my capture card's is the ATI 600 USB. So, I wonder what the max Chroma clipping range would be. I'll experiment with that soon!This is very exciting!
-
Last edited by jagabo; 20th Dec 2013 at 17:59.
-
For Tweak, you need to set coring=false, otherwise it clips Y' 16-235, CbCr 16-240
http://avisynth.nl/index.php/Tweak
coring = true/false (optional; true by default, which reflects the behaviour in older versions). When set to true, the luma (Y) is clipped to [16,235] and the chroma (U, V) is clipped to [16,240]; when set to false, the luma and chroma are unconstrained. [Added in v2.53].This is a very goood point! And that's what I am looking for to help answer my question. So heavily Saturated U and V is OK as long as it's not clipped. If the U and V limits stored are to heavily saturated, then they may NOT translate nicely to RGB. However, as you are saying, the data is not lost...the capture card did capture the full U and V signal as long as the U and V didn't get clip. So in post-processing for RGB, even though the U and V are heavily saturated, I can simply reduce the saturation and end up with the same results as if I would have captured at that saturation to begin with. Bingo! That answers my question! I'm sooooo thrilled!
This makes me wonder:
Is it better to have a more heavily saturated U and V (as long as they don't clamp) to better represent the color when the capture card is sampling the Chroma? If my Chroma was near greyscale, trying to expand would leave to banding...right? So, to get good sampling on my Chroma, is it good that my Chroma is evenly distributed between 0 and 255?Hmmmmmm
For VHS, I'm capturing data in 8 bit Y'CbCR (or YUV), so does this mean that in the future, I'd be able to see better dynamic range with a larger RGB color space such as maybe Rec2020? 8bit Y'CbCr may not fully translate to the 8bit RGB. But, would 8bit Y'CbCr translate better to say 10bit RGB with a larger color gamut? Or how is that related? I am thinking, wow, maybe in the future I'll see VHS tapes in a way that I have never seen them before in terms of the color (realizing the VHS resolution will always remain low).
10bit wide gamut displays are already available, and have been for years.
There are new 4K models already out as well (expensive) -
WOW!!!! This really helps summarize and put it all together. From this you get a feel for how Luma, U, and V are stored, and the relationship on how it affects the conversion to RGB. Great example! I'm already playing around with this right now.
Going from Original to Corrected, you can see how if I stored the Chroma out of the Rec 601 range (Chroma between 16 and 240), that the Chroma data was NOT lost...and that it's a matter of reducing the Chroma to get the Chroma back in the same limits of Y (luma).
What that shows me is if on capture, if my Chroma Saturation is not exactly perfect, I don't have to have a panic attack as long as that Chroma Saturation is not clipping in U and V!
You guys have really nailed what I was trying to get at! I'll continue to play with this. -
Exactly.
Here's an animation that shows you valid U and V values for all values of Y:
Code:function AnimateY(clip vid, int offset) { ColorYUV(vid, off_y=offset) # Subtitle("Y="+String(offset)) } BlankClip(length=256, width=256, height=256) #build a greyscale gradient black=Crop(0,0,-0,1) white=black.Invert() StackVertical(black,white) BilinearResize(256,256) Crop(0,63,-0,-63) grad=BilinearResize(256,256) U=grad.FlipVertical().ConvertToYV12(matrix="PC.601").BilinearResize(128,128) V=grad.TurnLeft().ConvertToYV12(matrix="PC.601").BilinearResize(128,128) Y = grad.ConvertToYV12(matrix="PC.601").ColorYUV(off_y=-256) YtoUV(U, V, Y) Animate(last, 0, 255, "AnimateY", 0, 255) HighlightBadRGB(0) ConvertToYUY2().VideoScope("both", true, "U", "V", "UV")
-
Hmm, I got the following error msg. Must be do the crop. See error message below:
Code:function AnimateY(clip vid, int offset) { ColorYUV(vid, off_y=offset) # Subtitle("Y="+String(offset)) } BlankClip(length=256, width=256, height=256) #build a greyscale gradient black=Crop(0,0,-0,1) white=black.Invert() StackVertical(black,white) BilinearResize(256,256) #ERROR MSG HERE: "Resize: Source image too small for this resize method. Width=1, Support=1" Crop(0,63,-0,-63) grad=BilinearResize(256,256) U=grad.FlipVertical().ConvertToYV12(matrix="PC.601").BilinearResize(128,128) V=grad.TurnLeft().ConvertToYV12(matrix="PC.601").BilinearResize(128,128) Y = grad.ConvertToYV12(matrix="PC.601").ColorYUV(off_y=-256) YtoUV(U, V, Y) Animate(last, 0, 255, "AnimateY", 0, 255) HighlightBadRGB(0) ConvertToYUY2().VideoScope("both", true, "U", "V", "UV")
Code:black=Crop(0,0,2,2)
I'll continue to look at this. -
Your fix doesn't work quite right, the cube is too small. Try this:
Code:function HighlightBadRGB(clip vid, int "color") { color = default(color, $ff0000) badcolor = BlankClip(vid, color=color) Subtract(vid, ConvertToRGB(vid).ConvertToYV12()) Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add") # Y = abs(Y-126) ColorYUV(gain_y=65000) Overlay(vid,badcolor,0,0,last) } function AnimateY(clip vid, int offset) { ColorYUV(vid, off_y=offset) # Subtitle("Y="+String(offset)) } BlankClip(length=256, width=256, height=256) #build a greyscale gradient black=Crop(0,0,-0,1) white=black.Invert() StackVertical(black,black,white,white) BilinearResize(256,1024) Crop(0,383,-0,-384) # adjust here if necessary, you want black at the top, white at the bottom grad=BilinearResize(256,256) U=grad.FlipVertical().ConvertToYV12(matrix="PC.601").BilinearResize(128,128) V=grad.TurnLeft().ConvertToYV12(matrix="PC.601").BilinearResize(128,128) Y = grad.ConvertToYV12(matrix="PC.601").ColorYUV(off_y=-256) YtoUV(U, V, Y) Animate(last, 0, 255, "AnimateY", 0, 255) HighlightBadRGB(0) #ShowBadRGB() ConvertToYUY2().VideoScope("both", true, "U", "V", "UV")
You can use Gavino's ShowBadRGB() instead of my HighlightBadRGB() to get cleaner results. Though, as you know, his function gives you a grey box where the valid colors are, and passes through the bad colors. -
I updated my HighlightBadRGB() function to be more accurate. Previously it was only looking at differences in the Y channel after ConvertToRGB().ConvertToYV24(). Now it looks for differences in the Y, U, and V channels.
Code:function HighlightBadRGB(clip vid, int "color") { color = default(color, $ff0000) badcolor = BlankClip(vid, color=color) Subtract(ConvertToYV24(vid), ConvertToYV24(vid).ConvertToRGB().ConvertToYV24()) absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add") absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add") absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add") Overlay(absU,absV, mode="add") Overlay(last,absY, mode="add") ColorYUV(gain_y=65000) Overlay(vid,badcolor,0,0,last) }
Last edited by jagabo; 20th Dec 2013 at 23:13.
-
This is great Jagabo! However, I just tried it and the video Overlay's are returning with half the size.
I get the following AviSynth error msg: "Overlay: Mask and overlay must have the same image size! (Width is not the same)"
To put a patch in to make it work, I did the following:
Code:function HighlightBadRGB(clip vid, int "color") { color = default(color, $ff0000) badcolor = BlankClip(vid, color=color) Subtract(ConvertToYV12(vid), ConvertToYV12(vid).ConvertToRGB().ConvertToYV12()) absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add") absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add") absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add") Overlay(absU,absV, mode="add") Overlay(last,absY, mode="add") ColorYUV(gain_y=65000) #NOTE: absU and absV for me are returning a image half the size of my input. #I added BilinearResize to resize the overlay to match my input video (I know, this is not a FIX) BilinearResize(720,480) Overlay(vid,badcolor,0,0,last) }
Have you tried running this through some actual video? Aside from that, I only have ConvertToYV12() so I reverted back to that for AviSynth 2.5. I'll upgrade to the new one soon!
-
With AviSynth 2.6 working in YV24 the U an V planes are the same size as the Y plane. Since you are working with YV12 the U and V planes are half the size (each dimension) of the Y plane. The chroma supersampling going from YUV 4:2:0 to RGB 4:4:4 and chroma subsampling going back to YUV 4:2:0 causes many small rounding errors.
Try this for AviSynth 2.5:
Code:function HighlightBadRGB2dot5(clip vid, int "color") { color = default(color, $ff0000) badcolor = BlankClip(vid, color=color) Subtract(ConvertToYUY2(vid), ConvertToYUY2(vid).ConvertToRGB().ConvertToYUY2()) absY = Overlay(ColorYUV(off_y=-126), Invert().ColorYUV(off_y=-130), mode="add").ColorYUV(off_y=-1) absU = Overlay(UtoY().ColorYUV(off_y=-128), UtoY().Invert().ColorYUV(off_y=-128), mode="add").ColorYUV(off_y=-1) absV = Overlay(VtoY().ColorYUV(off_y=-128), VtoY().Invert().ColorYUV(off_y=-128), mode="add").ColorYUV(off_y=-1) Overlay(absU,absV, mode="add").PointResize(vid.width, vid.height) Overlay(last,absY, mode="add") ColorYUV(gain_y=65000) Overlay(vid,badcolor,0,0,last) }
HighlightBadRGB() and HighlightBadRGB2dot5():
That same video using YV12, and without ignoring small errors, gave many red dots everywhere. -
I'm playing around with it right now. It seems to work so far! Excellent!
AviSynth V2.6 official is not out yet. The preview is available. So where did you get your AviSynth 2.6? Or, which version do you recommend as a replacemnet to my AviSynth 2.5? Seems like there is an 'unoffical' multithreaded MT version.
Apparently there is also a %avisynthdir% search path somewhere. I'm not sure where that is settable. I don't see it in my environment variables.
Once I update, I'll try the version that works for 2.5 and the new one that should work under 2.6. -
Similar Threads
-
Flickering chroma(?) lines in VHS capture
By Tel147 in forum Capturing and VCRReplies: 28Last Post: 3rd Oct 2012, 06:57 -
Calibrating luminance levels/color for capture from VHS
By Cherbette in forum Capturing and VCRReplies: 188Last Post: 29th Sep 2011, 20:18 -
How can I correct severe tearing during VHS capture?
By Sparrowhawk in forum Capturing and VCRReplies: 11Last Post: 3rd Apr 2011, 17:31 -
fluctuation of saturation and hue levels
By mark23 in forum RestorationReplies: 0Last Post: 1st Feb 2011, 17:12 -
Correct Black Levels
By MasterOfPuppets in forum Capturing and VCRReplies: 7Last Post: 10th Jul 2010, 15:38