First time post, long time lurker. Hopefully someone here can help me with a little snag I have run into with my latest GNURadio project.
I would like to take a image in the YUV colorspace, and split the Y, U, and V components to streams to use in an NTSC modulator. I have the RF portion built and from what I can tell, working, but I of course need some valid information to feed into it first.
The video stream would come from gstreamer, and the output would be named pipes which the individual streams would pass through to the radio.
My experience in programming is limited to Python at this point. Not sure if that matters here, but figured I'd note that.
Thanks in advance.
+ Reply to Thread
Results 1 to 5 of 5
-
Last edited by jowijo; 19th Jul 2014 at 16:12. Reason: Used the wrong tense for "built".
-
Cb, Cr are usually extracted as a greyscale representation of each respective chroma plane, would that work for you?
There is a ffmpeg gstreamer, and you should be able to use -vf extractplanes to get, Y,Cb,Cr
http://gstreamer.freedesktop.org/modules/gst-ffmpeg.html
https://www.ffmpeg.org/ffmpeg-filters.html#extractplanes
Alternatively, with avisynth you can use greyscale(), utoy(), vtoy() to show the y,u,v planes in greyscale -
-
-c:v rawvideo to decode your source video into raw YUV, the filter -vf extractplanes will split into Y,U,V . You can use ffmpeg to pipe to other applications
-
If it helps, here is a stupid example of a single pipe of ffmpeg's "u" plane output using stdin to ffplay. "Stupid" because you could just as easily use ffplay with -vf extractplanes directly, but the point is you can pipe ffmpeg to other applications
"1.avi" source video is 640x360, xvid, YUV 4:2:0 . So that means Y' is 640x360, U is 320x180, V is 320x180
Code:ffmpeg -i 1.avi -vf extractplanes=u -c:v rawvideo -f rawvideo - | ffplay -f rawvideo -s 320x180 -pix_fmt gray -
I'm not familiar with gstreamer, or how to split multiple named pipes in ffmpeg, but it should be possible