I wish to understand BT.601/BT.709 by a simple example!
1. Generate a 64x64 blue png file with python:
#!/usr/bin/env python3
import cv2
import numpy as np
# Create a blank 300x300 black image
image = np.zeros((64, 64, 3), np.uint8)
# Fill image with red color(set each pixel to red)
image[:] = (255, 0, 0) #BGR
cv2.imshow('single color', image)
cv2.imwrite("blue.png",image)
cv2.waitKey(0)
cv2.destroyWindow('single color')
2. Convert blue.png to yuv420p with ffmpeg:
ffmpeg -y -i blue.png -s 64x64 -pix_fmt yuv420p yuv420p.yuv
3. Check the output yuv420p.yuv, it's (41, 240, 110).
How can I input (0,0,255) then output (41,240,110)?
Is this by BT.601 color matrix?
How can I generate BT.709 output for blue color?
If convert it back by BT.601 or BT.709, Should I got (0,0,255) exactly or (0,0,235)?
+ Reply to Thread
Results 1 to 14 of 14
-
-
Can you clarify what you're trying to do?
Is this by BT.601 color matrix?
How can I generate BT.709 output for blue color?
You should use the flags full_chroma_int+accurate_rnd, otherwise swscale will use the inaccurate method (more rounding errors) . You might not see on a single color test, but there is difference on more complicated tests
ffmpeg -i blue.png -vf scale=out_color_matrix=bt709:flags=full_chroma_int +accurate_rnd,format=yuv420p yuv420p_709.yuv
If convert it back by BT.601 or BT.709, Should I got (0,0,255) exactly or (0,0,235)?
That is on the 1 color test. On real material you will get 8bit rounding errors and 4:2:0 subsampling issues will cause it to be irreversible for a variety of reasons -
Thanks,
I wish to use a color pattern to verify what color space is using in android camera captured YUV, so try to use a simple sample and need to know which output will be BT.601 or BT.709.
And when I try to run your command to convert png to BT.709, I got a error in MacOsx.
$ ffmpeg -i blue.png -vf scale=out_color_matrix=bt709:flags=full_chroma_int +accurate_rnd,format=yuv420p yuv420p_709.yuv
ffmpeg version 4.1.4 Copyright (c) 2000-2019 the FFmpeg developers
built with Apple LLVM version 10.0.1 (clang-1001.0.46.4)
configuration: --prefix=/usr/local/Cellar/ffmpeg/4.1.4 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags='-I/Library/Java/JavaVirtualMachines/adoptopenjdk-12.0.1.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/adoptopenjdk-12.0.1.jdk/Contents/Home/include/darwin' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-videotoolbox --disable-libjack --disable-indev=jack --enable-libaom --enable-libsoxr
libavutil 56. 22.100 / 56. 22.100
libavcodec 58. 35.100 / 58. 35.100
libavformat 58. 20.100 / 58. 20.100
libavdevice 58. 5.100 / 58. 5.100
libavfilter 7. 40.101 / 7. 40.101
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 3.100 / 5. 3.100
libswresample 3. 3.100 / 3. 3.100
libpostproc 55. 3.100 / 55. 3.100
Input #0, png_pipe, from 'blue.png':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: png, rgb24(pc), 64x64, 25 tbr, 25 tbn, 25 tbc
[NULL @ 0x7f8aef818e00] Unable to find a suitable output format for '+accurate_rnd,format=yuv420p'
+accurate_rnd,format=yuv420p: Invalid argument -
There is a space where there shouldn't be before the "+"
Code:ffmpeg -i blue.png -vf scale=out_color_matrix=bt709:flags=full_chroma_int+accurate_rnd,format=yuv420p yuv420p_709.yuv
Did you mean you wanted to generate a synthetic known color pattern input (known values) with ffmpeg ? (instead of python or using png) ? -
Great, now I got below table!
| RGB | YCbCr(BT.601) | YCbCr(BT.709) |
|-----------+---------------+---------------|
| (0,0,255) | (41,240,110) | (32,240,118) |
What I want to do is:
1. Use python generate png and use Android to capture it.
2. Then call camera API to get the YUV to identify it's BT.601 if near (41,240,110) or BT.709 if near (32,240,118).
Any more smart method can do the same?
What's more, I wish to use ffmpeg convert YUV back to RGB in below cases:
1. YUV BT.601 -> RGB with BT.601 matrix, expected output (0,0,255).
2. YUV BT.709 -> RGB with BT.709 matrix, expected output (0,0,255).
3. YUV BT.601 -> RGB with BT.709 matrix, expected output (?,?,?) (overflow?)
4. YUV BT.709 -> RGB with BT.601 mattrix, expected output (?,?,?) (overlfow?)
How can I do such tasks with ffmpeg? -
You can check out Vapoursynth that runs in Python dealing with loading videos, images, colorspace changes and as running in python you can do comparisons right there.
It needs to be installed though, not using pip alone. -
You can generate known YUV or RGB values directly in ffmpeg , would that help ?
What does the "Android to capture it" process entail ? Is it a camera physically pointed at something, or is done completely in software ? How is input fed into the android camera ? Because lighting/exposure is a factor that can affect your results
What's more, I wish to use ffmpeg convert YUV back to RGB in below cases:
1. YUV BT.601 -> RGB with BT.601 matrix, expected output (0,0,255).
2. YUV BT.709 -> RGB with BT.709 matrix, expected output (0,0,255).
3. YUV BT.601 -> RGB with BT.709 matrix, expected output (?,?,?) (overflow?)
4. YUV BT.709 -> RGB with BT.601 mattrix, expected output (?,?,?) (overlfow?)
How can I do such tasks with ffmpeg?
Expected values for scenario 3,4 are
3) RGB 0,15,255
4) RGB 3,0,245
Some colors will give you exact results, but the expected range of rounding errors for 8bit RGB=>YUV=>RGB round trip conversions is +/-3 (e.g. some colors might be off +/-3) when you test all colors. (10bit YUV should be able to give you perfect values from 8bit RGB) -
You are right, I plan to use camera to capture the picture for analysis, that means it's will be impact to environment.
BTW, I try below command:
for #2
ffmpeg -f rawvideo -vcodec rawvideo -s 64x64 -r 25 -pix_fmt yuv420p -i yuv420p_709.yuv -vf scale=out_color_matrix=bt709:flags=full_chroma_int +accurate_rnd,format=rgb24 out.rgb
ffmpeg -f rawvideo -vcodec rawvideo -s 64x64 -r 25 -pix_fmt yuv420p -i yuv420p_709.yuv -vf scale=out_color_matrix=bt601:flags=full_chroma_int +accurate_rnd,format=rgb24 out.rgb
what's wrong with me command? -
Sorry I should have been more clear - it's supposed to be in_color_matrix when going from YUV to RGB. The matrix is always specified for the YUV direction
If you're doing environment, you should use color bars, or known test pattern. ie. More than 1 color. Helpful since rarely is ambient light perfect "white" . If you have known bars, you can calculate what to expect for scenario 1,2,3,4. -
Thanks,
I found a API in android, which will do YCbCr to RGB conversion. since application will use this API to convert captured YCbCr to RGB, so it's good to use that API to identify camera's output YCbCr data colorspace!
Below is my test result:
API: rsYuvToRGBA_uchar4
| Input(YCbCr) | Output(rgb) |
| (41,240,110) | (0,0,254) |
| (32,240,118) | (2,0,244) |
then if we feed BT.709 (32,240,118), it output (2,0,244), we expect it to be (3,0,245).
I don't think it's just a computer error. Maybe still another version colormatrix used in android? -
Close enough. +/-3 is typical for 8bit. There are slight differences in rounding and the way it's calculated, even in ffmpeg
When you compare all colors, the no flag swscale is the least accurate . That looks like what android is using. Browsers currently use the same. It's the "fast" method, lower accuracy
swscale rgb24 no flag 0,0,254
swscale rgb24 flag 1,0,255
swscale gbrp no flag 1,0,255
swscale gbrp flag 1,0,255
zimg gbrp 1,0,255
8bit RGB => 10bit YUV => 8bit RGB will give you 0,0,255 if it's done properly
And the difference between 601 vs. 709 is much larger than +/-3, it should be easy to detect, especially with colorbars -
8 bit limited range YUV (YCbCr) has only about 1/6 as many valid different colors as 8 bit RGB.
https://forum.videohelp.com/threads/381298-RGB-to-YUV-to-RGB#post2467087
So you expect about 6 different RGB values to map to the same YUV value. -
BT.601 is obsolete for all intents and purposes. I don't know why anyone would want to mess with it.
-
If you are doing real-world camera testing, the X-Rite Color checker is probably the reference that pros use most often.
Color bars are used more for display & record/playback testing.
Scott
Similar Threads
-
Convert rec.601 into rec.709?
By killerteengohan in forum RestorationReplies: 8Last Post: 15th Jun 2021, 18:25 -
Converting rec.709 to rec.601?
By killerteengohan in forum RestorationReplies: 22Last Post: 13th Jun 2020, 16:19 -
Screenshots and respect of the YUV>RGB colorspace conversion
By abolibibelot in forum Video ConversionReplies: 15Last Post: 3rd Nov 2018, 17:37 -
can x264vfw do change rec.601 --> rec.709?
By marcorocchini in forum Newbie / General discussionsReplies: 2Last Post: 9th Mar 2017, 15:06 -
ConvertToYV12 in Avisynth: Rec601, Rec709, PC.601, PC.709, or Average?
By CZbwoi in forum RestorationReplies: 61Last Post: 10th Dec 2016, 16:59