I wondered if anyone could tell me if the HDMI spec/signal includes Error Correction...? I've been told it does not and that you can get "bad pixels" in the image from lost 'bits' - I was under the impression that HDMI either worked or didn't as it was a digital signal (or analog signal that carries digital information).. ive heard of white dots appearing on the screen when long runs of HDMI is used? (can't remember the term) ...point being some people on other places online are making it sound like you need to spend a fortune on a high quality cable to prevent "bad pixels" and therefore gain superior picture quality over cheaper cables..
Iam just trying to get this confirmed to educate myself really..
thanks in advance
+ Reply to Thread
Results 1 to 14 of 14
Yes, a small amount of error correction data (Reed Solomon) is transmitted . It's not full duplex though -- the client can't ask for the frame to be sent again on errors. If there are too many errors they will not be corrected and the picture will be corrupted.
thanks for the quick reply jagabo...
when you say corrupted do you mean outright you can spot there is an issue? or would you get "bad pixels" where you may not be able to tell but your picture IS degraded?
It's not like analog where you may get small changes in brightness, color, sharpness, ghosting, etc. The sparkles aren't subtle. They're very obvious.
With longer runs the cable quality becomes more important but the correlation between cost and video quality is not good.
thanks jagabo - you have confirmed exactly what I believe to be true, you will see the sparklies and know there is an issue...rather than have a slightly degraded (but un-noticable and very watchable) signal in that a better cable may produce a better picture (not true), the only thing I was wrong on was error correction, I thought it did have some form of FEC applied on the whole signal as Im sure I read it somewhere but if I did it was obviously incorrect....confusing my VDSL knowledge with HDMI lol...
as I say someone elsewhere online made claims about 'bad pixels', which would have led the person that was asking for advice on cables to believe a better cable may get him a better picture, a regular 'John Smith' would probably ascertain that anyway, especially if they were asking for advice on HDMI 2.1 cables....which they were.
HDMI does use Reed Solomon FEC but it's a one way connection. The source just sends and sends. The sink can't ask for a block or frame to be retransmitted.
but thats just on the Audio and Data Control... is that correct? - the video has no FEC?
I had always assumed FEC was used for the video too. But you've exhausted my knowledge of the subject. If you find out otherwise please let me know.
HDMI same as DVI use 8/10 encoding, initially if HDCP is in use you will see single pixels error as colour noise in whole screen, FEC is not required as signal coding and relatively lossless data channel provide sufficient BER margin.
Simple DVI/HDMI Verilog code https://www.fpga4fun.com/HDMI.html
Once again HDCP scramble video - if you miss single bit then whole image will be like color noise, side to this modern HDMI transmitters/receivers are pretty robust and immune even to unbalanced (not differential) link.
So stop worry and enjoy video with audio over HDMI.
HDMI uses TDMS, and TDMS uses error correction on all its bit transmissions, called "TERC4", IIRC.
Regardless, since the payload data (not counting EDID & HDCP) is one-way, other than this minor EC (or what jagabo & pandy mentioned), that's it, and since it is realtime data with no capacity for retransmission (which capability would require massive buffering & consquent lag/delay), it doesn't make sense to use any further EC.
Active Aux/Audio Data. Auxiliary data (InfoFrame) and audio data encoded as 10-bit TERC4
symbols. The TERC4 encoding scheme consists of 16 unique 10-bit characters converted from
4-bit auxiliary or audio data. Transmitted on the green and red channels only.
Last edited by pandy; 4th Sep 2019 at 14:14.
those broadcast forms use much better error correction coding, but at the expense of increased bandwidth (for realtime applications) and/or buffer delay (for near-realtime applications), which is what I was getting at above. I think you would agree with me on that. And yes, the primary justification for the minimal EC in HDMI was shorter, stable, noise-free medium.
On HDMI delay is always (at least) 1 video frame - display is memory type thus data are collected in internal display memory, then processed, then displayed, if other delay buffer are added to this then usual delay in whole path (from broadcast camera to customer display) somehow can be even few second (delay only from satellite link on geostationary orbit i.e. 2x36kkm is around 250ms).
All this is nicely covered by:
And to conclude, my point (when i've tried to disagree with You) was to emphasize fact that HDMI use relatively comfortable way of transmitting data, also loosing data is not severe problem (poor data integrity will be obviously visible but rarely more than single person is affected thus overall system criticality is very low). I consider this thread just as nice curiosity similar to gold plated power wall sockets connected trough poor copper wires to noisy distribution lines - few um of gold will not magically improve "current quality" but it may significantly drain wallet and bust ego.