I don't know if this is a good place to ask this question, or not. Maybe someone can point me somewhere.
I have Comcast cable in Northern California, a "new" DVR box, and an SD 36" TV. (this occurred on my old box and 25" TV last week, too) Comcast support is pretty much worthless these days.
The weirdest thing is occurring. During live or DVR playback, when there is a scene/shot cut to another scene/shot, the image blurs or defocuses for a split second, and then goes back into focus. This doesn't happen all the time (I don't think), but it happens a lot. It is happening on ABC, and I need to see if it is happening on other channels, too.
I know a little about TV and signals and equipment and such, and I can't even imagine how this would be occurring, or what would cause it.
It's probably not the TV, because my other TV did it too. I suppose it could be the box, but the fact that the old box was acting tweaky (one of the recording receivers worked, and the other would drop sound for a few seconds every 15 seconds or so -- it got so hot you couldn't hardly touch it, which has happened to every box I've gotten from Comcast eventually) and then started doing that weird blur thing caused me to exchange it for a "new" box.
The really weird thing is, how would or could "it" ("anything" -- electronic circuit or whatever) "know" when the video image is switching from one viewing shot and then cutting to another viewing shot? (basically, a video edit cut) I don't know if I'm explaining this well enough. Say you are watching some show, and the camera is shooting one person talking. (which we are seeing as the final video) Then there is a cut to the other person talking. That's the cut I'm talking about. When it cuts to the second person, it defocuses, and then focuses, and the image is good until it cuts back to the first person, and then it happens during that cut. And then throughout every cut to cut to cut during the video/show.
Anyone ever heard of anything like this, and/or know what it is or might be, or know how to fix it?
Thanks.
+ Reply to Thread
Results 1 to 4 of 4
-
Last edited by BeeDee666; 5th May 2010 at 05:27. Reason: edit for clarity
-
They are overcompressing the video in order to get more channels in their limited digital bandwidth.
-
I see a similar situation on our local cable system. They downconvert even the local HD stations to SD and the picture is fuzzy and sometimes has static, though it was originally HDTV.
They seem to be trying to maximize their number of channels by compressing the video and minimizing the bandwidth for each channel. It's so bad I can only watch it on a small screen TV. This same company got raided by the FBI last year for purchasing used hotel satellite systems and rebroadcasting HBO and other premium channels to their cable customers without reimbursing the satellite company.
-
Digital video compression basically works because most video frames are only slightly different from the preceding and next ones. So the compression process is to store complete index frames and then a small difference from each preceding one.
Obviously this has to start a new sequence at every cut scene, when the next frame is completely different. So that frame needs a lot more bits and thus time to transmit.
And obviously this is what is screwing up. Maybe they're compressing an input in real time for broadcast and not keeping up.
Similar Threads
-
Weird, very weird thing demuxing a DVD
By jairovital in forum EditingReplies: 17Last Post: 22nd Oct 2010, 18:31 -
Blur
By jfreak53 in forum EditingReplies: 3Last Post: 24th Apr 2010, 15:22 -
Unwanted blur when rendering
By Notuom in forum EditingReplies: 16Last Post: 11th May 2009, 18:25 -
blur stuff out?
By zookeeper525 in forum EditingReplies: 8Last Post: 3rd Feb 2008, 07:52 -
motion blur on tv
By ineedhelp007 in forum Newbie / General discussionsReplies: 3Last Post: 18th Dec 2007, 10:49