I have a bunch of tricky questions, can anyone help?
Resolution
1) When I record a live digital broadcast signal, is there any way I can get a 100% pixel-perfect copy of what has been broadcast? I'm not talking about picture quality (that's bound to degrade during capture), I'm talking about matching the resolution at which I capture, to exactly the same resolution that the video is broadcast at, so that no "scaling" takes place. That way I will essentially have a near perfect copy of the original video. I don't really understand how capture cards work, but am I right in saying that they should somehow be able to detect the resolution of the incoming signal? In which case, is it possible to exactly match the capture resolution to the broadcast resolution?
2) I intend to capture both 16:9 and 4:3 presentations, from both live broadcasts (from a digital satellite signal) and also from video tape. My question is: does the aspect ratio make any difference to the resolution of the video? I know that all DVD videos are 720x576, regardless of their aspect ratio (16:9 just stretches the image out anamorphically). But what about broadcast television and video tape? Do they use two different resolutions for the two different aspect ratios, or just the one?
Interlacing
3) How is interlaced video "stored" in avi and mpeg2 formats? Are the two fields compressed seperately or together? I notice my video card captures PAL video at 25fps, so each pair of fields MUST be merged together to form one high resolution frame. Surely if both fields are compressed together as a single frame, the two fields will "blur together" slightly due to compression, and will hence contaminate each other. So then when the fields are separated again (for viewing on an interlaced TV) each image will be "dirtied" slightly because colour from the other field will have bled into it slightly. Am I right about this? If so, is there any way to capture both fields independently to stop the fields from being dirtied?
4) The resolution that TV programmes are broadcast is bound to be different to the resolution that the VCR records onto the tape at (I believe TV is broadcast at about 330 lines whereas S-VHS is about 410 lines). So whenever you record to tape, you're changing the resolution of the broadcast image (sometimes scaling up, sometimes scaling down). My question is, what happens to the interlacing during this change of resolution? How can interlacing possibly survive when the number of lines of resolution has changed? The only way I can think of that the VCR could preserve interlacing and still scale the image would be to scale each field independently. But the trouble with this is that there's no way you can scale an image (up or down) if half the picture the information is missing! I am led to conclude that when you record to a video tape, the two independent fields are lost, and the whole thing is just merged into a progressive scan. Can anyone shed any light on this?
5) Another little thing that's been bugging me - when I capture at 720x576 and play back the captured video, it's always 720x540! What happened to the extra 36 pixels?
Thanks to anyone who can help.
+ Reply to Thread
Results 1 to 8 of 8
-
-
It looks like you've already done some good reading,.. but if you are really interested in learning more there are all kinds of other sites where you can get additional information. Try the following site and read all you can.
http://geocities.com/lukesvideo/
Here is another site that is controlled by Silicon Graphics Co., which is one of the top developer of Video equipment for high end professionals. It just happens to offer information for beginners and others.
http://toolbox.sgi.com/TasteOfDT/DTtxt.html
Also, this site has many articles and other information that could help any one understand Video, TV, DVD, etc.
http://www.videoguys.com/
Hope this helps. Bud"Technology",...It's what keeps us all moving forward. -
1: The broadcast signal is not really bound to a particular resolusiton in the same way you think from a TV. Capture cards will give you the choice of what resolution to capture at, up to and including 'full screen' or the total capture you're looking for. They don't really 'detect' the resolution, it's up to you to choose the resolution you want to capture in.
2: TV/Video tape will normally use a resolution of 720x480 (NTSC) when viewed in 'full screen. The only different aspect ratios (for TV) I have seen have been the difference between PAL and NTSC (and other broadcast formats).
I'm not big on interlacing, but I'll try and answer a bit...
When you are encoding interlaced video to MPEG2, you can sometimes get your encoder to apply a deinterlace filter to the video before the encode takes place. That will improve the video signal in most cases. While I pefer to capture to AVI first and then encode to MPEG2, you can encode straight to MPEG2, but I find for TV/Video tapes the quality is not as good at lower bitrates. If you have a pure digital signal that's a different story.
Keep in mind there is an 'overscan' area on the TV set which has about 30 pixels on each side that you never see on regular TVs. If you are going to be doing a lot of capturing from TV/Video tape you'll find that interlace will be something that you'll need to work on to get MPEG2 you're happy with.
Regards,
Savant -
Thanks for the info, guys!
Bstansbury>> I'm working my way through all the info on the three sites you gave, thanks very much.
I have now managed to completely answer my question 4. I was wondering what happened to the interlacing when you record a TV programme onto tape, and the resolution is obviously changed. The answer is that although everything uses a different resolution (TV, VHS, DVD etc.) - only the horizontal resolution ever changes, and all these things use exactly the same vertical resolution. I never understood this before, but the term "lines of resolution" does not literally mean the number of (horizontal) lines that make up the picture. It actually refers to the horizontal resolution, or more accurately, the number of vertical lines that can be resolved in an area of width that is as wide the screen height. (If anyone needs me to exaplin this better, just ask.) But horizontal resolution aside, everything uses exactly the same vertical resolution. TV, VHS, S-VHS, satellite, DVD. They all have the same vertical resolution, which is 656 lines for PAL and 525 for NTSC. Although only 576 of these are visible for PAL and 480 for NTSC, which is what DVD uses. So the answer to my question is that interlacing always remains perfectly intact because the vertical resolution never, ever changes!
Savant>> Phew! Your answers were even more confusing than my original questions! Uh. I think I understand....
>The broadcast signal is not really bound to a particular resolusiton in the same way you think from a TV
Are you saying that TV stations don't broadcast at any set resolution, they just broadcast each programme at whatever resolution it comes at?
>The only different aspect ratios (for TV) I have seen have been the difference between PAL and NTSC
But what about the difference between 4:3 broadcasts and 16:9 broadcasts? Do they use a different resolution? All TV shows in the past 3 or 4 years have been broadcast in widescreen, but before that they were always 4:3. I don't fully understand how this widescreen signal is broadcast without it affecting a 4:3 display. Is it anamorphic? Is it letterboxed? Does it use a different resolution? The whole thing confuses me!
>you can sometimes get your encoder to apply a deinterlace filter to the video before the encode takes place
This deinterlace filter... what exactly does it do? Does it separate the avi into two separate fields before making it into an mpeg, so the two fields are compressed individually and therefore cannot contaminate each other? (That would be ideal!) -
Are you saying that TV stations don't broadcast at any set resolution, they just broadcast each programme at whatever resolution it comes at?
But what about the difference between 4:3 broadcasts and 16:9 broadcasts?
This deinterlace filter... what exactly does it do?
I think you are 'overthinking' the whole thing too much. You need to get a few captures under your belt and then that will help you to understand how it works better than the explanations.
Regards,
Savant -
You are asking the question in the wrong way. Analog television signals do not have a horizontal resolution. It's a continuous waveform that varies over time. It has a bandwidth, not a resolution.
That bandwidth happens to be high enough, in VHS tape, to depict about 360 vertical lines (the white space between two black lines is also a line) along the width of the TV screen, or about 240 lines along a width equal to the height of the screen. (I think I got that right, feel free to correct me, experts) -
Thanks Lava, that's sorta what I was trying to say, but I wasn't in the right train of thought. However talking bandwidth is probably better than talking resolution. Mind you he may be even more confused now.
Regards,
Savant
Similar Threads
-
Questions about interlacing/deinterlacing - TMPGenc users
By yukon33 in forum Newbie / General discussionsReplies: 9Last Post: 25th Jun 2010, 13:41 -
.mkv resolution questions
By micro9mm in forum Blu-ray RippingReplies: 2Last Post: 3rd Apr 2010, 10:36 -
Questions about resizing/rescaling/changing the resolution....
By divxmpegjpeg in forum Video ConversionReplies: 0Last Post: 14th Mar 2010, 14:07 -
Good Video Explaining Interlacing and De-Interlacing
By Soopafresh in forum Newbie / General discussionsReplies: 3Last Post: 14th Aug 2008, 19:50 -
Questions about video resolution, and bitrate
By jay_lubb in forum Newbie / General discussionsReplies: 3Last Post: 18th Sep 2007, 14:46