VideoHelp Forum




+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 33
  1. I use a Canon GL1 to videotape services at our church on Mini DV. I use Premiere 6.5 to capture and edit the video. On my PC, the video looks a tad dark so I tend to want to lighten it some before exporting as MPEG files using the standard DVD setting in Premiere's built-in MPEG exporter. I then use Sonic MyDVD to create a pretty basic DVD with a small number of separate clips.

    BUT! When I view the DVD on a player, it is so bright and washed out it is trash! My first thought is "OK dont brighten it." But when I view a commercial DVD on my PC and then on my home DVD player, there is no discernable difference. The ones I burn are night and day different on PC and DVD player. What is the problem?

    If this is not the best forum (Newbie/General), will the admin please relocate it to the correct one?

    TIA.

    BTW, this is the BEST site on the net for this stuff. Every time I do a search for anything DVD related I end up back here!
    Quote Quote  
  2. Mod Neophyte redwudz's Avatar
    Join Date
    Sep 2002
    Location
    USA
    Search Comp PM
    I had read threads on this subject before, but I couln't find them. Really, if they come out OK on the TV, that's what you really want. There may be settings in your software to correct the monitor image but unless it causes an editing problem I wouldn't worry about it.
    Quote Quote  
  3. Agree with redwudz.

    my capping apps preview screens show up a bit darkon my pc, but final product on set top player/tv is fine...
    Quote Quote  
  4. Member adam's Avatar
    Join Date
    Sep 2000
    Location
    United States
    Search Comp PM
    Tv's and pc's use different color ranges, with tv's being more limited. PC's use a range from 0-255 and tv's typically support anywhere from 8 to 16 on the low end to 235 on the high end.

    When you create footage for tv playback you are supposed to compress it to the 16-235 range so as to avoid clipping when viewed on the television. Basically, anything above or below those ranges will not be displayed, which can lead to horrible artifacts.

    The DV format takes this into consideration and records in the 16-235 to begin with. So when viewed on your pc you are not getting the darkest of the darks or the brightest of the brights. So your result can appear either too bright or too dark but usually its the latter. This is perfectly normal, but it does lead to problems if you want to color correct your footage because what you see on the pc is not what you will get on the tv. Do NOT simply increase the gamma til it looks right. You need to expand the luminence range to 0-255, do your editing, and then compress to 16-235 again during encoding. The more practical way is to use a true NTSC or PAL monitor to do your editing on. But as others have said, if you are happy with your output on your tv now then you shouldn't worry about it. Another thing to consider is that a few DV codecs auto expand the footage to 0-255 (even though it is filmed at 16-235) so you have to know what's going on behind the scenes and make adjustments accordingly. Off the top of my head, the Canopus DV codec is the only one I can think of that does this. There is a DV Faq in the DV forum of www.doom9.net which explains which codecs do what to the source. My suggestion is to pick one that does not expand the luminence ranges to 0-255 and to just ignore this issue entirely unless you have to do color correction. Then you can rest assured that your footage is properly setup for tv viewing. It appears you are already doing this.

    Now as far as DVDs are concerned, they also have the luminence values compressed to ~16-235 (also called CCIR601). But most if not all software dvd players autoexpand footage to the 0-255 range for obvious reasons, so that's why it appears correct. If you play one of your DV sources authored as a DVD on a software DVD player you should get the same result.
    Quote Quote  
  5. Adam,

    If I am understanding your advice correctly, it is counter to my understanding of the subject of black levels, etc. From what I have read (if I understand correctly) and from my observations, you should not compress your video footage to the 16 level, because when played on a standalone DVD player, the player will boost the black level and if you start at 16, you will get muddy blacks.

    Have a look at this site, it certainly seems to explain what I have experienced with my DVD's. I would be interested in your comments. It is a commercial site; I hope it is okay to link to it.

    http://www.signvideo.com/dv-black-level-dvd-7.5-ire-0-ntsc-part-2.htm
    Quote Quote  
  6. Member adam's Avatar
    Join Date
    Sep 2000
    Location
    United States
    Search Comp PM
    That article is very simplistic in its analysis, but I think you are just confusing what it is saying. Its saying that when played from the DV camcorder the luminence ranges will be wrong specifically because you will have values that extend below 16 and above 235. When encoded to any format suitable for tv playback you ALWAYS want to compress your values to 16-235 if they are not like that already.

    DVD players never boost the levels. a luminence level of 16 simply translates to an IRE of 7.5 on an NTSC display. Similarly, a luminence level of 16 translates to an IRE of 0 on PAL displays or in Japan's NTSC standard. Nothing is being changed, the first is simply the digital value and the second is the equivalent analog value. Values below 16 and above 235 are reserved for headroom to accomodate for ringing and overshoot. Its a buffer of sorts because different hardware may or may not be able to display those ranges. To guarantee that you don't experience hard clipping of the luminence values by either your tv or your dvd player, you must always compress (remap) your luminence ranges to the 16-235 range. This applies to all regions. Yes you are loosing the dynamic range of your luminence values but you are not loosing any actual data. If you don't do this the illegal values just get hard clipped during playback, which is much worse. This is what is occuring when you view your DV footage on the tv directly off the camcorder, and this is the main point of that article you linked to. (Though its not nearly as big a problem as they make out.)

    DV camcorders are not "wrong" they simply have different priorities. Most people don't watch DV from the camcorder, they transfer it to a digital medium, and the luminence levels are corrected when "captured" using the DV codec. Almost all DV codecs remap the luminence levels to 16-235, thus giving you perfect output for tv viewing.

    Read that page again.

    if you watch - on the same TV/monitor - your DV tape and the DVD you made from it, they will look different just because the DV analog output is 0 IRE and the DVD player is 7.5 IRE.
    Translation: The DV camcorder has recorded "superwhite" and "superblack" levels (which are corrected during capture to DV codec or during encoding if you so choose) and thus has not taken into account the headroom required. So when viewed on the tv luminence ranges are being hard clipped, as opposed to the DVD which does have the luminence values remapped, which will display correctly. Remember 16 has an IRE equivalent of 7.5 on NTSC displays. This is the darkest color the tv can display. The DV has values less than 16 and thus less than an IRE of 7.5...which means its illegal. That's why its darker. The DVD on the other hand has its darkest value stored at 16 which translates to the darkest value the tv supports, 7.5 IRE. Perfect.

    The DVD player is most likely showing you the correct black levels and for many DV users, the first time they ever see their video with correct levels is when they encode it to MPEG2, burn it to DVD and play it on a standalone player plugged into an NTSC TV or monitor.
    Translation: The footage has been remapped to 16-235 at some point before authoring, most likely during capture to the DV codec. Thus its luminence values display correctly on the NTSC display. That page is specifically telling you that you SHOULD compress your lumience values to 16-235, and the whole point of the document is just that DV footage when viewed unaltered may have incorrect luminence values.

    The point is that the DV codecs will remap the values for you (with a few exceptions, ie: Canopus). This is why you can pretty much ignore this issue as long as you are sure that your codec stores the footage in the 16-235 range and that you don't further compress your ranges during encoding.

    Check out the section titled Codec Problems on this site for a more thorough explanation of black levels.
    http://www.adamwilt.com/DV-FAQ-editing.html
    Quote Quote  
  7. I had a similar problem, until I turned up the brightness in my video overlay.
    Quote Quote  
  8. Originally Posted by adam
    When you create footage for tv playback you are supposed to compress it to the 16-235 range so as to avoid clipping when viewed on the television. Basically, anything above or below those ranges will not be displayed, which can lead to horrible artifacts.
    OK. So I am using Premiere. Are you referring to the Levels Setting under Video | Adjust in the Video Effects rollout? I applied the levels setting and adjusted the Output Levels to 16 and 235. This actually seemed to make the video darker. I understand now that what I see is NOT what I get, so that probably doesn't matter. However, if I understood you correctly, I should only need this adjustment during editing and it can be removed before burning since the DVD player will make the adjustment automatically. Is that correct?

    How come I don't see this effect when viewing commercial DVDs on my PC and and then on DVD player?
    Quote Quote  
  9. Member adam's Avatar
    Join Date
    Sep 2000
    Location
    United States
    Search Comp PM
    No. Basically your footage needs to be compressed to 16-235 at some point in your process. I assume you are capturing in Premiere to a DV codec, in which case the levels are probably compressed at this point automatically, but of course it depends on the DV codec you use. This is why I say to just ignore this whole issue as long as you make sure you are using a conventional DV codec. Its taken care of for you. Of course you need to make sure that the encoder does not further compress the luminence ranges unnecessarily. You do this by setting the encoder to 0-255. (I know it sounds backwards but that's how it works.)

    This of course leads to the problem of editing because you are viewing the footage in a more limited colorspace than what your monitor supports. I don't suggest expanding your luminence ranges for editing, and then compressing them again before or during encoding. You are really still going to be flying blind anyway. If you are serious about color correcting then you should get a true NTSC monitor. Other than that, you should be able to eyeball it good enough.

    You don't see this issue with DVDs played in a software DVD player because the decoder expands the luminence ranges to 0-255 to correspond with the monitor's display capabilities. The exact same thing will happen to your DV footage encoded to DVD if you handle your levels properly.

    Here's a test. Rip a vob to your hard drive from a commercial DVD and load it in dvd2avi. Set the video output to RGB (YUV which is what DVDs use doesn't even support the full range of 0-255) and then toggle between tv scale and pc scale. When set to tv scale the video is displayed as it is stored, in the 16-235 range. You will see that black looks more like grey and some scenes appear too bright and some appear to dark. This is what you are seeing when viewing your DV footage on your monitor. Its formatted for tv, not pcs. Now set dvd2avi to pc scale. It expands the footage to 0-255 just like a software dvd player would do. Its luminence ranges now correspond to those of the monitor and it looks perfect., but if you were to then play this on your tv it would look terrible.
    Quote Quote  
  10. Adam, many thanks for the detailed reply; however, I am going to respectfully disagree with you , on at least one item. I used analog camcorders for many years and only recently acquired a Digital 8 and a DVD writer, and I have been trying to work through the differences. One thing troubled me tremendously and when I read the info on the website I linked above, a lightbulb came on in my brain and it seemed to be just what I was experiencing.


    Where did the levels go?
    Then you went through the sometimes long and tedious process of authoring the video to DVD, encoding, creating menus chapters and the DVD navigation structure. After the DVD authoring was completed you burned it to DVD, popped it into your DVD player (connected to the same monitor on which the DV tape had looked so good) and...... "What happened to my video, what happened to the black levels, how come it looks washed out?"
    I did just that. I looked at some recorded Hi8 tape on my NTSC monitor from my Digital 8 and it looked good, a little darker than I remembered from playing the video from my analog camcorder, but still quite nice. I adjusted my monitor a bit to lighten it just a little. I transferred to DV on my computer, authored, burned, popped into my DVD player, and what happened - washed out and muddy blacks.

    In trying to sort all this out, I have used colorbars -smpte, ntsc, dv, whatever, as long as I am consistent with the tests. I adjust my NTSC monitor (I don't have a TV) to an NTSC colorbar pattern, from my analog camcorder or output the NTSC colorbars from the timeline in my NLE. Then I load the same colorbar pattern, burned to a DVD, into my DVD player. Bam! Blacks levels are washed out! - Because my DVD player has built-in setup. It moves the "pedestal" to 7.5 IRE, but if your video "pedestal is already at 7.5IRE, the DVD player will still move up - washing out the blacks.

    So it appears to me that one should not try to compress the luma to 16. Depending on the source, the DV codec and the Mpeg codec, it might be possible to just leave the leveles alone - if you have well exposed and illuminated video - often I don't. However, if your black level is at 16 or above, it will get washed out when played in a DVD player.
    Quote Quote  
  11. Member adam's Avatar
    Join Date
    Sep 2000
    Location
    United States
    Search Comp PM
    Once again, DVD players do not "setup" your video, arbitrarily compressing the luminence ranges regardless of how they are stored in the source. Setup simply refers to the fact that in region 1 NTSC's black level maxes out on the low end at 7.5 IRE. So since anything below this is illegal, the DVD player simply does not display any values below this. If you do have values below this then they are hard clipped. An NTSC tv does not display anything below IRE 7.5. This is an analogue measurement. It corresponds to 16 in the digital luminence scale. This really seems very self expanatory to me. If 7.5 IRE= 16, and 7.5 IRE is the lowest a tv can go, than it stands to reason that you will not gain anything by using a value less than 16 in your source.

    SOME NTSC tv's can display superwhite and superblack levels ok, and SOME dvd players have the option of outputting the full 0-255 luminence range, and option called enhanced black. If this is the case then yes you will get a more dynamic range of luminence levels.

    It is a simple fact that values below 16 (8 in some regions) and above 235 are reserved for illegal luminence valuese which most hardware will not support. Video for tv playback must be prepared by compressing the luminence values to avoid clipping. Please read the site I linked to. It explains all this. There is an entire broadcast standard which specifically requires this remapping of luminence values to 16-235. It is an industry and a hardware standard. Do a search on CCIR601.

    I think it is worth noting again that DVDs are stored in the YUV colorspace. This colorspace does not even support values outside of 16-235. Whether you realize it or not, by encoding to DVD you are always compressing your luminence values to CCIR601 (16-235.) But what you are probably doing unintentially is either over compressing your ranges or undercompressing them, resulting in clipping. Either of these could result in the muddy colors you are experiencing.

    If you read the site I linked to it explains how some DV codecs hard clip super black and super white values. This is bad. Others do not compress the values to 16-235 at all, and you could easily clip them during editing or encoding. The point is that you can do all the testing you want, but if your codec is doing the opposite of mine then your results don't mean anything to me. If you realize that 16-235 is the set range that a tv supports and 0-255 is the range that is only intended to be used on pc monitors, then you can begin to understand where you are going wrong in your process.

    If you tell me what your methods are including the DV codec you use maybe we can figure this out.

    You are saying you get better results by just leaving your sources alone and not messing with the luminence levels. What I am saying is that you ARE compressing your luminence levels simply by capturing to a DV codec. That's the whole point. DV codecs are supposed to take care of this for you, you just have to make sure you don't screw it up further down the line like in your encoder for example.
    Quote Quote  
  12. Adam, once again thanks for the reply. I think I am probably not explaining myself clearly. I have previously spent considerable time perusing the site you suggested (and others). I understand the NTSC 16-235 range. I understand the 16 RGB/7.5 IRE limitation of NTSC video. I understand my NTSC monitor is not displaying values lower (or higher) than the legal range.

    What I am trying to express, is that my DVD player (I have 2) increases the brightness level when a DVD is inserted. This is what I understood the aforementioned website was also saying. If I have a black level below legal values, the DVD player will bring it up , lighten it, brighten it, whatever - to 7.5 IRE/16 RGB. If my brightness range of my video is already a legal 7.5, the DVD player will raise the brightness above 7.5 to yield washed out blacks. There is simply no question that my DVD player does this(my eyes are not that poor), and in this regard, it doesn't really matter what DVcodec was used or what Mpeg encoder was used - no matter what the DV codec and/or Mpeg encoder has done, the bottom line is that the DVD player is going to raise the brightness level.

    I understand the DVcodec and the Mpeg encoder can impact what happens to the luma. I have done some tests, which I have reported in another thread, trying to determine what my Sony DV codec and TMPGenc (and Mainconcept) are doing to the video. Again I have used colorbar patterns, both with NTSC and DV levels in RGB. I have converted these RGB images into uncompressed avi's, Sony DV compressed avi, HuffYUV compressed avi , all run through TMPGenc and MC. I have extracted stills from each and examined back in Photoshop, to see what is happening at each stage.

    My obervations from all of this, is that I need to ensure, depending on my source video and it's original luma, that my final Mpeg has black levels below 16/7.5 unless I want muddy blacks. The DV spec is, I believe, 0 IRE and if you have true DV video then, depending on your codec and the encoder, you probably won't have to alter the black level - the DVD player will adjust the DV black level to NTSC spec. But if, and this is my case and I know many others, I am starting with NTSC analog Hi 8 tapes, and the Sony DV codec seems to presere whatever levels it is provided, so I am starting with 7.5 IRE,analog, NTSC legal source. The Sony codec doesn't alter this range and neither does TMPGenc. Thus I have DV avi's that have only 7.5 IRE that yield Mpeg's with 7.5 IRE and when played in my DVD standalone, this 7.5 is elevated to yield washed out blacks.
    Quote Quote  
  13. Member adam's Avatar
    Join Date
    Sep 2000
    Location
    United States
    Search Comp PM
    Ok to me this is starting to sound like a simple limitation of your dvd player. There are plenty of DVD players which output too bright or too dark. But one thing I need to make clear is that ALL and I mean ALL DVDS are compressed to 16-235. Its required by CCIR601 and the YUY standard which they use does not even support 0-255 ranges. So if you say 16-235 sources come out too bright on your NTSC monitor, then this will also happen with commercial DVDs as well as your homemade ones from DV footage. If this is not the case then you are simply doing something wrong in your conversions.

    Once again, your DVDs you are making from your DV footage are NOT 0-255 ever. The YUV standard only supports 16-235. The only question is how you get there. You want to compress just enough and not too much, and you want to have your values remapped rather than clipped. Whether you realize it or not your DV encoded as DVD are 16-235. I think you are just hard clipping your luma values rather than compressing them.
    Quote Quote  
  14. Member adam's Avatar
    Join Date
    Sep 2000
    Location
    United States
    Search Comp PM
    BTW: By default yes TMPGenc does alter the luminence values. It compresses them unless you enable the "output YUV as blah blah" option. If all you are saying is that you have to feed TMPGEnc 0-255 sources in order to get proper output, then I wholeheartedly agree. That 0-255 source is being remapped to 16-235 during encoding. If you feed TMPGEnc 16-235 sources then it further compresses these values to who knows what, which will certainly give you improper luminence values and muddy colors when displayed on your tv. That is what the "output YUV as blah blah" setting is for. Its meant to be used on 16-235 sources to preserve their luminence ranges, and the tooltip specifically says this is a good thing to use on DV sources for this reason.
    Quote Quote  
  15. Adam,
    This is beginning to feel a bit like trying to nail Jello to the wall - just when you think you have it, it starts sliding away. With all the terminology and numbers being bandied about, perhaps I am confusing apples with oranges.

    I know there are differences and errors introduced going to and from YUV and RGB, and maybe my method of looking at the luma values is not valid. I made 720x480 RGB colorbar patterns in Photoshop, and converted these to avi's and mpeg's. Then I took these into VDub (I believe VDub converts back to RGB), copy a frame of the colorbar and paste into Photoshop. All looks fine - if I started with 16 to 235, I still have 16 to 235 (or VERY close to these values), and if I had started with 0 to 235, those values remain intact.

    Is this a legitimate way to test? I am now questioning this method for two reasons. First because you stated that DVD's have a luma range of 16 to 235; however, when I examined a commercial DVD in this manner (Finding Nemo), I found the levels to be from 0 to 255.

    Secondly, I discovered that if I took my DV avi, encoded using the Sony codec, directly into TMPGenc, that the resulting Mpeg had the luma completely expanded to 0 to 255 (whether "output YUV ..." box was checked or not.) However, when I frameserved from VDub into TMPGenc (I assume TMPGenc was then receiving RGB), the original luma was retained. Likewise, when I took the uncompressed avi directly into TMPGenc, the luma values were retained.

    So now I am thinking that perhaps I have to manually adjust (expand) the luma if frameserving RGB into TMPGenc - perhaps by checking "output YUV...", but when this option is used, white levels get expanded as well.

    And the fact remains that my DVD players increase the brightness level as compared to directly viewing from the DV camcorder. This may be a limitation (or feature) of my player.

    My brain is tired ...
    Quote Quote  
  16. Adam,

    I want to thank you for your help and patience - I think I finally have sorted things out. My issue is with my cancorder, not my DVD player - I was looking at it backwards. My DVD player has "setup" and outputs legal NTSC video;whereas, the camera does not have "setup" built -in - correct? (Please don't tell me I am wrong.) There is a very noticeable difference in the brightness between the camera and the DVD player, but the player has it right.
    Quote Quote  
  17. Member Gritz's Avatar
    Join Date
    Jan 2003
    Location
    United States
    Search Comp PM
    Hmmm ..... a while back I captured a movie from VHS to Virtual Dub (to avi) then encoded with TMPGENc, and drug the resulting mv2 (mpg & wav) to author into MyDVD 4.0 to create a DVD movie. The results were terrible (it looked like enlarged newspaper print) so I tried just the avi files (huge) into MyDVD and the resulted were very good. I was later told if you bring mpg into Sonic MyDVD it re-encodes so you have less quality. The later versions of MyDVD may not, I don't know. I later took the original mv2 and authored in SpruceUp with nice results. It seems to me that you are encoding with Premiere and then running that through MyDVD ... just as I was doing? Just a thought.
    "No freeman shall be debarred the use of arms." - THOMAS JEFFERSON .. 1776
    Quote Quote  
  18. andie41,

    Can I try


    Sooo many things mixed together.

    Viewing your source
    Starting with your source. DV uses the YCbCr color space. This is only used in digital. It has a normal Y (luma) range of 16-235, but values can go out to 2-254. 16 is black and 235 is white. MPEG also uses this color space.

    If you look at this footage on a PC, it will be converted to RGB. Depending on how it is converted the range may be changed so that black is 0 and white is 255. This would clip anything above or below to those ranges. The range also may not be changed. Whatever decodes the file decides to change the range or not.

    The standard for your PC is to use 0 as black and 255 as white. So if the decoder does not change the range, the footage will look washed out. If it appears too dark, it is probably because the footage is dark, or because the gamma of your monitor is higher than your TV. (This is another story.)

    Recap
    DV and MPEG are YCbCr using a range of 16-235. Viewing requires RGB. Changing the range clips but makes it look correct. Not changing the range does not clip but makes it look washed out on a PC.

    Encoding
    An mpeg encoder may take YCbCr or RGB as input. If it takes RGB, it is important to tell it what range the RGB is in. The defaults are generally to assume 0-255.

    If the encoder takes YCbCr, it doesn't have to worry. This is because it has only 1 standard range. Since MPEG uses YCbCr, the encoder must convert RGB input, or just use YCbCr input. Some encoders only take RGB. In this case, the codec converts the YCbCr into RGB and then the encoder converts back to YCbCr.

    Recap
    MPEG encoders make YCbCr output, but may need RGB input. Because the range on the RGB may not be standard, you can tell them what they are getting.

    Playing
    You may have noticed that setup has not yet come up. That is because only when the DVD player outputs analog does this matter. Setup IRE 7.5 is an analog thing. The dvd player must convert the YCbCr mpeg to YIQ (NTSC) or YUV (PAL). This output is in a voltage range not a number like 16 or 0. IRE is a generic scale because depending on the plug, the voltages are different. For US DVD players and TVs, black should be set to 7.5 in the voltage signal. For Japanese NTSC, black is set to 0. By the way, the signal can range beyond 0 and 100 IRE.

    Recap
    IRE and steup are analog things. If your digital YCbCr is off because you told your encoder the wrong thing, your DVD player is not trying to fix it for you.

    Conclusions
    Know your source and know what you are doing to it every step of the way. It is fairly simple to look at a histogram and see what you have. Most every frame should have some black/near black. So if you look at RGB histograms, something converted your source to RGB. If the lowest levels are around 16, the range was not changed. If the lowest levels are near 0. The range was changed.

    Your only real points to mess this up are 1) if you are doing color correction, and 2) when you input to your encoder. If you are doing #1 you probably know all of this. Just before #2, check what you have and tell the encoder the right thing.


    Hope this was not too confusing.....


    PS
    If you test or compare to commercial DVD, be aware how you are playing, measuring. MPEG is YCbCr 16-235. Viewing it requires a conversion somewhere to RGB. When this is done, the proper levels are set. The process followed by a PC dvd player may not be the same as a PC editor.
    Quote Quote  
  19. Originally Posted by andie41
    Adam,

    I want to thank you for your help and patience - I think I finally have sorted things out. My issue is with my cancorder, not my DVD player - I was looking at it backwards. My DVD player has "setup" and outputs legal NTSC video;whereas, the camera does not have "setup" built -in - correct? (Please don't tell me I am wrong.) There is a very noticeable difference in the brightness between the camera and the DVD player, but the player has it right.
    If you transfer via firewire, there is no concept of setup in digital. If you play back footage via analog outputs from your camcorder to your TV and the footage is too dark, it could be due to what the analog outs do, or because your footage is dark.

    A better test is to transfer via firewire, encode to mpeg with the 2 range options. If neither look good, increase the light when you shoot. Take some sunny out doors shots to compare.
    Quote Quote  
  20. trevlac,

    Thanks for your comments. I was concerned that I was taking too much of Adam's time and taking over the thread, but I still have a number of outstanding questions. I have had a lovely dinner and a couple of glasses of wine, so I will work on this tomorrow - thanks.
    Quote Quote  
  21. trevlac,

    I have many older 8 & Hi-8 tapes, recorded with different cameras and under many lighting conditions - some good, some bad. I am trying to standardize my NTSC monitor so I see if my video needs and can be improved by color/contrast/brightness correction. I am not concerned with how it looks on my PC; that is unimportant, although it would be nice to use the PC monitor to judge, but I know that isn't going to work. I want to be able to give a DVD to another person and know that the chroma and luma is close to spec - if their TV is out of adjusmetnt, that is their problem.

    So how to judge? I can't use the computer monitor; I have been tempted to dig out my old ISA card vectorscope/waveform monitor, but I have enough stuff cluttering my workspace. My problem is adjusting the NTSC monitor. If I place a colorbar pattern on the timeline of my NLE and output the digital signal to my Sony Digital 8 camera and then the analog out to my NTSC monitor, and adjust the NTSC monitor Re the pluge and color; then I burn this very same colorbar pattern to DVD and play it in my standalone DVD player and send this analog signal to my NTSC monitor, I get a 7.5% elevated brighness level. So my question is really which level is correct; is this an anomaly of my two DVD players or does this indicate that the camera is putting out 0 IRE? So must I turn up my NTSC monitor's brightness level when editing from the timeline and then reset the monitor, when viewing from the DVD - assuming the DVD player is outputing legal NTSC. Unfortunately I do not have cable at the moment or I could use the cable company's colorbars to verify.

    My second issue is (and certainly part and parcel of the above) how to adjust and to what extent, the levels of my mpeg. Again, I use colorbar patterns because differences are easily observed, and if I feed TMPGenc a DV file, it expands the luma to 0-255. RGB values of 24-24-24 and below are changed to 0-0-0 and the 235-235-235 of the colorbar is now 255-255-255. I don't want this, do I? Your telling me, Adam's telling me, that the legal range is 16-235.

    If I run my DV video through VDUB and frameserve (RGB I assume)to TMPGenc (NOT checking the "output YUV ..." box), TMPGenc appears to not alter the luma values at all - if I feed it 16-16-16, the resulting Mpeg is still 16-16-16; if I give it 0-0-0, it retains the 0-0-0, and 235-235-235 is still 235-235-235. There are filters in TMPGenc to restrict the luma to 16-235, and apparently this should be done if I am sending a DV file. Why would one ever want to expand the range beyond legal NTSC values?

    Of course, both of these issues are inner-related, because I need to get my NTSC monitor set properly. Adjusting the monitor through the camera from the timeline and then producing my DVD at 16-235, will result in washed out blacks and colors.

    Sending messages back and forth makes this more difficult than it really is; I have no doubt that 5 minutes one on one would clear everything up, but I do appreciate your comments.
    Quote Quote  
  22. Hi,

    I think i was mixed up on your problems, but you are very clear hear.

    Originally Posted by andie41
    If I place a colorbar pattern on the timeline of my NLE and output the digital signal to my Sony Digital 8 camera and then the analog out to my NTSC monitor, and adjust the NTSC monitor Re the pluge and color; then I burn this very same colorbar pattern to DVD and play it in my standalone DVD player and send this analog signal to my NTSC monitor, I get a 7.5% elevated brighness level. So my question is really which level is correct; is this an anomaly of my two DVD players or does this indicate that the camera is putting out 0 IRE?
    Well the problem is either analog from the Cam or analog from the DVD player. You'd have to measure the analog to know. I'd say the cam outputs black as IRE 0, because the DVD player outputing black as IRE 15 does not make sense.

    My second issue is (and certainly part and parcel of the above) how to adjust and to what extent, the levels of my mpeg. Again, I use colorbar patterns because differences are easily observed, and if I feed TMPGenc a DV file, it expands the luma to 0-255. RGB values of 24-24-24 and below are changed to 0-0-0 and the 235-235-235 of the colorbar is now 255-255-255. I don't want this, do I? Your telling me, Adam's telling me, that the legal range is 16-235.
    DV is YCbCr. The levels for YCbCr are 16-235. TMPGenc requires RGB. It asks the DV codec on your machine for RGB. Your codec expands the range from 16-235 to 0-255. TMPGEnc then contracts the range back to 16-235 and makes YCbCr.

    16 should be made 0 and 235 should be made 255 by your codec and then it goes back again by TMPGEnc. Clipping would occur. You can avoid clipping, but for source in the normal range, this is not a problem.

    I'm not clear how you feed the color bars into TMPGenc. DV avi?

    If I run my DV video through VDUB and frameserve (RGB I assume)to TMPGenc (NOT checking the "output YUV ..." box), TMPGenc appears to not alter the luma values at all - if I feed it 16-16-16, the resulting Mpeg is still 16-16-16; if I give it 0-0-0, it retains the 0-0-0, and 235-235-235 is still 235-235-235.
    VDub does not convert YCbCr to RGB or change the range. Something feeds VDub RGB and it just passes it on.
    How do you know the resulting ranges? You probably have to convert from MPEG YCbCr to RGB to check. This step will usually convert 16 black to 0 black.


    I hope this helps.

    In summary....

    Conversion from/to DV or MPEG (YCbCr) to/from PC RGB usually changes the range from 16-235 to 0-255. This is because ITU-601 specifies that there is headroom/footroom in YCbCr and traditionally, PC don't use headroom/footroom. If something is in question, it is your DV codec. If it acts 'normal', use all the defaults and you are set. Normal does clip out of range input values, but this should not be a big deal for digital input.

    If your cam outputs colorbars, you should be able to just flow them thru your process to see what happens. If they are 75%, they should show as 0/191 in RGB. If they show as 16/180 then your DV codec is not changing the range when it converts from YCbCr to RGB.

    Let me know....
    Quote Quote  
  23. Member FulciLives's Avatar
    Join Date
    May 2003
    Location
    Pittsburgh, PA in the USA
    Search Comp PM
    I figured an easy way out of all this. Stop using anything that converts to RGB.

    This means use AviSynth instead of VirtualDub and use CCE as your encoder instead of TMPGEnc.

    Makes your LIFE soooooooooooooo much easier.

    - John "FulciLives" Coleman
    "The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
    EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
    Quote Quote  
  24. trevlac

    I think i was mixed up on your problems
    Probably because I was mixed up, as well.

    I am going to go back a re-examine all my tests, but I am quite confident of the following:

    I have become concerned about the validity of converting my DV and/or Mpeg video back to RGB to examine; however, if I encode my DV video with TMPGenc, load this mpeg in VDub, take a still into Photoshop, levels from 24-24-24 now show as 0-0-0 - there are no gradations, just one black bar at 0-0-0. Is this an accurate way to examine - I don't know. However, if I burn this Mpeg to DVD, play in my standalone player, out to my NTSC monitor, I can pretty much see the same thing- there are no gradations in the pluge pattern, all is one superblack bar. I can crank my monitor's brightness all the way up, there is only superblack. So it seems that the RGB method, from VDub to Photoshop is providing reasonably accurate information.

    On the other hand, if I take the same DV video into VDub and frameserve to TMPGenc (without checking the "output YUV .." box), and then load the resulting Mpeg into VDub, capture a still, into Photoshop - the pluge is still there, values are as per original. And if I then burn this Mpeg to DVD, play from my DVD player, the pluge is there - now depending on how I have adjusted the monitor, I may have to turn the brightness level up or down, but the pluge pattern is there.

    Perhaps I should wait to post this until I have redone the tests, but I have already done them twice. Perhaps I am over-thinking this, but NTSC video, and particularly small format such Hi-8, is not very good to start with, so I would like to maintain as much quality as possible.
    Quote Quote  
  25. Originally Posted by FulciLives
    I figured an easy way out of all this. Stop using anything that converts to RGB.

    This means use AviSynth instead of VirtualDub and use CCE as your encoder instead of TMPGEnc.

    Makes your LIFE soooooooooooooo much easier.

    - John "FulciLives" Coleman
    John,

    I know there are a bunch of people who would say stick with YCbCr, but actually, RGB is a much better colorspace. If you do a simple capture with not much filtering, the cap-->CCE way is probably a good process. However, if you do filtering, YCbCr 4:2:2 has much less color info than RGB. About only 25%. This can kill your gradation/tonal range. Think of it this way, filtering will cause you to keep your intermediate results in YUY2 (8 bit YCbCr). 8 bit YCbCr only has about 25% of the colors as in RGB 8 bit. In addition, it only keeps 1/2 the resolution of RGB (YV12 is worse). So basically, as you filter, you are loosing info every step of the way. Don't get me wrong, this is not the end of the world. But keeping and fixing the colors is the next step past a good capture.

    Andie,

    I assume when you say Hi-8 you are not doing analog capture. Also, how do you know you start with 24-24-24 levels? In DV and MPEG there is no RGB. There is Luma and color difference values.

    Trev
    Quote Quote  
  26. I figured an easy way out of all this. Stop using anything that converts to RGB.

    This means use AviSynth instead of VirtualDub and use CCE as your encoder instead of TMPGEnc.
    Reasonable suggestion, I have considered it; however, I don't want to learn 2 new programs and I don't want to buy another program, and, as I use an NLE fairly often for titles and transitions, I am in RGB space anyway. Apparently Premiere 7 can, with some qualifications, operate in YUV, but I don't have Premiere and am not going to purchase it. And TMPGenc is a good encoder - like any tool, one has to learn how to use it, and I am getting there.

    trevlac
    I assume when you say Hi-8 you are not doing analog capture. Also, how do you know you start with 24-24-24 levels? In DV and MPEG there is no RGB. There is Luma and color difference values.
    I am using my Sony Digiatal 8 to convert my Hi8 tapes, but before I get to that point, I want to get my process standarized - hence the use of colorbars.

    how do you know you start with 24-24-24 levels?
    I have begun with RGB 720x480 colorbar images in Photoshop - with a variety of values, but as an example: black at 16-16-16, dark grey at 24-24-24, superblack at 7-7-7, white at 235-235-235. I take this png image file to MS7 and make avi's -uncompressed and compressed with the Sony DV codec.Then, loading these avi's to VDub and saving a sitll and loading into Photoshop, I see I still have RGB values the same as I started with. Then I make mpeg's of these avi's - both directly to TMPGenc and frameserved from VDub to TMPGenc. Then I take each Mpeg back to Vdub , capture a still and load into Photoshop and examine the RGB values. Valid test procedure? The results are completely consistent with what I get on my NTSC monitor, although I can't read the levels.

    So what I get when I take these captures into photoshop and what it looks like on my NTSC monitor is as the following:

    uncompressed avi encoded in TMPGenc - all values as per original 16-16-16 is still 16-16-16, 24-24-24 is still 24-24-24, etc.

    uncompressed avi frameserved through VDub to TMPGenc - same as above -perfect.

    Sony DV avi direct to TMPGenc - 16-16-16 is now 0-0-0, 7-7-7 is now 0-0-0, 24-24-24 is now 9-9-9, 180-180-180 is now 191-191-191,235-235-235 is now 255-255-255.

    Sony DV frameserved from VDub to TMPGenc - all values are right on.

    When "output YUV ..." is selected blacks go to 0-0-0 and white goes to 255-255-255.

    As I said, I am not certain that converting back to RGB is totally accurate, but the output on my NTSC monitor, when I burned these mpeg's to a DVD, is totally visually consistent with what I am seeing in Photoshop.

    I just about have this nailed down - I think I am going to have to hack a computer together with an ISA motherboard and get my waveform monitor setup.
    Quote Quote  
  27. I guess I am a little confused:

    As Adam stated:
    Once again, your DVDs you are making from your DV footage are NOT 0-255 ever. The YUV standard only supports 16-235
    But one thing I need to make clear is that ALL and I mean ALL DVDS are compressed to 16-235. Its required by CCIR601 and the YUY standard which they use does not even support 0-255
    And a quote from Joe Kane http://www.videoessentials.com/ve_d_faqplayer.htm
    The majority of DVD's are mastered using 0 Volts DC for black. In reality all DVD's should be mastered using 0 Volts DC for black
    Perhaps I am just too thick headed to get it, but this appears to be contradictory. Interesting, but frustrating, at the same time. I realize that 0 -255 is RGB designation, but doesn't 0 volts DC= 0 IRE = equivalent 0 RGB?
    Quote Quote  
  28. Member adam's Avatar
    Join Date
    Sep 2000
    Location
    United States
    Search Comp PM
    Originally Posted by andie41
    Perhaps I am just too thick headed to get it, but this appears to be contradictory. Interesting, but frustrating, at the same time. I realize that 0 -255 is RGB designation, but doesn't 0 volts DC= 0 IRE = equivalent 0 RGB?
    Sorry I bowed out of this discussion but I haven't had much time to post lately.

    Yes 0 volts does = 0 IRE but neither is equal to 0 RGB. Read what I posted above. A digital measure of luminence of 16 is equal to 0 IRE. This is defined in CCIR601.

    Mpeg uses YCbCr. In this standard there is an accommodation for footroom and headroom in the signal. Thus the blackest black is mapped at 16 and the whitest white is mapped at 235. This is a result of the CCIR601 standard. Just do a search for CCIR601 and YcbCR, there are techical documents galore on these. At this point our blackest black (16) is still equal to an IRE of 0, or 0 volts dc if you will. In the US we use setup, which means that these ranges are further compressed. This is done to provide room for blanking. But setup is entirely an analogue funtion. Its impossible to do digitally. So our blackest black is still stored as 16 digitally, but when displayed on an analogue set it is "setup" to an IRE of 7.5, and when played elsewhere in the world it is output at 0 IRE.

    You are confusing setup and luminence levels in general. One is an analogue concept and the other is a digital one. There is no inconsistency between the statements of mine you quoted and the one by that other author. Our luminence rages are always compressed regardless of where we are in the world. In the US are ranges are even further compressed due to setup, but only during the conversion to analogue. Regardless of where we play DVDs, their luminence ranges still measure the same on the digital scale of luminence, ie: 16-235. Neither CCIR601 nor YCbCr even allow for DIGIAL luminence ranges in the 0-255 range. That's really all I'm saying.

    The best explanation I have seen still comes from that same website I keep referring to.

    All 601-conforming digital formats record nominal black at a luma level of 16, and nominal white at a luma level of 235 (in a 0-255 range, using 8 bits: there are 10-bit versions, too, like D-5 and DigiBeta, where the range is 64-940, but the DV formats are all 8-bit formats so we'll stick with 16-235 for this discussion).

    When played over FireWire or over SDI or over SDTI, that's what you get: blacks at 16, and whites at 235. When you interchange files digitally, whether as DV-format stream files or QuickTime or AVI with the appropriate DV codec, black is 16 and white is 235. The same numbers hold, by the way, if your computer file holds 601-format uncompressed data, or DVCPRO50 data, or HDCAM, or DVCPROHD, as long as it's stored as "YUV" (really YCrCb) and not transcoded to RGB (wherein a whole range of gain/offset problems can occur, and even a gamma change.

    Now, when you play back to analog, what happens? Digital levels get converted to analog levels, and that's where setup enters the picture (or not).

    In Europe and Asia, analog video's blacks are at zero voltage: 0mV PAL, 0 IRE NTSC. In North America we add a slight offset, the infamous 7.5 IRE of setup, for historical reasons (the DC regulation of early sets was poor, and electron beam retrace suppression didn't exist, thus the designers provided a safety margin between "black" and "blanking" levels so that retrace wouldn't be visible even if the viewer's TV set was slightly misadjusted).

    Going from analog to digital and back again, if you're following the spec, analog black goes to digital black and vice versa. Whether digitizing from NHK in Tokyo (0 IRE setup) or NBC in New York (7.5 IRE setup), the black levels in the digital data should be the same: 16. Likewise, the blackest black in a picture coming out of the camera section of a camcorder should always be laid to tape with a luma level of 16, regardless of what part of the world the camera is designed for. And that same tape should play back in Japan with 0 IRE setup, and in the USA with 7.5 IRE setup.
    Quote Quote  
  29. Adam, You are still here! I was afraid I had run you off - thanks for your persistence. With all the cryptic terminolgy and numbers flying around, I had lost sight of the fact that 0 IRE was not 0 RGB.

    My efforts have been directed at getting an accurate method to adjust my NTSC monitor. In the end it is mostly academic anyway; I am not doing this commercially and it is a simple matter to adjust the brightness level on my NTSC monitor or a TV. If some values get clipped - -probably hardly noticeable anyway unless two examples are side by side. I am probably being too picky for NTSC and YCbCR, etc.
    Quote Quote  
  30. @andie41

    Sorry, I did not have a chance to post over the weekend.

    Sony DV avi direct to TMPGenc - 16-16-16 is now 0-0-0, 7-7-7 is now 0-0-0, 24-24-24 is now 9-9-9, 180-180-180 is now 191-191-191,235-235-235 is now 255-255-255.

    Sony DV frameserved from VDub to TMPGenc - all values are right on.
    Thanks for the details on your test. It looks like a sound method.

    One would expect the above 2 cases to be the same. In the 1st, the DV codec is converting from YCbCr to RGB. It appears that it delivers 16 as 16. I can not see how it delivers a different value for different apps.

    So, I'd guess TMPGenc knows the difference between getting RGB from VDub and from your DV codec. There is a setting "Set equation for color space (Canopus DV Codec) ". This seems to be specific for Canopus, but it is worth a try to see if there is a difference.

    Regardless, if you frame serve from VDub, you get the correct results.

    PS:
    "Output YUV data as basic YCbCr, not CCIR601" You dont want to use this. Keep it unchecked.


    As far as the WFM. Break it out and test your DVD player and CAM. I'd bet the cam outputs IRE0 and the DVD player does 7.5. Only way to know is in the Analog world.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!