VideoHelp Forum




+ Reply to Thread
Results 1 to 13 of 13
  1. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    I still consider myself a NEWBIE to video editing and everything that goes along with it. I am shooting these instructional DVDs and am having fits trying to figure out how to get the image I see on the computer monitor and the image I see on the TV screen look at least somewhat close in terms of brightness, etc..



    Here is a frame capture from last night's shoot. I don't know if it helps, but I read that a Television's color temperature is 6500K, and I found I was able to adjust my computer monitor to that setting with its controls, yet once again, the image on the computer monitor is sooo much darker and dreary compared to the same picture I am viewing on the TV... Isn't there a way to sucessfully calibrate the TV and monitor to approximately the same setting and then try and find a happy balance between the two?

    DVDs that I rent in the video store don't seem to have this problem THAT much...
    Quote Quote  
  2. Every TV is different. We have TVs in our studio that cost over $1000 and their primary purpose is for color correction only. I would reccomend getting your screen calibrated and then going from there.
    Quote Quote  
  3. Member
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    "I would reccomend getting your screen calibrated and then going from there. " how do I do that correctly? and how do I know when it's right? IF the tv and computer monitor are both correctly calibrated, will the picture look somewhat close?
    Quote Quote  
  4. Member
    Join Date
    Aug 2002
    Location
    cleveland, oh
    Search Comp PM
    A qualified Video Service Technician can properly calibrate your monitors.
    However be advised that CRTs require periodic re-cal due to aging.
    Look in the yellow pages for a local tech.
    CRT Calibration falls into four major areas. Color Temperature, purity, convergence and linearity.
    Once calibrated the CRT should not be moved.
    Quote Quote  
  5. Computer monitors and Televisions have very different gamma values. With both calibrated for their stated purpose you will get much darker colors on the computer.

    http://kb.indiana.edu/data/ador.html?cust=134210.49393.30

    http://www.bberger.net/rwb/gamma.html

    http://www.poynton.com/notes/colour_and_gamma/GammaFAQ.html

    http://broadcastengineering.com/mag/broadcasting_gamma_correction/
    Quote Quote  
  6. Member glockjs's Avatar
    Join Date
    Feb 2004
    Location
    the freakin desert
    Search Comp PM
    Originally Posted by sdsumike619
    "I would reccomend getting your screen calibrated and then going from there. " how do I do that correctly? and how do I know when it's right? IF the tv and computer monitor are both correctly calibrated, will the picture look somewhat close?
    nope. 2 different types of displays. i recommend the ol' tried and true "trial and error" method

    make a few quick samples of the vid and burn it on a dvd. try it in every tv in the house. and keep the one that looks best on average. because it's gonna look different on every single one of your customers tv's
    Quote Quote  
  7. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    For best results you should be making quality judgements on the video (TV) monitor. You didn't mention your editing software.

    Products Like Sony Vegas and Adobe Premiere allow real time monitoring on a video monitor. Usually the monitor (TV) is connected through a camcorder so the hardware DV codec in the camera can be used to generate the video output in realtime at full quality.

    Cheaper programs often don't have this capability. ULead Video Studio 8 will allow DV playback from the timeline to the DV camcorder. You can also use free apps like WinDV to playback sample DV streams back to the camcorder.

    The monitor (TV) must first be calibrated to a DV reference color bar. Most editing apps provide these. In order to properly setup a TV as a monitor, all automatic color correction features should be switched off.

    http://www.indianapolisfilm.net/article.php?story=20040117004721902
    http://www.videouniversity.com/tvbars2.htm
    Quote Quote  
  8. I think the important point in the original post, that no one has addressed yet, is that the DVDs he is creating show much more variability than commercial DVDs. Everyone is talking about calibration, but I don't see how that would cause his DVDs to look different than commercial ones.
    Quote Quote  
  9. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by sync
    I think the important point in the original post, that no one has addressed yet, is that the DVDs he is creating show much more variability than commercial DVDs. Everyone is talking about calibration, but I don't see how that would cause his DVDs to look different than commercial ones.
    Another related issue for NTSC DVD authoring is management of 7.5 IRE setup.

    DV format places black at 0IRE. The DVD format calls for black to be authored at 0 IRE. An NTSC DVD player will lift black to 7.5 IRE at the output. This is where a commercial DVD will display black.

    The TV monitor needs a different calibration for authoring (at 0 IRE black) and DVD playback (at 7.5 IRE black). This is one reason why it's a good idea to encode a color bar when authoring a DVD. The color bar can be used later for proper monitor adjustment.

    Another problem related to NTSC capture is double setup. If you use a DV camcorder to capture NTSC, it will record setup directly placing black at 7.5 IRE *. If this isn't corrected during capture or authoring, the DVD will be encoded with 7.5 IRE black and the DVD player will lift black another 7.5 IRE for a total of 15 IRE (i.e. 7.5% too high). This will make the image appear overly bright and washed out (no blacks).

    * Some capture cards do this as well. Others map analog 7.5 IRE to 0IRE digital.
    Quote Quote  
  10. Member rhegedus's Avatar
    Join Date
    Sep 2002
    Location
    on the jazz
    Search Comp PM
    Get a graphics card with TV-out and pump out your signal to a portable TV - should be good enough.
    Regards,

    Rob
    Quote Quote  
  11. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by rhegedus
    Get a graphics card with TV-out and pump out your signal to a portable TV - should be good enough.
    Absolutely not. Graphics card NTSC out is notorious for poor levels and non linearity. While it is possible to calibrate a monitor to a color bar on this output, it will often produce misleading results on the actual DVD. Best practice is to use the DV out for monitoring.

    A graphics card will only pass what is on the monitor display. The video shown on the monitor will have 0 IRE blacks and if used for reference, will produce a DVD that is too bright and washed out.

    PS: I see rhegedus is in the UK. Setup management is a NTSC issue. In a PAL environment blacks will be maintained at 0 IRE during capture, authoring and display. The graphics card video out will still be non-linear but at least it will display the correct black.
    Quote Quote  
  12. Member rhegedus's Avatar
    Join Date
    Sep 2002
    Location
    on the jazz
    Search Comp PM
    Originally Posted by edDV
    PS: I see rhegedus is in the UK. Setup management is a NTSC issue. In a PAL environment blacks will be maintained at 0 IRE during capture, authoring and display. The graphics card video out will still be non-linear but at least it will display the correct black.


    Worked for me for countless projects.
    Regards,

    Rob
    Quote Quote  
  13. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    It's a 5+% vs >2% tolerance issue.

    Camcorder DV codecs (hardware) produce broadcast quality specs. Graphics cards use very cheap D/A and NTSC/PAL encoders that produce wide tolerance.

    Also the NTSC/PAL signal that appears on the graphics card S-Video out is often a conversion from the RGB display buffer and several conversions distant from the raw video being monitored.

    The DV output is monitoring the actual signal.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!