VideoHelp Forum
+ Reply to Thread
Results 1 to 19 of 19
Thread
  1. Member
    Join Date
    Apr 2015
    Location
    The United States Of America
    Search Comp PM
    This is what I want to do. I want to hook up a 1080i signal output tv tuner to a 1080p monitor input so I can watch tv on my monitor. Will the 1080i out convert to 1080p in? As long as I'm on the subject, if 1080p is better than 1080i, why the two standards. It would sure eliminate questions about the two.
    Quote Quote  
  2. 1080 -> The resolution (mostly 1920x1080) is the same, but i is for interlaced and p is for progressive.
    Some Monitors "only" can handle 1080p because flatscreens of any kind "only" can display progressive pictures, so you need a built in deinterlacer in the monitor/TV. Every TV has it, but 99% of monitors not. If you want it, you need a TVmonitor like I got the Samsung T27A950. It has also 3D.

    There are also other resolutions available like 1440x1080 or 1280x1080 (anamorph -> pixels are not square). My VU+ displays it at some channels and the recordings transferred to my PC of course also have this resolution. I checked it with MediaInfo (Lite)

    Movies are shot in 24p or in 23.976, it depends on their equipment, like for Crank 2 they used "cheap" HDV-camcorders.
    Progressiv supports more framerates. Interlaced only 25fps (PAL) / 29.97fps (NTSC).
    Quote Quote  
  3. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Also, part of the reason for the 2 standards (of 3 actually - don't forget 720p) is that the Progressive version uses up twice as much bitrate/resources as the Interlaced version, so for example when the ATSC spec and the Blu-ray specs were being worked out, the burden of using 1080p was still too high for both broadcast and consumer equipment's capabilities (and even to some extent with PCs). Things have vastly improved since then but we're not "out of the woods" yet. Adoption of upcoming new standards such as ATSC 3.0, Rec2020, HDMI2.0, UltraBD(4k), etc will enable further acceptance of progressive-priority or progressive-only material.

    AFA the difference between using a TV and using a monitor, TVs have, in addition to broadcast reception circuitry, much more processing of the signal than monitors do (rescaling, deinterlacing, color & levels dynamics, overscanning...). Also, it seems that in general, TVs follow CE-centric requirements of a few acceptable/common framerates and resolutions, while monitors follow IT-centric expectations of more widely varying framerates and resolutions.

    Scott
    Quote Quote  
  4. Originally Posted by Cornucopia View Post
    ... monitors follow IT-centric expectations of more widely varying framerates and resolutions.
    Most monitors (I tested about 20 different "avarage" monitors) only support 60 Hz as framerate. That's nothing for me^^

    Many users, maybe most, are happy with 24p video played with VLC media player for example but the refresh rate is still set to 60 Hz. That's not real 24p
    Quote Quote  
  5. Every 1080p TV knows how to convert a 1080i source to 1080p for display. Some computer monitors can do this but others not. If your monitor doesn't support it you have to use a scan converter.
    Quote Quote  
  6. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    1080i and 1080p without mentioning the framerate is not very useful in this discussion, it only confuses things.

    1080i and 1080p at 30 frames per second uses exactly the same amount of bandwidth. One is allowed the other is not. Why? Beats me, it does not make any sense! However some people try to use the argument that 30p is too stuttery to "mitigate " this decision.

    The best way to play a 1080i source at 30 frames per second on a progressive monitor is to render it at 60 frames per second either straight forward frame doubled and deinterlaced at real time. Do not use overscan, it's an archaic solution for a problem that no longer exists, unfortunately it is still in TVs and won't go away if broadcasters refuse to clean up their sources.
    Quote Quote  
  7. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Robert Bradford View Post
    This is what I want to do. I want to hook up a 1080i signal output tv tuner to a 1080p monitor input so I can watch tv on my monitor. Will the 1080i out convert to 1080p in? As long as I'm on the subject, if 1080p is better than 1080i, why the two standards. It would sure eliminate questions about the two.
    As others have indicated this, won't work in most cases. For most LCD monitors, the TV tuner would need to output video at 1080p60/59.94 instead of 1080i.

    I found one TV tuner which claims to offer 1080p, the HomeWorx HW180STB, but I can't find a reference saying that it actually converts 1080i29.97 input to 1080p59.94 for output. It may only output 1080p when a 1080p file is played using the built-in media player. You might be able to ask about that at avsforums.com http://www.avsforum.com/forum/42-hdtv-recorders/1880337-homeworx-hw-180stb.html

    newpball only posts to troll, complain, disrupt, and misdirect. If 1080p29.97 had been adopted instead of 1080i29.97, most LCD monitors still wouldn't display 1080p29.97 input correctly as that isn't close to 60Hz per second
    Last edited by usually_quiet; 25th Apr 2015 at 12:55. Reason: clarity
    Quote Quote  
  8. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Originally Posted by flashandpan007 View Post
    Originally Posted by Cornucopia View Post
    ... monitors follow IT-centric expectations of more widely varying framerates and resolutions.
    Most monitors (I tested about 20 different "avarage" monitors) only support 60 Hz as framerate. That's nothing for me^^

    Many users, maybe most, are happy with 24p video played with VLC media player for example but the refresh rate is still set to 60 Hz. That's not real 24p
    Well, then your average is a lot cheaper than what I'm used to. From what I've seen, even the bottom of the barrel LCD monitors have supported 60, 72 and 75Hz.

    24p footage played through 72Hz refresh rate is exactly like how triple-flashing would work in the cinema. Each frame is triplicated before moving on to the next frame. Maintains the refresh rate but gives the equivalent of true 24p duration, without any variation, jerkiness or stuttering (assuming the video card+driver can do this - most good ones can).

    Scott
    Quote Quote  
  9. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Cornucopia View Post
    Well, then your average is a lot cheaper than what I'm used to. From what I've seen, even the bottom of the barrel LCD monitors have supported 60, 72 and 75Hz.
    Right, there is absolutely no technical reason whatsoever why monitors could not support those refresh rates for HD.

    Of course expecting that for HD televisions would be considered a blasphemous outrage by some. Thankfully standards people disallowed that idiotic idea for HD. Besides, even if they would have allowed it their idea of the duration for a standard™ handshake borders the time for a signal to reach the Moon.
    Quote Quote  
  10. Originally Posted by newpball View Post
    Originally Posted by Cornucopia View Post
    Well, then your average is a lot cheaper than what I'm used to. From what I've seen, even the bottom of the barrel LCD monitors have supported 60, 72 and 75Hz.
    Right, there is absolutely no technical reason whatsoever why monitors could not support those refresh rates for HD.

    Of course expecting that for HD televisions would be considered a blasphemous outrage by some. Thankfully standards people disallowed that idiotic idea for HD. Besides, even if they would have allowed it their idea of the duration for a standard™ handshake borders the time for a signal to reach the Moon.
    Download MPC-HC enable Exact VSync and press CTRL+J while plaing a file. The red and green line should be parallel otherwise the refresh rate is not matching the framerate and the video stutters
    Quote Quote  
  11. The tested monitors were not mine. I looked at my friends computers what refresh rates are supported in windows settings.
    72 Hz is not to prefer because 99,9% of the movies are 23.976 and not 24p like in the past. But I know that Transporter 3 is 24p my one and only true 24p Blu-ray.
    And lots of files are 25 Hz or 29.97 Hz. Every graphics card has it own profiles for 23/24/25/29/30/50/59/60Hz. Just look it up.

    I am thinking of supporting the frame rates of the videofiles itself like every TV does it.
    I do the same like any player for a TV like a PS3, PS4 or whatever. It switches also to 24 Hz or 23.976 which is also called 24Hz by Samsung.

    I use MPC-HC with automatic refresh rate change and D3D-mode. Works perfect. I am the one who was begging for the delay timer before playback starts because the monitors need some time to switch to the matching refresh rate.

    Also My three VU+ can do that.

    If you want to see refresh rate stability press CTRL + J while playing a videofile in MPC-HC. Be sure to use VSync and exact VSync.

    Klaus from Bavaria
    Quote Quote  
  12. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Yes, it IS an idiotic idea for HDTV, because then the MILLIONS of people who already have HDTVs wouldn't work for some signals.
    And it's much more "blasphemous" (or rather ARROGANT) to think your own favorite way of doing things it THE ONLY WAY, than the way that teams of knowledgeable people have spent months & years testing & determining what actually will work economically for the great majority of users given then-current levels of technology.

    Stop conflating computer HD (IT) and HD TV (CE) in your arguments, because you clearly don't understand the difference.

    Scott
    Quote Quote  
  13. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Cornucopia View Post
    Yes, it IS an idiotic idea for HDTV, because then the MILLIONS of people who already have HDTVs wouldn't work for some signals.
    If they were built with the right standards in place there would not have been a problem.

    The standards came first, manufacturers simply complied (and sometimes ever were compelled) to them!

    And I know, some people try really hard to keep TV and PC video standards as separate as possible.

    Wonder why?
    Last edited by newpball; 25th Apr 2015 at 13:29.
    Quote Quote  
  14. ?? Sorry I didn't get #12 and #13. Have I said something wrong?
    Quote Quote  
  15. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by flashandpan007 View Post
    ?? Sorry I didn't get #12 and #13. Have I said something wrong?
    No. Scott was replying to newpball's post, #9.
    Quote Quote  
  16. But you need a monitor or a TV which is capable of diplaying 23/24/25/29/30/50/59/60 Hz. XBMC, oh sorry Kodi can also do it . But it is in expert settings.

    Please try MPC-HC otherwise you won't get my arguments. If you don't get it, this discussion is useless.

    needed sttings
    Click image for larger version

Name:	settings.png
Views:	249
Size:	47.9 KB
ID:	31417
    Quote Quote  
  17. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    All the PC discussions are not much use to the OP, Robert Bradford. As I read his first post, it appears he wants to connect a set-top box of some kind directly to the monitor.

    If he wanted to connect the monitor to a PC with a PC TV tuner, he wouldn't have a problem doing that.
    Quote Quote  
  18. Here is what it looks like playing a 23.976p videofile with a Nvidia grafics card - it only supports 23.972 so every 4 ... 5 minutes there is small jump, but if you use 24p setting for the 23.976 files there is a jump every 40 secs. Thats the reason why there is a 23Hz setting of the graphics driver of any company Nvidia, AMD (ATI) Intel.
    Intel HD 4600 can do it exact! I have tried it. But older Intel HD had the 24p bug - there is only 24p available. google it

    https://drive.google.com/file/d/0B4y5u0gBKYqbZ0VQSktHU1NWV28/view?usp=sharing


    Here is a 50fps file played at 50Hz:

    https://drive.google.com/file/d/0B4y5u0gBKYqba0J5czloUHJWVlk/view?usp=sharing

    Note that the loud keyboard is a mechanical Cherry G80. I switched the PGS subtitle of the AVCHD mts file off and on. When switching you can see a jump in the display stats -> red and green line.

    You see 72 Hz is not what you should use. 50fps or 25 fps and almost all others cannot be displayed correctly. This is not voodoo these are refresh rates!

    If you want to do it right, you have to switch everytime to the matching refresh rate -> MPC-HC/MPC-BE/Kodi can do it automatically for you.

    Most people using an average player like VLC-media player think they are watching it right, but that's not the case. The refresh rate of the monitor must match the framerate of the video.
    For VU+ receivers there is a addon called autoresolution

    Klaus from Bavaria
    Quote Quote  
  19. Originally Posted by usually_quiet View Post
    All the PC discussions are not much use to the OP, Robert Bradford. As I read his first post, it appears he wants to connect a set-top box of some kind directly to the monitor.

    If he wanted to connect the monitor to a PC with a PC TV tuner, he wouldn't have a problem doing that.
    OK, I totally agree with you

    @ Robert Bradford: What equipment do you want to connect together. Maybe the "tuner" is a receiver like my VU+ and some of these like mine also supports 1080p.
    Please tell us what devices you have so we can help better.

    Klaus from Bavaria
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!