Hello!
I will describe my situation briefly.
1)I have a few .TS files that are 1080i.
2) My TV is 32" HD-Ready latest Philips model
3)My desktop monitor is 19" LCD (1280x1024)
Now, I did a conversion of this .TS file to .MP4 AVC/AAC. I've done it twice, one file is 1280x720 (5 Mbps) and the other one is 1920x1080 (12 Mbps).
The question is which file to keep? I watch HD content via PC (DVI-HDMI). When I play these 2 files on my PC I cannot see any difference, will it be the same on my TV? Also, will it be the same on a Full HD TV?
The second question is about terms interlace and progressive, as I said the source was 1080i. However the 2 outputted MP4 files are progressive, is it a true progressive files? I mean can we get that if the source was interlaced? (1080i -> 1080p & 1080i -> 720p).
Thanks
+ Reply to Thread
Results 1 to 16 of 16
-
-
Originally Posted by kouras
First your TV is probably native 1366x768 or lower so you won't see much difference one to the other but you will see difference to the original during motion. It makes a difference whether the original is live (e.g. sports) or film based.
1080i has 59.94 motion increments per second (59.94 fields per second) as does original 720p (59.94 frames per second). Your conversion lowered both to 29.97 progressive frames per second (somewhat jerky). For 1080i the method of deinterlace determines the level of damage.
A premium 1920x1080p HDTV would do a fine job from the original 1080i source but can only pass your degraded deinterlaced 12 Mb/s 1080p conversion. The 720p/29.97 5 Mb/s conversion would look even worse.
Originally Posted by kouras
Likewise, 1920x1080i film source can be converted to 1280x720p/23.976 under ideal conditions. Live 1080i source will need special deinterlace technique for 720p/59.94 conversion.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
I know that the source is the best in any field, but these .TS files arent meant to be for player, in any player I opened them the picture had these annoying lines in motion. MP4 doesnt have it and I cant spot any difference in quality vs .ts, which is 16 Mbps.
Originally Posted by edDV
Originally Posted by edDV
Video
ID : 2048 (0x800)
Menu ID : 1 (0x1)
Format : MPEG Video
Format version : Version 2
Format profile : Main@High
Format settings, Matrix : Default
Duration : 3mn 20s
Bit rate mode : Variable
Bit rate : 16.2 Mbps
Nominal bit rate : 38.8 Mbps
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16/9
Frame rate : 29.970 fps < -
Standard : Component
Colorimetry : 4:2:0
Scan type : Interlaced
Scan order : Top Field First
Bits/(Pixel*Frame) : 0.261
So its not 59 ? Would it say 59 ? -
I just connected to TV and was doing some testing, apparently those lines in TS file were in VLC player only, when I opened the TS file in zoom player the lines werent noticeable. However, I did not notice any difference in quality between the 3 files: 720p(5 Mbps), 1080(12 Mbps) & TS(16.7 Mbps).
I guess I should stick to MP4 AVC/AAC @ 1280x720, since it has x4 less filesize comparing to TS.
Shame I cant test them on a Full HD TV to see if there will be any difference, would be interesting. I think its also depends on the TV's size? Probably we need 50" or more to notice smth?
Thanks for replying to my questions. -
Originally Posted by kouras
The lines you were seeing represent two interlace fields displaced 1/59.94th sec in time displayed as one 29.97 fps progressive frame. If your HDTV is any good, it will hardware deinterlace 1080i to 59.94 fps progressive for better motion realism. 720p is broadcast with 59.94 progressive frames per sec.
If you software deinterlace the 1080i file, the HDTV won't be able to use its superior hardware deinterlacer. If you reduce resolution to 1280x720p, you will notice the resolution hit when viewed on a 1080p TV.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by kouras
I'm just pointing out the compromise implicit when you software deinterlace the file. Better to let the software player or TV deinterlace for progressive display.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
edDV, thanks for your helpful answers.
So, in another words you would not recommend to convert the ts files to save the quality?
What about if we talk only about HD-Ready, which is 1366x768p. On some forums Ive seen ppl are saying that the best result will be if video is in that resolution as native TV's one. Is this true?
"720p is upscaled to 1366x768p, and 1080i is deinterlaced and downscaled to 1366x768p"
^ this is the quote, which setting would you recommend leave the ts as it is at 1080i or convert to 720p for my set up?
edit: I guess this is the answer? http://www.gamespot.com/pages/forums/show_msgs.php?topic_id=26583897
p.s.: this page is already 3rd in google resultsThe G is too fast)))
-
edDV is providing you with with information that very few normal folks will be able to digest. He is a perfectionist. That said it appears you are concerned about file size.
However, I did not notice any difference in quality between the 3 files: 720p(5 Mbps), 1080(12 Mbps) & TS(16.7 Mbps). -
Originally Posted by kouras
So many people spent man years to transfer their DVD collections to low bit rate 720x405 divx/xvid files that still looked good on their computer monitor or SD TV sets only to find they look like crap on their new HDTV. Now they need to go back to original source. We are talking about something similar here. If you reduce your 1920x1080i to 1280x720p @ 29.97 progressive frames per second, it is going to look soft and jerky on your future HDTV.
Originally Posted by kouras
Point is the ATSC broadcast system and BluRay are designed for best performance with future TV sets. Today you see only the level of processing in your current TV.
Another generalization is the hardware processors in mid to upper level HDTV sets are far better than anything you can do in consumer available software. This is particularly true for deinterlace and inverse telecine. Those TV processors rely on interlace to perform well. They depend on 59.94 fields per second in order to create the best quality 59.94 or 119.88 progressive frame per second display.
If you are sending a computer desktop or game display to an LCD TV, you are asking the TV to act as a computer monitor. In that case you can bypass double scaling (computer and TV) if you output at the native panel resolution of the TV. This assumes the TV supports pixel for pixel mapping. Most do on the VGA port and some do on the HDMI port.
If you are importing from a cable/sat tuner you have two choices
a. Set the tuner for 1080i and let the TV deinterlace, IVTC and rescale, or
b. Set the tuner for 720p and let the cable/sat box deinterlace, IVTC and scale 1080i to 1280x720p. Then the TV needs to upscale 1280x720p to 1366x768.
If you think the cable box has a higher quality image processor (i.e. you have an old or crap low end TV) then set 720p*. If your TV is reasonable quality (mid to high end) set the box to 1080i and let the TV do the work.
* Exception is if 720p sports are a priority (ABC/FOX/ESPN HD) switch the box to 720p during the big game for better motion performance.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
If your goal is to reduce file size, then the idea is to do as little damage as necessary. To maintain more quaity as you reduce bit rate, you would first inverse telecine all film source before recode. If possible you would retain interlace for non-film source or smart bob to 59.94 fps. The big step down in quality comes when everything is deinterlaced and rate reduced to 30p but that is necessary to use highest compression codecs.
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
http://www.pixmania.co.uk/uk/uk/1232624/art/philips/32pfl7623d-lcd-television.html
"Pixel Plus 3 HD, 3:2 - 2:2 movement compensation" < - is this the option to look?
So if Im watching a 1080i .TS file via VLC on my TV connected via HDMI to graphics card its getting scaled twice? First in VLC, then on my TV, is that what you are saying?
And if I watch a 720p file, it will be upscaled once on my TV? Also, why do ppl use 1280x720, when the native resolution of TV's is 1366x768? These numbers arent divided by 16 & 9, but if we do the math the closest resolution which is divided would be 1344x756, is there any reason to not make videos at 1344x756 and make them at 1280x720 standard?
thanks -
It could be scaling three times depending on settings.
1. Software player
2. Display card output to TV
3. TV
I'll try to find the manual for your TV. They usually list accepted input resolutions on the VGA and YPbPr/HDMI ports.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
OK, I found the manual for your HDTV.
http://www.p4c.philips.com/files/3/32pfl7623d_10/32pfl7623d_10_dfu_eng.pdf
To eliminate scaling in the TV, set as follows. It is important to realize that this turns off all picture processing in the TV other than simple brightness and contrast. You are now at the mercy of your software players and display card hardware for picture processing.
If you were to switch TV inputs from the PC to a direct tuner (1080i/720p/576i) you would need to change from unscaled to one of the other TV display modes so that the TV can scale input to panel resolution 1366x768.
P-38 connecting a Personal Computer...
P-14 "Widescreen picture format" Set to unscaled.
P-41 "Supported display resolutions" Set display card output (VGA or DVI-D to HDMI) to 1360x768.
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Great stuff, although I already connected it via HDMI, but didnt know about unscaled function, will check it out, thanks.
I got the other problem now with my Ati HD4850 card and dual display mode, it just doesnt like that mode and vista crashes all the time with Bsod's, I already posted on AMD forums, and it seems like its not only me having the same crash problem, I will update the drivers to the latest 9.1 tomorrow and see if that will help.
One simple question, would it be better to encode videos in 1360x768 format rather than 1280x720 if we are talking only about my tv? -
Originally Posted by kouras
Your computer monitor is probably some resolution other than 1360x768. You will be setting that separately.
1920x1080i/29.97 and 1280x720p/59.94 have similar bit rates. 1280x720p/29.97 has half the bit rate but is potentially jerky. So much for quality.
So if the goal is to reduce file size follow the guides for h.264, divx, xvid.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
I see, thanks for the help.
By the way, the guys from Ati did a great job. I updated the drivers to 9.1 and the dual mode now works fine. Its nice when ppl actually do fix something, unlike ppl from microsoft
Similar Threads
-
Upscaling performance to 1080i/720p
By alexh110 in forum DVD & Blu-ray PlayersReplies: 5Last Post: 17th Dec 2011, 18:02 -
Encoding 1080i@25FPS to 720p
By jiopi in forum Video ConversionReplies: 24Last Post: 9th Nov 2010, 01:58 -
720p vs 1080i?
By therock003 in forum Newbie / General discussionsReplies: 15Last Post: 27th Apr 2010, 22:52 -
720p vs 1080i - which is best?
By snadge in forum DVB / IPTVReplies: 19Last Post: 31st Aug 2009, 19:19 -
I want a new tv. should i get a 1080i or a 720p
By championjosh in forum Off topicReplies: 8Last Post: 1st Jan 2008, 13:34