My PC has a VIA Unichrome S3 graphics card which gives very good interlaced video output from Xvid and Divx files providing that I use the Xvid codec for playback. If I switch to FFDShow the output is very nasty and dirtily de-interlaced.
Anybody have an idea why? (De-interlaced is not checked in ffdshow setup!)
Thanks in advance...
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays!
+ Reply to Thread
Results 1 to 7 of 7
Thread
-
-
"Very good interlaced video output" where? On an s-video or composite output port? On the computer's monitor? Is your source file interlaced (Xvid and Divx files usually aren't interlaced)? You mean it's properly deinterlaced during playback? What player(s) are you using?
-
Composite video output to tv. Divx and Xvid files support interlacing and if the source material is interlaced then it should be kept that way. It is not de-interlaced at all. The PC output is interlaced and the tv displays it as such. Unless I use ffdshow, then it is a horrible mess (looks a bit like video that has been re-sized before being de-interlaced)
-
I used to use s-video outout to a TV. Matrox cards had the best interlaced output. ATI and Nvidia would work after a lot of fiddling but I found they often "forgot" the correct settings and required resetting.
Anyway, your reply narrows it down a bit. There are still many places where things can be going wrong. I suspect ffdshow or the output module is adjusting the height of the frame (without using interlace aware resizing) to compensate for non-square pixels, causing the two fields to co-mingle.
Start by looking at ffdshow's output controls (Start -> All Programs -> ffdshow -> Video Decoder Configuration, Output in the left pane). There are several settings there that could cause problems; Set Interlace Flag In Output Media Type, and Set Pixel Aspect Ratio in particular.
Other filter settings could be causing the problem especially the Deinterlace and Resize & Aspect settings.
I'm assuming you're using the same player with both Xvid and ffdshow decoding. If not, the players' settings can have an effect on this too. The chosen output module (ie, Video Overlay vs VMR7/9, etc.) any deinterlacing settings, scaling, etc.
By the way, how does the video look on the computer monitor? Do you see comb artifacts with both decoders? -
Ooh, lots to think about there, thanks.
To answer a couple of questions... Yes, I always use the same player regardless of codec.
I don't know if combing appears on the PC monitor as it is set up as two seperate outputs. PC desktop on monitor and full frame video on TV out. I don't like to fiddle with the set up as it takes ages to get right again!
I am away from home for a few days now so I'll have a closer look at ffdshow when I get back.
Thanks again for the pointers, no doubt I'll be back with more questions in a couple of days! -
Originally Posted by floatingshed
-
Yes, I am using the equivalent of "theater mode". It means that when a video stops playing the tv goes black, (providing of course that I have no desktop image). Nice and tidy!
I'm going home in a couple of hours so will begin an in-depth investigation.
Similar Threads
-
Converting DV to H.264 and comparing interlaced/de-interlaced
By amirh1 in forum Video ConversionReplies: 5Last Post: 23rd Jun 2010, 09:16 -
How to know if a video is interlaced?
By JLGUT in forum EditingReplies: 9Last Post: 20th Mar 2009, 15:40 -
Does de-interlaced video = progressive?
By yoda313 in forum RestorationReplies: 13Last Post: 18th Aug 2008, 19:51 -
Video Advantage PCI: Interlaced Video/No Live Audio Preview/Other Questions
By acid_burn in forum CapturingReplies: 2Last Post: 8th Feb 2008, 20:03 -
How can I tell if a video is interlaced?
By cheerful in forum Newbie / General discussionsReplies: 10Last Post: 18th Jul 2007, 21:56