In Windows XP, MPC-HC uses VMR7 by default. In Windows 7, it uses EVR.
I noticed that, for EVR, users can switch between two resize algorithms: bilinear and bicubic. So for EVR, I know that when I switch to fullscreen mode, the frames will be resized with either bilinear or bicubic. But if I use Windows XP, and VMR7, what algorithm is used for resizing ? For EVR, I know that the quality of the resized frames is better with bicubic than with bilinear, but for VMR7 is the resize quality the same as EVR bilinear ? Or lower than EVR bilinear ? Or how would you describe, in general, the quality of resized frames, between VMR7 and EVR ?
+ Reply to Thread
Results 1 to 5 of 5
-
Last edited by codemaster; 16th Feb 2013 at 22:13.
-
-
vmr7 ouch i don't advise you to use this "has-been" renderer Vmr9 support direct x 9 at least
if you want to have a sharp output (work with a good quality source otherwise forget) use spline resize (spline36)
second best would be bicubic then bilinear*** DIGITIZING VHS / ANALOG VIDEOS SINCE 2001**** GEAR: JVC HR-S7700MS, TOSHIBA V733EF AND MORE -
Just for informing the uninformed
,
you *can* use EVR on Windows XP, and you don't even have to install the dotNET bloatware---
--- all you have to do is copy the files evr.dll, evrprop.dll and dxva2.dll to the system32 folder, and then apply regsvr32 onto evr.dll and evrprop.dll. -
You'd probably be better off using WMR9 rather than WMR7 if you don't have the EVR renderer installed. I have it installed, and I still use WMR9 renderer because (for XP at least) DXVA decoding doesn't work with EVR and MPC-HC. WMR9 should give you the same resizing choices as EVR.
Personally, I use Bilinear resizing. It's not as sharp as Bicubic but at the same time it's less prone to resizing artefacts and because it's not as sharp, it also doesn't sharpen any compression artefacts as much, which I much prefer when resizing standard definition Xvid/AVIs to full screen on my TV. When it comes to resizing high quality HD video (ie 720p) there's not as much visible difference between the two resizing methods so it doesn't matter too much. I guess a lot of it depends on the size of the monitor you're using and what you can see at normal viewing distance.
My biggest disappointment is nothing seems to have changed with EVR in terms of the colorimetry choice it makes when displaying video. There's basically two methods for converting the video to RGB on playback, Rec.601 for standard definition and Rec.709 for high definition. Most renderers make the choice of which to use according to the resolution of the video and Microsoft does it differently to everyone else.
If I remember correctly from when I tested it a long time ago, the Windows renderers (well I've only tested using XP) display video using standard definition colorimetry if the video width is less than 1200, and if the height is less than 578. Or to put it another way.... in order for video to display using high definition colorimetry the width must be equal to or greater than 1200 AND the height must be equal to or greater than 578.
The upshot of that is if you resize high definition video to 720p while cropping the black bars as most people do, you're very likely to be left with a resolution such as 1280x544 which the Windows renderers will display using the wrong colorimetry. The difference isn't huge, but it's enough to be a bad thing if it's not correct.
The MadVR renderer seems to make it's colorimetry choice using a more standard formula (ie if width is greater than 1024 OR height is greater than 600 it'll use high def colorimetry) and it also obeys any colorimetry info written to the h264 video stream so it's more likely to get it right. MadVR also gives you more resizing choices but it uses the CPU for decoding/resizing rather than the GPU, and unfortunately it doesn't seem to play well with Reclock when using multiple monitors with different refresh rates..... sigh.
For me though, resizing is a secondary concern compared to displaying video with the wrong colorimetry. That drives me nuts.
I tested the above using MPC-HC and ffdshow. I recently switch to LAV Filters and haven't got around to testing if a different decoder will cause things to be done differently. While I'm rambling on about renderers......
My solution for fixing the colors on playback has been to simply activate the MPC-HC "BT.601->BT.709" pixel shader for those encodes which would otherwise display using the wrong colorimetry, but I re-installed MPC-HC from scratch recently only to discover the new version of the pixel shader only works with standard definition video. I managed to work out how to replace the pixel shader with the older version which works regardless of resolution, but does anyone know what the logic behind having the "BT.601->BT.709" pixel shader only work when displaying standard definition video might be? I don't get it.Last edited by hello_hello; 17th Feb 2013 at 08:34.
Similar Threads
-
mpc-hc and wrong EVR colorspace output
By wewt in forum Newbie / General discussionsReplies: 2Last Post: 9th Dec 2009, 12:24 -
PQ issue with MPC-HC VMR 9 renderless
By mx5 in forum Software PlayingReplies: 6Last Post: 5th Aug 2009, 06:08 -
EVR on XP SP3 with MPC-HC
By THX-UltraII in forum Newbie / General discussionsReplies: 0Last Post: 10th Sep 2008, 06:30 -
Will an up-scaling DVD Player improve quality?
By W_Eagle in forum DVD & Blu-ray PlayersReplies: 15Last Post: 4th Jul 2008, 15:23 -
EVR via MPC
By ckamc in forum Software PlayingReplies: 0Last Post: 23rd Apr 2008, 22:48