http://www.bbc.co.uk/rd/blog/2016/01/h-dot-265-slash-hevc-vs-h-dot-264-slash-avc-50-pe...vings-verifiedThe results and an extensive analysis of the formal subjective verification tests of the H.265/HEVC video compression standard are published in the IEEE Transactions on Circuits and Systems for Video Technology (TCSVT), January 2016. BBC R&D video coding research team focused on evaluations of UHD content and definition of analytics as part of standardisation process and presented in this paper.
http://ieeexplore.ieee.org/stamp/stamp.jsp?reload=true&tp=&arnumber=7254155The High Efficiency Video Coding (HEVC) standard (ITU-T H.265 and ISO/IEC 23008-2) has been developed with the main goal of providing significantly improved video compression compared with its predecessors. In order to evaluate this goal, verification tests were conducted by the Joint Collaborative Team on Video Coding of ITU-T SG 16 WP 3 and ISO/IEC JTC 1/SC 29. This paper presents the subjective and objective results of a verification test in which the performance of the new standard is compared with its highly successful predecessor, the Advanced Video Coding (AVC) video compression standard (ITU-T H.264 and ISO/IEC 14496-10). The test used video sequences with resolutions ranging from 480p up to ultra-high definition, encoded at various quality levels using the HEVC Main profile and the AVC High profile. In order to provide a clear evaluation, this paper also discusses various aspects for the analysis of the test results. The tests showed that bit rate savings of 59% on average can be achieved by HEVC for the same perceived video quality, which is higher than a bit rate saving of 44% demonstrated with the PSNR objective quality metric. However, it has been shown that the bit rates required to achieve good quality of compressed content, as well as the bit rate savings relative to AVC, are highly dependent on the characteristics of the tested content
What do you guys say that have tried HEVC?
+ Reply to Thread
Results 1 to 26 of 26
This it the important bit:
However, it has been shown that the bit rates required to achieve good quality of compressed content, as well as the bit rate savings relative to AVC, are highly dependent on the characteristics of the tested content
It depends on what you're encoding, and the settings you're encoding them at. The number 59% is actually completely meaningless.
The output codec is useless without device support for playback.Got my retirement plans all set. Looks like I only have to work another 5 years after I die........
Handbrake first really integrated H.264 encoding, I waited a bit and kept using MPEG-4 visual until I had mastered H.264 and x264 settings.
As for content mattering when it comes to bit rate, well that is a given. Encoding limited motion anime will allow some pretty low bit rates while encoding a 4K heavy grained film will need a lot more bit rate loving and thus less effiecient but we've seen that with H.264. I've encoded literally hundreds if not thousands of video of various content and I learned early on to test test test test because one preset for animation that looks good will create a horrible encode for a movie with heavy grain.
Yes, h265 is better. We'd all be using it already if all our playback devices supported it.
Yes, h265 is better. But trying to put a number on how much better it is is a waste of time, and has most likely been done solely as a marketing gimmick to give them something specific to say and to get people excited. Anyone who's been in the game long enough will just scoff at it, even as it's accomplishing it's goals.
In the end if I can't figure out how to deinterlace and deblend some of this crap properly, it doesn't matter which codec I encode it with.
In that pdf with detailed info, they said they compared HM 12.1 reference software to JM 18.5 reference software using fixed QP. Why not use something more practical like x264 and other commercial encoders that are used in real world? Is it because they need this test to be strictly scientific and with their own software?
I don't know man, for sure HEVC as a technology is better but I think it's much more usable for 2160p. For lower resolutions, I still have my doubts it will reduce bitrate that much. Hell, I see that MPEG 2 is still used for standard definition TV.
Look at the bottom. "Improvements in compression technology".
MPEG2 Set Top Box decoders are cheap pieces of crap, that's why they exist in the first place. SD only decoders are for people stuck with SD televisions, which is why SD channels tend to be broadcast in MPEG2. It's all about the price of purchase and nothing else. If they upgraded all the SD broadcasts to h264, all these people who couldn't afford to buy (or be bothered buying) a HD TV would be forced to spend precious money on a more expensive Set Top Box in order to watch TV.
In fact, until very recently ALL Australian TV, SD or HD has been broadcast in MPEG-2, and cost was the only concern towards that decision. A lot of people were annoyed about that, but for the government at the time cost was everything.
They've neutered the broadband rollout too. Again the only concern was cost.
It doesn't pay to be interested in technology in Australia.
And this reminds me of something I wrote almost a decade ago (2007) when H.264 was rather obscure among devices. Even those that supported H.264 (ex: iPhone, Apple TV, etc) had problems with many of its features back then.
"However, hardware will eventually catch up with MRFs, B-pyramids, etc in due time, even though they make several decoders choke today."
"In 5 years time my phone will be playing all my encoded clips of today, with all the gravy, and it doesn't even have to be a phone from Apple either..."
I don't post there anymore, but pointing out that it did become true. Yes, in 2012, 5 years later, my non-Apple phone easily played stuff I encoded then.
I say the same similar thing about HEVC today for five years from now.
I'm telling Future-Me then to link me.I hate VHS. I always did.
Side to this most expensive thing nowadays is Air - if you see how much cost operating license and rights to use part of spectrum this is frequently billions.
I would blame other factors - name corruption as one them, second laziness - Australia is large country, island, no neighbours so almost whole VHF/UHF available for local operators - i can assume that Australia have no motivation to go for new technology but this also mean that MPEG-32 should be ok with reasonable high bitrate - i would say 20 - 30Mbps for HD and 6 - 8Mbps for SD should be capable to deliver HQ content. Also error robustness for MPEG-2 may be better than more complex encoders.
I just think 4K is going to be bumpy as heck (or should I say buffery as heck) if we don't magically kick up bandwidth for everyone. In the US rural broadband is pretty bad in a lot of parts...you can drive a couple of miles from a big town or city and not be able to get anything but satelite or cellular which is always overpriced.
Why have we not moved on to H.264 even though it is blatantly better?
1) Just about all ATSC Digital TVs can only decode MPEG2 over the air - so everyone will need a new TV or an external H.264 decoder box to watch it
2) Broadcast encoders will need to be changed #not a big deal
3) Average person is not very savvy on such issues so no one is demanding it, all they know is that their TV and Tuner are digital
4) Probably slower to click through channels and fringe area people will have more problems with error sensitive H.264
5) Most parts of the US still has lots of free spectrum
Uh in regards to number 5, we are running out of spectrum left and right. You should see the spoiled ends of the spectrum dish is using and paid big for. If we had divided up 700MHz and 2500-2700MHz like the rest of the world this time around for LTE we could squeeze more channels and carriers in, US spectrum is so mangled its not even funny, its like the FCC engineers just got completely blasted one night and drew up a plan for the 700MHz band plan, its horrendous and to make things worse we screwed over Canada with it since they border us so they have to use our mangled Band 12, 13, 17, 29, plan instead of the simpler, spacious band 28 plan, even Mexico shy'ed away from the US 700MHz plan at the cost of having to cordinate with the US FCC on the border to avoid interference as Mexico is going Band 28. In the 2.5GHz range we let Clearwire and Sprint gobble up the entire band for WiMAX which died way too slow and now we have TD-LTE which is actually a smart idea since the transition from WiMAX to TD-LTE is smoother than to FDD-LTE but it doesn't align with Europes band plan. I suppose I am getting off topic here but make no mistake the US is running out of spectrum and the way it is managed is pretty much a joke.
There are currently 48 channels to pick from with ATSC, and I live between 2 big cities and a small city. With about 200 miles between these big cities. I can only pick up ~7 channels but if I was in a better spot (a hill) I would probably still only be able to pick up 20-25 channels in total from these 3 population centers. There is still room for more stations as far as I can tell. Even though we lost 18 channels due to the digital transitions, and the FCC giving that spectrum to cell phone companies.
I live in the Midwest, so California and the North East US may be out of spectrum, but in the middle it's ok still.
Which is exactly why the codec as seen by Broadcaster is of much less relevance to the people re-encoding DVDs and Blu Rays.
The better codecs all work best at lower bitrates, as the bitrate (and therefore quality) increases all the new instructions don't work as efficiently and the actual benefit relative to bitrate becomes less and less. Although even at lossless the better codecs should always be better, there's no "this codec offers exactly THIS percent better compression than this OTHER codec".
I've been annoyed that "Fantastic 4" has MPEG-2 video ever since I learned what it meant. Yet despite having a bitrate of only 16.5 Mbps, and it's sequel using AVC with a bitrate of 29.2 Mbps, I'll be blown if I can actually tell the difference. Maybe I'm just not experienced enough.
I am hopeful that .h265 is going to be a great thing because it might alleviate some bottlenecks when it comes to streaming video. 4K and 8K video is nice for those who want to become used to that superb quality. However I watch a lot of stuff on the Blu Ray player that is hooked to a 20 inch 480p television and I am happy with the way it looks and sounds.
One of the reasons I am happy about .h265 is that in rural areas particularly here in Canada the big phone companies are not upgrading the phone lines for faster internet in certain areas. So an advance such as .h265 may mean that we could get streaming video in HD quality that requires less .mpbs and streaming video is becoming a bigger part of the TV video as time goes on. One of the aspects of .h265 is that it can give you the same quality of picture as .h264 at half the size.
Of course it will be terrific for those who can afford and or who desire these new HD and beyond quality picture and sound on huge screens.
DNxHD and then they do the encoding themselves, I delivered a DNxHD video and slide show once only to see it stretched, overcompressed and all blocky on some presentation projector...LOL So much for taking the time to make sure the content looked good, at least that one time I didn't put my name on the content or get paid! There are so many factors in video compression to consider. When I go to encode a video what codec I use, what settings, what bit rate, etc. are based on the content. With Blu-ray there is less of an excuse to have a crummy encode unless you are trying to fit a 6 hour movie on a BD25 or even BD50 disc. For a 2hour movie you can pretty much set the bit rate settings for whatever codec to max and get good results. I think its funny when people get excited because a Blu-ray lists its average bit rate of 28mbps over 22mbps for something like AVC on the same movie, after xxMbps you start to get to a point where more isnt always better for basic presentation and delivery. Although film sources with heavy grain I tend to like to give a generous bit rate since grain can be hard on encoders to compress without ruining the look but for example the BD of the TV show The Wire has each episode at 17.5mbps and a few at 19.5mbps using AVC and it looks great, while I've seen movies with bit rates of 38mbps AVC look bad.
The analysis and encoding settings do seam to provide better results at lower bit rates with H.264 but it doesn't mean they can't really really help at moderate to high bit rates. 16-22mbps H.264 for 1080p24 tends to be pretty sufficient for most sane content, I'm talking average bit rate though because for example in Apple Compressor you can set the average bit rate anywhere from 5-30 and the max bit rate anywhere from 6-35 (although max and min must be 5mbps apart in Compressor), with two pass encoding you can give regular scenes just the right dose of bits while complex scenes get the peak bit rates. That's not including audio but since BD-ROM video has a max AV transfer rate of 48mbps and max data rate of like 54mbps theres plenty of room unless you have 8 different languages in 7.1 DTS-MA lol.
I just wasted my time encoding an episode of classic Doctor Who using both x264 and x265 on placebo settings at crf 18.
The x265 encode turned out to be a little over 60% of the size of the x264 version, but when comparing them both to the UT loss version x265 looks to have lost quite a lot of the minor details that x264 kept.
It's a meaningless comparison, but I was bored.
Also x265 seems to have a pretty strong denoiser so I use "--tune grain" in the x265 command line, which seems to stop that denoising. This will raise the file size.
Last edited by KarMa; 20th Jan 2016 at 01:20. Reason: grammar
handbrake does some funny stuff with film grain. I was running some short test encodes of several Blu-ray rips including Patton (shot on 65mm) and x265 seamed to want to denoise harder, with x264 I set No DCT decimate to 1 and it does wonders, I use it for all film sources with prominate grain. Anyways the difference in bit rates was not 60% with my short tests, 30-40% average, with digital sources that were much cleaner or less motion in a scene the compression was indeed better quality but it was not like going from MPEG-2 to H.264 or even MPEG-4 Visual to H.264 way back when those transitions happened. I remember cutting bit rates in half using H.264 over MPEG-4 visual and MPEG-2 with ease.
What are your suggestions for h265ing all of my home movies ?
They are all 1080/50p (mostly AVCHD2.0 @28mbps from Panasonic camcorders) and I 'm going to use Handbrake to reduce their filesize at least in half !
Apart from using slowest settings available are there any internal commands that should be used for the specific content ?