I partly agree and partly don't. Look at audio.
There was a part of history where uncompressed audio was crap, then it was barely acceptable, then it was good, then it was great. Now, it can be MORE than great, but to most folks (who may not even be able to distinguish between good vs. great, or more likely CARE), it has reached that point of diminishing returns, so few need it to be that much better anymore, and even low bitrate OPUS-compressed versions are considered decent enough now. So there is very little incentive to push the envelope further.
Video is just further back in the efficiency "hysteresis" curve (because it is much more complex and there is so much more data to begin with).
But...
Already for many consumers, DVD is "good enough", and for the great majority of consumers, BD is already "great enough". These same attitudes have been parroted on this site many times. I myself believe that 4k + HDR + HFR + WCG + MVD + no Subsampling actually could be "revolutionary" in many ways and not a "yawn" (if fully utilized by gifted artists), but it won't be for everybody, for the same reason that 32bitFloat + 196kHz + 22.2ch will never be for everybody. Some are just not able to tell the difference, some can but don't care or have other priorities. But I think the only folks "complaining" about BD quality are/will be technophiliac zealots.
Scott
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 61 to 84 of 84
Thread
-
-
I'm still waiting for newpball to offer an example of 1080p video that'll suffer if it's downscaled at all. And waiting, and waiting.....
A picture documentary of today's workflow. Checking how much quality would drop according to bitrate and applying filtering etc. Uploaded as jpgs so as not to hog too much bandwidth. This time, instead of using the player's File/SaveImage menu, I used the printscreen button on the keyboard to show exactly what I'm seeing on my TV when the encodes are running fullscreen (don't compare the thumbnails because the forum software sharpens them). So, as seen on TV (mine).......
1. Original Bluray video 30Mbps:
2. Re-encode, no filtering, CRF18, 11Mbps:
3. Re-encode, no filtering, CRF20, 8000Kbps:
4. Filtering to fix the jaggies as much as possible. I'd still like to know how they were created. See the top of the bongos.
1080p CRF20, 7000Kbps:
5. With Filtering, 720p, CRF 18, MPC-HC with Bicubic upscaling, 4700kbps:
6. With Filtering, 720p, CRF 18, MPC-HC with Bilinear upscaling, 4700kbps:
Give #5 hardly looks any worse than #4 and definitely better than the source, it looks like it'll be 720p again for a 25.5Mbps bitrate reduction. Although I might go down to CRF16 for this one.
As is often typical, the 720p version doesn't take a noticeable quality hit unless a fairly soft resizer is used (ie #6) but most players/TVs would have much better upscalers.Last edited by hello_hello; 18th Jul 2015 at 20:24.
-
Audio has Dolby Atmos and DTS:X. They're only partially object orientated at the moment, so there's still room to grow.
They'll always come up with something new that will chew through more bitrate (ie PIP, multiple angles), and the ripping community will always be several steps behind. -
Didn't say there wasn't room to grow. In fact, I completely agree with that. But the growth will be appreciated by a smaller and smaller group (diminishing returns), as audio is already over the shoulder of the efficiency hysteresis curve.
Personally, I like the idea of object-oriented audio, but they could have just as (or more) easily implemented 4ch B-(W-X-Y-Z), UHJ, G or higher order, Ambisonic/Soundfield encoding.
Scott -
Yes, much like the CPU clock there'll come a point where growth will become impractical... and then they'll have to start a new era of simply refining things for best effect.
Joy. -
-
I think his goal in #4-6 was to fix the jaggies. The are reduced, but the fine details are reduced as well. You can't have your cake and eat it too
Whether or not it looks "better" is subjective. You can argue it either way, but I think ideally those jaggies shouldn't have been there in the original (i.e not great source quality) -
And as usual, when he is even remotely challenged to respond to an argument that he originally started, you never see him crowing again, he just flies off to nest somewhere else, trying to bait other members for a debate, which, in the end, he knows he will lose anyway.
Just look at some recent posts he made recently in the Camcoder section, creates the argument then runs away when he is beaten.
I am with you on this 1080p to 720p down-sizing, if you have reasonable 1080p source video, the 720p will look great at the right bitrate.
Example, my 1080/50p MTS camera files average 24Mbps, when dropped to 720/50p they are about 11.5Mbps, which is around 45% of the source bitrate, encoding with VRD Pro using the x264 encoder, they look beautiful, even if up-scaled again on a 1080p HD TV, and what can i say when they are played on a 768p HD TV.
As far as re-encoding videos with x265 to save a bit of storage, i agree with many that it is just a waste of time, even for the little storage gain, but then again, i don't know enough about x265 to be sure about its difference with x264.
For me, i paid an additional fee to VRD to have them unlock the x264 encoder in my Pro version, and to my eyes, i cannot see any difference, with or without my glasses, even if the x264 and the default mainconcept h264 encoder outputs are at the same bitrate.
A non issue for me.Last edited by glenpinn; 28th Jul 2015 at 10:11.
-
Just like I'm not surprised when I posted another couple of screenshots demonstrating 720p downscaling in your ridiculous thread on creating video for future generations, you chose to ignore it, as you always do when someone shows you something that contradicts your blinkered view.
Just as I'm not surprised that "frankly I'm not surprised" is the most intelligent comment you could manage here.
And I'm still waiting for newpball to offer an example of 1080p video that'll suffer if it's downscaled at all. And waiting, and waiting..... but that's silly of me because I should know all he'll manage is talk, talk, talk, talk, talk.....Last edited by hello_hello; 28th Jul 2015 at 11:30.
-
Why do you think that is a ridiculous thread?
The question about what format to use, HD or UHD, is in my opinion a contemporary one. Do you disagree with that? So a person who would come to you and say 'should I record my wedding in UHD or HD' is asking something ridiculous? Would you perhaps show him your 1080p/720p comparisons?
Are you perhaps saying that members on this forum are not interested in questions about conservation of memorable events in video and what formats to use?
If not, then why, pray tell, do you think is a poll about HD or UHD for important events a ridiculous one?
-
-
-
Clearly everyone else does not agree with you, and i thought that topic was somewhat strange as well, and just because you have this strange habit of creating new threads on just about any topic you can think of, does not mean that everyone will reply, because even if they do, your sitting in your arm chair with a gun ready to shoot them down if you dont agree with them.
little wonder you have lost all respect from me, simply by your actions.
Edit: just ordered my Flycam Nano HD Stabilizer, yippeeeee -
I think that's because you're nutter enough to claim it doesn't even if it does, if it meant you'd have to acknowledge you're not always right.
Yes #5 looks better than the original Bluray. No question. The jaggies are gone and the edges of objects are cleaner. FFS, just look at the strings on the guitar or the light reflecting off the edge near her hand. Do they really look more natural in the original video?
In case you've never encoded a video before or posted without thinking, 90% of the difference you can see in respect to fine picture detail between #5 and the original is a result of the filtering to remove the jaggies, not the resizing, so if you thought you'd found an opportunity to make a derogatory comment about downscaling you're out of luck and only managed to help newpball make himself look silly again. For the record, I used nnedi3 to throw away every second scanline and replace them with interpolated versions, then I downscaled to 720p, and finally I ran QTGMC in de-interlacing mode on the progressive video to stabilised any remaining jaggies (after a fair bit of experimenting it seemed to be more effective than running it in progressive mode). Naturally that process effected some fine detail in places but if you look carefully at the background you'll see it also cleaned up a bit of noise, and you might see it also reduced any encoder blocking..... because yes, despite the 30Mbps bitrate there was some fine blocking in places, especially where the lighting was changing quickly. Does #4 look better than the original? Of course it does, but any "which looks best" nitpicking while comparing screenshots becomes completely irrelevant when the pictures are moving, aside from the encoded version looking a little "cleaner" and the artefacts from the bad de-interlacing (or however the jaggies were created) don't jump out at you. In the original they move as objects or the camera moves. In the encoded version even where I didn't always eliminate them, they're far less obvious.
Original:
Encode:
(Some info for newpball to ignore)
If you want to post for the purpose of making derogatory comments about downscaling, you probably need to keep the comparison confined to screenshots #4 and #5, as they were both filtered and re-encoded, but #4 is 1080p while #5 is 720p. Are there differences in the screenshots? Of course there are, but they're probably more encoding differences than anything else, and in the end I encoded it using a lower CRF value than I used for the test encodes. Would the differences be anything you'd see while the pictures are moving? I very, very, much doubt it. If you can, you probably need your brain checked for tumours.Last edited by hello_hello; 28th Jul 2015 at 11:39.
-
I guess if newpball doesn't want to be taken seriously, that's up to him. I've said several times his selective arguing makes it impossible to discuss a topic like an adult when he continually ignores anything that doesn't suit him, and it's also a fair indication his argument doesn't have much substance. Once again newpball's picked a side issue.... a single word from my post.... in order to argue about it as a distraction from the fact he ignored the topic completely.
-
-
Why didn't you say so earlier when sophisticles made a medical comment about my needing an eye doctor? Were you not in need of a distraction from the topic at that stage? In case it's not incredibly obvious to you yet, my comment was a come-back for his and nothing else. What's sad is the fact I had to explain it.
-
Last edited by hello_hello; 29th Jul 2015 at 09:19. Reason: spelling
-
Several posts have warned about the cost of the extra electricity used during transcoding (or any other compute-intensive process). During the winter heating season, however, the marginal cost is approximately ZERO for many users, because the energy used by a computer and most appliances is entirely converted to useful heat, which correspondingly reduces the amount of heat that would otherwise need to be generated by the building's heating system to heat the rooms containing the appliances.
An example may help to clarify. Suppose my appliances (not counting the home's heating system) collectively use 50 watts of electricity while on. If left on 24 hours per day, they use 1200 watt-hours (24 hours x 50 watts) of electricity per day, and generate approximately 1200 watt-hours of heat per day since the conversion of electricity to heat is near complete. (We can neglect the small amount of energy that escapes out of the house as wifi or cellular radio, and the visible light that escapes through the windows.) If instead I turn off my appliances 20 hours per day, they generate approximately 200 watt-hrs (4 hours x 50 watts) of heat per day. Suppose that if the appliances are turned off 20 hours per day, the home heating system uses 5000 watt-hrs per day of electricity during winter to maintain the temperature of the home at 72 degrees Fahrenheit. If the appliances are left on 24 hours per day, then the heating system will use approximately 4000 watt-hrs of electricity per day to maintain 72 degrees F. In other words, in winter the total electricity consumption per day would be about 5200 watt-hrs regardless of whether the appliances are left on or turned off: either 1200+4000 or 200+5000.
If the building's heating system runs on electricity (rather than some cheaper form of energy) too and if the warm air from the heating system passes through ducts to reach the rooms, then the heating system is less efficient at heating the rooms than the appliances are, because less of the heat from the appliances is wasted heating the ducts. In this case, running the appliances more should actually save a small amount of electricity. (An exception is the clothes dryer, since most of its heat is exhausted to the exterior and wasted.)
So, for many people, encoding & transcoding during the winter are energy-efficient processes.
The opposite is true the rest of the year. Worst is the summer, when the electricity used by appliances costs approximately double because the heat produced by the appliances forces the home's air conditioning system to do more work to cool the home. -
Anybody know where to find 32bit x265 1.9x anywhere?? Can't find.
Similar Threads
-
H.264 vs x264 and H.265 vs x265
By LeoKac in forum Video ConversionReplies: 27Last Post: 18th Jan 2017, 07:59 -
Which is the optimal motion search range in x264 and x265?
By Stears555 in forum Video ConversionReplies: 52Last Post: 6th Feb 2016, 10:48 -
x265 vs x264
By deadrats in forum Video ConversionReplies: 71Last Post: 10th Jan 2016, 06:14 -
X264 vs x265 at sane values
By Valnar in forum Video ConversionReplies: 12Last Post: 15th Oct 2014, 11:56 -
x264 vs x265??
By xenotox in forum Video ConversionReplies: 167Last Post: 12th Jul 2014, 11:16