VideoHelp Forum
+ Reply to Thread
Results 1 to 14 of 14
Thread
  1. It's 2016, this is a pile of nonsense.

    I tried Adobe Premiere Pro CC some time ago, I don't know why I stopped, maybe I forgot, I think I stopped because it may not work on AMD at all.


    I've mainly used Sonv Vegas and now I got the new Vegas 14 from Magix, the MainConcept AVC is the only one with VBR, but no GPU acceleration, first big hit, so I'm forced to use the CBR only Sony AVC, oh there is no GPU acceleration for Sony AVC on Vegas 14, So I had to open up the old 13 version when I got it to work, but this is just not some long term solution, I mean, this whole system is just so weird to me, this whole codec existing as a specification and then anyone who want's to "go at it" implementation, it's just so stupid to me. This is so fundamentally broken the whole idea of making up some hype around "look what the future has for us" announcing the specification, oh look it's complete, the new HEVC codec is out, oh but it's still not implemented so you have to wait 10 years before it'll actually practically work.

    But come on even with GPU acceleration working now, it's laughable, the GPU is only being used 10-30% half the time, it's a total joke, I'm still going to wait for 1 hour encoding time, with 1000 kbps output quality and lower than 720p resolution, like 900x504, this is a total joke what POS this industry is in.


    FFMPEG is obviously totally useless in this area. How many years has this been "experimental".

    What's the point of all these new modern codecs if their implementation is a pile of horse manure? All these codecs and features are overhyped and I think it could amount to fraud and deceptive actions, I'm not an authoritarian so I don't like how this will sound myself, but I hate this so much if I was a dictator I would ban any kind of hyping and announcing of some nonexisten crap that doesn't work and is only in the blueprint stage in some stupid boreucrat office.

    "Oh, our new codec can do magnificent gpu acceleration". Yeah the codec is a specification how to implement it, it's nonexistent, it's pointless, stop trolling, stop announcing, stop hyping nonexistent nonsense.

    And GPU manufacturers are responsible as well, on both sides, I have had absolutely zero benefit of any kind of these "features" in the last 10 years I keep hearing about this CUDA thing, and "computing" buzzword, it's an annoying buzzword, nothing practical uses it, it's a dud, and the word itself is annoying, CUDA, some name for a cheese.

    Thankfully I mostly modernized my DVDs a while ago, this is just some screen recordings I'm doing ocasionally when they fill up.

    Seriously, the commercial codecs are deceptive and I think an investigation should be made to determine if and how it used immoral marketing and propaganda practices, the industry as a whole, and should be determined how much % each of those is responsible, this is being hyped up for years and it's going nowhere.

    This is exactly the same as Nvidia announcing their puppy fake GPU with wood screws back some years ago, i think it was Fermi, it's should not be legal, it's fundamental LIES.

    I'm so serious if I was wealthy I would be getting a team of lawyers and I would hire people in industry and form a non-profit investigation group, The investigation group would spend a year or so researching the evidence, identifiy the ones who the case, and then notify the EFF and other consumer organizations for collaboration and further advice, and we could launch a joint lawsuit against the whole damn those responsible in industry for deceptive marketing and nonexistent promises. Yes this would include suing the MPEG organization itself, this whole idea of making blueprints and waiting for some bloke to "have a go at it" half-baked way is just not optimal, the specification doesn't even specify what are minimal requirements, it has so much of these profiles, any one can say it "supports X265" but under the hood can pick the lowest profile and support some minimal set of features and even those don't go through any real-world practical tests to actually figure out what's acceptable real-world practical encoding times.

    That's like saying "This Computer Supports Windows 10" ... but it only supports the 256MB ram, and 300MHZ CPU, so it runs 50 times slower than a normal PC - that shouldn't be legal at least from moral standpoint. And then their lawyers can argue yeah it's all about the fine print "supports" term doesn't indicate how fast it runs, yeah but these are LOOPHOLES and that's how they get away with the super specific contract language, looking from a practical standpoint it's just people who like to screw with everyone else.
    Last edited by Wader8; 17th Oct 2016 at 20:21.
    Quote Quote  
  2. This video https://www.youtube.com/watch?v=qOf4dkwdQ14

    Is there any truth to it, my performance increase was a measly 50% and even this is not at all sure, I was using 2 pass 1 slice with the VBR, and with GPU Accell CBR i was using 4 slices and 1 pass.
    Quote Quote  
  3. Member
    Join Date
    Aug 2013
    Location
    Central Germany
    Search PM
    No surprise you are confused if you put all "buzzwords" into the same dough and expect the cake to taste great after baking it way too hot.

    Why is massively parallelized computation, as excellently executed by CUDA, not useful for HEVC? ... Because having hundreds of people forming optimal rolls does not help baking one wedding cake. You don't need an array of separate cubicle offices, you need a team of specialists with division of work and a lot of communication between all members of the team. And on top, you can't wait for your material delivered by container ships (transportation of video frames between main RAM and video RAM takes efforts). HEVC does not need more "breadth"; it needs more "depth" and more exchange between each part.

    Last side note: x265 is a specific implementation of the HEVC standard, like the MainConcept HEVC encoder. And its 'x' is lower-case.
    _

    P.S.: More slices are less efficient. Each slice is processed separately, does not see the content of other slices, therefore may miss chances to find more similar areas to spare bitrate from.
    Last edited by LigH.de; 17th Oct 2016 at 11:53.
    Quote Quote  
  4. Originally Posted by Wader8 View Post
    But come on even with GPU acceleration working now, it's laughable, the GPU is only being used 10-30% half the time, it's a total joke, I'm still going to wait for 1 hour encoding time, with 1000 kbps output quality and lower than 720p resolution, like 900x504, this is a total joke what POS this industry is in.
    In NLE's not all operations are accelerated by GPU, and you sometimes have other bottlenecks

    But I agree the implementation of GPU acceleration could be improved immensely. And the encoding quality is terrible

    FFMPEG is obviously totally useless in this area. How many years has this been "experimental".
    If you want just speed, it's fantastic . Both NVEnc and QSVEnc are supported by ffmpeg. But quality is lower, and you have to build it yourself (distributed builds have enable non-free , which do not contain them, but it's fairly easy to build with autobuild scripts)


    And GPU manufacturers are responsible as well, on both sides, I have had absolutely zero benefit of any kind of these "features" in the last 10 years I keep hearing about this CUDA thing, and "computing" buzzword, it's an annoying buzzword, nothing practical uses it, it's a dud, and the word itself is annoying, CUDA, some name for a cheese.
    Video encoding is really a tiny fraction, an afterthought of CUDA's intended use. It's fantastic for other things like physics simulations, ray tracing, 3d work & rendering.

    Besides, on newer Nvidia cards , CUDA is not even used for video encoding for a few years . There is a dedicated silicon separate from the "cuda cores" which is responsible for video decoding and encoding. CUDA, as used in NLE's mostly does things like scaling, acceleration some filters.
    Quote Quote  
  5. Well, the expectations were way through the roof that's why I had an emergency steam release. Looking at it if it saved me about 1 hour, okay, 30 minutes was already done when it was showing 1:36, so I guess the saving of time was

    I usually am not as time concerned as I am right now, when I was modernizing DVDs I left a batch process run 12 hours over night, because of other non-PC circumstances right now.
    But when I have quick rough screen caps, I just wanna put them quickly together in Vegas, stitch and remove irrelevant stuff, and just make a standard HD watchable video out of it, so the text is clear enough.


    So, we would actually need a physical PCIE encoding card separately which is designed for this kind of work ?
    Quote Quote  
  6. A separate card won't help the current vegas implementation . (Or I don't know about Vegas 14, but I assume it's the same)

    Nvidia card performs poorly on Vegas; it's more optimized for OpenCL ATI/AMD cards. The reverse is true for Adobe
    Quote Quote  
  7. Member
    Join Date
    Aug 2013
    Location
    Central Germany
    Search PM
    The Nvidia video decoder area on the die is called "PureVideo", in different generations named "feature sets" (starting with GeForce 6).

    And the name of the encoder area on the die is "NVENC" (if present, depending on the chipset generation, first was "Kepler").
    Quote Quote  
  8. Originally Posted by Wader8 View Post
    It's 2016, this is a pile of nonsense.

    I tried Adobe Premiere Pro CC some time ago, I don't know why I stopped, maybe I forgot, I think I stopped because it may not work on AMD at all.


    I've mainly used Sonv Vegas and now I got the new Vegas 14 from Magix, the MainConcept AVC is the only one with VBR, but no GPU acceleration, first big hit, so I'm forced to use the CBR only Sony AVC, oh there is no GPU acceleration for Sony AVC on Vegas 14, So I had to open up the old 13 version when I got it to work, but this is just not some long term solution, I mean, this whole system is just so weird to me, this whole codec existing as a specification and then anyone who want's to "go at it" implementation, it's just so stupid to me. This is so fundamentally broken the whole idea of making up some hype around "look what the future has for us" announcing the specification, oh look it's complete, the new HEVC codec is out, oh but it's still not implemented so you have to wait 10 years before it'll actually practically work.

    But come on even with GPU acceleration working now, it's laughable, the GPU is only being used 10-30% half the time, it's a total joke, I'm still going to wait for 1 hour encoding time, with 1000 kbps output quality and lower than 720p resolution, like 900x504, this is a total joke what POS this industry is in.


    FFMPEG is obviously totally useless in this area. How many years has this been "experimental".

    What's the point of all these new modern codecs if their implementation is a pile of horse manure? All these codecs and features are overhyped and I think it could amount to fraud and deceptive actions, I'm not an authoritarian so I don't like how this will sound myself, but I hate this so much if I was a dictator I would ban any kind of hyping and announcing of some nonexisten crap that doesn't work and is only in the blueprint stage in some stupid boreucrat office.

    "Oh, our new codec can do magnificent gpu acceleration". Yeah the codec is a specification how to implement it, it's nonexistent, it's pointless, stop trolling, stop announcing, stop hyping nonexistent nonsense.

    And GPU manufacturers are responsible as well, on both sides, I have had absolutely zero benefit of any kind of these "features" in the last 10 years I keep hearing about this CUDA thing, and "computing" buzzword, it's an annoying buzzword, nothing practical uses it, it's a dud, and the word itself is annoying, CUDA, some name for a cheese.

    Thankfully I mostly modernized my DVDs a while ago, this is just some screen recordings I'm doing ocasionally when they fill up.

    Seriously, the commercial codecs are deceptive and I think an investigation should be made to determine if and how it used immoral marketing and propaganda practices, the industry as a whole, and should be determined how much % each of those is responsible, this is being hyped up for years and it's going nowhere.

    This is exactly the same as Nvidia announcing their puppy fake GPU with wood screws back some years ago, i think it was Fermi, it's should not be legal, it's fundamental LIES.

    I'm so serious if I was wealthy I would be getting a team of lawyers and I would hire people in industry and form a non-profit investigation group, The investigation group would spend a year or so researching the evidence, identifiy the ones who the case, and then notify the EFF and other consumer organizations for collaboration and further advice, and we could launch a joint lawsuit against the whole damn those responsible in industry for deceptive marketing and nonexistent promises. Yes this would include suing the MPEG organization itself, this whole idea of making blueprints and waiting for some bloke to "have a go at it" half-baked way is just not optimal, the specification doesn't even specify what are minimal requirements, it has so much of these profiles, any one can say it "supports X265" but under the hood can pick the lowest profile and support some minimal set of features and even those don't go through any real-world practical tests to actually figure out

    That's like saying "This Computer Supports Windows 10" ... but it only supports the 256MB ram, and 300MHZ CPU, so it runs 50 times slower than a normal PC - that shouldn't be legal at least from moral standpoint. And then their lawyers can argue yeah it's all about the fine print "supports" term doesn't indicate how fast it runs, yeah but these are LOOPHOLES and that's how they get away with the super specific contract language, looking from a practical standpoint it's just people who like to screw with everyone else.
    They're doing the best they can. They are companies that put out a product, you choose whether or not you buy it, develop your own, or don't buy it. This technology is harder to execute than you are imagining.
    Quote Quote  
  9. I am open to compromises, but the idea that pushing proprietary CUDA as doing "best they can" is just not it. But mostly I just had to let some steam off, it's probably more complex.
    Quote Quote  
  10. There are ways to use NVEnc indirectly in vegas (Adobe has a 3rd party plugin, made by a user, not officially sanctioned by Adobe. Unfortunately nobody has made one for Vegas) . Again, this is the actual newer Nvidia encoder, NVEnc, not CUDA. It is faster, maybe 3-5x than CPU, at least on a maxwell. Newer cards actually won't be faster using vegas, because the limiting factor (bottleneck) is actually the colorspace conversion (Vegas works in RGB, but the destination encoding format is usually YUV 4:2:0). If you can do that faster, you can encode faster (the newer pascal cards would be probably 10-15x faster if there wasn't a bottleneck) . But to do it in vegas you would use debugmode frameserver, avisynth . Then you can use NVEncC or ffmpeg to access NVEnc. This is only if speed is the utmost important, because the compression efficiency and quality is quite a bit lower. But if some end user can code this, and even a plugin for Adobe, why not vegas or the vegas creative team ? I don't know. Even if the quality is lower, it's nice to have other options
    Quote Quote  
  11. Originally Posted by Wader8 View Post
    So, we would actually need a physical PCIE encoding card separately which is designed for this kind of work ?
    Yes, but you are too poor for it...
    Anyway this is cheapest solution on market for such things.
    http://www.intelserveredge.com/intelvca/
    And as mentioned CUDA or GPGPU is useful for some class of software problems and classical video compression doesn't fit nicely in GPGPU area.
    Quote Quote  
  12. Thanks for these inputs.

    Even if I wanted a good and slow encode, i can't set the Intel HEVC rendering option below 2000 kbps which is such a dealbreaker, so the whole codec is totally useless for my type of stuff. Most the stuff I target 720p at 1.500 kbps and with HEVC I would have to drop it off down to 700 kbps avg and 1500 max.

    On the other side I figured that GPU Accel for Sony AVC does indeed work in the Vegas 14 they just hidden all the GPU info, you don't have a selection box to set whether to use CPU only or not, and the "Check GPU" button is missing in System tab, but I see now it's actually working better, humbly at 20% steady but still only 20% of the GPU activity, at least it's actually calculating more since it's not jumping up and down and is busy all the time. I see that it does have an effect and I will only have to wait for 30 mins for this 1 hour and a half clip.

    It's just SUCH a bummer the template supporting only CBR.
    Last edited by Wader8; 17th Oct 2016 at 18:34.
    Quote Quote  
  13. Originally Posted by Wader8 View Post
    Even if I wanted a good and slow encode, i can't set the Intel HEVC rendering option below 2000 kbps which is such a dealbreaker, so the whole codec is totally useless for my type of stuff. Most the stuff I target 720p at 1.500 kbps and with HEVC I would have to drop it off down to 700 kbps avg and 1500 max.
    You need 4k to see 50% difference between HEVC and AVC (and wait few years for improvements on encoding software).
    Quote Quote  
  14. Member
    Join Date
    Sep 2016
    Location
    Brazil
    Search PM
    The main focus of Sony Vegas GPU acceleration stays in features used for editing like effects, transitions, compositing etc.. and for that GPU acceleration does it job nicely. GPU encoding apparently was only an experimental from Mainconcept for old cards up to Fermi/VLIW which they just leave behind and dont even wasted time updating anymore. Adobe Premiere never implemented something like GPU encoding, their GPU acceleration stay for editing effects like Vegas OpenCL engine does.

    Apparently Sony AVC does only motion estimation in GPU encoding and nothing more so the speed up there is not so big like Mainconcept CUDA/OpenCL with old Fermi/Vliw. Mainconcept GPU encoding done lot of work in GPU so the gains were great against Mainceonpt CPU. But a point to consider Mainconcept CPU only has some quality which approach to x264 slow/medium, while CUDA encoding quality is close to x264 superfast.

    CUDA H264 encoding does worse quality at low bitrates compared to x264 Veryfast. When using higher bitrate then quality difference decrease. I guess they dropped GPU encoding because quality was not good enough.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!