VideoHelp Forum




+ Reply to Thread
Page 1 of 3
1 2 3 LastLast
Results 1 to 30 of 63
  1. Despite itīs title, this thread doesnīt intend to provoke a fanboy war. Itīs just that Iīm really curious regarding the common claims that Intel (letīs say, Haswell i7īs) are so vastly superior to AMDīs FX eight core CPUs. This question is asked many times from an economic point of view (since AMD is cheaper) and here itīs no different. Iīm currently considering assembling a new PC for editing (basically AVCHD video shot with a Panasonic camcorder, editing in Premiere CS6 and authoring DVDs from it and maybe later, bluray discs) Iīm aware that the AMD CPUs use more power, run hotter and wind up a few notches behind equivalent Intel CPUs in most benchmarks, but considering that I donīt see them as insurmountable obstacles (I donīt edit 24/7, far from it, so the power consumption is not that important, I also plan to install a better/larger CPU fan so hi temps will also be dealt with and about performance, I can certainly live with the benchmark runner up); my quistion is: is there an actual, technical fact that keeps a FX83xx from being a reliable and reasonably fast/powerful foundation for an editing PC? (remember, Iīm not aiming at the top of the list, not looking for a High End, Top Notch system, no bells and whistles)
    Quote Quote  
  2. Member racer-x's Avatar
    Join Date
    Mar 2003
    Location
    3rd Rock from the Sun
    Search Comp PM
    That's a good question and would be great if we could get honest answers. I actually just bought a new PC (i7 4790) because my wife's PC kicked the bucket. So I upgraded mine and gave her my old one (6 year-old Q9300). I was contemplating getting an AMD, but I ended up getting this one for $500. It's a low end, but I bought it for the CPU anyway.

    The problem with editing tests, are that everyone uses different editors and require different needs. I can tell you that editing my Panasonic 1080 60p AVCHD was a breeze with my old Q9300. I'm really interested in editing 4k HEVC in a possible near future. My old Q9300 could do it, but it was a little slugish. We'll see if this i7 fares any better when I get some more RAM.
    Got my retirement plans all set. Looks like I only have to work another 5 years after I die........
    Quote Quote  
  3. To get a fair comparison, get an Intel and AMD CPU with the same no. of cores and run at around the same clock speed, then run some encoding benchmark.(x264/x265)
    Comparison using specific editor will be difficult since some editing process can benefit from GPU, and how much work can be offloaded to GPU varies greatly from editor to editor
    Stopping development until someone save me from poverty or get me out of Hong Kong...
    Twitter @MaverickTse
    Quote Quote  
  4. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    Is there an actual reason that an AMD system wouldn't meet your needs? No.

    Most of the time AMD has had to compete on price rather than speed. "More bang for the buck" as we say. I work in IT for a living and I only use AMD PCs at home for Windows. Somebody has to support them, so I do. I don't want to live in a world where there is no AMD. A lot of people buy Intel simply because of the name or because they think it's faster whether it is or not. As long as people keep finding excuses to buy Intel, AMD is going to continue to barely scrape by. Maybe one day they won't be there any more. Geez, it's not like Intel systems are 5 times faster than their equivalent AMD counterparts.
    Quote Quote  
  5. As I stated in the other thread, I had an overclocked 3570k (@4ghz) and after it died I settled on a FX8320. Here's what I have noticed:

    1) In scenarios that are poorly threaded, lightly threaded or limited by a single threaded process in the pipeline, such as using certain filters during encoding, the Intel cpu walks all over the FX and easily at that. In scenarios that don't stress all the cores, such as using x264 with the ultra fast preset, the Intel cpu kills the AMD, in my tests with ultra fast the 3570k was easily twice as fast as the 8320.

    2) In scenarios that are heavily threaded, the AMD turns the tables on the Intel cpu, in a shocking fashion. In my tests with 1080p sources, the 3570k would hit close to 100fps when encoding to 1080p with x264+uf but would slow down to maybe 5-6fps with placebo. The 8320 on the other hand hits about 42-45fps with x264+uf BUT encodes at 10fps with placebo! More importantly, with Divx HEVC the 3570k was hitting 18fps, the 8320 hits 25fps. With x265 + ultra fast the Intel was barely hitting 5fps, the 8320 actually hits 15fps! With this cpu x265 is now a viable encoding option for me.

    The above would change with a Haswell based cpu since those processors use support AVX2 and I'm assuming that x265 probably benefits quite a bit from that instruction set, but when you take price into account, the AMD's are an unbelievable value.

    In fact, I am wondering what happens with 4k sources and encoding. Maybe this would be the perfect thread to find out, maybe we can get various members of this forum, in this thread, to download 4k sources from the below link and run some encoding tests and report their speeds and the cpu they use.

    Let's settle on VidCoder as app and use the ProRes samples in the link below source:

    http://www.elementaltechnologies.com/resources/4k-test-sequences
    Quote Quote  
  6. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by MaverickTse View Post
    To get a fair comparison, get an Intel and AMD CPU with the same no. of cores and run at around the same clock speed, then run some encoding benchmark.(x264/x265)
    Does not give a fair comparison at all. AMD and Intel cores have differences, they cannot be compared one to one.

    I think that AMD definitely gives you a better price/performance, but performance is limited. For video editing I prefer Intel over AMD. In my experience the overclocking on Intel is more stable and there is generally more room. For Intel you should always take the K version if you at all care about price/performance.

    This price/performance curve is not just for Intel vs AMD. For instance for Intel the price/performance on 1150 is much better than 2011 but sometimes you just need the extra juice. The price/performance goes totally down the drain with Xeon processors, for high end you have to pay excessive money.
    Quote Quote  
  7. I *guess* the threading difference(if it is really a threading issue) is due to compiler/library/coding issues. Yet, this is an interesting discovery deserving further investigation.

    do not expect too much from AVX/AVX2. There will be some performance gain, but such gain would not be as large as the jump from among SSE revisions.
    Stopping development until someone save me from poverty or get me out of Hong Kong...
    Twitter @MaverickTse
    Quote Quote  
  8. This comparison table should interest you. It's scored in second to encode 261 sec. of video to H264 Bluray 720p in CS6 (surprising that even for a 2013 chart, current CPU's are listed). At half the price, the AMD is only 17% slower (32 sec.) than the Haswell i7. It gets worse when you include the cost of the motherboard.
    Quote Quote  
  9. Originally Posted by newpball View Post
    Originally Posted by MaverickTse View Post
    To get a fair comparison, get an Intel and AMD CPU with the same no. of cores and run at around the same clock speed, then run some encoding benchmark.(x264/x265)
    Does not give a fair comparison at all. AMD and Intel cores have differences, they cannot be compared one to one.
    Then are you rejecting ALL CPU Benchmarks? Can you propose a way that the CPUs from the two companies can be compared?

    Personally, I will not buy any AMD products for Desktop again due to traumatic reliability experience from the past 15 years.... AMD/ATI products always last shorter than Intel and Nvidia products. Unless AMD give me a new high-end rig and proves to be reliable and durable over a 5-year usage, I cannot trust their products.
    Stopping development until someone save me from poverty or get me out of Hong Kong...
    Twitter @MaverickTse
    Quote Quote  
  10. Originally Posted by MaverickTse View Post
    Originally Posted by newpball View Post
    Originally Posted by MaverickTse View Post
    To get a fair comparison, get an Intel and AMD CPU with the same no. of cores and run at around the same clock speed, then run some encoding benchmark.(x264/x265)
    Does not give a fair comparison at all. AMD and Intel cores have differences, they cannot be compared one to one.
    Then are you rejecting ALL CPU Benchmarks? Can you propose a way that the CPUs from the two companies can be compared?

    I think he's saying that proposed comparison method is flawed for what the OP is interested in. The OP is more interested in performance in a certain price range comparison.

    The analogy would be testing a V6 car against a 4 cylinder that are in the same price category. Do you deactivate 2 cyl in the V6 just to make it "fair" ?

    I think you should just take whatever you can buy in that price range and test it with applications as he's going to be using them for. That more accurately reflects his usage scenario. He's not going to be disabling cores (I hope)
    Quote Quote  
  11. Yeah, in terms of Bang-for-the-Buck, AMD wins over Intel.
    I fell for this argument when I was a student and really lacking $$$. Only to learn in the hard way that the AMD/ATI failure rate is higher.

    I still use AMD for some machines, such as HTPC(the APUs are pretty good for this) and netbook, but I'm not going to use it for my main production machine.
    Stopping development until someone save me from poverty or get me out of Hong Kong...
    Twitter @MaverickTse
    Quote Quote  
  12. Originally Posted by MaverickTse View Post
    Yeah, in terms of Bang-for-the-Buck, AMD wins over Intel..
    For initial expenditures, yes for sure. Total cost of ownership is usually less for Intel if you own it more than a few years. Of it's going to vary wildly on usage and local electricity rates, so he'd have to do a little calculation. But the power consumption difference is significant when idle, and just enormous under load. Some people don' t even pay for electricity (maybe it's included in their condo fees or rent), while others pay a lot
    Quote Quote  
  13. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by MaverickTse View Post
    Originally Posted by newpball View Post
    Originally Posted by MaverickTse View Post
    To get a fair comparison, get an Intel and AMD CPU with the same no. of cores and run at around the same clock speed, then run some encoding benchmark.(x264/x265)
    Does not give a fair comparison at all. AMD and Intel cores have differences, they cannot be compared one to one.
    Then are you rejecting ALL CPU Benchmarks? Can you propose a way that the CPUs from the two companies can be compared?
    Compare them by price/performance ignore the number of cores.

    For instance an 8 core AMD has only 4 cores available for floating point operations. Intel has hyper threading AMD has not. Cache memory is often larger for AMD and allover memory bandwidth is usually better with Intel.
    Quote Quote  
  14. Originally Posted by MaverickTse View Post
    Personally, I will not buy any AMD products for Desktop again due to traumatic reliability experience from the past 15 years.... AMD/ATI products always last shorter than Intel and Nvidia products. Unless AMD give me a new high-end rig and proves to be reliable and durable over a 5-year usage, I cannot trust their products.
    Uh?! I fix these things at the lowest level, replace IC's, reflow BGA (would love to have the tools to reball)... I can tell you I see just as many Intel systems (especially big name ones) as AMD's. One major difference between the 2 is the availability of low cost third party chipsets; since the first Core series only Intel chipsets were available.

    Their higher costs makes them less likely to be used in cheap motherboards and manufacturers would follow Intel's reference design and guidelines (wouldn't be surprised if there isn't some selection by Intel involved).

    OTH, it's easy to find sub $50 Biostar or ECS AMD boards; you gotta wonder how they make them so cheap...

    What I'm saying is if a system is put together based on price the AMD rig is more likely to crap out sooner.
    Quote Quote  
  15. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by nic2k4 View Post
    What I'm saying is if a system is put together based on price the AMD rig is more likely to crap out sooner.
    No, I would not agree with that at all.

    Your suggestion that motherboard manufacturers use inferior electronics for AMD is based on what facts?
    Quote Quote  
  16. I have had both AMD and Intel based systems die on me and it was always due to the motherboard I used, the video card I used, the ram I used, or the power supply.

    In general using a cheap motherboard, like some of the early ECS or Biostar or ASRock, was just asking for trouble, as was using a no name brand power supply or some of that "value ram", like the Kingston garbage. The only time the cpu itself has been the culprit was when I was overclocking and then running the cpu full tilt for long encodes.

    I'm pretty sure I can fry either AMD or Intel cpu's equally easily by being hamfisted with overclocking, using too much voltage, cheap motherboard, poor cooling, etc.

    As for the electricity debate, I don't think you will see that much difference in electricity costs using an 84w cpu verses a 125w cpu, primarily because those are maximum values under 100% load and no one is running their cpu at 100% load 24/7, unless of course you're somehow making money with your pc in which case the cost of electricity is part of doing business.

    If the AMD "8 cores" cost as much as their Intel competition, then obviously the Intel cpu would be the better value but at the current price structure, AMD looks mighty good.
    Quote Quote  
  17. @jman98
    As usual, youīve made a very interesting point, and I can certainly relate to that (my feeling is about the same sticking with a PC while many friends of colleagues around me are running towards MACs because "they are better for anything creative" or buying DSLRs because with them you become an instant Steven Spielberg), but itīs more of a political or ideological point, Iīd like to read more technical reasons.
    @sophisticles
    Your experience is very interesting and doesnīt even have to do with side by side comparisons (I mean, benchmark-like)
    @MaverickTse
    You mean AMD CPUs are unreliable?, do they fail a lot?, are they short lived?...overclocking would shorten their lives? make them more failure prone? or even stock ones?
    @nic2k4
    If I choose to build either an Intel or AMD computer, Iīd use a reasonably good motherboard (Iīve had good experiences with Gigabyte so that would be the brand), would an AMD still fail more or crap out sooner?
    Thanks all for your replies, this is getting interesting (still not easier to decide)
    @poisondeathray
    While itīs certainly an important consideration, I mentioned already that long term energy savings wasnīt that much of an issue (unless higher voltages and/or temperature have an adverse impact on the chipīs lifespan or performance)
    Quote Quote  
  18. Originally Posted by newpball View Post
    Your suggestion that motherboard manufacturers use inferior electronics for AMD is based on what facts?
    I said there are different levels of board quality and some manufacturers can put out AMD boards at an incredibly low price. To do that they must save cost somewhere. Some sourced lower cost equivalent parts an example is the capacitor plague, at one point using Japanese capacitors was a selling feature.

    More recently it's the ROHS solder BS that caused most of the problems. There was many different solder formulations. Everyone was looking to bring the price closer to 60/40 lead solder, but some formulations had bad side effects and the solder connections would crack sooner.
    Quote Quote  
  19. @julitomg
    My unlucky experience told me that AMD/ATI's desktop products can rarely last 5 years. And often fails just after the warranty expired.
    While for Intel's, ALL CPUs works until/past I made a major PC upgrade. I have also seen OLD Intel rig in University labs that has been running for 10+ years.

    I never use MB as cheap as BioStar, but always choose from ASUS, Gigabyte, Intel and ASRock.
    I seldom overclock, and even less for AMD/ATI products, as they're famously HOT (Many Many Years ago, AMD made a record-breaking CPU that is so hot that you can use it to fry egg. I once owned one of those.). That ridiculous situation improved, but they're generally still hotter than Intel's chipsets.

    I have to add that my machines run in a relatively hot, humid and dusty place, so other users in a more machine-friendly environment probably have a different experience.
    Stopping development until someone save me from poverty or get me out of Hong Kong...
    Twitter @MaverickTse
    Quote Quote  
  20. Originally Posted by julitomg View Post
    @poisondeathray
    While itīs certainly an important consideration, I mentioned already that long term energy savings wasnīt that much of an issue (unless higher voltages and/or temperature have an adverse impact on the chipīs lifespan or performance)
    yes, voltage especially , and high temperatures kill chips. But if you're not overclocking, it's nothing to worry about because you will be running at stock volts (some people even undervolt). And if you have "adequate" cooling - temps are not a problem by definition
    Quote Quote  
  21. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by nic2k4 View Post
    Originally Posted by newpball View Post
    Your suggestion that motherboard manufacturers use inferior electronics for AMD is based on what facts?
    I said there are different levels of board quality and some manufacturers can put out AMD boards at an incredibly low price.
    That is not exclusive to boards for AMD CPUs.

    You can find sub $50 1150 boards for Intel CPUs as well.
    Quote Quote  
  22. Originally Posted by julitomg View Post
    ... would an AMD still fail more or crap out sooner?
    I have never seen a CPU die other than by some boneheaded action like mounting a board to the back plate without spacers (everything got fried) or installing the heatsink backwards with the base riding the socket not in contact with the CPU. Usually it's the motherboard chipset that goes and sometimes the BIOS chip (bad flash, user error).

    Sophisticles makes a good point about the power supply, it's often overlooked and can cause many problems, even damage. You have to remember that it powers not only the board and CPU, but also the memory, drives and video card. Find out the current requirement of all components, add it all up for each voltage rails and get a PSU with breathing room on each rails.
    Quote Quote  
  23. Originally Posted by sophisticles View Post
    As for the electricity debate, I don't think you will see that much difference in electricity costs using an 84w cpu verses a 125w cpu, primarily because those are maximum values under 100% load and no one is running their cpu at 100% load 24/7, unless of course you're somehow making money with your pc in which case the cost of electricity is part of doing business.
    Those are actually manufacturer TDP values, not actual power consumption

    39.8w delta idle, 82w delta under load

    Haswell i7-4770k Load 113.2 Idle 34.4
    http://www.anandtech.com/show/7003/the-haswell-review-intel-core-i74770k-i54560k-tested/2

    FX8350 Load 195.2 Idle 74.2
    http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/6

    If you just have it on 12hrs a day at idle, it ends up about $33-34 more /year if you calculate with Mexico average electricity price. 12hrs a day under load would be about $69-70 more /year. (in USD)
    http://en.wikipedia.org/wiki/Electricity_pricing#Global_electricity_price_comparison
    Quote Quote  
  24. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    One thing I forgot to mention that is important to remember about AMD servers. I built my most recent PC almost 3 years ago so it's possible that this is no longer true, but I suspect it's still true. At the time I built my PC, AMD does NOT allow 3rd party cooling fans to be used with their CPUs. It voids their warranty to use anything but the fan that comes with your CPU. This may not be a problem for some people, but it's worth considering.

    For the record, I do NOT ever overclock my systems. Those who insist on doing this should understand that this does have an effect on longevity and I can't offer any personal experience on this, but I've never had AMD systems just die on me. They've gotten old and I've replaced them, but I've never had them just die or fry themselves or anything like that.
    Quote Quote  
  25. Originally Posted by poisondeathray View Post
    39.8w delta idle, 82w delta under load

    Haswell i7-4770k Load 113.2 Idle 34.4
    http://www.anandtech.com/show/7003/the-haswell-review-intel-core-i74770k-i54560k-tested/2

    FX8350 Load 195.2 Idle 74.2
    http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/6

    If you just have it on 12hrs a day at idle, it ends up about $33-34 more /year if you calculate with Mexico average electricity price. 12hrs a day under load would be about $69-70 more /year. (in USD)
    http://en.wikipedia.org/wiki/Electricity_pricing#Global_electricity_price_comparison
    First of all I don't keep my cpu under 100% load for 12 hours a day, I usually work a 12-14 hour day and my pc is off when I'm gone, it's only under maximum load for the few encodes I do once in a while.

    Be that as it may, even if it was on max load for 12 hours a day, 7 days a week, I don't mean to come off as snarky but the $70 a year difference is not going to break the bank and more importantly in my case since I only paid $120 for the cpu and $5 for the motherboard, I have already saved enough to cover 2 years worth of electricity.

    In about a year when the new Intel's and AMD's come out chances are I will be upgrading to those new architectures so for me electricity costs are a non issue.

    Now maybe in a business setting where you have thousands of employees, with thousands of computers, then yes, it would be a consideration but even then we start getting into all sorts of economic realities, such as the fact that a company that size would probably be making hundreds of millions of dollars per year, so initial acquisition costs may be more important than electricity costs.
    Quote Quote  
  26. Originally Posted by jman98 View Post
    One thing I forgot to mention that is important to remember about AMD servers. I built my most recent PC almost 3 years ago so it's possible that this is no longer true, but I suspect it's still true. At the time I built my PC, AMD does NOT allow 3rd party cooling fans to be used with their CPUs. It voids their warranty to use anything but the fan that comes with your CPU. This may not be a problem for some people, but it's worth considering.
    I'm pretty sure the same holds true for Intel processors as well, at least the retail ones that come with a cooler. Haswell-E does not come with a cooler nor do the tray/oem processors so obviously they expect you to buy a third party cooler.
    Quote Quote  
  27. Member n8tvm's Avatar
    Join Date
    May 2002
    Location
    West Virginia, USA
    Search Comp PM
    I have used AMD for years and a few months ago went all out and got an Asroc X99 Extreme4 and a i7- 5960x encoding is a lot faster with this over the fx-8350 I had before. As for quality and longevity my experience has been most parts get obsolete long before they go bad.
    Quote Quote  
  28. Originally Posted by sophisticles View Post

    First of all I don't keep my cpu under 100% load for 12 hours a day, I usually work a 12-14 hour day and my pc is off when I'm gone, it's only under maximum load for the few encodes I do once in a while.

    Be that as it may, even if it was on max load for 12 hours a day, 7 days a week, I don't mean to come off as snarky but the $70 a year difference is not going to break the bank and more importantly in my case since I only paid $120 for the cpu and $5 for the motherboard, I have already saved enough to cover 2 years worth of electricity.

    In about a year when the new Intel's and AMD's come out chances are I will be upgrading to those new architectures so for me electricity costs are a non issue.

    Now maybe in a business setting where you have thousands of employees, with thousands of computers, then yes, it would be a consideration but even then we start getting into all sorts of economic realities, such as the fact that a company that size would probably be making hundreds of millions of dollars per year, so initial acquisition costs may be more important than electricity costs.

    Yes, for many people it's a non issue. As mentioned earlier, some people don't even pay for electricity

    Many people that cite how much lower AMD costs, fail to realize what TDP is or what it really costs to run a computer. But even idle or surfing, it adds up. Depending on your situation, it can end up being more expensive over a few years, than your initial cost savings by choosing AMD, even if you factor in the time value of money (interest)

    Since you brought up the business side, you're not going to be very successful in business if you don't factor in these things - total cost of ownership. AMD's "bread and butter" server operation with the classic opteron that they milked in the past is basically gone. Many operations, tasks are done significantly faster by Intel CPU's so they have higher percentage of being in lower power idle states. Not to mention additional cooling , AC additional costs for the hotter, power hunger processors. Cooling adds up 30-50% additional electricity in most server settings. The cost savings is enormous with Intel. Unless you intend to run the business for less than a year, it's basically not feasible to run a server farm with AMD anymore
    Quote Quote  
  29. Originally Posted by nic2k4 View Post
    I fix these things at the lowest level, replace IC's, reflow BGA (would love to have the tools to reball)... I can tell you I see just as many Intel systems (especially big name ones) as AMD's.
    Since Intel sells several times as many CPUs seeing "just as many" Intel failures may indicate a several times higher failure rate for AMD.
    Quote Quote  
  30. Member bendixG15's Avatar
    Join Date
    Aug 2004
    Location
    United States
    Search Comp PM
    Originally Posted by julitomg View Post
    Despite itīs title, this thread doesnīt intend to provoke a fanboy war. ...................
    Need I write what it did provoke ????
    Quote Quote  
Visit our sponsor! Try DVDFab and backup Blu-rays!