VideoHelp Forum




+ Reply to Thread
Results 1 to 29 of 29
  1. Member yoda313's Avatar
    Join Date
    Jun 2004
    Location
    The Animus
    Search Comp PM
    So why did Intel and AMD stop advertising their cpu speeds along with their chip names? I mean they used to be "the intel pentium 4 2ghz processor". Now they're always doing the damned "intel pentium d342" I mean I hate having to dig through tons of specs just to find the SPEED of the cpu!

    I hated it when AMD started doing the 2300+ type designations for their processors. Its like they're making it harder for consumers to do direct comparisons.

    Its just one of those annoyances that I don't like in the ad world. Like those smaller pc game boxes. They're too damn small!!! I liked it when they had stylized boxes too. Like some shaped like pyramids or whatever. I know it was a waste of cardboard but it was more interesting artwork when you had a larger canvas to use. Now their so small you can't easily pick out which game is which just strolling by the shelves.....

    Anyway anyone else dislike this change of terminology??
    Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
    Quote Quote  
  2. Член BJ_M's Avatar
    Join Date
    Jul 2002
    Location
    Canada
    Search Comp PM
    speed didnt equate to performance
    "Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650)
    Quote Quote  
  3. It's because, as BJ_M says, speed wasn't directly linked to the frequency of the chip. People would be deciding between a 2GHz P4 chip or a 2GHz Celeron, and go for the much cheaper one - the Celeron - believing that it had equivelent performance.

    The numerical system of ranking CPUs was brought in to counteract any confusion, but I still don't understand it fully.
    Quote Quote  
  4. Member yoda313's Avatar
    Join Date
    Jun 2004
    Location
    The Animus
    Search Comp PM
    But its still annoying. I mean raw power is still important in purchasing a computer. I know ram and peripherals play a major part but the guts of the computer are where it all starts.
    Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
    Quote Quote  
  5. Член BJ_M's Avatar
    Join Date
    Jul 2002
    Location
    Canada
    Search Comp PM
    hmm not really -- 400mhz and 800mhz sgi 128cpu systems have plenty of power (even going back to alpha chips at 200mhz) -- even 4 cpu systems ...

    but when you read 400mhz you think - crap
    "Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650)
    Quote Quote  
  6. If it ticks over at 400MHz but does more enough work per cycle to make it quicker than a 2400MHz CPU, then it's faster. Maybe that is the idea that the companies need to try to help their customers understand.
    Quote Quote  
  7. Член BJ_M's Avatar
    Join Date
    Jul 2002
    Location
    Canada
    Search Comp PM
    i think also - between dual core and 64 bit .. the speed factor mattered even less ..

    HT (i think you mentioned this before) - seems to be a bust..
    "Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650)
    Quote Quote  
  8. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    I comes down to what your application is and then benchmarking the hardware contenders for your chosen software.

    Games are one narrow world, other applications require analysis.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  9. contrarian rallynavvie's Avatar
    Join Date
    Sep 2002
    Location
    Minnesotan in Texas
    Search Comp PM
    On the consumer end I think it was the Opterons and the Athlon FX chips that first did away with the clockspeed naming convention. However my chips are still named by their clock speed. Mmm, 3.6 Irwindales
    FB-DIMM are the real cause of global warming
    Quote Quote  
  10. Member
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    Originally Posted by yoda313
    So why did Intel and AMD stop advertising their cpu speeds along with their chip names? I mean they used to be "the intel pentium 4 2ghz processor". Now they're always doing the damned "intel pentium d342" I mean I hate having to dig through tons of specs just to find the SPEED of the cpu!

    I hated it when AMD started doing the 2300+ type designations for their processors. Its like they're making it harder for consumers to do direct comparisons.

    Its just one of those annoyances that I don't like in the ad world. Like those smaller pc game boxes. They're too damn small!!! I liked it when they had stylized boxes too. Like some shaped like pyramids or whatever. I know it was a waste of cardboard but it was more interesting artwork when you had a larger canvas to use. Now their so small you can't easily pick out which game is which just strolling by the shelves.....

    Anyway anyone else dislike this change of terminology??
    hi,
    this is my opinion but it related to various article i have read over the years...
    amd since they went to processing two commands per cycle.... (intel processes 1 command per cycle) amd could n't keep up with the speed increase that intell was getting....... so they enphasie equilivent ..smile ..speed....
    Now intel here lately been pushing there dual core chips..... plus it getting harder for them to increase the raw mhz speed!!!! s now adays there enphasing the dual core.. especially with vista comming out....!!

    your not going to see too much... big increases in raw speed.. .. there limiting factors involve.... heating.. and physics...!! my self they will have to eventually go to a entirely different technolgy for making chips to really boost speed.....
    with present technology i think the max is about 5-7gighz!!
    myself i generally like to know the raw speed.. many of my programs depend on raw power....... not equilivent.. smile...
    Quote Quote  
  11. Originally Posted by yoda313
    But its still annoying. I mean raw power is still important in purchasing a computer. I know ram and peripherals play a major part but the guts of the computer are where it all starts.
    CPU speed is not the only measuring stick when it comes to computers. The companies fell into a marketing trap quoting the processor speed. This is why the change came about.
    Believing yourself to be secure only takes one cracker to dispel your belief.
    Quote Quote  
  12. like what was mentioned, raw processing speed is only one of the factors...other things to be considered are the following....is it single core or dual core, whats the FSB speed, is it a 32 or 64 bit processor.........how overclockable is it (if you intend to do that)....and those are only a few of the considerations that are needed to determine which processor is more "powerful" for exampe, i just invested a few bux into an asus motherboard and an AMD 3700+ 64 bit processor....just looking at raw ghz speed, it looks compareable to a p4 thats like 3 or 4 years old.....once other things come into play, you can tell that it's as good, if not better than most MODERN p4's that are around these days......as what was also mentioned, there's some limits to the current standards, they will either have to start making the processors and heatsinks bigger or have to invent a new form of a processor alltogether, at the moment, it seems that the computer companies are at a "plateau" of sorts........the current technology doesn't seem like it can go very much farther than where it is.....
    Quote Quote  
  13. Member pchan's Avatar
    Join Date
    Mar 2003
    Location
    Singapore
    Search Comp PM
    Both Intel and AMD are hitting the wall in bumping up the clock. 4GHz is probably the max that they can go. Clock is the key determinant of PC speed. Overclocking is one example. OC freak will bump the clock speed by putting the cpu into deep freeze just to get a little higher performance.

    Apple used to argue clock speed doesn't matter when they are in Power PC camp. Now they are using Intel.

    However, a fast CPU must be supported with good motherboard, RAM and video card.
    Quote Quote  
  14. Member Super Warrior's Avatar
    Join Date
    Jun 2002
    Location
    United States
    Search Comp PM
    Originally Posted by yoda313
    So why did Intel and AMD stop advertising their cpu speeds along with their chip names? I mean they used to be "the intel pentium 4 2ghz processor". Now they're always doing the damned "intel pentium d342" I mean I hate having to dig through tons of specs just to find the SPEED of the cpu!

    I hated it when AMD started doing the 2300+ type designations for their processors. Its like they're making it harder for consumers to do direct comparisons.

    Its just one of those annoyances that I don't like in the ad world. Like those smaller pc game boxes. They're too damn small!!! I liked it when they had stylized boxes too. Like some shaped like pyramids or whatever. I know it was a waste of cardboard but it was more interesting artwork when you had a larger canvas to use. Now their so small you can't easily pick out which game is which just strolling by the shelves.....

    Anyway anyone else dislike this change of terminology??
    Does it really matter? Anyone who knows a bit about CPUs and is seriously looking to buy/upgrade a new processor, will take the extra time to look up its actual speed before getting it.
    Quote Quote  
  15. Member mats.hogberg's Avatar
    Join Date
    Jul 2002
    Location
    Sweden (PAL)
    Search Comp PM
    I always suspect the marketing department in issues like this. My last computer, bought 5 years ago, was a PIV 1.7 GHz. Fastest there was at that time was 2 GHz. Three years later, I upgraded to a 3.2 GHz - almost twice the clock fq (but still not the fastest at the time). So now, yet 2 more years later, we've crawled up to 3.4 GHz "second fastest", with 3.6 as the speed demon.
    I'd suspect that if I upgrade in one year again, (following my usual 3 year computer life cycle) it'll be a 3.6 GHz CPU. Going from 88% increase year 1-3 to 12% year 4-6. Of course marketing panic! 12% used to be the quarterly CPU fq increase, and people started to feel outdated in a year. Now, suddenly you can keep your CPU 7 times longer based on clock fq alone. No surprise the MHz has vanished from the list...

    /Mats
    Quote Quote  
  16. Member FulciLives's Avatar
    Join Date
    May 2003
    Location
    Pittsburgh, PA in the USA
    Search Comp PM
    Originally Posted by mats.hogberg
    IGoing from 88% increase year 1-3 to 12% year 4-6. Of course marketing panic! 12% used to be the quarterly CPU fq increase, and people started to feel outdated in a year. Now, suddenly you can keep your CPU 7 times longer based on clock fq alone. No surprise the MHz has vanished from the list...

    /Mats
    I get what you are saying. In the game world my P4 540 3.2Ghz is more than fast enough and I've had it for 15 months now.

    The real bottleneck these days is the video card.

    It's sad when you have to upgrade your video card faster than your CPU

    - John "FulciLives" Coleman
    "The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
    EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
    Quote Quote  
  17. Member mats.hogberg's Avatar
    Join Date
    Jul 2002
    Location
    Sweden (PAL)
    Search Comp PM
    True. Now you want a new GFX card each year instead of a new CPU. If you use your computer for gaming, which fortunately I don't! Hint to Intel, AMD et al: Start making serious graphics processors!

    /Mats
    Quote Quote  
  18. contrarian rallynavvie's Avatar
    Join Date
    Sep 2002
    Location
    Minnesotan in Texas
    Search Comp PM
    Originally Posted by mats.hogberg
    True. Now you want a new GFX card each year instead of a new CPU.
    Not just one, two or more video cards each year because of SLI
    FB-DIMM are the real cause of global warming
    Quote Quote  
  19. Member Treebeard's Avatar
    Join Date
    Aug 2002
    Location
    127.0.0.1
    Search Comp PM
    Originally Posted by rallynavvie
    Originally Posted by mats.hogberg
    True. Now you want a new GFX card each year instead of a new CPU.
    Not just one, two or more video cards each year because of SLI
    Lets not forget the Quad SLI's now
    Quote Quote  
  20. Member FulciLives's Avatar
    Join Date
    May 2003
    Location
    Pittsburgh, PA in the USA
    Search Comp PM
    I can't imagine ever doing this again but I have to admit that back in the day I did have two of those Voodoo2 cards in SLI

    I mostly did that so I could play QUAKE with optimal settings

    - John "FulciLives" Coleman
    "The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
    EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
    Quote Quote  
  21. Member rkr1958's Avatar
    Join Date
    Feb 2002
    Location
    Huntsville, AL, USA
    Search Comp PM
    Originally Posted by mats.hogberg
    Now, suddenly you can keep your CPU 7 times longer based on clock fq alone. No surprise the MHz has vanished from the list...
    /Mats
    So true. I'm running a 2.6-GHz P4 in a computer that I built 2 1/2 years ago. At the time the top of the line was 3-GHz. Phycologiclly, I still feel that this CPU is still cutting edge based on clock speed. But 2 1/2 years prior to when I built my computer (5-years ago) the upper edge was around 1.3-GHz.
    Quote Quote  
  22. Member nebula8's Avatar
    Join Date
    Feb 2005
    Location
    United States
    Search Comp PM
    I recently did a hardware upgrade.
    My old system was a p4 2ghz from the time when 2.4ghz was the top.
    I now have a 4400+ duel core running at 2.2ghz.
    My new AMD system blows the old P4 out of the water. Absolutely no comparison, even when using one core.

    Heat is definately a factor limiting faster clock speeds. Since the upgrade, our power bill went up, and if my wife knew why, she would probably unplug my computer when I'm not near it!
    Quote Quote  
  23. contrarian rallynavvie's Avatar
    Join Date
    Sep 2002
    Location
    Minnesotan in Texas
    Search Comp PM
    Originally Posted by nebula8
    My old system was a p4 2ghz from the time when 2.4ghz was the top.
    I now have a 4400+ duel core running at 2.2ghz.
    My new AMD system blows the old P4 out of the water. Absolutely no comparison, even when using one core.
    I should hope so, those two chips really aren't in the same class. You can't really use their physical clock speeds for comparison.
    FB-DIMM are the real cause of global warming
    Quote Quote  
  24. heh, i actually just completed my computer upgrades a few days ago, and im pretty sure my energy bill will be going DOWN (i used to run an AMD athlon 2400...check my crrent computer details to see exactly what im running now) it has the cool and quiet technology built into it, so it's not chewing up massive amounts of computer power when it's idle....and it's a LOT faster too....even though i only jumped from 2ghz to 2.4ghz.......
    Quote Quote  
  25. Banned
    Join Date
    Feb 2005
    Location
    USA
    Search Comp PM
    Your speed increase is more than likely caused by an increase in efficency in the way the chip handles information. Your chips interaction with external parts (ie. FSB) has probably also received a significant boost.
    Quote Quote  
  26. Originally Posted by whitejremiah
    heh, i actually just completed my computer upgrades a few days ago, and im pretty sure my energy bill will be going DOWN.....
    1. What type of MB you got ?

    2. What is the cost of your MB/CPU ?

    3. What is performance different between your Hitachi HD vs WD HD ?
    Quote Quote  
  27. Member Faustus's Avatar
    Join Date
    Apr 2002
    Location
    Dallas, TX
    Search Comp PM
    Originally Posted by pchan
    Apple used to argue clock speed doesn't matter when they are in Power PC camp. Now they are using Intel.
    Who isn't clocked that much faster really. I mean its faster yeah but who cares. Apple realized a long while back because their platform used an entirely different type of CPU that pure mhz was a horrible way to measure.

    Then the AMD vs. Intel battle heated up and AMD managed to shift the argument by naming by performance instead of speed. 2200+ anyone? Intel who still makes fine CPUs begins to realize the problems with just pushing faster and faster and themselves develops some great CPUs that use less power and are good and fast, the bad side? Less megahertz. So the final step is taken and just as apple said long ago, mhz is almost meaningless now so they all ignore it.

    So really... Apple was right and ahead of the curve on that one. as usual :P

    Of course this will make Sys REquirements for software INSANE in a few years. "You must have any of the follow series of CPUs......" 12 pages later "and 512 ram"
    Quote Quote  
  28. Member mikesbytes's Avatar
    Join Date
    Jun 2003
    Location
    Sydney, Australia
    Search Comp PM
    Another shift is that the speed of the slowest currently available desktop CPU's isn't a hell of slower than the fastest CPU's, where in years gone buy the slow CPU's were half the speed of the fast CPU's.

    And despite all of the non speed enhancements, video encoding performance is mainly determined by CPU speed.

    We are heading to a point where there arn't huge differences between desktops, if your primary performance need is video encoding.
    Have a nice Day
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!