So why did Intel and AMD stop advertising their cpu speeds along with their chip names? I mean they used to be "the intel pentium 4 2ghz processor". Now they're always doing the damned "intel pentium d342" I mean I hate having to dig through tons of specs just to find the SPEED of the cpu!
I hated it when AMD started doing the 2300+ type designations for their processors. Its like they're making it harder for consumers to do direct comparisons.
Its just one of those annoyances that I don't like in the ad world. Like those smaller pc game boxes. They're too damn small!!! I liked it when they had stylized boxes too. Like some shaped like pyramids or whatever. I know it was a waste of cardboard but it was more interesting artwork when you had a larger canvas to use. Now their so small you can't easily pick out which game is which just strolling by the shelves.....
Anyway anyone else dislike this change of terminology??
+ Reply to Thread
Results 1 to 29 of 29
-
Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
-
speed didnt equate to performance
"Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650) -
It's because, as BJ_M says, speed wasn't directly linked to the frequency of the chip. People would be deciding between a 2GHz P4 chip or a 2GHz Celeron, and go for the much cheaper one - the Celeron - believing that it had equivelent performance.
The numerical system of ranking CPUs was brought in to counteract any confusion, but I still don't understand it fully. -
But its still annoying. I mean raw power is still important in purchasing a computer. I know ram and peripherals play a major part but the guts of the computer are where it all starts.
Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw? -
A decent motherboard, too - that's vital.
-
hmm not really -- 400mhz and 800mhz sgi 128cpu systems have plenty of power (even going back to alpha chips at 200mhz) -- even 4 cpu systems ...
but when you read 400mhz you think - crap"Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650) -
If it ticks over at 400MHz but does more enough work per cycle to make it quicker than a 2400MHz CPU, then it's faster. Maybe that is the idea that the companies need to try to help their customers understand.
-
i think also - between dual core and 64 bit .. the speed factor mattered even less ..
HT (i think you mentioned this before) - seems to be a bust.."Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650) -
I comes down to what your application is and then benchmarking the hardware contenders for your chosen software.
Games are one narrow world, other applications require analysis.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
On the consumer end I think it was the Opterons and the Athlon FX chips that first did away with the clockspeed naming convention. However my chips are still named by their clock speed. Mmm, 3.6 Irwindales
FB-DIMM are the real cause of global warming -
Originally Posted by yoda313
this is my opinion but it related to various article i have read over the years...
amd since they went to processing two commands per cycle.... (intel processes 1 command per cycle) amd could n't keep up with the speed increase that intell was getting....... so they enphasie equilivent ..smile ..speed....
Now intel here lately been pushing there dual core chips..... plus it getting harder for them to increase the raw mhz speed!!!! s now adays there enphasing the dual core.. especially with vista comming out....!!
your not going to see too much... big increases in raw speed.. .. there limiting factors involve.... heating.. and physics...!! my self they will have to eventually go to a entirely different technolgy for making chips to really boost speed.....
with present technology i think the max is about 5-7gighz!!
myself i generally like to know the raw speed.. many of my programs depend on raw power....... not equilivent.. smile... -
Originally Posted by yoda313Believing yourself to be secure only takes one cracker to dispel your belief.
-
like what was mentioned, raw processing speed is only one of the factors...other things to be considered are the following....is it single core or dual core, whats the FSB speed, is it a 32 or 64 bit processor.........how overclockable is it (if you intend to do that)....and those are only a few of the considerations that are needed to determine which processor is more "powerful" for exampe, i just invested a few bux into an asus motherboard and an AMD 3700+ 64 bit processor....just looking at raw ghz speed, it looks compareable to a p4 thats like 3 or 4 years old.....once other things come into play, you can tell that it's as good, if not better than most MODERN p4's that are around these days......as what was also mentioned, there's some limits to the current standards, they will either have to start making the processors and heatsinks bigger or have to invent a new form of a processor alltogether, at the moment, it seems that the computer companies are at a "plateau" of sorts........the current technology doesn't seem like it can go very much farther than where it is.....
-
Both Intel and AMD are hitting the wall in bumping up the clock. 4GHz is probably the max that they can go. Clock is the key determinant of PC speed. Overclocking is one example. OC freak will bump the clock speed by putting the cpu into deep freeze just to get a little higher performance.
Apple used to argue clock speed doesn't matter when they are in Power PC camp. Now they are using Intel.
However, a fast CPU must be supported with good motherboard, RAM and video card. -
Originally Posted by yoda313
-
I always suspect the marketing department in issues like this. My last computer, bought 5 years ago, was a PIV 1.7 GHz. Fastest there was at that time was 2 GHz. Three years later, I upgraded to a 3.2 GHz - almost twice the clock fq (but still not the fastest at the time). So now, yet 2 more years later, we've crawled up to 3.4 GHz "second fastest", with 3.6 as the speed demon.
I'd suspect that if I upgrade in one year again, (following my usual 3 year computer life cycle) it'll be a 3.6 GHz CPU. Going from 88% increase year 1-3 to 12% year 4-6. Of course marketing panic! 12% used to be the quarterly CPU fq increase, and people started to feel outdated in a year. Now, suddenly you can keep your CPU 7 times longer based on clock fq alone. No surprise the MHz has vanished from the list...
/Mats -
Originally Posted by mats.hogberg
The real bottleneck these days is the video card.
It's sad when you have to upgrade your video card faster than your CPU
- John "FulciLives" Coleman"The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
-
True. Now you want a new GFX card each year instead of a new CPU. If you use your computer for gaming, which fortunately I don't!
Hint to Intel, AMD et al: Start making serious graphics processors!
/Mats -
Originally Posted by mats.hogbergFB-DIMM are the real cause of global warming
-
I can't imagine ever doing this again but I have to admit that back in the day I did have two of those Voodoo2 cards in SLI
I mostly did that so I could play QUAKE with optimal settings
- John "FulciLives" Coleman"The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
-
Originally Posted by mats.hogberg
-
I recently did a hardware upgrade.
My old system was a p4 2ghz from the time when 2.4ghz was the top.
I now have a 4400+ duel core running at 2.2ghz.
My new AMD system blows the old P4 out of the water. Absolutely no comparison, even when using one core.
Heat is definately a factor limiting faster clock speeds. Since the upgrade, our power bill went up, and if my wife knew why, she would probably unplug my computer when I'm not near it! -
Originally Posted by nebula8FB-DIMM are the real cause of global warming
-
heh, i actually just completed my computer upgrades a few days ago, and im pretty sure my energy bill will be going DOWN (i used to run an AMD athlon 2400...check my crrent computer details to see exactly what im running now) it has the cool and quiet technology built into it, so it's not chewing up massive amounts of computer power when it's idle....and it's a LOT faster too....even though i only jumped from 2ghz to 2.4ghz.......
-
Your speed increase is more than likely caused by an increase in efficency in the way the chip handles information. Your chips interaction with external parts (ie. FSB) has probably also received a significant boost.
-
Originally Posted by whitejremiah
2. What is the cost of your MB/CPU ?
3. What is performance different between your Hitachi HD vs WD HD ? -
Originally Posted by pchan
Then the AMD vs. Intel battle heated up and AMD managed to shift the argument by naming by performance instead of speed. 2200+ anyone? Intel who still makes fine CPUs begins to realize the problems with just pushing faster and faster and themselves develops some great CPUs that use less power and are good and fast, the bad side? Less megahertz. So the final step is taken and just as apple said long ago, mhz is almost meaningless now so they all ignore it.
So really... Apple was right and ahead of the curve on that one. as usual :P
Of course this will make Sys REquirements for software INSANE in a few years. "You must have any of the follow series of CPUs......" 12 pages later "and 512 ram" -
Another shift is that the speed of the slowest currently available desktop CPU's isn't a hell of slower than the fastest CPU's, where in years gone buy the slow CPU's were half the speed of the fast CPU's.
And despite all of the non speed enhancements, video encoding performance is mainly determined by CPU speed.
We are heading to a point where there arn't huge differences between desktops, if your primary performance need is video encoding.Have a nice Day
Similar Threads
-
Intel Cuts Some Quad-core Chip Prices by 40 Percent
By Soopafresh in forum ComputerReplies: 0Last Post: 20th Jan 2009, 13:41 -
AMD breakup; chip production now to be Arab-owned
By ahhaa in forum ComputerReplies: 8Last Post: 8th Oct 2008, 21:53 -
AMD or Intel
By waheed in forum ComputerReplies: 33Last Post: 4th Mar 2008, 14:43 -
AMD or Intel??
By caesarhawy in forum ComputerReplies: 15Last Post: 13th Oct 2007, 22:47 -
intel price drop?
By wingfan in forum ComputerReplies: 8Last Post: 3rd Aug 2007, 19:39