VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 41
Thread
  1. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    the next gen intel chips will not have pins. they will need to be soldiered to the motherboard. about the suckiest news i've heard in the computer world in years.

    http://semiaccurate.com/2012/11/26/intel-kills-off-the-desktop-pcs-go-with-it/
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  2. I wonder why there was no mention of AMD.
    Aren't there a lot of enthusiasts who build their own PCs with an AMD chip?

    I agree that this soldiered intel cpu thing is gonna suck, and find their decision to go down that path, rather baffling.

    On the bright side, this will give AMD a great opportunity to get more market share, and step up their game, provided that they are able to stick to a single socket type for a long time.
    Quote Quote  
  3. I can see why Intel is doing this, desktop sales are down considerably because most people are buying laptops and tablets. In fact tablet sales surpassed laptops.

    http://news.cnet.com/8301-1035_3-57553868-94/tablet-display-shipments-jump-top-laptops-in-october/
    Quote Quote  
  4. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    amd is free to stay with the pin/socket type cpu.
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  5. So Gigabyte et al begin selling motherboards with the CPU in place. I can't see them taking this without a fight. And as Joe The Dude mentioned, there's always AMD.
    Quote Quote  
  6. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    Originally Posted by MOVIEGEEK View Post
    I can see why Intel is doing this, desktop sales are down considerably because most people are buying laptops and tablets. In fact tablet sales surpassed laptops.

    http://news.cnet.com/8301-1035_3-57553868-94/tablet-display-shipments-jump-top-laptops-in-october/
    I'm not disputing what you said, but tablets are hardly replacements for PCs of any kind unless you have nothing more serious to do than sending email and going to websites.

    Honestly if this was the stupidest decision ever by Intel, it would still hardly effect their market share. They would just shrug off the losses and start making CPUs with pins again. Unfortunately AMD doesn't have the luxury of doing whatever the heck they want without any regard to the consequences. And the enthusiast market isn't all that big to begin with, so they can probably afford to lose share in it to AMD if the move ends up having some kind of cost benefit. The article suggests that Intel is going to use this to squeeze out motherboard manufacturers who they view as having too much power.
    Quote Quote  
  7. I didn't say tablets are replacements for desktops, I was merely pointing out that the enthusiast is a small minority today. The average buyer today just needs a device to listen to music, check-in to Facebook and watch Netflix.
    Quote Quote  
  8. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by jman98 View Post
    Originally Posted by MOVIEGEEK View Post
    I can see why Intel is doing this, desktop sales are down considerably because most people are buying laptops and tablets. In fact tablet sales surpassed laptops.

    http://news.cnet.com/8301-1035_3-57553868-94/tablet-display-shipments-jump-top-laptops-in-october/
    I'm not disputing what you said, but tablets are hardly replacements for PCs of any kind unless you have nothing more serious to do than sending email and going to websites.
    The people you are describing represent the bulk of the home computer market. Those who do something more at home are in the minority.

    [Edit]Beaten again...
    Quote Quote  
  9. The motherboard makers could easily collaborate and invent a standardized socket. They can also come out with socketed versions of the CPU by soldering the BGA part to a small PCB that mates with said standardized socket. When there is a will, there is a way. Intel might not like that though since it adds a profit seeking middle man between them and the end user. This decision will effect OEMs too, its cheaper to have separate CPUs and motherboards for inventory reasons. You only have to stock one board for multiple configurations.
    Quote Quote  
  10. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    personally i think this is a good thing (just hear me out before ripping me a new one); if the cpu is built into the motherboard it will result in faster communication between cpu and the other chips built into the motherboard (such as lan, audio, sata) by reducing latencies, it will also allow for faster memory access. it should also lower the overall cost of building a computer as you won't have to buy a cpu and a motherboard, you will just buy one motherboard with the cpu built in.

    overall i think once this new way of doing things has matured for a year or two we'll look back on the days of having to buy the cpu separate from the motherboard much in the same way we look back on the days when the fpu was a separate chip called a co-processor and we had to spend an arm and a leg if we wanted a pc that could do floating point math.
    Quote Quote  
  11. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by Joe The Dude View Post
    I wonder why there was no mention of AMD.
    Aren't there a lot of enthusiasts who build their own PCs with an AMD chip?
    because amd is not long for this industry, i'm sure you've heard the rumors that amd is up for sale and that various groups are looking at buying it up, also amd has announced the discontinuation of any post piledriver cpu, steamroller and excavator have been cancelled, i don't think by the time this comes around that amd will be making mainstream cpu's anymore.
    Quote Quote  
  12. Banned
    Join Date
    Jun 2004
    Location
    ®Inside My Avatar™© U.S.
    Search Comp PM
    Originally Posted by aedipuss View Post
    the next gen intel chips will not have pins. they will need to be soldiered to the motherboard. about the suckiest news i've heard in the computer world in years.

    http://semiaccurate.com/2012/11/26/intel-kills-off-the-desktop-pcs-go-with-it/
    A lot of older Intel CPU chips do NOT have pins, they just have contacts on the bottom of the CPU that rest against
    the little bent/spring type contacts on the motherboards.

    Nothing new to not have pins on Intel chips.

    Now if they stopped making chips you could use at home in desktops on whatever mobo you want they will be shooting themselves in the foot!!

    Yeah, with the hoards of today's SHEOPLE that think they are cool because they have an Ipad, tablet, what ever you want to describe the POS people buy, even if it is just a % of the consumer market, desktops and people that want to build or order what they want, will still ad up to millions at least worldwide, and that will be no chump change!!

    And that does not just include "DIY" people, that includes many many more people!!

    I also like how the title of the article is, "Inel Kills Off the Desktop"
    LMFAO!!
    Like Intel is the supreme controller and what they do defines what people want, have or make!!

    Oh, Intel say's it's so!!
    So it shall be!!!
    LOL!!

    Are we talking about Apple here ??
    LOL!!

    Did anyone see the last episode of the Simpsons where Homer won the latest "Mapple MyPad" ?
    And "Steve Mobbs" where he say's "The product you hold is like a giant expensive smart phone that can't call anyone, it's that incredible" And it has the "SUBMIT" icon!!
    "Now press the SUBMIT icon and agree to buy all our future products, and we are going to be making a lot of stuff",
    "submit, submit, submit!!"
    LMAO!!!
    They so dogged the idiots that follow and are members of the Apple cult!!

    And most people who write articles, especially on the internet, are usually not any better than the lemmings who write reviews on the internet from movies to pc hardware/software.
    Last edited by Noahtuck; 26th Nov 2012 at 21:50.
    Quote Quote  
  13. Member
    Join Date
    Oct 2010
    Location
    England
    Search Comp PM
    On the face of it, this doesn't look like a problem for a lot of users.

    I haven't ever upgraded a CPU while retaining the motherboard and RAM. By the time I think about doing so, the market has moved on. Socket standards change - RAM standards also.

    But this *is* going to be a problem when a motherboard needs replacing. Failure rates seem much higher in the last 10 years compared to the 1990s, and having to chuck out expensive CPUs along with the motherboard is a terrible idea.
    Quote Quote  
  14. Originally Posted by intracube View Post
    having to chuck out expensive CPUs along with the motherboard is a terrible idea.
    Not for Intel!
    Quote Quote  
  15. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    noahtuck - if you had read at least the first couple paragraphs of the article you would have seen that the new chips will HAVE to be soldiered onto the motherboard. no more socket at all. only motherboard manufacturers will be allowed to buy a cpu from intel. there will be no retail cpu market at all, you will have to buy whatever cpu/motherboard combo the gigabyte/asus/intel/msi people decide to offer.
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  16. Will this bring Intel great fortune, or a bullet hole in the foot? I suspect it is just one of those events, like the move from through-hole to surface-mount components, that hobbyists must endure.

    A quick survey of the local computers reveal two with soldered-in Via chips while the rest are mated for life by cpu-socket obsolescence.
    Quote Quote  
  17. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    Originally Posted by Constant Gardener View Post
    Will this bring Intel great fortune, or a bullet hole in the foot?
    Like Microsoft, it really doesn't matter. NOTHING will convince the masses to abandon them no matter how idiotic their decisions are. Most of my friends and family are not techies. I know plenty of non-IT people who believe that if Intel didn't make it, it's cheap crap and they don't buy it.

    Based on personal observation, almost everybody wants a laptop now. I work in IT and I am one of the very few people where I work who still has a PC desktop. I have a laptop too, but I rarely use it. I don't like laptops and if given a choice I prefer desktops rather strongly. What just amazes me from the posts I see here is seeing so many people who have basically done the PC equivalent of buying a Smart car and they're trying to get Porsche performance out of it or trying to make it haul as much stuff as you could fit in a truck. When you tell them that a Smart car is not a Porsche or a truck, they just get upset.
    Quote Quote  
  18. Member
    Join Date
    Oct 2010
    Location
    England
    Search Comp PM
    Originally Posted by jman98 View Post
    What just amazes me from the posts I see here is seeing so many people who have basically done the PC equivalent of buying a Smart car and they're trying to get Porsche performance out of it or trying to make it haul as much stuff as you could fit in a truck. When you tell them that a Smart car is not a Porsche or a truck, they just get upset.
    Another analogy; "Sorry sir, the oil filter has failed. To fix your car, we'll need to replace the engine, gearbox, driveshaft and front suspension."

    This seems the most ill-conceived, batsh*t crazy proposal I've heard in quite a while. Take the most expensive components of a computer and solder together, so when a 5 cent/pence component on the mobo fails...

    What next? the RAM modules?
    Quote Quote  
  19. Bad economy is likely affecting large companies also.
    These companies many times place idiots in control.

    This idea intel has come up with seems to fit the windows 8 OS mentality.
    Quote Quote  
  20. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    Originally Posted by intracube View Post
    What next? the RAM modules?
    My prediction is that if this works and Intel uses it to become the major motherboard manufacturer that they will come up with a design that will make it almost impossible to add in 3rd party video cards. Of course, on the laptop side, if Intel is the manufacturer of the motherboard there is no way to put another video card in, so Intel can use this as a backdoor method (assuming laptop sales overtake desktop sales) to sneakily become the number one video chip manufacturer too.
    Quote Quote  
  21. As transistor sizes continue to shrink Intel (and whoever else is left making CPUs) will have to find something to do with the real estate. I don't think there's any need for more and more cores in personal computing (obviously technical computing can use more cores). It only makes sense to put DRAM (or SRAM, or MRAM, or whatever they come up with) onto the CPU die.

    I think this issue is being overblown. Intel is moving to a modular (for them, not for end users) architecture. They want to be able to pick cores (slow and low power, fast and high power), GPU, memory controller, etc, and put them together into mix/match CPUs. This is an attempt to further their x86 everywhere mantra. I believe Broadwell is meant as a low power CPU to compete with ARM in tablets, cell phones, etc. So it make sense for it to come only in a BGA package soldered a motherboard. The article goes on to say that at least some future CPUs will come in socket packages. Intel is probably hedging their bets over where the market is going. If Desktops disappear (I don't think they will, though they may drop to 1/3 the recent highs) they won't need to produce socketed CPUs any more. If Desktops continue so will socketed CPUs.
    Last edited by jagabo; 28th Nov 2012 at 07:44.
    Quote Quote  
  22. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    there is also another side to this story that i just thought off, cpu's have become physically smaller as the manufacturing processes have shrunk, back in the slotted cpu days a P3 667mhz was a big processor, the old athlons and duron were physically big as well. as intel and amd transitioned to better manufacturing processes the cpu itself shrunk to the point that the ivy bridges are about the size of a quarter. haswell is slated to be a 22nm part and broadwell after that is slated to be manufactured using a 14nm process. a 14nm cpu will likely be about the size of a dime, at that small size it will become difficult for a do it yourselfer to install a cpu into the motherboard by hand and as process size and cpu size shrinks it will be impossible to install a cpu without a specialized tool of some sort, without damaging the motherboard.

    the reality is that intel's decision may also have a very practical and necessary component to it, we may need to have the cpu installed at the factory as it may be difficult if not impossible from a dexterity standpoint to do it ourselves.
    Quote Quote  
  23. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    even a pin sized cpu could be put n a reasonable sized package with pins or contact points. the extra inch of space wouldn't slow it down as instructions move at the speed of light on the board. it may be the chips they are talking about are for future phones and tablets not desktops at all. the worst chips i recall were the ones with bendable legs that had to be pushed into zif sockets. maybe the 8086 and those god awful 64kb memory chips you had to add 9 at a time.
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  24. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by aedipuss View Post
    the worst chips i recall were the ones with bendable legs that had to be pushed into zif sockets.
    do you know why this statement is funny? because "zif" stands for "zero insertion force", you weren't supposed to push the chips into the socket, of course you would bend the "legs", you were supposed to gently place them into the socket.

    having said that i remember the time i bought one of the at the time just released athlon 64 3000+ for i don't remember how much and as i was trying to open the package that held the chip the package was sealed tight and as i tried to pull it apart the package burst open and the chip went flying into the air, and i tried to grab it before it fell but i slammed pin side down into the ground. i picked up the chip and a bunch of the pins were bent and i remember letting out a stream of obscenities as i sat there and tried to straighten the pins out as best i could and when i tried to put it into the motherboard it wouldn't sit right because i couldn't get the pins perfectly straight so i pushed the damn thing in and locked it into place in the zif socket.

    i remember praying "please, work, please work" as i hit the power button on the case and jumping for joy when the pc actually started right up. i had that computer for a couple of years and then i gave it to someone and it kept on running, never blue screened, never any problems, never ran hot, i still chuckle when i think about it.
    Quote Quote  
  25. Member budwzr's Avatar
    Join Date
    Apr 2007
    Location
    City Of Angels
    Search Comp PM
    Remember when "64" meant 64 colors? As in "Commodore 64".
    Quote Quote  
  26. Member
    Join Date
    Oct 2010
    Location
    England
    Search Comp PM
    I guess there's no chance that motherboard manufacturers would put these CPUs onto a standard base/socket before selling on to the consumer?

    If that was ever close to happening, Intel would put a stop to it no doubt.
    Quote Quote  
  27. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    Originally Posted by budwzr View Post
    Remember when "64" meant 64 colors? As in "Commodore 64".

    Commodore 64 had 8 colors and 64k of ram.
    I think,therefore i am a hamster.
    Quote Quote  
  28. Originally Posted by johns0 View Post
    Originally Posted by budwzr View Post
    Remember when "64" meant 64 colors? As in "Commodore 64".

    Commodore 64 had 8 colors and 64k of ram.
    16 colors...

    https://en.wikipedia.org/wiki/List_of_color_palettes#Terminals_and_8-bit_machines
    https://en.wikipedia.org/wiki/List_of_8-bit_computer_hardware_palettes#C-64

    Name:  Commodore64_palette.png
Views: 761
Size:  1.1 KB
    Last edited by pandy; 28th Nov 2012 at 05:17.
    Quote Quote  
  29. Member
    Join Date
    Mar 2011
    Location
    Nova Scotia, Canada
    Search Comp PM
    I do not agree that amd is better than intel at supporting the 'enthusiast' market. At all.

    Many linux users will never buy an amd powered computer again. They've almost totally dissolved their linux development team.
    Quote Quote  
  30. Originally Posted by deadrats View Post
    cpu's have become physically smaller as the manufacturing processes have shrunk, back in the slotted cpu days a P3 667mhz was a big processor,
    There is a limit to how small the die can be -- predicated largely by the number of pads required to get signals on/off the die, heat dissipation, and physical handling. Economics limit how large a die can be -- the bigger the die the fewer you can put on a wafer and the lower the yield (the cost of processing a wafer is relatively fixed). So as the process gets smaller more transistors and functionality are added to the die. The die size remains relatively constant.

    Intel's slotted processors came in large cartridges because the L2 cache was moved off the motherboard and into the cartridge to get it closer to the CPU (lower latency). As the process shrunk the L2 cache was moved onto the CPU die. As the process continued to shrink they had to find more an more stuff to put on the die: memory controller, multiple cores, L3 cache, graphics...
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!