continued hereOn Friday, Intel engineers are detailing the inner-workings of the company's first graphics chip in over a decade at the Game Developers Conference in San Francisco--sending a signal to the game industry that the world's largest chipmaker intends to be a player.
During a conference call that served as a preview to the GDC sessions, Tom Forsyth, a software and hardware architect at Intel working on the Larrabee graphics chip project, discussed the design of Larrabee, a chip aimed squarely at Nvidia and Advanced Micro Devices' ATI unit.
And Nvidia and AMD will no doubt be watching the progress intently. Intel's extensive and deep relationships with computer makers could give it an inside track with customers and upset the "discrete" graphics duopoly now enjoyed by Nvidia and AMD. In the last decade Intel has not competed in the standalone, or so-called discrete graphics chip market where Nvidia and AMD dominate. Rather, it has been a supplier of integrated graphics, a low-performance technology built into its chipsets that offers only a minimal gaming experience
So do you think Intel can pull it off?
I bought an Intel board that had onboard video because they promised a bunch. They couldn't muster much even after several driver updates. I think they're promising the sky again. Sure *eventually* we'll have a combined CPU/GPU chip, just think it will never be good enough for the most heaviest of workloads
+ Reply to Thread
Results 1 to 7 of 7
-
-
Larrabee on hold:
http://arstechnica.com/hardware/news/2009/12/intels-larrabee-gpu-put-on-ice-more-news-...me-in-2010.ars
It's funny that it took Intel so long to realize they weren't going to catch up with Nvidia and ATI after being behind them by three or four generations for so many years. -
Originally Posted by stiltman
http://www.firingsquad.com/hardware/amd_ati_radeon_hd_2900_xt_performance_preview/page2.asp
The ultra-threading dispatch processor acts as a traffic cop, it’s a central dispatch unit that is responsible for tracking and distributing thousands of threads simultaneously across the Radeon HD 2900’s shader processors.
http://www.anandtech.com/video/showdoc.aspx?i=3651&p=4
the gt200 nvidia gpu can keep almost 31 thousand threads in flight at the same time, x86 cpu's can only keep 1 thread per core in flight or 2 threads per core if they have hyper threading. ati's latest gpu's can do over 1 teraflop in 32 bit mode and over 500 gigaflop with 64 bit operations and the biggest advantage is that gpu's don't require the programmer to worry about threading his/her code, their is hardware present on the gpu that handles the threading and associated maintenance, with x86 cpu's the programmer needs to create a new data type, namely a dword for each thread he wants to launch and then he has to worry about memory allocation, de-allocation, locks, scheduling, it's a nightmare.
in all honesty, if someone would take just a single x86 core and integrate it with a modern gpu, we would see a hundred fold performance increase.
but that's where the business people come in, if you have ever worked in any big company and dealt with any of the people that run it that have business degrees you will discover that no matter what the industry, no matter what the service or target market, they all preach one credo above all else: residuals. and, there is one study they always site above any else: that it's 10 times cheaper to keep an existing customer than it is to acquire a new customer.
all business plans are built around those 2 fundamental frameworks; namely squeezing every penny they can from each customer AND doing it again with a new product cycle.
amd recently "clarified" bulldozer's architecture, unlike previous statements that the integrated gpu would handle most of the floating point operations, they are no saying that "bulldozer" is just laying down the groundwork for a cpu 3-5 years in the future that will be their first offering to offload "some" of the floating point operations to the integrated gpu, which from a self preservation standpoint makes perfect sense: why build and sell a cpu/gpu hybrid that is so powerful that it effectively eliminates demand for a discrete gpu product and possibly offers enough performance as to be fast enough for most target markets, thus killing their constant upgrade cycle.
the reality is that amd and/or intel could easily make and sell a cpu/gpu hybrid that offered so much performance that it would effectively negate the need to upgrade until the parts actually died and needed to be replaced, but then they wouldn't be able to treat the general public like their own personal cash cows.
it's a sad reality of business, i remember working in a honda and chevy dealership and one day a honda executive came for a visit and i over heard one of the dealership salesmen ask the honda exec "with all the technology we have, can't they just build a car that doesn't break down or fold up like an accordion with the slightest accident" to which the honda exec responded "what are we, f*cking stupid? of course we can, but if we did then no one would ever buy a new car again".
that pretty much sums up any big businesses view of it's customers... -
Originally Posted by deadrats
This is an Intel thread and I was talking about Intel
run along -
Intel wants to get into making gaming video cards to get more business.
I think,therefore i am a hamster. -
Originally Posted by stiltman
I bought an Intel board that had onboard video because they promised a bunch. They couldn't muster much even after several driver updates. I think they're promising the sky again. Sure *eventually* we'll have a combined CPU/GPU chip, just think it will never be good enough for the most heaviest of workloads
now i will run along, i suggest you skip along, maybe pick some daisies, and maybe bake a fruit cake, whatever turns your festive little self on... -
Originally Posted by stiltman
Today's operating systems (Windows, Mac, Linux) all use GPU accelerated desktops. Current Intel integrated chips barely get the job done. Any $25-$30 card offers far more performance than Intel's offerings.
Gaming is only semi-important. I'd wager Intel has a much greater installed/user base with it's integrated chips compared to Nvidia and AMD/ATI gaming cards combined. I see the future as content decoding (H.264) from the likes of Hulu, Netflix, and other future content delivery methods. Most mainstream games are playable with old and weak graphics cards. Not talking Crysis, but titles like The Sims, run farely well on a - by today's standards - weak Nvidia 7600GT. Out of the many average-Joe computer users I know, most are content playing online flash based games, or time wasters from Reflexive and Big Fish. Which do run with Intel integrated chipsets.Linux _is_ user-friendly. It is not ignorant-friendly and idiot-friendly.
Similar Threads
-
The future of chip cooling.
By HotDamn! in forum ComputerReplies: 0Last Post: 11th Mar 2010, 03:00 -
Intel to debut 6-core gaming chip (aka video editing)
By stiltman in forum ComputerReplies: 18Last Post: 5th Mar 2010, 16:09 -
Intel Cuts Some Quad-core Chip Prices by 40 Percent
By Soopafresh in forum ComputerReplies: 0Last Post: 20th Jan 2009, 13:41 -
?Intel chip problem? Video not opening / iMac computer
By migseder in forum MacReplies: 4Last Post: 10th Sep 2007, 00:25 -
Dual Monitors - Adobe Premiere - Laptop with Intel graphics Help?
By Tuesnightspecial in forum EditingReplies: 0Last Post: 9th Jun 2007, 09:55