VideoHelp Forum




+ Reply to Thread
Results 1 to 7 of 7
  1. On Friday, Intel engineers are detailing the inner-workings of the company's first graphics chip in over a decade at the Game Developers Conference in San Francisco--sending a signal to the game industry that the world's largest chipmaker intends to be a player.

    During a conference call that served as a preview to the GDC sessions, Tom Forsyth, a software and hardware architect at Intel working on the Larrabee graphics chip project, discussed the design of Larrabee, a chip aimed squarely at Nvidia and Advanced Micro Devices' ATI unit.

    And Nvidia and AMD will no doubt be watching the progress intently. Intel's extensive and deep relationships with computer makers could give it an inside track with customers and upset the "discrete" graphics duopoly now enjoyed by Nvidia and AMD. In the last decade Intel has not competed in the standalone, or so-called discrete graphics chip market where Nvidia and AMD dominate. Rather, it has been a supplier of integrated graphics, a low-performance technology built into its chipsets that offers only a minimal gaming experience
    continued here

    So do you think Intel can pull it off?
    I bought an Intel board that had onboard video because they promised a bunch. They couldn't muster much even after several driver updates. I think they're promising the sky again. Sure *eventually* we'll have a combined CPU/GPU chip, just think it will never be good enough for the most heaviest of workloads
    tgpo famous MAC commercial, You be the judge?
    Originally Posted by jagabo
    I use the FixEverythingThat'sWrongWithThisVideo() filter. Works perfectly every time.
    Quote Quote  
  2. Larrabee on hold:

    http://arstechnica.com/hardware/news/2009/12/intels-larrabee-gpu-put-on-ice-more-news-...me-in-2010.ars

    It's funny that it took Intel so long to realize they weren't going to catch up with Nvidia and ATI after being behind them by three or four generations for so many years.
    Quote Quote  
  3. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by stiltman
    I bought an Intel board that had onboard video because they promised a bunch. They couldn't muster much even after several driver updates. I think they're promising the sky again. Sure *eventually* we'll have a combined CPU/GPU chip, just think it will never be good enough for the most heaviest of workloads
    actually you are wrong, gpu's by themselves are more than good enough for the heaviest of workloads, an ati xt2900 is capable of handling thousands of threads simultaneously:

    http://www.firingsquad.com/hardware/amd_ati_radeon_hd_2900_xt_performance_preview/page2.asp

    The ultra-threading dispatch processor acts as a traffic cop, it’s a central dispatch unit that is responsible for tracking and distributing thousands of threads simultaneously across the Radeon HD 2900’s shader processors.
    and newer gpu's are more powerful still:

    http://www.anandtech.com/video/showdoc.aspx?i=3651&p=4

    the gt200 nvidia gpu can keep almost 31 thousand threads in flight at the same time, x86 cpu's can only keep 1 thread per core in flight or 2 threads per core if they have hyper threading. ati's latest gpu's can do over 1 teraflop in 32 bit mode and over 500 gigaflop with 64 bit operations and the biggest advantage is that gpu's don't require the programmer to worry about threading his/her code, their is hardware present on the gpu that handles the threading and associated maintenance, with x86 cpu's the programmer needs to create a new data type, namely a dword for each thread he wants to launch and then he has to worry about memory allocation, de-allocation, locks, scheduling, it's a nightmare.

    in all honesty, if someone would take just a single x86 core and integrate it with a modern gpu, we would see a hundred fold performance increase.

    but that's where the business people come in, if you have ever worked in any big company and dealt with any of the people that run it that have business degrees you will discover that no matter what the industry, no matter what the service or target market, they all preach one credo above all else: residuals. and, there is one study they always site above any else: that it's 10 times cheaper to keep an existing customer than it is to acquire a new customer.

    all business plans are built around those 2 fundamental frameworks; namely squeezing every penny they can from each customer AND doing it again with a new product cycle.

    amd recently "clarified" bulldozer's architecture, unlike previous statements that the integrated gpu would handle most of the floating point operations, they are no saying that "bulldozer" is just laying down the groundwork for a cpu 3-5 years in the future that will be their first offering to offload "some" of the floating point operations to the integrated gpu, which from a self preservation standpoint makes perfect sense: why build and sell a cpu/gpu hybrid that is so powerful that it effectively eliminates demand for a discrete gpu product and possibly offers enough performance as to be fast enough for most target markets, thus killing their constant upgrade cycle.

    the reality is that amd and/or intel could easily make and sell a cpu/gpu hybrid that offered so much performance that it would effectively negate the need to upgrade until the parts actually died and needed to be replaced, but then they wouldn't be able to treat the general public like their own personal cash cows.

    it's a sad reality of business, i remember working in a honda and chevy dealership and one day a honda executive came for a visit and i over heard one of the dealership salesmen ask the honda exec "with all the technology we have, can't they just build a car that doesn't break down or fold up like an accordion with the slightest accident" to which the honda exec responded "what are we, f*cking stupid? of course we can, but if we did then no one would ever buy a new car again".

    that pretty much sums up any big businesses view of it's customers...
    Quote Quote  
  4. Originally Posted by deadrats
    actually you are wrong, gpu's by themselves are more than good enough for the heaviest of workloads, an ati xt2900 is capable of handling thousands of threads simultaneously...
    No YOU are

    This is an Intel thread and I was talking about Intel

    run along
    tgpo famous MAC commercial, You be the judge?
    Originally Posted by jagabo
    I use the FixEverythingThat'sWrongWithThisVideo() filter. Works perfectly every time.
    Quote Quote  
  5. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    Intel wants to get into making gaming video cards to get more business.
    I think,therefore i am a hamster.
    Quote Quote  
  6. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by stiltman
    Originally Posted by deadrats
    actually you are wrong, gpu's by themselves are more than good enough for the heaviest of workloads, an ati xt2900 is capable of handling thousands of threads simultaneously...
    No YOU are

    This is an Intel thread and I was talking about Intel

    run along
    i guess that you are suffering from memory loss combined with the inability to go back 2 posts and reread what you wrote, so allow me. you said:

    I bought an Intel board that had onboard video because they promised a bunch. They couldn't muster much even after several driver updates. I think they're promising the sky again. Sure *eventually* we'll have a combined CPU/GPU chip, just think it will never be good enough for the most heaviest of workloads
    to which i posted the reply i did.

    now i will run along, i suggest you skip along, maybe pick some daisies, and maybe bake a fruit cake, whatever turns your festive little self on...
    Quote Quote  
  7. Get Slack disturbed1's Avatar
    Join Date
    Apr 2001
    Location
    init 4
    Search Comp PM
    Originally Posted by stiltman
    So do you think Intel can pull it off?
    I give them 50/50 as to actually making something that competes at the same level as last-gen Nvidia and AMD/ATI chips at ~ the same price. I give them 100% they create something better than their current integrated chips.
    Today's operating systems (Windows, Mac, Linux) all use GPU accelerated desktops. Current Intel integrated chips barely get the job done. Any $25-$30 card offers far more performance than Intel's offerings.

    Gaming is only semi-important. I'd wager Intel has a much greater installed/user base with it's integrated chips compared to Nvidia and AMD/ATI gaming cards combined. I see the future as content decoding (H.264) from the likes of Hulu, Netflix, and other future content delivery methods. Most mainstream games are playable with old and weak graphics cards. Not talking Crysis, but titles like The Sims, run farely well on a - by today's standards - weak Nvidia 7600GT. Out of the many average-Joe computer users I know, most are content playing online flash based games, or time wasters from Reflexive and Big Fish. Which do run with Intel integrated chipsets.
    Linux _is_ user-friendly. It is not ignorant-friendly and idiot-friendly.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!