VideoHelp Forum
+ Reply to Thread
Results 1 to 14 of 14
Thread
  1. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    i know how ridiculous the title sounds but check out this article on direct x 11:

    http://legitreviews.com/article/1001/1/

    and in particular this paragraph:

    One of the key new features of DirectX 11 is support for DirectX Compute, which enables developers to utilize the massive parallel processing power of modern GPUs to accelerate a much wider range of applications that were previously only executable on CPUs. Accessed via programs called Compute Shaders that are executed on the GPU, they can be used to enable new graphical techniques (such as order independent transparency, ray tracing, and advanced post-processing effects), or to accelerate a wide variety of non-graphics applications (such as video transcoding, video upscaling, game physics simulation, and artificial intelligence).
    between directx compute and opencl:

    http://en.wikipedia.org/wiki/OpenCL

    i don't see the forced cpu upgrade cycle surviving much longer, no more having to buy new ram and a new motherboard because the new generation of cpu's use a different socket and a different memory controller, no need to reinstall the OS after a motherboard/cpu upgrade, no need to overclock a cpu to try and get maximum performance out of your configuration, just buy a faster video card, install the new drivers and bam!, you're done.

    i wonder how long before someone custom creates a linux distro using cuda?[/list][/url]
    Quote Quote  
  2. And the graphics card manufactures have a stellar record regarding bug-free drivers.
    Quote Quote  
  3. Not in our lifetime. GPU is limited to very limited tasks and I don't see programmers changing entire code bases to adapt. (How many years has it taken even to get limited multithreading coding for CPU implemented... LOL...)
    Quote Quote  
  4. Mod Neophyte Super Moderator redwudz's Avatar
    Join Date
    Sep 2002
    Location
    USA
    Search Comp PM
    Not Latest Video News. Moving to Computer Forum.
    Quote Quote  
  5. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo
    And the graphics card manufactures have a stellar record regarding bug-free drivers.
    never said they did, ati in particular has at times been horrible, but the same can be said for audio drivers, chipset drivers, motherboard bios', cpu's with erratas, buggy motherboards, the list goes on.

    on the bright side video card manufacturers do seem to have gotten their act together and i can't remember the last time nvidia released a buggy driver.
    Quote Quote  
  6. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by poisondeathray
    Not in our lifetime. GPU is limited to very limited tasks and I don't see programmers changing entire code bases to adapt. (How many years has it taken even to get limited multithreading coding for CPU implemented... LOL...)
    gpu's have been fully programmable since the geforce 3, it's just that programmers weren't taking advantage of them. furthermore, all modern gpu's, have gpgpu functionality built into them.

    as for changing entire code bases, evidently it doesn't require that much of a change seeing how scientific and business applications have been reworked to run on gpu's, distributed computing apps such as F@H and SETI have been recoded to run on gpu's, even LAME has a gpu accelerated encoder.

    i think it's obvious that the cpu, at least as we know it, won't be around for much longer, more than likely by 2011-2012 we will be using hybrid gpu/cpu chips, with the gpu handling the brunt of the heavy lifting and the cpu portion just being there to pass the data to the gpu.
    Quote Quote  
  7. Originally Posted by deadrats
    gpu's have been fully programmable since the geforce 3, it's just that programmers weren't taking advantage of them. furthermore, all modern gpu's, have gpgpu functionality built into them.
    programmable yes, but intergration with current applications would mean a complete re-write from the ground up. Very little code could be salvaged

    as for changing entire code bases, evidently it doesn't require that much of a change seeing how scientific and business applications have been reworked to run on gpu's, distributed computing apps such as F@H and SETI have been recoded to run on gpu's, even LAME has a gpu accelerated encoder.
    Again, very limited. Read vijay pande's blog, the GPU has single precision and cannot do 99% of the required calculations or functions. It is only capable of a small subset. At best, a GPU might be able to assist in certain functions, or perform specific calculations, but it will never be able to do what a CPU can, and certainly cannot run without a CPU.

    i think it's obvious that the cpu, at least as we know it, won't be around for much longer, more than likely by 2011-2012 we will be using hybrid gpu/cpu chips, with the gpu handling the brunt of the heavy lifting and the cpu portion just being there to pass the data to the gpu.
    It would seem that the manufacturers are pushing for hybrids (Intel's Larabee, amd Fusion), but I don't see the CPU disappearing anytime soon. In order for GPU's to "handle the brunt", everything as we know it would have to be re-written. And there is no need for "cpu portion just being there to pass the data", that's what interconnects are for.

    But I do like seeing newer applications being developed for GPUs, it's just that most need major improvement
    Quote Quote  
  8. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by poisondeathray
    In order for GPU's to "handle the brunt", everything as we know it would have to be re-written. And there is no need for "cpu portion just being there to pass the data", that's what interconnects are for.
    the interconnects are nothing more than the pathways through which data and instructions pass, something still needs to do the actual passing, that's what i meant by "cpu portion just being there to pass the data".
    Quote Quote  
  9. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by poisondeathray
    Again, very limited. Read vijay pande's blog, the GPU has single precision and cannot do 99% of the required calculations or functions. It is only capable of a small subset. At best, a GPU might be able to assist in certain functions, or perform specific calculations, but it will never be able to do what a CPU can, and certainly cannot run without a CPU.
    tell Vijay to get with the times:

    http://ati.amd.com/products/streamprocessor/specs.html

    http://ati.amd.com/technology/streamcomputing/product_firestream_9250.html

    http://insidehpc.com/2008/06/18/under-the-hood-of-nvidias-latest-gpu/

    "A very important new addition to the GeForce GTX 200 GPU architecture is double-precision, 64-bit floating point computation support. This benefits various high-end scientific, engineering, and financial computing applications or any computational task requiring very high accuracy of results. Each SM incorporates a double-precision 64-bit floating math unit, for a total of 30 double-precision 64-bit processing cores.

    The double-precision unit performs a fused MAD, which is a high-precision implementation of a MAD instruction that is also fully IEEE 754R floating-point specification compliant. The overall double-precision performance of all 10 TPCs of a GeForce GTX 200 GPU is roughly equivalent to an eight-core Xeon CPU, yielding up to 90 gigaflops.”
    Quote Quote  
  10. Only a few of the cores are double precision. 99.9% of the are still single precision. Part of the reason is backwards compatibility (not everyone has a GTX 2XX, or ATI 4XXX), part of the reason is that it would require a huge re-write (not going to happen). Again this is the same thing with all software and applications - they will require re-writes almost from the ground up to take advantage (not going to happen). At least some progress has been made in CPU multithreading...but even that is a long way from being decent.

    http://foldingforum.org/viewtopic.php?f=16&t=9740
    http://fahwiki.net/index.php/Cores

    Don't get me wrong, I think it's great with all this new GPU development, and OpenCL looks very promising. But CPU will not be gone anytime soon. GPU can only do very specific things, unfortunately
    Quote Quote  
  11. Banned
    Join Date
    Aug 2002
    Location
    beautiful
    Search Comp PM
    This has got to be at least 1000th time when some enlightened soul here or there seriously announces "death of ..." (PC, CPU, Windows, etc)
    Yet since I was kid and had my first computer I still see everybody use the very same basic design: CPU, RAM, HDD, monitor, mouse, keyboard (and add Windoze for most of people).

    Even if GPU could and would replace CPU, what would it change?
    GPU would become CPU at this moment, geez.
    When math co-processors were incorporated into CPU did we called them out of the sudden "CPU With Built-In Math Co-Processor"? No, they were still just "CPU".
    When GPU's architecture become as versatile and powerful as CPUs (and they will), of course there will be no need for some "separate CPU" anymore - but such GPU will be the CPU of such system. And no one will call it "GPU" at that moment, since this chip will be the system's Central Processing Unit, aka CPU

    BTW
    deadrats, you already started another thread about it. What was the point to open this one - you didn't like some replies in your previous thread?
    No matter how many threads you start, it won't change the facts that GPU won't replace CPU - and if it will, it will become the CPU itself.
    Quote Quote  
  12. Member Skith's Avatar
    Join Date
    Oct 2004
    Location
    Bottom of the ocean
    Search Comp PM
    People are creative, they will always find demand for more power, even if it is just for bragging rights

    That said, I believe I read that Adobe CS4 does in fact have at least some support for CUDA. That said, a CPU is still needed for many tasks, particularly in a high end server environment.
    Some people say dog is mans best friend. I say that man is dog's best slave... At least that is what my dogs think.
    Quote Quote  
  13. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    well it appears that intel has moved up the mass production of its clarkdale cpu to the 4th quarter of this year:

    http://www.engadget.com/2009/06/29/intels-32nm-clarkdale-cpus-moved-up-to-q4-a-full-year-ahead-of/

    for thos of you that don't know the clarkdale is the cpu with the integrated gpu, here's a video of a pre-production sample in action:

    http://www.tweaktown.com/news/12378/intel_shows_clarkdale_system_with_on_chip_gpu/index.html
    Quote Quote  
  14. Banned
    Join Date
    Aug 2002
    Location
    beautiful
    Search Comp PM
    My almost 2 years old PocketPC has CPU with built-in GPU.
    And still there are no programs to utilize GPU's-side processing power

    Intel is going in right direction with Clarkdale chip.
    Anyways.
    All the low-end office machines and such should have always had some cheap GPU-co-processor already built-in on CPU's die or chipset, eliminating old AGP slots entirely (if they would remove old PCI or any other "expansion" slots, the low-end business machines should have been fit inside monitor's case by now, and probably that's where it will go).
    There are thousands of office "low end, cheap" boxes that could have been basically shrunk to slightly more than the size of CPU and heatsink is, if it weren't for all the useless ports and slots. Why they ever needed AGP, PCI or PCI-e slots on office machines? Those are usually just dumb terminals anyways. 2 USBs + VGA + LAN is all any office dumb box needs. Built GPU in into CPU, built in memory, and it fits inside any monitor
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!