VideoHelp Forum
+ Reply to Thread
Results 1 to 19 of 19
Thread
  1. http://news.cnet.com/8301-13924_3-10463931-64.html?tag=newsLatestHeadlinesArea.0

    Intel is expected to introduce a 6-core processor designed to crunch through the most 3D-intensive games
    in the coming weeks. The first glimpses of the chip running 3D-intensive games such as
    Napoleon: Total War could happen at the 2010 Game Developer's Conference next week, according to industry sources. The official roll-out of Intel's 6-core "Westmere" processors, however, is expected later this month.
    The Core i7-980X is distinguished primarily by being Intel's first 6-core "Extreme Edition" processor
    based on the chipmaker's cutting-edge 32-nanometer process technology. Generally, the smaller
    the manufacturing process, the more circuitry can be packed onto the chip, increasing performance.
    Most Intel processors still use "fatter" 45-nanometer technology. Like other Core i series processors,
    it features Hyper-Threading, which can double the number of tasks--or threads--a processor can execute.
    So, a 6-core processor can handle 12 threads. This technology is not offered on prior-generation Core 2 chips. Resellers--which have posted preliminary pricing--list the processor at just over $1,000 and show it running at 3.33GHz and packing 12MB of built-in cache memory. With an expected price of around $1,000, game boxes using the chip will not be cheap. PC makers that typically offer high-end gaming boxes include Falcon Northwest, Velocity Micro, and Dell's Alienware unit.
    tgpo famous MAC commercial, You be the judge?
    Originally Posted by jagabo
    I use the FixEverythingThat'sWrongWithThisVideo() filter. Works perfectly every time.
    Quote Quote  
  2. It will simply be an overkill, the software company is not moving as fast as hardware market. Heck, my Core i7 920 + GTX 260 SLI produces more then 100 FPS in 90%+ games playing at full 1080p with maximum settings + 4x AA. Besides that, how many games and encoding softwares out there are that actually even support quadcore? My guess is less then 5% of total applications. Sure, more cores means better multitasking but how many cores do you need? Afterall, its a PC not a SuperComputer. And almost 95% of PC games are also released on consoles or ported from consoles so unless we get new consoles so the developers actually start making more advanced games in means of physics and graphics, i say even C2D for most games is enough. I am myself more intereseted in Fermi though which is due to be released somewhere near end of March 2010.
    Quote Quote  
  3. which have posted preliminary pricing
    I read as prelim ICING which is what the price is .. I'm shocked tho 32nm?? I've already gone to 22nm FFS! And only six cores? give me eight, or give me death.. My light web browsing taskload demands it
    Corned beef is now made to a higher standard than at any time in history.
    The electronic components of the power part adopted a lot of Rubycons.
    Quote Quote  
  4. Member
    Join Date
    Oct 2004
    Location
    United States
    Search PM
    Originally Posted by sohaibrazzaq View Post
    Sure, more cores means better multitasking but how many cores do you need? Afterall, its a PC not a SuperComputer.
    bill gates said 640K memory is all we'd ever need

    todays computer is yesterdays super computer. such will always be the case. saying they should stop increasing the speed/cores/etc. because I don't have a personal use for it is short-sighted.
    Quote Quote  
  5. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by greymalkin View Post
    bill gates said 640K memory is all we'd ever need
    that quote is an urban myth, he never said that.
    Quote Quote  
  6. Banned
    Join Date
    Jun 2004
    Location
    ®Inside My Avatar™© U.S.
    Search Comp PM
    Originally Posted by deadrats View Post
    Originally Posted by greymalkin View Post
    bill gates said 640K memory is all we'd ever need
    that quote is an urban myth, he never said that.
    Maybe, but i have a magazine upstairs that has bill gates saying that SPAM would be stamped out and not exist anymore by something like 2006!!


    LMAO!!

    I also remember this supposed "PC guru" talking back when we all had 450mhz pc's, about how they would
    never break 700mhz because they could not build anything that fast without using "x-rays"
    Don't ask because i don't know!!!
    I just know i was sitting there laughing to myself and thinking, yep, just keep talking and letting everyone know what a F'n moron you are!!
    LOL!!
    Quote Quote  
  7. Member
    Join Date
    Oct 2004
    Location
    United States
    Search PM
    Originally Posted by deadrats View Post
    Originally Posted by greymalkin View Post
    bill gates said 640K memory is all we'd ever need
    that quote is an urban myth, he never said that.
    The quote, real or fabricated, is just an example of extreme short-sightedness. When Douglas Englebert gave his personal computer demo in 1968 most people scoffed at the ridiculous notion that people would use computers for personal reasons.
    Quote Quote  
  8. Member
    Join Date
    Feb 2008
    Location
    Twin Peaks
    Search Comp PM
    And remember IBM reject the computer mouse as unneeded, it was Apple/Mac that decided it was a good idea.
    Quote Quote  
  9. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    Originally Posted by lowellriggsiam View Post
    And remember IBM reject the computer mouse as unneeded, it was Apple/Mac that decided it was a good idea.

    well that's not true. xerox used them for years first and then both apple with their one button and microsoft/ibm with the 2 button started in the early eighties. i have a mouse systems serial port 2 button mouse from 1982 that came with m.s. word around here somewhere.....
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  10. Member
    Join Date
    Feb 2008
    Location
    Twin Peaks
    Search Comp PM
    Still all the same, IBM rejected it as unneeded.
    Quote Quote  
  11. Originally Posted by greymalkin View Post
    Originally Posted by sohaibrazzaq View Post
    Sure, more cores means better multitasking but how many cores do you need? Afterall, its a PC not a SuperComputer.
    bill gates said 640K memory is all we'd ever need

    todays computer is yesterdays super computer. such will always be the case. saying they should stop increasing the speed/cores/etc. because I don't have a personal use for it is short-sighted.
    I never said that, its good as long as we have the applications supporting it. What good it is if more then 90% applications are only supports upto two threads? As i said before, software industry is lagging far behind hardware industry so its useless to invest in new hardware unless we get more demanding games or applications.
    Quote Quote  
  12. Member
    Join Date
    Oct 2004
    Location
    United States
    Search PM
    Originally Posted by sohaibrazzaq View Post
    I never said that, its good as long as we have the applications supporting it. What good it is if more then 90% applications are only supports upto two threads? As i said before, software industry is lagging far behind hardware industry so its useless to invest in new hardware unless we get more demanding games or applications.
    I understand you didnt' literally say it, but you are still saying it.

    Who is we? We means myself and others who run applications like me. Since you have no use for it, don't buy it! But even you admit there are some who would have a use for it (the 10% approximation). Are they not important because they don't fall in the "we" category? I can think of many non-"useless" medical, geological, architectural, etc. uses that may benefit others..and maybe even you (now are you interested?).

    It's the 10% of people who can use it that will pay the big bucks to get it..paying that new premium and soaking up the initial startup/etc. costs for the chip...then when the prices come down, software that WE use will catch up when it's at a price that WE can afford. The software that WE use won't be written until it's economical to do so..that's determined by the entirety of the market, even those who fall outside of "WE". "That which has been is that which will be, And that which has been done is that which will be done. "
    Last edited by greymalkin; 5th Mar 2010 at 11:45.
    Quote Quote  
  13. Me, Myself and I = WE

    tgpo famous MAC commercial, You be the judge?
    Originally Posted by jagabo
    I use the FixEverythingThat'sWrongWithThisVideo() filter. Works perfectly every time.
    Quote Quote  
  14. Member
    Join Date
    Oct 2004
    Location
    United States
    Search PM
    Originally Posted by stiltman View Post
    Me, Myself and I = WE

    don't make me bust out the venn diagrams!
    Quote Quote  
  15. Originally Posted by greymalkin View Post
    Originally Posted by sohaibrazzaq View Post
    I never said that, its good as long as we have the applications supporting it. What good it is if more then 90% applications are only supports upto two threads? As i said before, software industry is lagging far behind hardware industry so its useless to invest in new hardware unless we get more demanding games or applications.
    I understand you didnt' literally say it, but you are still saying it.

    Who is we? We means myself and others who run applications like me. Since you have no use for it, don't buy it! But even you admit there are some who would have a use for it (the 10% approximation). Are they not important because they don't fall in the "we" category? I can think of many non-"useless" medical, geological, architectural, etc. uses that may benefit others..and maybe even you (now are you interested?).

    It's the 10% of people who can use it that will pay the big bucks to get it..paying that new premium and soaking up the initial startup/etc. costs for the chip...then when the prices come down, software that WE use will catch up when it's at a price that WE can afford. The software that WE use won't be written until it's economical to do so..that's determined by the entirety of the market, even those who fall outside of "WE". "That which has been is that which will be, And that which has been done is that which will be done. "
    Well tell you what, there will always be some folks who will buy the latest stuff even they have no use for it and thats the reason why so many re-bandage products sell like crazy, take the recent nvidia example. I WILL definitely buy new processor when i feel i have the need and applications and games are trully benefiting out of it. Consider people had core 2 quad systems in 1990s, was that system any good for them? No, because there were no software's and applications supporting them. Its becoming more or less the same, you are getting faster and faster hardware but i don't recall when was the last revolutionary game was released which needed a monster system? Oh well it was badly optimized Crysis in 2007, any game after that performs extremely well on a 2 years old pc, i may be wrong but can you elaborate atleast 5 NLE's or encoders that fully utilize 4 cores of a PC? As a much more then average PC user, gaming, video editing i do it all, still most of the time my PC is working under less then 5% load. I don't recall when was the last time i was encoding videos, listening to music, surfing internet & playing games simultaneously so i felt the need of having 6 cores. And if you are considering medical professionals, autocad using engineer's under "WE" then its your bad, i cant help here because they use workstations anyway. By WE i mean PC users.
    Last edited by sohaibrazzaq; 5th Mar 2010 at 13:47.
    Quote Quote  
  16. I have just find an article, if you have time i suggest you read it, it better explains what i am trying to say, here:
    http://www.tomshardware.com/reviews/future-3d-graphics,2560.html

    And if you don't have time to read it all, i strongly suggest you atleast read this part:
    http://www.tomshardware.com/reviews/future-3d-graphics,2560-3.html

    a quote: Simply put, software development has not been moving as fast as hardware growth. While hardware manufacturers have to make faster and faster products to stay in business, software developers have to sell more and more games
    Quote Quote  
  17. Member
    Join Date
    Oct 2004
    Location
    United States
    Search PM
    First of all, I do not disagree that most software cannot utilize 6-12 cores, or that software development is behind hardware in most situations. I also do not disagree that marketing this as a "gaming chip" when no game can use 6 cores is dumb on the surface...but people buy $1500 handbags to stick their toy poodles in as well...so if it's made economical by any means, then it will be made, regardless of how useless it may be.

    What I cannot accept is the notion that 6 core processors are "useless" because you or 99% of the world cannot use it ("you" being a subset of "we" and "we" being the 99% of the world). Every example you used was how YOUR computer doesn't peg out doing x,y,z. You or the rest of the 99% of the world. What should be said it this is "useless to me".

    Why? Because there ARE PC applications that can use 8 or more cores. Edius is one such example since we are on a video editing site..using dual i7 (i.e. non-workstation) processors. Therefore, I am not mistaken (mistaken being substituted for the non-sensical phrase "my bad"

    So just to re-cap. I'm not disagreeing with all of what you are saying:

    "software industry is lagging far behind hardware industry" - I Agree
    "so its useless to invest in new hardware unless we get more demanding games or applications" - By this do you mean it's useless for you to invest or it's useless for Intel/AMD to invest? If the former then I've wasted a lot of typing time today :P. I was assuming the latter..and if so it is still an incorrect conclusion.
    Quote Quote  
  18. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    allow me to add a thing or two, multiple threads are usually not needed and threaded software for the sake of being threaded should not be encouraged, all it does is add complexity to the code,, thus making the code hard to maintain and upgrade and it causes unneeded overhead.

    what we should be encouraging is highly optimized code, ala SIMD, in another thread i did an encoding test with x264vfw, in one test i set the encoder to use 4 threads with no cpu optimizations and in the other i set the encoder to use all available optimizations )i have an x4 620m highest SIMD it supports is SSE3) but only 1 thread, the latter test proved to be the faster by a decent margin.

    what i'm most excited about is to see what intel's video transcoding driver for the clarkdale's can do (which seems like it will allow the clarkdale's to use the integrated gpu's 10, of 12, processing cores, to speed up video transcoding) and what sandy bridge brings to the table with it's new 256bit SIMD and fully integrated on chip gpu.

    if anyone has ever installed a base linux distro, like fedora and then custom compiled the kernel with the correct cpu optimizations selects, you would see the difference optimized software makes.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!