VideoHelp Forum
+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 50 of 50
Thread
  1. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by SLK001
    Originally Posted by RabidDog
    SO which of those two calculation methods will calculate PI exactly?
    Pi can't be calculated exactly.
    talk about giving someone a misleading answer. pi is the ratio between the circumference of a circle and it's diameter, the greater the precision of pi the more perfect the circle associated with it.

    now technically it may be true that since pi is supposedly* a never ending series that it can never be calculated "exactly" (since it technically*) doesn't have an exact value but i think it's obvious that what he meant was which method gives a more precise answer, or more accurately, which method gives a correct answer.

    *i am of the school of thought that doesn't believe that any series is actually infinite, you eventually reach a point at which that which the series describes has a finite limit and thus any further precision becomes meaningless. that limit is what i call the "planck limit".
    Quote Quote  
  2. Member
    Join Date
    May 2001
    Location
    United States
    Search Comp PM
    What, exactly, is misleading about the correct answer? If this is NOT true, then give use the EXACT value of PI.

    But wait... YOUR own formula (pi = L n->i (Pn/d)) shows that the series is infinite, thus no exact answer.

    But then, we digress... Since it obviously is true that the value for PI that we use is adequately accurate enough for our tasks, this still doesn't make our value of PI exact. "Adequate" is NOT equal to "exact" or "correct". This discussion could continue to infinity - or, applying your "planck limit" for a couple of more pages.

    Also, your "belief" that no series is actually infinite seems like a ridiculous statement, since you actually show one that is (plus, there are many more that also are infinite).

    And finally, this statement "the greater the precision of pi the more perfect the circle associated with it" is ignorant, since the precision of PI has nothing to do the "perfectness" of a circle.

    I have come to my "planck limit", so I will end this here.
    ICBM target coordinates:
    26° 14' 10.16"N -- 80° 16' 0.91"W
    Quote Quote  
  3. Precision and accuracy are two distinctly different things.

    22/7 is an accurate but imprecise representation of pi. (Incidently, it should be pronounced "pee".)

    3.1415926536 is an equally accurate but more precise representation. And is the precision to which my brain stores pi...

    1.22345234524562323452542534555683537783737 is a highly precise but woefully inaccurate representation.

    32-, 64- and 80-bit floating point representations used in most mainstream computing are all accurate but have increasing precision.
    Quote Quote  
  4. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo
    The i87 instruction set will not be removed from processors and Windows support for it will not end in the foreseeable future. As JohnnyMalaria points out, virtually every Windows program would cease working.
    yeah, you're right, because no company has ever moved all it's computers to a whole new architecture thereby rendering all existing software obsolete <cough> apple </couch>.

    seriously though, it has happened before, not saying it will again, but it certainly wouldn't be unprecedented.

    interesting side note: i wonder how hard apple is kicking themselves over moving away from the power architecture and switching to the intel chips? they made the decision to move away from the 970 (POWER4 based power G5) chip towads intel's conroe's and shortly thereafter ibm introduced the POWER5, then the CELL and now they have the POWER 6, a monster of a cpu.

    if apple had just stuck with ibm a little while longer then they would have had a chip that would have made apple's claims of superior performance a fact rather than a fantasy.
    Quote Quote  
  5. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by SLK001
    Also, your "belief" that no series is actually infinite seems like a ridiculous statement, since you actually show one that is (plus, there are many more that also are infinite).
    my belief arises from my physics background, all math has a corresponding physical counterpart, since there is a physical limit as to how precise we can measure anything (called the planck limit) then there is also a point at which an infinite series ceases to be truly infinite since that which it describes can't possible exist.

    capice?

    Originally Posted by SLK001
    And finally, this statement "the greater the precision of pi the more perfect the circle associated with it" is ignorant, since the precision of PI has nothing to do the "perfectness" of a circle.
    actually that is precisely what pi describes, as i have already said, pi is the ratio between the radius of a circle and it's circumference.

    now assume we have a point in the center and we wish to draw a circle with extreme precision, the greater the precision of pi we calculate the more accurate will be our placement of the corresponding point on curve that is the circles circumference.

    to understand, assume we have a point and use 3.1416 as an estimate of where we should place the other point, when we are done placing 360 degrees worth of points, if we examined the placement we would find that gaps exist between said points, meaning that our circle wasn't perfect. as we increase the precision of pi, we will be able to place the points ever closer to one another (thus allowing us to use more points) and the resulting circle would be more perfect since all the points would be more precisely spaced from the center.

    now at some point, as i mentioned above, we reach a physical limit where we just can't add anymore points, they are as close as they can get, at that point there is a corresponding value to pi, any additional calculation of pi will result in extraneous values and as such we can think of as having reached the limit of the infinite value of pi.

    are you capable of understanding this?
    Quote Quote  
  6. Just because we cannot measure something with that precision, doesn't mean the thing we are trying to measure isn't an irrational number that can be represented by an infinite series.

    pi/4 = 1 - 1/3 + 1/5 - 1/7 + 1/9...

    It goes on for infinity.

    Just because physical phenomena can be represented by mathematical shorthand, it doesn't follow that all mathematical constructs have a meaningful physical equivalent. Mathematics is pure. The concepts of infinite series etc are absolute.

    Planck really has nothing to do with this. His work is concerned with the limits of measuring paired properties with the combined units of J.s. In the realm of relativity, it is meaningless.
    Quote Quote  
  7. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by JohnnyMalaria
    Just because we cannot measure something with that precision, doesn't mean the thing we are trying to measure isn't an irrational number that can be represented by an infinite series.

    pi/4 = 1 - 1/3 + 1/5 - 1/7 + 1/9...

    It goes on for infinity.

    Just because physical phenomena can be represented by mathematical shorthand, it doesn't follow that all mathematical constructs have a meaningful physical equivalent. Mathematics is pure. The concepts of infinite series etc are absolute.

    Planck really has nothing to do with this. His work is concerned with the limits of measuring paired properties with the combined units of J.s. In the realm of relativity, it is meaningless.
    1) math is a language that describes the universe, there is nothing "pure" about it. without something to describe math becomes meaningless, such as when we get extraneous results when solving an equation that we simply discard because they have no direct correspondence to a physical property.

    2) planck is relevant to this discussion in the form of the so-called planck length, i.e. the smallest unit of length that has any meaning for us.

    if you have any "infinite" series, such as the classic .999... it can be proven that is eventually becomes 1 (i know some misinterpret the proofs as meaning it equals 1), the point at which it becomes 1 is the planck limit, that is what the equation is telling us, i.e.:

    assume you wish to accelerate a particle to 99% the speed of light, there is nothing in the laws of physics that prevents you from doing so, like wise there is nothing that prevents you from accelerating to 99.999% the speed of light and so on. now, solving the math one could assume that there is no limit as to how arbitrarily close we come to the speed of light, so one might incorrectly surmise that we could accelerate to 99.9999...% the speed of light, but at some point we are traveling at a velocity of the speed of light minus one planck length, the only increment available to us, if we could increase by that increment, would have us traveling at the speed of light.

    but relativity tells us we can't do that, thus as the point where the series .9999... leaves us no further increments by which to arbitrarily get closer to 1 the "infinite" series ceases to exist and in fact becomes one.

    3) and last but certainly not least, relativity, classical mechanics, modified newtonian mechanics and quantum physics do not exist in isolation of one another. relativity applies on the quantum scale just as much as the macro scale, and the uncertainty principle applies to the macro scale just as much as the quantum scale. 2 billiards colliding on a pool table are governed by the same laws as 2 electrons colliding, all the laws regarding energy still apply, the laws of elasticity apply and perhaps most surprising to some, the 2 billiard balls have the same properties of uncertainty as do the electrons, it is impossible for us to predict exactly the exact location AND velocity of the 2 billiards after a collision just as it would be impossible to predict the exact location and velocity of 2 electrons post collision.

    one of the fundamental laws of physics is that the rules do not discriminate, they apply to all objects in all reference frames uniformly.
    Quote Quote  
  8. Sorry, you are wrong (or have been taught erroneously) on many levels.

    Our various approximations of the laws of the universe (classical Newtonian, relativistic, quantum mechanical) cannot be equally applied to all situations. Newtonian physics fails at both extremes of high speed and very small dimensions.

    Maths is pure. And it can be applied to the real world.

    Electrons don't collide like billiard balls. Our detection of the interaction of their wave functions (using mathematical constructs) yields observations that resemble our macroscopic experience.

    Planck refers to, as I mentioned before, the fundamental limit to which a pairwise combination of properties that have the units of J.s can be measured. Crudely, delta E x delta t = h. This is very important in the quantum world. It is irrelevant in the macroscopic world.

    The use of pi in many equations used to describe our world is a consequence of our choice of measurement system. e.g., equations describing electrostatics in the c.g.s system differ from the m.kg.s system by a factor of 2.pi (or 4.pi.epsilon)

    If we used a number system based on pi, the representation of pi would be trivial.

    Einstein rejected quantum mechanics. Sub-atomic physics cannot explain some critical observations of our universe. The Grand Unification Theory is the holy grail. It doesn't exist and, when it does, it will be a theory - nothing more. Just a mathematical *approximation* of our universe. Mathematics will continue to be both pure and applied. It can be used to describe worlds that do not follow the "laws" of our universe.
    Quote Quote  
  9. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by JohnnyMalaria
    Sorry, you are wrong (or have been taught erroneously) on many levels.
    i fail to see how i could have been "taught wrong", i have about 60 college credits in physics and comp sci with a gpa of over 3.5 (i recieved A's in all my physics classes).

    Originally Posted by JohnnyMalaria
    Our various approximations of the laws of the universe (classical Newtonian, relativistic, quantum mechanical) cannot be equally applied to all situations. Newtonian physics fails at both extremes of high speed and very small dimensions.
    it is you that is so very wrong, the laws of physics are the same for all observers in all frames of reference, it is one of the most basic laws of physics:

    http://en.wikipedia.org/wiki/Principle_of_relativity

    quote"In physics, the principle of relativity is the requirement that the equations, describing the laws of physics, have the same form in all admissible frames of reference.

    For example, the Maxwell equations have the same form in all inertial frames of reference; the Einstein field equation has the same form in arbitrary frames of reference.

    Several principles of relativity have been successfully applied throughout science, whether implicitly (as in Newtonian mechanics) or explicitly (as in Albert Einstein's special relativity and general relativity)."

    furthermore, so-called modified newtonian dynamics can, and has been shown to also give the exact same results as relativity:

    http://en.wikipedia.org/wiki/Modified_Newtonian_dynamics

    http://www.iop.org/EJ/abstract/1475-7516/2006/09/006

    and it can be applied in limited circumstances to quantum mechanics as well.

    Originally Posted by JohnnyMalaria
    Maths is pure. And it can be applied to the real world.
    what exactly does "math is pure" mean? math is just a language that describes natural phenomena, geometry describes the size, shape and position of objects in relation to one another and some forms of geometry extend geometry to deal with space itself, algebra is the study of relation, structure, quantity and the averages of said properties and calculus is the study of change, specifically instantaneous change, there is nothing pure about any of these areas of mathematics, nothing ethereal, it's just a language that we can use to analyze our surroundings, that's all it is.

    Originally Posted by JohnnyMalaria
    Electrons don't collide like billiard balls. Our detection of the interaction of their wave functions (using mathematical constructs) yields observations that resemble our macroscopic experience.
    all collisions are fundamentally similar in nature, they are either elastic or inelastic, they all obey the laws of conservation, and the colliding bodies always obey the uncertainty principle, always.

    Originally Posted by JohnnyMalaria
    Planck refers to, as I mentioned before, the fundamental limit to which a pairwise combination of properties that have the units of J.s can be measured. Crudely, delta E x delta t = h. This is very important in the quantum world. It is irrelevant in the macroscopic world.
    you're right, i guess Max Planck had no idea what his own theories were talking about:

    http://en.wikipedia.org/wiki/Planck_length

    http://en.wikipedia.org/wiki/Planck_units

    Originally Posted by JohnnyMalaria
    The use of pi in many equations used to describe our world is a consequence of our choice of measurement system. e.g., equations describing electrostatics in the c.g.s system differ from the m.kg.s system by a factor of 2.pi (or 4.pi.epsilon)
    no, we do not invent equations, we discover equations that describe the phenomena that we are observing. if pi shows up in said equations it is because whatever system we are observing has some rotational or orbital component to it, a perfect example is coulomb's law (the fundemental equation of electrostatics):

    F = 1/4pi*Ec (q1q2)/r^2

    where F is force, q1 is charge 1, q2 is charge 2, and r is the distance between the 2 charges and Ec is the electric constant. what pi's appearance in the above equation tells us is that the shape of the electric field has a curved component to it.

    Originally Posted by JohnnyMalaria
    Einstein rejected quantum mechanics. Sub-atomic physics cannot explain some critical observations of our universe. The Grand Unification Theory is the holy grail. It doesn't exist and, when it does, it will be a theory - nothing more. Just a mathematical *approximation* of our universe. Mathematics will continue to be both pure and applied. It can be used to describe worlds that do not follow the "laws" of our universe.
    while einstein is famous for saying "God don't play dice with the universe" you would do well to remember that it was he that showed an electromagnetic wave could be described by a particle called a photon and this directly led to a theory of unity (i.e. particle-wave duality)

    furthermore:

    quote: "Predictions of quantum mechanics have been verified experimentally to a very high degree of accuracy. Thus, the current logic of correspondence principle between classical and quantum mechanics is that all objects obey laws of quantum mechanics, and classical mechanics is just a quantum mechanics of large systems (or a statistical quantum mechanics of a large collection of particles). Laws of classical mechanics thus follow from laws of quantum mechanics at the limit of large systems or large quantum numbers."

    quote: "The defining postulates of both Einstein's theory of relativity and quantum theory are indisputably supported by rigorous and repeated empirical evidence. However, while they do not directly contradict each other theoretically (at least with regard to primary claims), they are resistant to being incorporated within one cohesive model."

    as for why we have so much difficulty combining quantum theory with relativity the answer should be obvious: quantum mechanics can be thought of as the psychology of a single person or a small group of interacting with one another and general relativity can be thought of as group or "pack" behavior.

    just as people don't behave the same way when they are alone or with a single friend as they do when they are part of a large group, so too particles can not be expected to behave the same way when they are in a large group that we call celestial objects.

    thus it can be thought we can think quantum mechanics as the behavior of solitary particles and relativity as the behavior of lots of particles together.

    ergo, there is nothing to unify.
    Quote Quote  
  10. I guess I'll increase my carbon footprint and incinerate all the hard copies of my doctoral thesis then.

    Just to respond to few of your misconceptions:

    MOND is not an alternative to relativity. It is an effective theory that provides an alternative to one aspect of relativity, namely the non-linearity of gravity. Did you read the article you reference?

    Re F = 1/4pi*Ec (q1q2)/r^2 ? Did you not read my earlier reply regarding the choice of measurement system? Electrostatics uses different coefficients in the cgs and mks systems - one has pi, one doesn't.

    Re quantum mechanics, as I said before, it limits to Newtonian physics in the macroscopic world. The consequences of uncertainty become irrelevant. As does particle-wave duality. When you are waiting for a bus, it doesn't diffract around the corner and miss the bus stop.

    The laws of conservation - I trust you mean conservation of momentum, energy, mass etc. What about nuclear fusion?

    Mathematics is pure. Describing physical phenomena is not a prerequisite for matrix algebra. Nor calculus. Or Boolean algebra. Mathematical models are tools to assist in understanding the physical world. There can be no absolute laws. We challenge those laws and either continue to use the existing models or develop new ones. Very complex models typically imply that something is missing. Mathematicians develop new concepts to assist in resolving such issues - usually when there is a great need such as the Manhattan Project.

    Planck units merely replace equations written in SI units so as to make them easier to handle. Nothing more than scaling factors. Nothing to do with precision, uncertainty, relativity, Newton. They have the same dimensions as the SI equivalents.

    Planck length is another scaling factor. The article you reference (wikipedia seems to be your oracle) even says its meaning is abstract.

    As I mentioned before, the importance of uncertainty is related to the uncertainty of the product of two observables having units of J.s. Such a pair is Energy and Time. Also Momentum and Position. Probabilistically, the uncertainty (expressed as the standard deviation of many measurements) cannot be less than h bar (the reduced expression of Planck's constant).
    Quote Quote  
  11. Member tmw's Avatar
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    Originally Posted by JohnnyMalaria
    Making it work on any CPU/GPU will cost you performance that can be had for creating code for specific architectures...

    The code still has to be compiled for a target, though. At the end of the day, an x86 processor can only work with x86 code. If you want true portability that allows something to run on any architecture without specific compilation, you'll have to resort to some kind of interpreter or JIT compiler.
    I think that is generally true. Anything above the basic assembly language has to be compiled at some point. However, it looks like khronos is effectively replacing the OS. So, while that adds another level of compiling, it removes one big bulky layer as well.

    I also thought it was very interesting the list of sponsors and participants in khronos openCL. It seemed to have nearly everyone in the world, except one particularly large tech company seemed mysteriously missing, unless I missed something. I guess they don't want to be easily replaced.
    Quote Quote  
  12. Of course, "replace the OS" means replace the OS-specific ways of handling video and graphics. Some people make it seem like the whole notion of the OS will disappear - i.e., no Windows, MacOS, Linux etc(!)

    With Windows, the layer isn't really that big. The origins of Direct3D, DirectShow etc relate to direct access to the hardware, bypassing all the existing layers (such as GDI etc). It is quite lightweight.

    That said, OpenCL has the potential to really impact the video/graphics industry.
    Quote Quote  
  13. Banned
    Join Date
    Aug 2002
    Location
    beautiful
    Search Comp PM
    Originally Posted by deadrats
    i would much rather if they just took the P4, made it dual core, removed all SIMD capabilities, beefed up the FPU's and had 3 per core (to match the ALU's) and just made the L2 as big as they could. i have a gut feeling such a chip could smoke even the i7's, with 6 ALU's and 6 FPU's crunching away and a massive L2 to store everything.
    Agreed! My thoughts exactly.
    Unfortunately Intel will never make such chip, there is less money in it (vs chips with "their" technologiech such as all those useless SIMD)
    Quote Quote  
  14. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by DereX888
    Originally Posted by deadrats
    i would much rather if they just took the P4, made it dual core, removed all SIMD capabilities, beefed up the FPU's and had 3 per core (to match the ALU's) and just made the L2 as big as they could. i have a gut feeling such a chip could smoke even the i7's, with 6 ALU's and 6 FPU's crunching away and a massive L2 to store everything.
    Agreed! My thoughts exactly.
    Unfortunately Intel will never make such chip, there is less money in it (vs chips with "their" technologiech such as all those useless SIMD)
    the thing is that intel's business model as it relates to consumer grade cpu's is to soak us for every penny they can as often as they can, i.e. the planned obsolescence/forced upgrade business model.

    a chip like outlined above would have a shelf life of years, as it would be more than enough for even most hard core video encoders and gamers, but that wouldn't be good for intel's bottom line.

    i think this is the biggest reason intel decided not to move toward the itanium as they had originally planned, they looked at the idea of going with the itanium on the desktop and said "are we out of our minds"?

    why give the end consumer a cpu that is a floating point monster, 64 bit (at the time we were still at 32 bit) and has only been through 2 iterations (the itanium 1 and 2) when they can go from the P3 to the P4 (that required a new socket, new ram, and new chipset <--more licensing fees for intel), then to a hyperthreaded P4 (that didn't work on most existing boards and chipsets) then to a dual core P4 (same deal with boards and chipsets), then to the conroe (another socket, chipset, board combo), then to a quad core (same deal again with chipsets and boards), then to an upgraded conroe (penryn) with still different chipsets and boards and finally to the i7 that also used a different socket, memory type, chipset (again more licensing fees for intel) and new boards.

    the thing that really pisses me off and that really makes me buy amd instead of intel is that the itanium 2, which was released a while ago is still faster than even the fastest i7's and even the original itanium, running at significantly lower clock speeds, under most circumstances is capable of beating the i7.

    so basically we, the consumer grade chumps, get intel's scraps of a cpu while the enterprise market get the really good stuff. yeah i know the itaniums are expensive, as are the boards and the memory, but when you add up all the upgrades on the consumer side, it's still cheaper by quite a bit.

    note: for those that doubt itanium's performance, here is a benchmark running SPECjbb 2005:

    http://www.intel.com/products/processor/itanium/specifications.htm?iid=Itanium+tab_specifications

    http://www.intel.com/performance/server/itanium/summary.htm?iid=itanium+body_performance

    http://www.intel.com/performance/server/xeon_mp/server.htm?iid=perf_server_lhn+mp_server

    the xeon 7460 (6 core penryn, aka Dunnington, 2.66ghz, 16mb L2) achieved a score of 531,669 business operations per second (bops), the itanium (dual core, hyperthreaded, 667mhz fsb, 24mb L2, 1.66ghz) achieved a score of, get this, 5,180,451 bops, that's almost 10 times as fast with 1/3 the cores and 1 ghz lower clock speed.

    that means it would take almost 10 xeon 7460 xeons to match the performance of a single itanium or put another way, assuming a doubling of performance with every new cpu release, it will be 5 cpu generations before intel graces the consumer market with a cpu as fast as what they offer the enterprise market today.

    absolutely appalling.
    Quote Quote  
  15. The P4 was Intel's worst version of the Pentium line. The PIII was far better but couldn't go as fast - until Intel developed the fabrication process required. History shows the P4 as a dead end and the PIII as the direct ancestor of current Intel chips.

    Planned obscelescence is essential to their business model. Without it, you'd either still be using 8086 (or even Z80s). And an extreme process with multiple FPUs etc would cost a fortune to develop, as would new software and OSes to handle the disappearance of SIMD instructions (that would be as stupid as removing the FPU). And why develop such a thing if it provides the consumer with a processor with a useful life of x years?

    I find it quite laughable that people b-and-moan about a $50 CPU that just 5 years ago would have been out-of-reach to 99.9% of consumers. Ten years ago I was editing with a 500MHz PIII Celeron and, at the time, cost $200 (the full PIII was about $400 and differed only in having L2 cache - which offered zero benefit). It's speed was amazing.

    Without the evil planned obscelescence, we'd have no $300 off-the-shelf dual-core multigigahertz, multigigabyte computers.
    Quote Quote  
  16. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by JohnnyMalaria
    Planned obscelescence is essential to their business model. Without it, you'd either still be using 8086 (or even Z80s). And an extreme process with multiple FPUs etc would cost a fortune to develop, as would new software and OSes to handle the disappearance of SIMD instructions (that would be as stupid as removing the FPU). And why develop such a thing if it provides the consumer with a processor with a useful life of x years?
    this is absolutely not true, on any level. the only one the "planned obsolescence" business model benefits is intel, not the consumers.

    other chips, aimed at different markets and by different manufacturers, that were marketed using business models other than the "forced upgrade"/"planned obsolescence" model resulted in vastly superior chips:

    ibm's power series
    sun's ultra sparcs (a 33 mhz ultra sparc was capable of smoking a 400 mhz P2)
    and even intel's own itanium

    as for costing a fortune to develop, developing desktop cpus and new manufacturing processes to go along with each new generation already costs a fortune, not to mention said companies have already spent the fortune and developed the cpus that i have described, i.e. sun's T2, ibm's power 6 and intel's itanium.

    furthermore, i think we could survive with the disappearance of SIMD quite well: the itanium 1 didn't have any SIMD capabilities, windows 2k ran fine on it as do, the sparc line of cpu's don't have SIMD and linux, unix and solaris seem to run just fine, current itanium's don't have any SIMD units yet linux and unix run just fine on it, MMX and 3DNOW! aren't used at all yet windows runs just fine.

    make no mistake, SIMD is nothing more than a way of tying software to a particular hardware platform, and in order to make it enticing to developers, cpu manufactures spend more on their own proprietary SIMD development than on FPU improvements.

    i remember back when the P3 had SSE and amd's athlon didn't, there was some SSE optimized software that didn't run on amd's chips (or intel's older non-SSE capable chips), and needless to say intel greatly encouraged the use of SSE. the end result was that if you wanted to run said software you had no choice but to either upgrade from intel's non-SSE chips or switch from amd's cpu to a P3.

    as for me, i hate being forced to do anything, much less buy a certain companies cpu because they stacked the deck and made sure that developers used their own proprietary technology rather than an open, cross platform standard.

    that's also one of the reasons that i hope gpgpu takes off, it will give consumers tons of computing power, more than any cpu on the market in the foreseeable future and it would work no matter which company's video card or which cpu you used, just lot's of processing horsepower.

    Originally Posted by JohnnyMalaria
    I find it quite laughable that people b-and-moan about a $50 CPU that just 5 years ago would have been out-of-reach to 99.9% of consumers. Ten years ago I was editing with a 500MHz PIII Celeron and, at the time, cost $200 (the full PIII was about $400 and differed only in having L2 cache - which offered zero benefit). It's speed was amazing.
    or you could have bought a 64 bit ultra sparc clocked in at 200mhz that would have taken a 800mhz P3 to school.

    Originally Posted by JohnnyMalaria
    Without the evil planned obscelescence, we'd have no $300 off-the-shelf dual-core multigigahertz, multigigabyte computers.
    it does you no good to be able to buy a $300 computer that in less than 6 months is obsolete and needs an upgrade. it's much smarter to spend $1000 on a computer that will have a decent shelf life and won't need any upgrades for a while.

    the argument you make could be made about cars, shoes, you name it: i can walk into a kmart or a c.h. martin and buy a $15 pair of shoes that will last me all of 3 months and will not be all that comfortable or i can spend $150 and buy a pair of rockports that are probably the most comfortable shoe out there and that last for years (i have 2 pairs of rockports that are over 10 years old and they are still more comfortable and durable than any new shoe i own).
    Quote Quote  
  17. It does you no good to be able to buy a $300 computer that in less than 6 months is obsolete and needs an upgrade.
    Depends. My workhorse PC is a 3 year old Pentium D 2.8 and serves my purposes admirably - high performance software development mainly including handcoded SSE2. It cost $500 and has earned its keep many times over.

    Just because faster hardware comes along, it doesn't mean you have to be a sheep.

    Most people still use Windows XP and on machines at least 2 or 3 years old.

    For processor market, the $300 machines and their consumers bank roll the incremental development of existing CPU architectures. This permits new, high performance hardware and software while maintaining backwards compatibility and lowering development costs. To throw away the entire revenue stream would be corporate suicide. Billion dollar companies generally survive through product line extensions, not starting from square one every year.

    The thing is that intel's business model as it relates to consumer grade cpu's is to soak us for every penny they can as often as they can.
    Name a mass consumer-oriented industry that doesn't. Just look at the advertising on TV - especially at this time of year - 99.9% pure shite targeted at people who part with their money when told "this is the best ever" or "you must get it". Snuggie, ShamWoW, anything Billy Mays shouts about, satellite internet, VOIP, cell phones that will tell you when someone in Cambodia has farted, cars because your 3 year old car is a death trap, doesn't have 5.1 surround sound and is an embarassment, 60 foot TVs and houses to accommodate them, "food" etc.

    You already have the choice if you want a higher performing CPU architecture. But - true to our capitalist ways - it will cost ya.
    Quote Quote  
  18. Member tmw's Avatar
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    What is doubly entertaining about this thread is the banner ad at the bottom:
    The Intel(r) Core (tm) i7 Extreme Edition: It's the best performing processor on the planet.(2).

    (2) Based on SPECint_rate_base2006* scores. Results have been estimated based on internal Intel analysis and are provided for informational purposes http://www.intel.com/performance/desktop/extreme/index.htm for additional information.
    Talk about relevant placement.
    Quote Quote  
  19. Banned
    Join Date
    Aug 2002
    Location
    beautiful
    Search Comp PM
    Originally Posted by tmw
    What is doubly entertaining about this thread is the banner ad at the bottom:
    The Intel(r) Core (tm) i7 Extreme Edition: It's the best performing processor on the planet.(2).

    (2) Based on SPECint_rate_base2006* scores. Results have been estimated based on internal Intel analysis and are provided for informational purposes http://www.intel.com/performance/desktop/extreme/index.htm for additional information.
    Talk about relevant placement.
    LOL
    good find!
    Quote Quote  
  20. Member
    Join Date
    Mar 2005
    Location
    United States
    Search Comp PM
    Originally Posted by lordsmurf
    Computers are a continual process of reducing the bottlenecks.
    I don't upgrade anymore unless it improves the bottleneck.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!