![]() |
#231
|
|||
|
|||
![]()
While nVidia is obviously doing what's best for them, us users complaining about it is anything but childish, it's about what's good for us.
For example, if 99% of the games 5 years from now use physX and you are forced to buy nVidia cards at grossly inflated prices due to lack of competition, you'll understand why people are complaining now in a effort to steer things the way of the customer while it's still early on ![]() |
#232
|
|||
|
|||
![]() Quote:
So: 1st: blame the programmers 2nd: blame AMD for not having something similar 3rd: blame nvidia only if they OWN the programming studios. Last edited by swiss; 11-15-2010 at 08:30 PM. |
#233
|
|||
|
|||
![]() Quote:
Also I'd only see a problem if a game NEEDED PhysX, rather than just benefitted from it, which I think is unlikely. If PhysX became an important factor in the sales of Nvidia cards, the competitors would probably come out with their own solution... I'm not surprised that they haven't yet.
__________________
DIY uni-joint / hall effect sensor stick guide: http://www.mycockpit.org/forums/cont...ake-a-joystick |
#234
|
|||
|
|||
![]()
Well first of all I pointed out that both AMD and nVidia are guilty of anti-competitive practices, so the accusations I'm part of the fanboisie are misplaced.
Secondly, I'd need to see a lot of evidence before I'd believe GPUs have some inherent advantage over CPUs for physics calculations; physics engines have been incorporated in numerous games for years and hardly any games are CPU limited on even the most basic machines. Il-2 has it's own rigid body model for crashes for example, one of many innovations. People need to move past brand loyalty and see attempts to control the market for what they are: monopoly exploitation that will hurt consumers in the long run. PhysX, CUDA etc are just attempts to balkanise the industry in the exact same way Netscape and Microsoft tried to with the internet. They took open standards like HTML and added proprietary extensions; the idea was that websites would look bad or just be broken on their opponents software. This had nothing to do with helping consumers and everything to do with gaining power over them. I don't believe that AMD are more innocent than nVidia, it's just that these tricks serve the interests of the dominant player rather than the underdog. Two companies are already insufficient for proper competition. If either gets a lock on the market, everybody loses. dduff |
#235
|
||||
|
||||
![]() Quote:
People used to use multiple CPUs to make fast computers, now they use multiple GPUs. http://en.wikipedia.org/wiki/GPGPU CUDA is the nVidia name for it: http://en.wikipedia.org/wiki/CUDA The AMD name for it is Stream: http://en.wikipedia.org/wiki/AMD_Fir...evelopment_Kit |
#236
|
|||
|
|||
![]() Quote:
|
#237
|
|||
|
|||
![]()
From Wikipedia.
On anti-competitive practices: Versions 186 and newer of the ForceWare drivers disable PhysX hardware acceleration when a GPU from a different manufacturer, such as AMD, is present in the system.[14] Representatives at Nvidia stated to customers that the decision was made due to development expenses, and for quality assurance and business reasons.[11][15] This decision has caused a backlash from the community that led to the creation of a community patch for Windows 7, circumventing the GPU check in Nvidia's updated drivers. Nvidia also implemented a time bomb in versions 196 and 197 which slowed down hardware-accelerated PhysX and reversed the gravity, leading to unwanted physical effects[16] - which was again remedied by the updated version of the community patch.[17] On 5 July 2010, Real World Technologies published an analysis[21] of the PhysX architecture. According to this analysis, most of the code used in PhysX applications is based on x87 instructions without any multi-threading optimization. This could cause significant performance drops when running PhysX code on the CPU. The article suggests that a PhysX rewrite using SSE instructions may substantially lessen the performance discrepancy between CPU PhysX and GPU PhysX. In response to the Real World Technologies analysis, Mike Skolones, product manager of PhysX, said[22] that SSE support has been left behind because most games are developed for consoles first and then ported to the PC. As a result, modern computers run these games faster and better than the consoles even with little or no optimization. Senior PR manager of Nvidia, Bryan Del Rizzo, explained that multi-threading is already available with CPU PhysX 2.x and that it is up to the developer to make use of it. Automatic multi-threading and SSE will be introduced with version 3 of the PhysX SDK.[23] It's hard to make sense of Mike Skolones' comment that "modern computers run these games faster and better than the consoles" because "most games are developed for consoles first and then ported to the PC". Some forms of physics modelling are suitable for parallelisation and some are not. I don't recall Havok based games running into CPU bottlenecks. In fact CPU-limited games are rarer than hen's teeth. This conversation badly needs to get away from the nVidia vs AMD thing. When Apple were underdogs they complained bitterly about Microsoft's dirty tricks. Now they're on top, they're as bad as Microsoft ever were. Microsoft haven't gotten much better either. Back when Netscape was on top in internet applications, it wrestled with all it's might for power over consumers; it only started complaining when it lost out to Microsoft. Such practices are nearly universal among companies that have the power to carry them out. dduff |
#238
|
|||
|
|||
![]()
The supercomputers referred to run a restricted set of programmes over and over again. The performance figures referred to apply only to these specially tailored programmes which are suitable for parallelisation. Many software applications, including numerous physics applications, cannot be parallelised in this way. That many others can is neither here nor there -- CPUs don't struggle with modern games.
Reference to supercomputers is in any case of no relevance to a discussion about computer games. dduff |
#239
|
|||
|
|||
![]()
I still don't get it:
(all from wiki) Quote:
Give them one reason why they should spend a single nickel to optimise it for CPUs they don't sell or ATI cards, which is a competitor. Quote:
It's not like NV is a NPO - they paid for it too. Maybe ATI want to shove some green over? Now neither of the two is really dominating the market, tell ATI to remove the finger and give it go themselves. also: Quote:
![]() |
#240
|
|||
|
|||
![]()
Oh God...
You've just ignored most of the points made, as well as the numerous comparisons drawn with other companies that are uncontroversially held to have engaged in uncompetitive practices in the past. The time bomb is inexcusable -- it's just an attempt to make competitors hardware look like it's malfunctioning when in fact it's nVidia sabotaging its own equipment. The disabling driver is equally inexcusable; the people affected own perfectly funcioning PhysX-capable nVidia cards, but because the driver detects an AMD card also on the system it shuts down. This isn't "nVidia's driver, nVidia's rules", it's "gamer's cards, nVidia's rules". People shelled out on secondary nVidia cards only for nVidia to sabotage them after the fact with a driver "update". A tiny bit of attention paid to nVidia's excuses reveals them to be plain BS in each case. If you're going to persist with the debate, please address the parallels drawn with the behaviour of Apple, Microsoft and Netscape described above. dduff |
![]() |
Thread Tools | |
Display Modes | |
|
|