Jump to content
Sign in to follow this  
Richey79

PhysX

Recommended Posts

That is a lie...I dare you to prove it!

(a link to realworldtechnologies would be a own goal from your side...do some digging ;))

And you are balming the wrong people.

Why should gamign stand still, just because AMD cann't get their own GPU physics up and runnning?

Cart before horse...not bright.

http://www.tgdaily.com/hardware-features/50554-does-physx-diminish-cpu-performance

Nvidia hobbles CPU physX performance by forcing CPU physX through the x87 instruction set. In case you weren't aware, x87 is from nineteen eighty nine. 1989!!! So yeah... I'd say forcing a 22 year old instruction set on CPU physX is a bit crippling...

Off Topic: Mafia 2's hardware excelerated physX actually ran marginally well on medium for ATI users, if they went in and deleted the ApexCloth file. APEX is a lightweight subset of PhysX, that again... is purposely being used to boost the image of green team GPUs, despite the fact that CPUs can run the calculations just fine if it were calling on SSE4 instructions.

---------- Post added at 05:01 PM ---------- Previous post was at 04:58 PM ----------

It's a simple matter of performance (or the lack of performance).

Would you like a setting that would enable full PhysX effect on the CPU...if that meant you had to play at 3-4 FPS, because your CPU simply isn't up for the task? :)

GPU > CPU for SIMD

A GPU creams a CPU in physics calculations...those are the hard facts.

This is absolutely true, but not on the scale that Nvidia makes most believe. Proof? http://www.havok.com/index.php?page=showcase&hl=en_US

Look at any of those videos, and after watching, remind yourself that every bit of that was CPU calculated physics.

Share this post


Link to post
Share on other sites
http://www.tgdaily.com/hardware-features/50554-does-physx-diminish-cpu-performance

Nvidia hobbles CPU physX performance by forcing CPU physX through the x87 instruction set. In case you weren't aware, x87 is from nineteen eighty nine. 1989!!! So yeah... I'd say forcing a 22 year old instruction set on CPU physX is a bit crippling...

Off Topic: Mafia 2's hardware excelerated physX actually ran marginally well on medium for ATI users, if they went in and deleted the ApexCloth file. APEX is a lightweight subset of PhysX, that again... is purposely being used to boost the image of green team GPUs, despite the fact that CPUs can run the calculations just fine if it were calling on SSE4 instructions.

Hurr x86 is terrible stuff from 1978 right? PowerPC was so great from 1992 that it has all of the desktop share today right?

Share this post


Link to post
Share on other sites
Fact is this PhysX thingy is dividing the community in two classes:

the poor and the rich, those who have and have not :p

(even if it´s just the luxury of the feeling 'to b on the safe side' for nVidia users :rolleyes:)

Or rather, the educated and the fanboys. Nvidia has had some awesome video performance in the last 6 years. But it's been proven time and time again that this generation, the highest performing ATI cards are indeed faster than the highest performing Nvidia cards, while still being cheaper. This is also why ATI recently took over the discrete maret share majority.

Share this post


Link to post
Share on other sites

Just to be on the safe side, I'm switching to a GTX 560...

Share this post


Link to post
Share on other sites
Hurr x86 is terrible stuff from 1978 right? PowerPC was so great from 1992 that it has all of the desktop share today right?

x86 is a baseline 32-bit instruction set. And it's evolution was the x64 instruction set. x87's evolution is the SSE, SSE2, SSE3, and SSE4 instruction sets. x64 requires more memory than the x86 instruction set, to address the extra bits in each cycle. Not only that, x86 still sees widespread use for nearly every entire industry rather than just the gaming industry. So backwards compatibility is a must. SSE variants have no such limitation, and the only reason NOT to use SSE instruction sets is for backwards compatibility. Considering that Nvidia is in absolutely zero danger of anyone using their PhysX API on a 15 year old or older processor, using x87 only points to ONE THING. IE: your snarky one liner argument holds no weight.

Edited by TheCapulet
spelling error

Share this post


Link to post
Share on other sites
http://www.tgdaily.com/hardware-features/50554-does-physx-diminish-cpu-performance

Nvidia hobbles CPU physX performance by forcing CPU physX through the x87 instruction set. In case you weren't aware, x87 is from nineteen eighty nine. 1989!!! So yeah... I'd say forcing a 22 year old instruction set on CPU physX is a bit crippling...

Off Topic: Mafia 2's hardware excelerated physX actually ran marginally well on medium for ATI users, if they went in and deleted the ApexCloth file. APEX is a lightweight subset of PhysX, that again... is purposely being used to boost the image of green team GPUs, despite the fact that CPUs can run the calculations just fine if it were calling on SSE4 instructions.

I knew that FUD would come.

It's been debunked ages ago:

http://scalibq.wordpress.com/2010/09/17/physx-an-easy-target/

http://forums.anandtech.com/showpost.php?p=30102764&postcount=46

And is not true anyone (even if it was debunked):

http://physxinfo.com/news/5671/physx-sdk-3-0-has-been-released/

Like I figured...ignorance and lies are the opponents of PhysX.

---------- Post added at 05:01 PM ---------- Previous post was at 04:58 PM ----------

This is absolutely true, but not on the scale that Nvidia makes most believe. Proof? http://www.havok.com/index.php?page=showcase&hl=en_US

Look at any of those videos, and after watching, remind yourself that every bit of that was CPU calculated physics.

You think simple rigig bodies physics will impress me?

Come back when Havok does this (tearable cloth):

Or this:

Or this:

Ignorance is a bad foundation.

Please don't waste my time with more ignorance.

---------- Post added at 11:14 PM ---------- Previous post was at 11:13 PM ----------

Or rather, the educated and the fanboys. Nvidia has had some awesome video performance in the last 6 years. But it's been proven time and time again that this generation, the highest performing ATI cards are indeed faster than the highest performing Nvidia cards, while still being cheaper. This is also why ATI recently took over the discrete maret share majority.

Document this..or I will call you the fanboy..and put your ramblings on ignore.

Share this post


Link to post
Share on other sites

This Nvidia vs AMD fight is interesting but if the PhysX is applied on CPU, all PC users can have the benefit of the technology whatever his GPU brand is.

Nvidia card user can perhaps then benefit from a PhysX extra-acceleration from their GPU.

Have a look on system requirements, it clearly show there is a probable jump in the level of the needed CPU :

ESTIMATED SYSTEM REQUIREMENTS

OS - Windows 7 / Vista

CPU - Intel Core i5 or AMD Athlon Phenom X4 or faster

GPU - Nvidia Geforce GTX 260 or ATI Radeon HD 5770, shader Model 3 and 896 MB VRAM, or faster

RAM - 2 GB

HDD - 15 GB free space

DVD - Dual Layer compatible

DirectX® - 10

Share this post


Link to post
Share on other sites
This Nvidia vs AMD fight is interesting but if the PhysX is applied on CPU, all PC users can have the benefit of the technology whatever his GPU brand is.

Nvidia card user can perhaps then benefit from a PhysX extra-acceleration from their GPU.

Have a look on system requirements, it clearly show there is a probable jump in the level of the needed CPU :

ESTIMATED SYSTEM REQUIREMENTS

OS - Windows 7 / Vista

CPU - Intel Core i5 or AMD Athlon Phenom X4 or faster

GPU - Nvidia Geforce GTX 260 or ATI Radeon HD 5770, shader Model 3 and 896 MB VRAM, or faster

RAM - 2 GB

HDD - 15 GB free space

DVD - Dual Layer compatible

DirectX® - 10

There is a reason too why AMD really, really, really, really, really, really, really, really, really, really, really, really, really, really would like to have GPU physics too:

Performance.

It like night and day between the CPU and GPU in physics.

Share this post


Link to post
Share on other sites

But you can't cut AMD/ATI owner from the benefits of playing the game unless you think BIS guys are dummies.

They know quite well who are the customers and do not intend to push AMD/ATI owners -as I am ... 6970 powa ! - out of the game.

Share this post


Link to post
Share on other sites
But you can't cut AMD/ATI owner from the benefits of playing the game unless you think BIS guys are dummies.

They know quite well who are the customers and do not intend to push AMD/ATI owners -as I am ... 6970 powa ! - out of the game.

Why should the minority ditacte the performance for the majority?

Don't blame BIS...blame AMD...for doing nothing but talk about GPU-physics.

Share this post


Link to post
Share on other sites

I am blaming nobody, you will get a playable game, whatever your GPU will be provided you get a powerful CPU.

Performances in ArmA were already quite CPU dependent due to the ways the game is built. You can see there is not so big a jump in GPU requirements as there is on CPU requirements, from what I understand, it means that the PhysX in ArmA 3 will be a CPU thing, needing perhaps a full thread on the needed 4 cores CPU.

Nvidia is working this way for years, of course they tend to make you believe in their commercials, that a Nvidia GPU is needed but they have already show they can be damn adaptable, even selling SLI licences for future AMD motherboard, it's called realism ;)

Edited by Old Bear

Share this post


Link to post
Share on other sites
I am blaming nobody, you will get a playable game, whatever your GPU will be provided you get a powerful CPU.

Performances in ArmA were already quite CPU dependent due to the ways the game is built. You can see there is not so big a jump in GPU requirements as there is on CPU requirements, from what I understand, it means that the PhysX in ArmA 3 will be a CPU thing, needing perhaps a full thread on the needed 4 cores CPU.

I certainly hoped that BIS hasn't botched their PhysX implementation so that they only offer CPU physics...

Share this post


Link to post
Share on other sites

But just because they're using PhysX doesnt mean that they will use it to the more extreme levels where you really need the GPU-based hardware. Regular ragdoll-effects and basic vehicle-physics wont stress the CPU much at all for ATI/AMD-owners. It's the more advanced effects like organic physics, particles, massive destruction and wind/water that would need a GPU for calculations. I trust BIS will find some sort of balance here, which wont leave ATI/AMD-users outside as much as you might think.

I mean, I dont really care about realistic waveing flags or advanced shattered glass and stuff like that which we got to see in Mirrors Edge for example with the GPU-physics activated. All I want is realistic physics on my grenades, vehicles and some sort of ragdoll-effect on animations.

Share this post


Link to post
Share on other sites
Why should the minority ditacte the performance for the majority?

Don't blame BIS...blame AMD...for doing nothing but talk about GPU-physics.

They aren't entirely all talk: http://bulletphysics.org

Share this post


Link to post
Share on other sites
They aren't entirely all talk: http://bulletphysics.org

Yes they are.

Bulletphysics != AMD

Bullet 3.0 (GPU physics) is still not out = just talk.

They have adopted ATi's line...they promised (back in 2006) GPU physics in a few months...nothing has happend....except a lot of FUD, talk and empty PR.

Why do people who think PhysX is BAD...and AMD is just around the corner with their own CPU physics solution always show a lack of insight in the topic?

---------- Post added at 11:56 PM ---------- Previous post was at 11:54 PM ----------

But just because they're using PhysX doesnt mean that they will use it to the more extreme levels where you really need the GPU-based hardware. Regular ragdoll-effects and basic vehicle-physics wont stress the CPU much at all for ATI/AMD-owners. It's the more advanced effects like organic physics, particles, massive destruction and wind/water that would need a GPU for calculations. I trust BIS will find some sort of balance here, which wont leave ATI/AMD-users outside as much as you might think.

I mean, I dont really care about realistic waveing flags or advanced shattered glass and stuff like that which we got to see in Mirrors Edge for example with the GPU-physics activated. All I want is realistic physics on my grenades, vehicles and some sort of ragdoll-effect on animations.

Anything that makes my favourite mil-sim more REALISTIC is a bonus for me.

You are advocating less realism...because AMD is doing nothing.

If you want less realism...you can always play CoD...problem solved.

Share this post


Link to post
Share on other sites

Well im guessing that the physX will be like in mafia II, only special effects that has no effect on gameplay and is only eyecandy. Probably the physX will be disabled in multiplayer and will be a single player only. Though I hope im wrong and do something innovating with the physX in gameplay.

Im on AMD and I don't see why so many are against PhysX, It something that you can turn off anyway or run on your processor. Computer has always been about the more money you spend on your system the more eyecandy you will experience, if you want a level playing field buy a console.

My two cents on PhysX is im glad for the usage of newer technologys and wish more developers would take advantage of newer technologies software and hardware. Also this is an excuse for me to justify spending hundreds if not thousand/s on my computer.

Share this post


Link to post
Share on other sites

Those elitist rants of some nVidia fanboys are hardly bearable anymore :o:

Share this post


Link to post
Share on other sites
Yes they are.

Bulletphysics != AMD

Bullet 3.0 (GPU physics) is still not out = just talk.

They have adopted ATi's line...they promised (back in 2006) GPU physics in a few months...nothing has happend....except a lot of FUD, talk and empty PR.

Bullet isn't AMD as PhysX is nVidia, but they did hire they guy which was my point. And Bullet already has hardware acceleration, so your statement isn't true.

Share this post


Link to post
Share on other sites

There are many ways to render physics in game such as Havok, Nvidia PhysX or Bullet.

BIS has chosen the Nvidia solution for a CPU solution.

All player whatever their GPUs will be, are going to play ArmA 3.

But there is no reason to flame people because they have not bought an Nvidia card, the COD part was irrelevant and useless.

Share this post


Link to post
Share on other sites
Those elitist rants of some nVidia fanboys are hardly bearable anymore :o:

Not as sad as the whining of AMD's fans over features they can't run...

---------- Post added at 12:09 AM ---------- Previous post was at 12:07 AM ----------

Bullet isn't AMD as PhysX is nVidia, but they did hire they guy which was my point. And Bullet already has hardware acceleration, so your statement isn't true.

Since Bullet 3.0 SDK isn't out, it hard to code for...right?

I have seen tech demos from Ati/AMD since 2006.

I am tired of techdemos...put up or shut up AMD...

Share this post


Link to post
Share on other sites

@terracide : You are going the wrong way trying to build a ATI-AMD/Nvidia war here, if you are going on so, I will report you so you could cool a bit.

Share this post


Link to post
Share on other sites

Oh no...not a Capulet and Montagues war over video cards :(

Im curious as to any VBS owners who have already witnesses first hand PhysX's role in terms of eye candy fluff or more substantial features as well as if ATI's were missing out?

Share this post


Link to post
Share on other sites
Not as sad as the whining of AMD's fans over features they can't run...

---------- Post added at 12:09 AM ---------- Previous post was at 12:07 AM ----------

Since Bullet 3.0 SDK isn't out, it hard to code for...right?

I have seen tech demos from Ati/AMD since 2006.

I am tired of techdemos...put up or shut up AMD...

I guess you misunderstood, 2.xx has hardware acceleration.

Share this post


Link to post
Share on other sites
I guess you misunderstood, 2.xx has hardware acceleration.

Last I spoke with Erwin Coumans he stated that GPU support wouldn't come untill bullet SDK 3.x.

I would like to se your source for GPU acceleration in SDK 2.x?

Share this post


Link to post
Share on other sites

fyi Bullet SDK 2.77 and 2.78 supports OpenCL on both NVIDIA and AMD afaik ...

first integration tries started around 2.74/2.75 if i remember correctly

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×