Jump to content
Sign in to follow this  
Richey79

PhysX

Recommended Posts

Eh, whenever any drop of intel is given be it photo, video or text the community will always talk about it and in great detail and debate, it has always been that way.

Edited by NodUnit

Share this post


Link to post
Share on other sites
Anything that makes my favourite mil-sim more REALISTIC is a bonus for me.

You are advocating less realism...because AMD is doing nothing.

If you want less realism...you can always play CoD...problem solved.

Maybe you should think more before trolling.

I'm definitely not advocating less realism. I'm just saying that most of the PhysX-functions that would actually matter in a game like ArmA2 wont need a dedicated GPU for good performance, like character- and vehicle-physics, or basic destruction of buldings and such. So that part should not be a problem for people who doesnt own a nvidia-card. As already mentioned, there has to be some minimum physics-setting in order to sync and balance multiplayer with everyone, so the most logical thing for BIS to do is to simply have that as an option, and then add higher PhysX-settings for nvidia-users which would look cool, but doesnt really do anything to the gameplay itself. This is how it's been done for a lot of PhysX-games so far anyway, and that's probably how it will be for ArmA3 as well. So any of you AMD-users shouldnt worry.

Personally I prefer as many functions and effects as possible, and doesnt have any problems with buying whatever hardware that would be the most suitable for a game I play and work with all weeks long. So I'm definitely not saying I dont want the extra, just that I dont really care about the things that wont really matter gameplay-wise as long as I get the things that DO matter.

And again, for AMD/ATI-users there are hacked nvidia-drivers to use PhysX with a dedicated nvidia-card in a secondary pci-e slot. If I decide to keep my HD6990 for next year as well, I'll probably get me a cheap nvidia-card with passive cooling for PhysX. And that's that.

Share this post


Link to post
Share on other sites

I would very much prefer OpenCL accelerated physics, Bullet or Havok, but most of all i'm so excited that RV engine is finally (apparently) gonna get some decent physics.

Post your opinions about these two videos:

Havok and OpenCL on AMD GPU:

Physx Cloth Mvidia:

Share this post


Link to post
Share on other sites

I had the same reaction as many other AMD users in this thread, however I have read that the latest version of Physx uses the SSE instruction set and automatically multi-thread's for the CPU. As long as BIS focus on Physx's use on a CPU not an Nvidia GPU I'll be happy.

Share this post


Link to post
Share on other sites

And again, for AMD/ATI-users there are hacked nvidia-drivers to use PhysX with a dedicated nvidia-card in a secondary pci-e slot. If I decide to keep my HD6990 for next year as well, I'll probably get me a cheap nvidia-card with passive cooling for PhysX. And that's that.

There are but they dont always work properly and you are stuck using older versions of drivers.

Personally, I wouldn't go near any kind of 'jerry rigging'.

As many others have said, you will always have the option to run PhysX on the CPU regardless of what GPU you are running.

At the end of the day, you cant have it both ways. AMD users are forever berating PhysX and I find it highly amusing that the usual scoffing has of course been replaced by whinging as soon as the perceived lack of features extends to something they actually like.

There is nothing wrong with AMD IMHO but similarly, there is no way Nvidia will ever support GPU PhysX for AMD cards. Do you actually believe for one second it would be different if the shoe was on the other foot?

That would effectively mean that Nvidia would have to provide some level of support for a competitor's product which is never going to happen.

Edited by BangTail

Share this post


Link to post
Share on other sites

If Physx didn't sell to nVidia and instead worked a multi-platform license then we wouldn't have these problems and we could have heavy GPU based physics, but since nVidia own the rights then no game developer (BIS included) will develop a commercial game that uses more than strapped on GPU physics options because they can't alienate a crowd.

Share this post


Link to post
Share on other sites
In the end it comes down to not, "why use Physx?", but instead "why not use Physx?"

Physx is a physics engine, like every other physics engine out there. The bonus of Physx is that it grants nVidia users bonus features, those features being disabled if you don't use an nVidia card.

But AMD/ATi users still get standard CPU based physics and minor GPU faced physics (ie; grass movement).

NVidia users get bonus features. If any other physics engine was used then it would be standard CPU based physics without the extra features to nVidia users.

So basically saying that using Physx is unfair because of the nVidia advantage is like saying that it's unfair for one kid to starve while another eats, then solving the problem by taking the food away so they both starve.

Oh wow, this is the most inane arguement I've seen here.

Did you read any of mine or other anti proponents of Physx arguements at all or are you just talking out of your ass? We're saying if you want to add physics to the engine, use a universal api that doesn't work only on one part of the market.

Share this post


Link to post
Share on other sites
or are you just talking out of your ass?

And until BI confirms whether or not their physX implementation will actually use hardware acceleration, then all you anti-physX'ers are talking out your asses too ;) :rolleyes:

Share this post


Link to post
Share on other sites

Not here to start a war but I'm glad PhysX will be used. As for the issue around those with AMD graphics cards well as far as I understand it the game will still run, but you may not get the full benefit of PhysX. Easy to fix that, an nVidia card isn't that much and although I'm sure my next comment will make me unpopular imo AMD has never been a good choice anyway.

FYI - I'm not just saying that to be an nVidia fan boy, I'll support whatever card offers the best deal but imo nVidia is the safer bet when it comes to gaming. Sure in the end there is no real difference in performance from nVidia to AMD, thats not the point. At most the difference you get from one card to another is a few measly FPS that most people will never notice anyway. Whats 50FPS vs 49FPS, not frickin much. I'd still choose an nVidia over AMD however because they've been around longer, more established and there isn't a single game that wont run on them, and run well. I cant say the same for AMD. Sure many games do support AMD, but there are some that don't or dont work as well on AMD because they've used something like PhysX that isn't fully supported. What's more in my own experience during a number of workstation rollouts for companies like BHP and other large companies I've often seen problems with AMD chips that don't seem to happen with nVidia, plus when it comes to installing new drivers the AMD ones seem rather messy, easy enough when you know what you're doing but nowhere near as smooth as nVidia. nVidia can automatically clean off your old drivers before proceeding with the new install, it auto unpacks to your drive and auto launches the install/setup.exe. AMD however at least in my experience did none of those things, it didn't remove old drivers, it did unpack but certainly didn't auto launch the setup. Perhaps the laptops I've seen with AMDs were bad because HP, Lenovo etc did a poor job, but I'd still choose nVidia because I know for sure it works, always.

Anyone care to name a game that wont work on nVidia?

Share this post


Link to post
Share on other sites
But you can't cut AMD/ATI owner from the benefits of playing the game unless you think BIS guys are dummies.

They know quite well who are the customers and do not intend to push AMD/ATI owners -as I am ... 6970 powa ! - out of the game.

The Specs Clearly Show ATI cards will work with the game.. meaning that they will have a plan!

Everyone just chill, the game is over a year away! and im sure BIS know exaxctly what they are doing so untill they provide the details just keep hush hush and stop fighting! :mad:

Share this post


Link to post
Share on other sites

i know nothing of Physx. My question is this:

Is Physx used only for the 'eyecandy' type effects (destructibility, explosions, water, etc.)?

Or can it be used for the more substantive elements like calculating bullet drop/deflection, penetration, hit mapping, etc.

Edited by Westsailor

Share this post


Link to post
Share on other sites
Oh wow, this is the most inane arguement I've seen here.

Did you read any of mine or other anti proponents of Physx arguements at all or are you just talking out of your ass? We're saying if you want to add physics to the engine, use a universal api that doesn't work only on one part of the market.

Your 'arguments' are the same ones I've been hearing for years.

Who would have thought that one company would want to offer incentives to encourage customers to choose their products over a competitors :rolleyes:

I don't have a lot of sympathy tbh. Too many AMD fanboys telling me that PhysX sucks and it would never be a factor in determining which card they would purchase.

If it sucks so much and it wasnt a factor, why the sour grapes all of a sudden? It's not like you won't be able to play ArmA 3 or be prohibited from using the PhysX engine. The only limitation will be that you will not have GPU Physics.

If you're not happy with that, you can always buy an Nvidia card. It's not like ArmA 3 is releasing tomorrow.

Edited by BangTail

Share this post


Link to post
Share on other sites
Oh wow, this is the most inane arguement I've seen here.

Did you read any of mine or other anti proponents of Physx arguements at all or are you just talking out of your ass? We're saying if you want to add physics to the engine, use a universal api that doesn't work only on one part of the market.

Yep, Physx works on all parts of the Market, just not every feature works outside of nVidia. I don't know of a physics engine that fully utilises hardware physics on ATI and nVidia based cards, but if you have one in mind that's easy to implement into the RV engine then feel free to let BIS know.

Share this post


Link to post
Share on other sites
i know nothing of Physx. My question is this:

Is Physx used only for the 'eyecandy' type effects (destructibility, explosions, water, etc.)?

Or can it be used for the more substantive elements like calculating bullet drop/deflection, penetration, hit mapping, etc.

I would explain it to you, But Im waaaayyyyyy to lazy to type up an essay right now. Even to lazy to copy and paste from Wikipedia

But hey here is a link PhysX

Also, The "Bullet Drop","Deflection" and "Penetration" is already calculated by the CPU. Its pretty much part of the RV Engine if I'm not mistaken. So PhysX shouldn't have to calculate any of that.

Also, Its funny how mad ATI users are getting over here. If PhysX "Can Be" used then ATI users will have to us their CPU. Whats so bad about that? Oh wait I see the problem... You want to your game to be faster. You naughty gamers ;)

Edited by Haystack15
Grammar

Share this post


Link to post
Share on other sites
i know nothing of Physx. My question is this:

Is Physx used only for the 'eyecandy' type effects (destructibility, explosions, water, etc.)?

Or can it be used for the more substantive elements like calculating bullet drop/deflection, penetration, hit mapping, etc.

PhysX is an entire physics API. PhysX itself is the base standard for it, which runs the non-'eyecandy' stuff that you're talking about. Stuff like destruction calculations, vehicles, hit mapping, animations, ragdoll calculations, etc.

PhysX APEX on the other hand is a subsection of PhysX that deals specifically with the eycandy. It handles things like cloth physics calculations, 100,000+ particle effects, volumetric FX, and debris clutter.

Realistically, without APEX stuff turned on, Nvidia and ATI users will have identical experiences. It will all work the same, no matter what.

With APEX capabilities on though, Nvidia users will have much better graphics due to the fancy effects that APEX enables that I explained above. Gameplay will remain the same, but it will look even better.

----

I'm not really all that upset by this. I have the capabilities to continue to use my ATI cards while still taking advantage of GPU calculated PhysX. But that still doesn't change the fact that like someone said above, most ATI users won't go near the 'jerry-rigged' way of getting it to work like I have.

ATI recently overtook Nvidia in the discrete GPU marketshare, which indicates that despite the fact that Nvidia relies so heavily on PhysX and CUDA as sellingpoints for it's cards, consumers are definitely more interested in $:performance and efficiency than over advertised anti-consumer 'features' that alienate over half of a PC games potential user base.

Make no mistake; PhysX will drastically improve the Arma engine, even without the fancy effects. But it's still going to piss a lot of people off anyways, when there are comparable inter-compatable solutions out or coming out in the near future that could have been used instead.

Share this post


Link to post
Share on other sites
PhysX is an entire physics API. PhysX itself is the base standard for it, which runs the non-'eyecandy' stuff that you're talking about. Stuff like destruction calculations, vehicles, hit mapping, animations, ragdoll calculations, etc.

PhysX APEX on the other hand is a subsection of PhysX that deals specifically with the eycandy. It handles things like cloth physics calculations, 100,000+ particle effects, volumetric FX, and debris clutter.

Realistically, without APEX stuff turned on, Nvidia and ATI users will have identical experiences. It will all work the same, no matter what.

With APEX capabilities on though, Nvidia users will have much better graphics due to the fancy effects that APEX enables that I explained above. Gameplay will remain the same, but it will look even better.

----

I'm not really all that upset by this. I have the capabilities to continue to use my ATI cards while still taking advantage of GPU calculated PhysX. But that still doesn't change the fact that like someone said above, most ATI users won't go near the 'jerry-rigged' way of getting it to work like I have.

ATI recently overtook Nvidia in the discrete GPU marketshare, which indicates that despite the fact that Nvidia relies so heavily on PhysX and CUDA as sellingpoints for it's cards, consumers are definitely more interested in $:performance and efficiency than over advertised anti-consumer 'features' that alienate over half of a PC games potential user base.

Make no mistake; PhysX will drastically improve the Arma engine, even without the fancy effects. But it's still going to piss a lot of people off anyways, when there are comparable inter-compatable solutions out or coming out in the near future that could have been used instead.

Nvidia holds the performance crown so whatever but ATI sells more because they are cheaper.

That's the simple fact. If you're interested in pure performance (and things like PhysX) there is only one choice.

Edited by BangTail

Share this post


Link to post
Share on other sites

Can you guys please cut the horrible fanboyism on both sides? As has been said before none of us know wether nvidia users will get the option to use their GPU for hardware acceleration of PhysX or not and since it seems PhysX isn't optional like other games but mandatory and used for essential areas such as vehicle simulation I'm assuming there won't be hardware acceleration until BIS advises further.

Share this post


Link to post
Share on other sites

There isn't much point in implementing PhysX at all if they aren't going to implement it through hardware (might as well just use a non hardware based solution at that point) so my money is on it being HW accelerated.

PS: I like AMD just fine but I have heard too many AMD users talking about how useless PhysX is and the fact that they are now going to complain because they can't use it doesn't fly with me.

Either it sucks or it doesn't.

Edited by BangTail

Share this post


Link to post
Share on other sites
There isn't much point in implementing PhysX at all if they aren't going to allow it through hardware so my money is on it being HW accelerated.

PS: I like AMD juts fine but I have heard too many AMD users talking about useless PhysX is and the fact that they are now going to complain because they can't use it doesn't fly with me.

Either it sucks or it doesn't.

Exactly. Fanboyism is something I personnaly profoundly despise.

I'm not a Nvidia fan since as I usually buy the brand with the best value for money which has been ATI for quite a long time now... Still that since they are going to implement PhysX it would be retarded not to use Nvidia GPU's for hardware acceleration just so ATI fans won't whine...

Edited by dunedain

Share this post


Link to post
Share on other sites
Nvidia holds the performance crown so whatever but ATI sells more because they are cheaper.

That's the simple fact. If you're interested in pure performance (and things like PhysX) there is only one choice.

[Character Limit]

Share this post


Link to post
Share on other sites
Exactly. Fanboyism is something I personnaly profoundly despise.

I'm not a Nvidia fan since as I usually buy the brand with best value for money which has been ATI for quite a long time now... Still that since they are going to implement PhysX it would be retarded not to use Nvidia GPU's for hardware acceleration ... Too bad for ATI, I may buy Nvidia for my next computer.

Ditto, I've owned many AMD cards, some good some bad, same with the Green team.

The fact is that they are competitors and obviously they compete. Part and parcel of that is offering features that the other doesn't.

After Metro 2033, and as long as Nvidia remain competitive, I will not go back to AMD easily as the difference PhysX made was so noticeable. I'll grant you that there aren't that many games that make good use of it but there are a few and I'm really hoping ArmA 3 is going to be a good addition :D

Edited by BangTail

Share this post


Link to post
Share on other sites
There isn't much point in implementing PhysX at all if they aren't going to implement it through hardware

Why do you say that? It's like saying the only point of implementing PhysX in ArmA3 is to appeal to Nvidia users, which is wrong. As has been said before it's a great and mature physics library regardless of wether it can run on a GPU or not.

PS: I like AMD just fine but I have heard too many AMD users talking about how useless PhysX is and the fact that they are now going to complain because they can't use it doesn't fly with me.

Either it sucks or it doesn't.

Your opinion about it is yours and that's okay but it's just irrelevant to both ArmA3 and any discussion of PhysX on here, and is frankly childish (on both sides, not just you). There's a word for it, fanboyism.

Share this post


Link to post
Share on other sites
[Character Limit]

Dual GPU cards don't count (and I don't say that to negate your argument as there are plenty of benchmarks where the 590 wins).

GPU vs GPU, Nvidia is the leader and that's all that matters. When you start convoluting things with those t00fer cards, there is always ambiguity because there is no baseline.

The 590 has less RAM so is limited at high resolutions etc.

When you use 3GB 580s in SLI, the performance king is clear.

You wont find me arguing about price vs performance, AMD wins there.

---------- Post added at 14:19 ---------- Previous post was at 14:19 ----------

Why do you say that? It's like saying the only point of implementing PhysX in ArmA3 is to appeal to Nvidia users, which is wrong. As has been said before it's a great and mature physics library regardless of wether it can run on a GPU or not.

Your opinion about it is yours and that's okay but it's just irrelevant to both ArmA3 and any discussion of PhysX on here, and is frankly childish (on both sides, not just you). There's a word for it, fanboyism.

I'm no fanboy bud, I've owned more AMD cards than you've had hot meals and I dont appreciate you calling me childish either.

Edited by BangTail

Share this post


Link to post
Share on other sites

Making us have to use nVidia display card just like US marine forced to use M16A4 to the battlefield. Follow to survive, or die trying.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×