Jump to content
Sign in to follow this  
Richey79

PhysX

Recommended Posts

Shame that ArmA 3 will be one of the few games to allow itself to be tied to NVidia's proprietary (closed) PhysX technology. Good for NVidia, bad for BIS's AMD/ATI customers.

I was really looking forward to ArmA 3 but this has deflated my enthusiasm quite a bit.

Share this post


Link to post
Share on other sites

At fisrt post:

Originally Posted by Damu (BIS Dev)

I can assure you both Nvidia and ATI (including others) will be supported. No panic, please. Nothing is going to be changed, except the world.

Stop beign over-dramatic.

Share this post


Link to post
Share on other sites

Well, I haven't read the entire thread but I guess it's still the case that the "support" will mean PhysX emulation using AMD card owner's CPU? Physics simulations are very processor intensive, so I guess my NVidia opponents will be very happy that I'm unable to target them effectively due to my system lagging while trying to process PhysX opcodes. :)

Share this post


Link to post
Share on other sites
Well, I haven't read the entire thread but I guess it's still the case that the "support" will mean PhysX emulation using AMD card owner's CPU? Physics simulations are very processor intensive, so I guess my NVidia opponents will be very happy that I'm unable to target them effectively due to my system lagging while trying to process PhysX opcodes. :)

The information we have so far doesn't support that assumption.

Share this post


Link to post
Share on other sites

Pettka said that there was no battles or anything going on, and he was not in a built up area. The information really doesn't give us any reason to believe either way, because he doesn't talk about any physical interactions going on at all.

Share this post


Link to post
Share on other sites

I never claimed otherwise. There is little real information either way.

Share this post


Link to post
Share on other sites

I am still very disappointed about this move. Does NVIDIA pay developers to use this engine over other alternatives?

Or is it just simpler for BI to port the PhysX implementation from VBS2 to this?

Either way, I still think that they should not gone with any kind of propriety physics engine, considering that they already have a small userbase. This may alienate users.

I already ordered a new motherboard, 6-core AMD processor, and new DDR3 RAM. I have still yet to decide on a card, maybe I'll just go with NVIDIA this time.

Share this post


Link to post
Share on other sites
I am still very disappointed about this move. Does NVIDIA pay developers to use this engine over other alternatives?

Actually, I wouldn't be surprised if some kind of incentive is provided for developers that go the PhysX route. Also I believe I've read somewhere that if a game shows their short intro video in the beginning, NVIDIA will provide support in marketing the game, or something.

Or is it just simpler for BI to port the PhysX implementation from VBS2 to this?

That is also a possibility.

Either way, I still think that they should not gone with any kind of propriety physics engine.

Agreed. I would have preferred Bullet.

Share this post


Link to post
Share on other sites

As far as I am aware, processing that would otherwise be done by PhysX can be offloaded to the CPU in VBS2, so I highly doubt it'll be any different in ARMA 3. Anyways, before you start freaking out at all things you perceive as evil, such as nVIDIA I guess, wait until we get some more clarification.

Share this post


Link to post
Share on other sites

It's amazing that no matter how many times it's repeated, that many people still refuse to understand the concept that you won't NEED an nVidia card. I guess those people cannot fathom that PhysyX is also a software solution.

Worst case scenario - nVidia users will have the option to offload some processing. I see this is similar to TrackIR users having some ability for awareness over non-TrackIR players. And yet I see no complaining over that.

Share this post


Link to post
Share on other sites

I don't think anyone is saying that here presently, DMarkwick. I think the concern is that certain users, and a certain company, are being favoured because of the proprietary solution. This puts some people at a performance disadvantage. As you may have heard, the ArmA series is hardware intensive :p

Share this post


Link to post
Share on other sites
I don't think anyone is saying that here presently, DMarkwick. I think the concern is that certain users, and a certain company, are being favoured because of the proprietary solution. This puts some people at a performance disadvantage.

... but you just exactly said the thing that I described :D

Anyway. No-one complains that people with higher resolution monitors get an unfair advantage for observation. ;)

Share this post


Link to post
Share on other sites
... but you just exactly said the thing that I described :D

Anyway. No-one complains that people with higher resolution monitors get an unfair advantage for observation. ;)

They don't, you can always adjust it by FoV.

Share this post


Link to post
Share on other sites
They don't, you can always adjust it by FoV.

... and for any FoV, the higher resolution monitor will display the better advantage.

Share this post


Link to post
Share on other sites

That doesn't matter. We're not talking about an advantage. We're talking about frame rates.

I don't understand this argument you're making. People don't complain about one thing, but they complain about another. What is the point? If someone complains about one thing it's not legitimate unless he also complains about all things?

Share this post


Link to post
Share on other sites
I think the concern is that certain users, and a certain company, are being favoured because of the proprietary solution. This puts some people at a performance disadvantage.
We're not talking about an advantage. We're talking about frame rates.

Now I'm thouroughly confused. :p

Share this post


Link to post
Share on other sites

I have 2 nvidia gpu's but I always put physX to the cpu as the gpu's are busy rendering. I doubt it will be different in arma 3.

maybe get a 3rd gpu for dedicated physX, put the 750W psu to work :)

Share this post


Link to post
Share on other sites
Now I'm thouroughly confused. :p

What DMarkwick sounded like he was talking about was a competitive thing. What I mean is a performance difference, notwithstanding how you use the game. Low frame rates are frustrating. It's also a monetary thing. Frames cost money. For instance if my computer commits sepuku everytime there's a battle scene in ArmA 3, it's going to cost me money to remedy that problem. Small monitor resolution may actually make the game more playable at a given detail setting for some people, not less.

Edited by Max Power

Share this post


Link to post
Share on other sites

I don't have too much of a point to make. Only that, there is a persistent misunderstanding that nVidia will be required, and that other hardware advantages never seem to cause anyone any trouble. Just an oddity I noticed :)

For example (and at the risk of OTing the thread even more) much noise is made about players wishing to use aim reticules, friendly tags etc, stuff that helps people play more effectively. And yet there is no discussion about people who play at resolution 2500x1600 vs people who can only manage 1024x768. There is a great advantage for spotting and identifying in that case.

I'm not saying it IS an issue, just that it seems to me that no-one seems to think that those advantages are really... advantages. In my own mind, the nVidia hardware PhysX issue is just the same. Should it even BE HW enabled which has NOT been established yet.

Share this post


Link to post
Share on other sites
I don't have too much of a point to make. Only that, there is a persistent misunderstanding that nVidia will be required, and that other hardware advantages never seem to cause anyone any trouble. Just an oddity I noticed :)

For example (and at the risk of OTing the thread even more) much noise is made about players wishing to use aim reticules, friendly tags etc, stuff that helps people play more effectively. And yet there is no discussion about people who play at resolution 2500x1600 vs people who can only manage 1024x768. There is a great advantage for spotting and identifying in that case.

I'm not saying it IS an issue, just that it seems to me that no-one seems to think that those advantages are really... advantages. In my own mind, the nVidia hardware PhysX issue is just the same. Should it even BE HW enabled which has NOT been established yet.

But the PhysX has its code gimped by Nvidia if it is run on CPU so it is not really good if you aren't running it on a GPU. Especially if ArmA 3 makes use of Apex. The problem here is, PhysX is mostly calculated on CUDA cores, if you run it on a CPU, it will not run as good. So, people with Nvidia cards have a large advantage in FPS if BIS will support GPU calculated PhysX.

Share this post


Link to post
Share on other sites

hrmph, think this whole argument is a little premature. Havn't seen any features that would seperate Nvidia from ATI users just yet.

But I'd be inclined to support DM's view on this.

Tough to be a consumer... isn't it ?

Share this post


Link to post
Share on other sites
hrmph, think this whole argument is a little premature. Havn't seen any features that would seperate Nvidia from ATI users just yet.

But I'd be inclined to support DM's view on this.

Tough to be a consumer... isn't it ?

PhysX is the feature, it can be done on GPU and it can be done on CPU. Although if BIS has a chance of doing it on GPU and ALSO doing it on CPU without having the CPU performance to be shit, fine by me.

But if they apply Apex without option to turn it off, good bye for me if I don't have an Nvidia card by then.

Share this post


Link to post
Share on other sites
But I'd be inclined to support DM's view on this.

Why thanks.

(But I thought my view was that everyone complaining that they wont be able to run A3 on ATI cards are idiots? :rolleyes2: )

I would suggest looking towards VBS, it uses physX but it is NOT hardware accelerated...

Share this post


Link to post
Share on other sites
Why thanks.

(But I thought my view was that everyone complaining that they wont be able to run A3 on ATI cards are idiots? :rolleyes2: )

I would suggest looking towards VBS, it uses physX but it is NOT hardware accelerated...

Yes it uses PhysX for something like ropes, but not for simulating overall physics.

It is a bit different.

Share this post


Link to post
Share on other sites
Yes it uses PhysX for something like ropes, but not for simulating overall physics.

It is a bit different.

Do some research, then come back and flap your gums on the forums ;)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×