Jump to content
Sign in to follow this  
Richey79

PhysX

Recommended Posts

Can you put ALL of that into a language that I understand please? Like English!

Whn you write a program you can use programming languages like C, C++ etc... bla bla... Your CPU won't understand that language unless you compile it to machine code. While you compile it you can use SIMD's (CPU instruction sets) to run that code faster and more efficiently!!! EIther Integer or floating point more advanced instruction sets allow for better performance if you optimize for it. Better to use a 1GHZ cpu with support for AVX or FMA4 than a 5GHZ cpu with only x87.

But let's say you have 2 processors. Intel I7 vs Phenom II. Best that Intel does is SSE 4.2 vs SSE 3 on Phenom II. Intel compiler will use SSE 4.2 for I7 but if it detects a Phenom instead of using SSE 3 that AMD fully supports it will make him only use SSE 2 or SSE or on some cases even worse 386 code!!!

Ok we know that Nehalem (first generation I7's and I5's) is in majority a better architecture than AMD's K10.5 (Phenom II), but the difference is not that great as people actually think. On some cases K10.5 architecture has more throughput potential than Nehalem.

Optimizing is the keyword.

For the record Intel must not cripple AMD's performance on ICC as settled by court, with the penalty of losing 64 bit technology on their CPU's.

The problem is Intel is gonna surely circumvent this. :mad:

Share this post


Link to post
Share on other sites

I have a nvidia gts 250 I can use for physx along w/ my AMD 6950. If that's not enough, maybe I'll snag a used 460.:confused:

Share this post


Link to post
Share on other sites

Well, GRAW had Physx.

I could play it with NO PROBLEMs at all using a non-CUDA (or Physx not ready NVIDIA) and with my ATi, while seeing ragdools and some fancy physics (most important stuff).

When I played it with Physx ON the effects were somehow better (more dust, particles around) but not so much better and didn´t change the gameplay at all.

So I will wait and see...

Share this post


Link to post
Share on other sites
Yes it appears so.

And please people stop saying ATI. ATI went from a company to just a brand and now nothing. It's AMD since 2006.

The graphics cards are still called ATI...

Share this post


Link to post
Share on other sites
I have a nvidia gts 250 I can use for physx along w/ my AMD 6950. If that's not enough, maybe I'll snag a used 460.:confused:

8000 series upwards are fine for PhysX, although 8800 upwards is better for performance.

Share this post


Link to post
Share on other sites
Thanks Carlos, I understand now I think! :p

Dale think about the times of Athlon64. Remember how Athlon64 was praised as the best CPU for gaming?

Well it was the best CPU for everything. Mainly because of 2 reasons:

1)Athlon 64 was a very good architecture (K8)

2)Intel Pentium 4 (Netburst) was a failure.

Still it is funny how in benchmarks compiled with ICC suddenly performance was reported to be better on Pentium 4.

Pure BS.

Information is good Dale.

This is section 2.3 from court settlement:

2.3 TECHNICAL PRACTICES

Intel shall not include any Artificial Performance Impairment in any Intel product or require any Third Party to include an Artificial Performance Impairment in the Third Party’s product. As used in this Section 2.3, “Artificial Performance Impairment†means an affirmative engineering or design action by Intel (but not a failure to act) that (i) degrades the performance or operation of a Specified AMD product, (ii) is not a consequence of an Intel Product Benefit and (iii) is made intentionally to degrade the performance or operation of a Specified AMD Product. For purposes of this Section 2.3, “Product Benefit†shall mean any benefit, advantage, or improvement in terms of performance, operation, price, cost, manufacturability, reliability, compatibility, or ability to operate or enhance the operation of another product.

In no circumstances shall this Section 2.3 impose or be construed to impose any obligation on Intel to (i) take any act that would provide a Product Benefit to any AMD or other non-Intel product, either when such AMD or non-Intel product is used alone or in combination with any other product, (ii) optimize any products for Specified AMD Products, or (iii) provide any technical information, documents, or know how to AMD.

Share this post


Link to post
Share on other sites

I have Nvidia cards, 3d vision etc and i would have preferred an ati/nvidia friendly engine. If we all can accelerate physics then BIS can make it a requirement and thus take even greater advantage of the hardware. So far physx implementations have been fluff effects at best!

Share this post


Link to post
Share on other sites
I have Nvidia cards, 3d vision etc and i would have preferred an ati/nvidia friendly engine. If we all can accelerate physics then BIS can make it a requirement and thus take even greater advantage of the hardware. So far physx implementations have been fluff effects at best!

But they already use it for VBS2, so it is a tech they already have experience with and can use more readily.

My heart does however go out to ATI users though and hope there is a working solution or compromise for them at the time of launch.

Share this post


Link to post
Share on other sites
I hope they accelerate everything with PhysX.

It will all still work just fine on AMD video card with their physics being calculated on CPU. And just think of how powerful CPUs will be in a year. The only difference would be Nvidia owners will get a boost. Looking at the scant list of PhysX titles available today, ArmA3 implementing PhysX is quite a welcome addition for Nvidia card owners.

And even if the hardware acceleration is limited to vehicles, isn't everything pretty much a vehicle anyway? Or at least, anything can be built as a "vehicle". Imagine entire buildings built of individual "vehicle" bricks, and then being blown to smithereens, all using the latest in hardware acceleration!

Ever tried running PhysX from the CPU when also trying to run a game? It runs like pure shit.

I have an overclocked i7-950 and I cannot even run Mafia II with PhysX enabled and get half decent framerates with crossfired 6950's

Share this post


Link to post
Share on other sites

most of us here that have ATI cards are "hardcore" enough to grab a cheap nvidia card for physx if it really is that big of a deal. I just hope they super optimize the coding.

Share this post


Link to post
Share on other sites

I really don't think AMD card owners will miss much, as far as playing Arma 3 out of the box goes. Rather, everyone will benefit tremendously from the use of a modern physics engine like PhysX, over the wonky and now decrepit ODE. Indeed, our beloved series has been plagued with terrible physics since the beginning in OFP. For this fact alone, AMD and nVidia card owners alike should rejoice at this announcement, knowing our shared affliction through all these years, will be finally and forever more be banished.

Mods are quite another story though. I have a strong feeling that we will being seeing a plethora of mods that take full advantage of the new hardware acceleration capability, and it will all be simply mind blowing! :eek:

In any event, those concerned have a full year to plan ahead and join the dark side, before the mighty power gets unleashed. :D

EDIT:

Ever tried running PhysX from the CPU when also trying to run a game? It runs like pure shit.

I have an overclocked i7-950 and I cannot even run Mafia II with PhysX enabled and get half decent framerates with crossfired 6950's

Well you can't turn on hardware accelerated physics without the hardware, now can you?!?

EDIT2:

Titles like Mafia 2 have that "hardware acceleration" option, which drastically increases the amount of physics going on, and only video cards with that capability can currently handle it. So yeah, I think ArmA3 out of the box needs to accommodate users with and without nVidia cards with this capability. But hopefully, ArmA3 will have a similar "hardware acceleration" option, which really cranks things up. Anyway, it's the mods I'm the most excited about!

But in the mean time, I really hope we get at least actual crumbling buildings out of the box. Maybe the chunk size can scale or something (where nVidia owners get many more and much smaller chucks, etc, after turning on HA).

Edited by MadRussian

Share this post


Link to post
Share on other sites

Can't wait to see what they do with particle effects.

Share this post


Link to post
Share on other sites

So how much additional pressure would PhysX put on the CPU of a machine with an ATI card?

Share this post


Link to post
Share on other sites

Despite having an ATI card, and probably in a state to upgrade to another one, I'm not worried about this. Frankly, it gives me a reason to use that old 8800 GTS I have lying around.

Share this post


Link to post
Share on other sites

People have been asking for physics upgrades for about a decade, and when BIS decides to integrate PhysX people bitch...

Can't say I'm surprised; it's the same story with any new feature people always want. People are never satisfied.

Share this post


Link to post
Share on other sites

IMO regardless if you have Nvidia or AMD at least you'll finally be able to give Bulldozers and Sandy Bridges something to do.

Share this post


Link to post
Share on other sites
What about the minority of us who have ATI cards, with we not be able to use the new Physics? Since it uses NVidia's PhysX?

FYP:

http://store.steampowered.com/hwsurvey/

---------- Post added at 10:10 PM ---------- Previous post was at 10:08 PM ----------

Ever tried running PhysX from the CPU when also trying to run a game? It runs like pure shit.

I have an overclocked i7-950 and I cannot even run Mafia II with PhysX enabled and get half decent framerates with crossfired 6950's

Even a quadcore CPU is inferior comaprd to a GPU for SIMD (physcis)

Ask yourself this:

Would you render your graphics on your CPU?

It's the same issue...a CPU is "ok" for everything...but excells at nothing.

Share this post


Link to post
Share on other sites
IMO regardless if you have Nvidia or AMD at least you'll finally be able to give Bulldozers and Sandy Bridges something to do.

No we wont - because NV crippled CPU version of PhysX.

I have AMD GPU and IMO if developer is happy to accept cash (if they allow significant feature differences from AMD to NV GPUs) from NV and in return screw customers who dont have "correct" brand GPU, I will (not) spend my money on their game based on that.

Share this post


Link to post
Share on other sites
No we wont - because NV crippled CPU version of PhysX.

I have AMD GPU and IMO if developer is happy to accept cash (if they allow significant feature differences from AMD to NV GPUs) from NV and in return screw customers who dont have "correct" brand GPU, I will (not) spend my money on their game based on that.

That is a lie...I dare you to prove it!

(a link to realworldtechnologies would be a own goal from your side...do some digging ;))

And you are balming the wrong people.

Why should gamign stand still, just because AMD cann't get their own GPU physics up and runnning?

Cart before horse...not bright.

Edited by Terracide

Share this post


Link to post
Share on other sites
No we wont - because NV crippled CPU version of PhysX.

I have AMD GPU and IMO if developer is happy to accept cash (if they allow significant feature differences from AMD to NV GPUs) from NV and in return screw customers who dont have "correct" brand GPU, I will (not) spend my money on their game based on that.

Er they didnt lock cpu physx for mafia 2 to only some low level, you were allowed the full range. I am sure they wouldn't restrict it now.

Share this post


Link to post
Share on other sites
Er they didnt lock cpu physx for mafia 2 to only some low level, you were allowed the full range. I am sure they wouldn't restrict it now.

It's a simple matter of performance (or the lack of performance).

Would you like a setting that would enable full PhysX effect on the CPU...if that meant you had to play at 3-4 FPS, because your CPU simply isn't up for the task? :)

GPU > CPU for SIMD

A GPU creams a CPU in physics calculations...those are the hard facts.

Share this post


Link to post
Share on other sites

Fact is this PhysX thingy is dividing the community in two classes:

the poor and the rich, those who have and have not :p

(even if it´s just the luxury of the feeling 'to b on the safe side' for nVidia users :rolleyes:)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×