Jump to content
Sign in to follow this  
Baker65

Nvidia in trouble?

Recommended Posts

I tried Googling this, but I could only find one source other than the one you linked. There was a lot of speculation over whether Apple would drop nVidia, but I haven't seen any conclusive source saying which way the decision went. It would be rather difficult for them to go back on nVidia considering the amount of OpenCL related stuff they put into Snow Leopard. I'm not sure how good ATI's chips are with that kind of stuff.

Share this post


Link to post
Share on other sites

Meh... i had Nvidias and ATIs over the years and all had their pros and contras, my current is the HD4870 that suits me perfectly, pros are good price / performance ratio and a perfectly working TV output that doesnt force me to burn everything on a disc to watch it on my TV.

Contra is that somehow it hates ArmA2 and 1, in all other games it runs and looks superb.

Intrestingly it runs very good for me under 7.

Share this post


Link to post
Share on other sites
I tried Googling this, but I could only find one source other than the one you linked. There was a lot of speculation over whether Apple would drop nVidia, but I haven't seen any conclusive source saying which way the decision went. It would be rather difficult for them to go back on nVidia considering the amount of OpenCL related stuff they put into Snow Leopard. I'm not sure how good ATI's chips are with that kind of stuff.
AMD/ATI joined the Khronos Group on OpenCL way before nVidia did.

Although the nVidia 8-series are already compatible with the OpenCL 1.0 specifications.

Now nVidia tries to strike back with Fermi, but by the time it's on the market ATI will already be producing the revision of the Evergreen chip.

Share this post


Link to post
Share on other sites
AMD/ATI joined the Khronos Group on OpenCL way before nVidia did.

Although the nVidia 8-series are already compatible with the OpenCL 1.0 specifications.

Now nVidia tries to strike back with Fermi, but by the time it's on the market ATI will already be producing the revision of the Evergreen chip.

I don't think that matters tbh. Fermi is still on track for December and if the specs are anything to go by, it will demolish the 58xx series.

Time will tell.

Eth

Share this post


Link to post
Share on other sites

Some of the previews of Fermi look quite interesting. Seems like nVidia is taking a Larrabee-esque approach to GPUs. Will be interesting to see which one wins out.

Share this post


Link to post
Share on other sites
I don't think that matters tbh. Fermi is still on track for December and if the specs are anything to go by, it will demolish the 58xx series.
I don't know. It lacks a hardware tessellator, so their stream processors will really have to push hard.

And both cache coherency and dual precision floating point processing is nice, but I don't think it'll be interesting for games yet.

Their architecture seems very futuristic though, very GPGPU orientated.

Maybe PhysX could finally add something to games. But it'll probably die before they'll get that chance when games are released with fully OpenCL/DirectCompute-compliant Bullet, Open Dynamics Engine, Newton and Havok.

Edited by SgtH3nry3

Share this post


Link to post
Share on other sites
I don't know. It lacks a hardware tesselator, so their stream processors will really have to push hard..

Odd, I thought tessellation was a part of the DirectX 11 standard? Or does that include software tessellation too?

Out of my?

Lolwut?

Share this post


Link to post
Share on other sites
Odd, I thought tessellation was a part of the DirectX 11 standard? Or does that include software tessellation too?
It is part of the DirectX 11 spec. But nVidia chose to do tessellation operations on the shader core. Which still is hardware tessellation.

ATI uses a dedicated tessellator.

Edited by SgtH3nry3

Share this post


Link to post
Share on other sites

i suppose nvidia approach is better ...

better programmability , and so , more flexible. i hope tesselation will be used soon , because LOD detail method is often crappy.

about nvidia, i think they are not in bade shape. the high end gamer card is not the main $$$ source for GPU company. it's nearly all about embeeded market share and big deal with computer manufacturer. DELL , IBM , PB , and so ...

Share this post


Link to post
Share on other sites

you forgetting FERMI is 'specialised' approach while AMD hasn't yet unveiled any FireGL or specialised GPU chips based on the 5xxx serie ...

many praising here 'pure' performance values of FERMI from presentation datasheet but take in mind AMD can easily counter it with e.g. two 5870 with 4GB RAM boards.

yet no matter what - i don't want NVIDIA to 'vanish' because competetion is only way how keep any company on innovation path ...

and from healthy competetion benefits any end user ...

Share this post


Link to post
Share on other sites

While i prefer ATi cards for their amazing performance/cost relation, i can say that nVidia wont go bankrupt (or whatever) just because of a few bad months, this always happens in the hardware industry (happened to ATi with the HD2XXX series)

Share this post


Link to post
Share on other sites
While i prefer ATi cards for their amazing performance/cost relation

I hope that isn't some kind of life philosophy you have, because the 'best value for money' provider bounces from one to the other on a regular basis. It's ATI now, but it was nVidia for a long long time before that. It may change back to nVidia when the GT300 family comes out, who knows...

Share this post


Link to post
Share on other sites
I don't know. It lacks a hardware tessellator, so their stream processors will really have to push hard.

And both cache coherency and dual precision floating point processing is nice, but I don't think it'll be interesting for games yet.

Their architecture seems very futuristic though, very GPGPU orientated.

Maybe PhysX could finally add something to games. But it'll probably die before they'll get that chance when games are released with fully OpenCL/DirectCompute-compliant Bullet, Open Dynamics Engine, Newton and Havok.

I don't think the dedicated tessellator is a particularly big deal. It would be if the new Nvidia cards couldn't perform tessellation at all.

PhysX adds plenty to games, and while I disagree with what Nvidia has done to prevent users using an ATI card for graphics and an Nvidia card for physics, it remains that games like Batman:AA looks ALOT better with physx than without it.

The 5870s are nice cards (apart from ATI's signature fan "whine"), having said that I still game on my Nvidia based rig more than I do on the ATI one. I don't like having to give up things like nHancer and I really don't like ATI's drivers.

At any rate, we'll see what Nvidia shows up with fairly soon. I'm still betting it will be significantly faster than the 5870.

Eth

Share this post


Link to post
Share on other sites
I don't think the dedicated tessellator is a particularly big deal. It would be if the new Nvidia cards couldn't perform tessellation at all.

PhysX adds plenty to games, and while I disagree with what Nvidia has done to prevent users using an ATI card for graphics and an Nvidia card for physics, it remains that games like Batman:AA looks ALOT better with physx than without it.

Apart from the cape and the flying tiles/debris, everything in the game could be either animated.

The cape and flying tiles can done through CPU physics processing like seen in the cinematic physics in Half-Life 2: Episode 2.

It's really more the lack of interest that developers haven't implemented such physics. Cloth physics were done in 2003 on the Open Dynamics Engine, on old Pentium 4's.

And we're almost in 2010 and we're still stuck with parallax and normal mapping to fake geometry detail.

Tessellation is there since the early 2000s and it looked better back then compared to what alternatives we have now.

It seems as if nVidia is trying to create a more malicious vendor-lock-in environment á la Microsoft.

If it weren't for nVidia we would already have DirectCompute and (hardware) tessellation in the DirectX 10 spec.

Thank God we are lucky enough to have consistent shader architectures (call it unified shaders, whatever you like) and high-bandwidth memory (busses).

Share this post


Link to post
Share on other sites
Apart from the cape and the flying tiles/debris, everything in the game could be either animated.

The cape and flying tiles can done through CPU physics processing like seen in the cinematic physics in Half-Life 2: Episode 2.

It's really more the lack of interest that developers haven't implemented such physics. Cloth physics were done in 2003 on the Open Dynamics Engine, on old Pentium 4's.

And we're almost in 2010 and we're still stuck with parallax and normal mapping to fake geometry detail.

Tessellation is there since the early 2000s and it looked better back then compared to what alternatives we have now.

It seems as if nVidia is trying to create a more malicious vendor-lock-in environment á la Microsoft.

If it weren't for nVidia we would already have DirectCompute and (hardware) tessellation in the DirectX 10 spec.

Thank God we are lucky enough to have consistent shader architectures (call it unified shaders, whatever you like) and high-bandwidth memory (busses).

Could be but unfortunately, isn't.

There is a big difference (visually) between Nvidia and ATI in Batman (and certain other games).

Anyways, Physx isn't a major reason to stick with Nvidia at the moment. I would probably use the ATI rig alot more if their drivers were better and if they had an nHancer equivalent.

Eth

Share this post


Link to post
Share on other sites
There is a big difference (visually) between Nvidia and ATI in Batman (and certain other games).

In situations like that, it's hard to know for definite whether the problem is with your hardware, or their lazy devs...

Share this post


Link to post
Share on other sites

I'll probably be helping nVidia's financial issues next time I buy a graphics card. Never had serious issues with my GeForce cards, but this ATI 4850 has been nothing but trouble. Downclocked it as much as I can to make it run more stable, but even then I got issues with plenty of games, although less than when running at default clock.

ATI generally seem to have a better performance/price ratio, but with my experience with this card, and the fact my friends generally have more issues with ATI than nVidia, I know ATI won't be my next choice.

Share this post


Link to post
Share on other sites
In situations like that, it's hard to know for definite whether the problem is with your hardware, or their lazy devs...

I was referring to PhysX (which ATI can't use at the moment).

Batman with PhysX on is far more visually appealing than with it off.

Eth

Share this post


Link to post
Share on other sites

Yeah, but that doesn't mean that the game couldn't have been designed to run smoothly using something other than PhysX.

Share this post


Link to post
Share on other sites
Yeah, but that doesn't mean that the game couldn't have been designed to run smoothly using something other than PhysX.

Yeah, it could have, but it wasn't. And if Nvidia have their way, you will see more and more games using PhysX.

As I said before, I don't like the fact that they have disabled PhysX support should the drivers detect an ATI card but it doesn't change the fact that the more big name games use Physx, the less appealing ATI becomes.

Obviously, right now, there aren't too many games using PhysX and even fewer implementing it well. Of course, that could go either way.

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites

For some odd reason, Batman game tells me I need to update my Nvidia drivers every time I launch it. I guess I will have to wait until another new batch is released.:rolleyes:

On topic, maybe it was my 2 8800gtx's that died on me was the cause of all this. :confused:

Share this post


Link to post
Share on other sites

PhysX adds plenty to games, and while I disagree with what Nvidia has done to prevent users using an ATI card for graphics and an Nvidia card for physics, it remains that games like Batman:AA looks ALOT better with physx than without it.

Eth

all the effects removed from non PhysX version of Batman can be done on CPU or 'emulated' to be like the physics ones on the CPU (for example carpets hanging from the ceiling are missing completely)

most of these 'effects' were done in 10 years old games ...

it's very easy claim how game looks great with versus w/o the effect when You simply decide to remove that effect for hardware you don't want to write it...

real challenge is to make it work and look same on ALL hardware

Share this post


Link to post
Share on other sites

real challenge is to make it work and look same on ALL hardware

I concur, and hopefully BI will not follow other Devs.

It kinda makes it "look/sound" as if a game may not play properly on your system, when they lean toward a specific video manufacture.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×