Jump to content
Sign in to follow this  
wazandy

An open letter to BIS Devs re: GTX 295 rigs.

Recommended Posts

May as well add me to this, spent so much cash on my system and the performance is shocking with Arma II.

I have heard a Bohemia rep say, the drivers are the cause and "we have contacted the respectable companies, ATI and Nvidia". I'm not sure if this is a typical "pass the hot potato stunt" were Bohemia don't want to accept ownership of the problem or not. I'd like to see some more input from the devs on this please.

I think the devs probably have better things to do :)

Talk to Nvidia about your "2 in 1" card. There is no doubt in my mind that the problem lies with that card specifically.

FC2 was the same thing with Nvidia's last "2 in 1" card (9800GX2). 3 months for working drivers.

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites

Milamber nice post, I will try Win7 soon. The big question is how is your FPS in the cities? I've got a 295 with an i7 and 30fps is my average.

Share this post


Link to post
Share on other sites
Well, you're not going to.

Talk to Nvidia about your "2 in 1" card. There is no doubt in my mind that the problem lies with that card specifically.

FC2 was the same thing with Nvidia's last "2 in 1" card (9800GX2). 3 months for working drivers.

The fact that you opted to spend a lot of money is YOUR problem, not Bohemia's.

Eth

Oh please, a tiny handful of games have SLI profile problems with dual cards and thats why a top-end card like the 295 is 'flawed'?

Newsflash! I have dozens if not hundreds of PC games, covering a massive spectrum of genres and the only game that has caused me any amount of aggravation to date has been Arma2.

I don't doubt that driver problems contribute to issues. Not at all. But please spare us the lecture about how spending money on good kit is the reason for all the problems that we'll see in games. It's complete rubbish.

Share this post


Link to post
Share on other sites
Oh please, a tiny handful of games have SLI profile problems with dual cards and thats why a top-end card like the 295 is 'flawed'?

Newsflash! I have dozens if not hundreds of PC games, covering a massive spectrum of genres and the only game that has caused me any amount of aggravation to date has been Arma2.

I don't doubt that driver problems contribute to issues. Not at all. But please spare us the lecture about how spending money on good kit is the reason for all the problems that we'll see in games. It's complete rubbish.

Thanks for the info :rolleyes:

Newsflash : Most games worked fine on my dual 9800GX2 setup at the time. A few didn't, FC2 being the most prolific offender, and it was TOTALLY down to Nvidia.

I don't consider the 295 to be "Top-end". It's 2 x 260's (Ok, it has more shader power than a 260 but that's the only advantage). The 260 is a midrange card. The single PCB 295 is OK but I still prefer 2 or more seperate cards for SLI. The 280/285 is substantially faster than a 260. The 295 hits 32.2 Gigapixels p/s while 2 x 280 hits 40. That's a no brainer if performance is your primary concern and it's also cheaper.

I wasn't lecturing btw, just stating facts.

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites
Thanks for the info :rolleyes:

Newsflash : Most games worked fine on my dual 9800GX2 setup at the time. A few didn't, FC2 being the most prolific offender, and it was TOTALLY down to Nvidia.

I don't consider the 295 to be "good kit". It's 2 x 260's. The 260 is a midrange card.

I wasn't lecturing btw, just stating facts.

Eth

orly?

"The Nvidia GeForce GTX 295 dual-GPU graphics card features 1792MB of memory which corresponds to the memory capacity of two GeForce GTX 260 boards. However, unlike the Radeon HD 4870 X2 which is literally two Radeon HD 4870 products stuck together, the GeForce GTX 295 seems to be more of a hybrid, sharing specs of both GeForce GTX 260 and 280 products."

Bottom line is this. I play every single game that I own on almost maximum graphics settings at 1920x1200, including Crysis and Crysis Warhead. The only limitation being the level of AA that I put on there as well.

So what if a 'few' games cause some aggravation? Welcome to the world of PC gaming, my friend.

Oh and for some laffs, lets hear what your opinion on "good kit" graphics cards is, because i'd dearly like to hear what you think outperforms a "medium range" card like a 295, LMFAO.

Also, what I do find very interesting is that you seem to pop up in threads about other peoples high-spec kit, seemingly trying to down-play any investment onto what is widely regarded as "good kit". You come across as someone very embittered by the spec of other posters, to the degree that i'd doubt any of what you have in your spec is actually true.

More likely you invested in hardware a little too early, are now regretting it and take every opportunity you can to diss what is clearly "good kit" owned by other people.

Share this post


Link to post
Share on other sites
orly?

"The Nvidia GeForce GTX 295 dual-GPU graphics card features 1792MB of memory which corresponds to the memory capacity of two GeForce GTX 260 boards. However, unlike the Radeon HD 4870 X2 which is literally two Radeon HD 4870 products stuck together, the GeForce GTX 295 seems to be more of a hybrid, sharing specs of both GeForce GTX 260 and 280 products."

Bottom line is this. I play every single game that I own on almost maximum graphics settings at 1920x1200, including Crysis and Crysis Warhead. The only limitation being the level of AA that I put on there as well.

So what if a 'few' games cause some aggravation? Welcome to the world of PC gaming, my friend.

Oh and for some laffs, lets hear what your opinion on "good kit" graphics cards is, because i'd dearly like to hear what you think outperforms a "medium range" card like a 295, LMFAO.

Also, what I do find very interesting is that you seem to pop up in threads about other peoples high-spec kit, seemingly trying to down-play any investment onto what is widely regarded as "good kit". You come across as someone very embittered by the spec of other posters, to the degree that i'd doubt any of what you have in your spec is actually true.

More likely you invested in hardware a little too early, are now regretting it and take every opportunity you can to diss what is clearly "good kit" owned by other people.

It is midrange tech. If you actually knew anything about the hardware you bought, you'd know that.

I never "dissed" anyone's kit. You should stop putting words in my mouth. I made a statement of fact and you got upset because you don't like the facts. Then you resorted to calling me embittered and a liar because of your inability to process the facts.

I never questioned his spending. I said that his spending was not Bohemia's fault. Again, you seem to have taken my statement out of context and I have removed it as to avoid any further confusion.

I "pop up" in threads concerning "2 in 1" cards because of my numerous bad experiences with them and it has absolutely NOTHING to do with how people spend their money.

I am certainly not mad that people choose to buy 295s (what a ridiculous concept) but I do understand the frustration whether it's due to A2 or Nvidia.

Because you don't actually have a legitimate argument, you've decided to try and divert people's attention away from your obvious lack of knowledge by casting asperions concerning the existence of my hardware.

I am getting a little sick and tired of people blaming BIS for something that I sincerely doubt has anything to do with them.

Judging by your knowledge of sound hardware (or distinct lack thereof), you shouldn't be telling anyone about PC hardware.

Thanks

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites

He's absolutely right. Your brand new PC is junk, send it to me and I'll be nice enough to give you a real machine back.. and you know it's good 'cause it has 32mb of onboard video powah.

Share this post


Link to post
Share on other sites
He's absolutely right. Your brand new PC is junk, send it to me and I'll be nice enough to give you a real machine back.. and you know it's good 'cause it has 32mb of onboard video powah.

LAWL.

To be fair, I never called anyone's rig "junk".

I'm not a fan of "2 in 1" cards from past experience so consider me a little biased :)

Eth

Share this post


Link to post
Share on other sites
It is midrange tech. If you actually knew anything about the hardware you bought, you'd know that.

Now you're attacking my specs? Jealous much?

I'm not embittered about anyone's specs. I am, however, getting a little sick and tired of people blaming BIS for something that I sincerely doubt has anything to do with them.

Judging by your knowledge of sound hardware (or distinct lack thereof), you shouldn't be telling anyone about PC hardware.

Thanks

Eth

LOL!

So lets hear then, if the 295 is 'mid-range' what is 'high-range'? I'll tell you right now that my 295 blows a 280 clean out of the water. So im very eager to hear what your obviously solid hardware experience has to say on the matter.

"Speaking of performance crowns, if there's a dual-slot, dual-GPU, single-card performance crown, then the GeForce GTX 295 has snagged that one. The GTX 295 is an even more extreme solution than the 285, obviously, but it's the class of the exotics."

I so dearly like to hear what you would class as a superior card to the 295, so let us all hear it.

And, for the record BIS should have absolutely been testing Arma2 with a 295. They clearly did not. If they had, they could have contacted Nvidia much earlier and reported potential problems with the SLI profiles. They did not. They should shoulder some of the blame for this.

Share this post


Link to post
Share on other sites
LOL!

So lets hear then, if the 295 is 'mid-range' what is 'high-range'? I'll tell you right now that my 295 blows a 280 clean out of the water. So im very eager to hear what your obviously solid hardware experience has to say on the matter.

"Speaking of performance crowns, if there's a dual-slot, dual-GPU, single-card performance crown, then the GeForce GTX 295 has snagged that one. The GTX 295 is an even more extreme solution than the 285, obviously, but it's the class of the exotics."

I so dearly like to hear what you would class as a superior card to the 295, so let us all hear it.

And, for the record BIS should have absolutely been testing Arma2 with a 295. They clearly did not. If they had, they could have contacted Nvidia much earlier and reported potential problems with the SLI profiles. They did not. They should shoulder some of the blame for this.

Go away for gawd's sake.

Do you not understand that it's 2 midrange cards bolted together? That's why 2 x 280/285 creams it. It is not really a "single card" and it's only real advantage is that it uses less power than 2 seperate cards.

and enjoy ignore :)

I'm not derailing this any further.

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites

This is quite entertaining. Eth is arguing that the 260 is mid-range (which it is) and PP is arguing that the 295 is high range (which it is) so where's the argument again?

---------- Post added at 12:46 AM ---------- Previous post was at 12:45 AM ----------

I'm sure it's been noted that not everyone playing on a 295, or even a 295 with an i7 is having the same issues... btw

Share this post


Link to post
Share on other sites
This is quite entertaining. Eth is arguing that the 260 is mid-range (which it is) and PP is arguing that the 295 is high range (which it is) so where's the argument again?

---------- Post added at 12:46 AM ---------- Previous post was at 12:45 AM ----------

I'm sure it's been noted that not everyone playing on a 295, or even a 295 with an i7 is having the same issues... btw

The 295 is a high performance card comprised of mid range tech (Same shader horsepower as the 280/285 but lower clocks and memory bandwidth, 448 as opposed to 512). This shows when it comes up against 280 series cards in SLI and is particularly telling when 3 x GTX 280's (3 GPUs) are tested against the 4 GPUs of 2 x GTX 295. The 295 has much more in common with the 260 than it does with the 280.

It's sold as a single card (which it technically isn't) and while it is faster than any single GPU card, it suffers from the same limitations as it's lineage and doesn't compete well with 280/285's in SLI configurations.

AFAICR, some of the guys with 295's do have it running well.

JFTR, I'm not dissing anyone's rig, I just think this particular problem is down to Nvidia and not BIS.

A 295 is 2 GPUs, put those 2 GPUs up against 2 x 280 GPUs and the 295 gets trounced. 1 x 280 vs 1 x 295 is not a fair comparison as it is 1 GPU vs 2 GPUs. Here are some comparisons and it is very clear that the 280/285's in SLI are much better performance wise than the 295 is. 2 GTX 280's are also cheaper than a 295.

http://www.overclock.net/graphics-cards-general/446500-gtx-295-vs-gtx-280-sli.html (As you can see from the FC2 and CoD benchmarks, the 295 is running 20 FPS slower than 2 x 285's)

http://www.guru3d.com/article/geforce-gtx-295-review-bfg/11

Eth

@Bulldogs : I have issues with A2 and the i7 under Win 7 64 and I'm not running a 295. I think there are problems with Win 7/Vista 64, the i7 and A2 (Specifically HT).

Edited by BangTail

Share this post


Link to post
Share on other sites
Go away for gawd's sake.

Do you not understand that it's 2 midrange cards bolted together? That's why 2 x 280/285 creams it. It is not really "single card" and it's only real advantage is that it uses less power than 2 seperate cards.

and enjoy ignore :)

I'm not derailing this any further.

Eth

Ignore me all you like, it's normally the action of a forum user who's backed into a corner with their own bullshit and has nowhere else to go.

I've already disproven your "it's only two 260s" because the 295 actually uses a mixture of 260 and 280 spec.

And you still haven't shared your oracle-like vision with us all, regards what you class as a 'high spec' card if a 295 is only 'medium spec'.

Enjoy your blissful ignorance, i'm sure there are a lot of other posters on these boards that wish you'd ignore them too so you wouldn't keep on sticking your nose into other peoples threads with nothing but drivel about how they've 'over spent' and yet 'under specced'.

Muppet.

---------- Post added at 03:54 PM ---------- Previous post was at 03:52 PM ----------

This is quite entertaining. Eth is arguing that the 260 is mid-range (which it is) and PP is arguing that the 295 is high range (which it is) so where's the argument again?

---------- Post added at 12:46 AM ---------- Previous post was at 12:45 AM ----------

I'm sure it's been noted that not everyone playing on a 295, or even a 295 with an i7 is having the same issues... btw

It's patently obvious that he's very bitter about other posters spec. I mean, he doesn't even own a 295, or even a 285 and yet he knows absolutely everything about why a 295 is 'medium spec' without telling us all what card trumps a 295.

Complete joker.

Share this post


Link to post
Share on other sites

I'm sure it's been noted that not everyone playing on a 295, or even a 295 with an i7 is having the same issues... btw

yes, many peoples seem to just say they've got a 295...not who the manufacturer (e.g. evga) is, which version of 295 from that manufacturer they have etc

Share this post


Link to post
Share on other sites

Very good point. I remember back with the 7900gt's the wrong details were given to msi and several other companies, as such each 7900gt that released ended up artifacting withing a few months

Share this post


Link to post
Share on other sites

Holy shit my friends, just agree there's issues on both sides and put the digital peni away. Those are facts that are 'incontrovertable'. You each have points, and it seems these are not solid enough to convince the other side (or the other sides pride is too large, either way). Failing that, for the sake of the rest of us please agree to disagree.

Share this post


Link to post
Share on other sites
This is quite entertaining. Eth is arguing that the 260 is mid-range (which it is) and PP is arguing that the 295 is high range (which it is) so where's the argument again?

...

yeah funny that way... but the 295 is really 275s, which can be anything really, like a new improved 260 or a neutered 285... but in the end a pair of 285's in SLi IS faster than a 295... tho ASUS has a true 285 pair in a uber 295 now...( hot hot).

With my 9800gtx2 it was always behind a month or three to have up to date drivers that its 8800s brothers would get... nothing has changed on that from NVDA. A 1.03 patch then a NVDA driver in mid august should do the trick.

Share this post


Link to post
Share on other sites
yeah funny that way... but the 295 is really 275s, which can be anything really, like a new improved 260 or a neutered 285... but in the end a pair of 285's in SLi IS faster than a 295... tho ASUS has a true 285 pair in a uber 295 now...( hot hot).

With my 9800gtx2 it was always behind a month or three to have up to date drivers that its 8800s brothers would get... nothing has changed on that from NVDA. A 1.03 patch then a NVDA driver in mid august should do the trick.

Yes, that dual 285 is a beauty (but should come with an air conditioner).

Yes, but the 275 wasn't around when the 295 was released and that's why everyone says it's basically 2 x 260 (Essentially it is, but with more shader power). Although you are technically correct. It could be viewed as either a neutered 280 or an improved 260.

Eth

Edited by BangTail

Share this post


Link to post
Share on other sites
Yes, that dual 285 is a beauty (but should come with an air conditioner).

Yes, but the 275 wasn't around when the 295 was released and that's why everyone says it's basically 2 x 260. Although you are technically correct. It could be viewed as either a neutered 280 or an improved 260.

Eth

well the 275s are the chip in 295s now, and they are "either a neutered 280 or an improved 260" If you go Quad go ATI if you go Tri go NVDA.

Share this post


Link to post
Share on other sites
well the 275s are the chip in 295s now, and they are "either a neutered 280 or an improved 260" If you go Quad go ATI if you go Tri go NVDA.

Cool, The single PCB 295 is also supposedly much better regarding heat dissipation which makes sense.

I might try ATI for my next build but it will depend heavily on what the GT300 series is like. I'm not liking Nvidia's drivers as of late.

Eth

Share this post


Link to post
Share on other sites
Cool :)

I might try ATI for my next build but it will depend heavily on what the GT300 series is like. I'm not liking Nvidia's drivers as of late.

Eth

yeah if you believe the interviral crap about ATI drivers, you get sucked into Nvidia then find out about the complete failure that NVDA drivers have become in the last two years... But when they work they work...as most will chime in soon.

Share this post


Link to post
Share on other sites
yeah if you believe the interviral crap about ATI drivers, you get sucked into Nvidia then find out about the complete failure that NVDA drivers have become in the last two years... But when they work they work...as most will chime in soon.

I'm still running 182.50.

The newer ones caused me no end of hassle. Fan control issues and DDC problems to name a few.

Others seem to have no problems with them but that's the way of things I guess.

As you say, the Nvidia driver situation has been in decline for a while now and shows no signs of imminent improvement.

I have no brand loyalty, I buy what's best. I'm more than happy to give ATI a try if their new series is roughly equivalent to the GT300 flagship card.

Eth

Share this post


Link to post
Share on other sites

I remember 5-6 years back when the ATi drivers were so buggy that I switched to nVidia, and they were worse. My best was when I went back to ATi then go the Omega drivers, stable and good performance... but after 2-3 years the Omega's seemed to go downhill and nVidia got better.. but that wasn't bound to last.

Share this post


Link to post
Share on other sites

Shouldn't this be an 'Open letter to nVidia'?

In any case my experience with nVDa products is very good to excellent. My experience with their drivers suggests that moving into a 'unified' driver that was made specifically for a card newer than the one I'm running is generally a really bad idea.

The 9800GX2 also took a few driver cycles to hit fully on all cylinders with the then current titles and when it did baby jump back!!

The GTX295 will be the same story oh impatient ones....give the noise to nVidia not BIS and see if they can speed up their driver dates but until then run what ya can and go play the darn game for a while.

BTW I'm running the 190 drivers with the CFG jammed to the proper VRAM setting on high settings and an OC'd i7. Lovin' every minute of it.

Rock on!!

Share this post


Link to post
Share on other sites
Shouldn't this be an 'Open letter to nVidia'?

In any case my experience with nVDa products is very good to excellent. My experience with their drivers suggests that moving into a 'unified' driver that was made specifically for a card newer than the one I'm running is generally a really bad idea.

The 9800GX2 also took a few driver cycles to hit fully on all cylinders with the then current titles and when it did baby jump back!!

The GTX295 will be the same story oh impatient ones....give the noise to nVidia not BIS and see if they can speed up their driver dates but until then run what ya can and go play the darn game for a while.

BTW I'm running the 190 drivers with the CFG jammed to the proper VRAM setting on high settings and an OC'd i7. Lovin' every minute of it.

Rock on!!

There is no doubt that the 295 still has room for improvement via the drivers but as often happens with Nvidia, as soon as the new range of cards appears, their support for older cards falters. This was definitely the case with the 9800GX2.

There seems to be some debate as to whether we'll see the full blown GT300 flagship card before 2010 although the lower spec cards are supposed to be around before Xmas.

:(

Eth

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×