Jump to content
Sign in to follow this  
7

ArmA2 dual GPU (X2 cards) support?

Recommended Posts

I'm wondering if preferably a BIS developer could tell me if ArmA2 will work well with quad core CPUs and dual-GPU cards?

If not, please make ArmA2 take advantage of quad core CPUs and dual-GPU cards (GeForce GX2, Radeon X2, etc.).

Share this post


Link to post
Share on other sites

Actually, Arma II will require dual or quad cores because of the AI.

Share this post


Link to post
Share on other sites
Actually, Arma II will require dual or quad cores because of the AI.

He's talking about the GPU, not the CPU.

The short answer is "I have no idea". I suppose it will though. I mean it would be silly if it didn't because pretty much every card of the middle class has multiple cores. Just seems like a lot of unused potential to me.

Share this post


Link to post
Share on other sites
Actually, Arma II will require dual or quad cores because of the AI.

He's talking about the GPU, not the CPU.

The short answer is "I have no idea". I suppose it will though. I mean it would be silly if it didn't because pretty much every card of the middle class has multiple cores. Just seems like a lot of unused potential to me.

Oh, I see. He did mention quad CPUs though so my reply is still valid. wink_o.gif

Share this post


Link to post
Share on other sites

Multi-GPU should definately be supported.

After the R800 AMD-ATI will only make multi-core GPU (to increase ROPs aswell as shader ALUs) cards in the high-end sector.

Share this post


Link to post
Share on other sites

i dont think it will be supported but who knows. Dual GPU's doesnt seemed to be liked by developers.

Share this post


Link to post
Share on other sites
i dont think it will be supported but who knows. Dual GPU's doesnt seemed to be liked by developers.

That's their choice, it will be naturally followed by a reaction according to the third law of Newton.

If multi-GPU is the future, developers should start to worry about their continuity.

It could be patched of course though.

Share this post


Link to post
Share on other sites

I baught the first dual core. Where there any games for it? no. Did it come any games for it within the first years? no. Was it praised and everyone told you to buy it? yes.

It takes time for the dev's to learn all new programming and mostly it needs huge overhaulings in the code. Nothing you just do over night. Im still chocked over Falcon 4.0 AF though that came out with a dual core patch. I had 12FPS in campaign over airbases, and after the patch i haf 40-50FPS. And that was made with a patch. It wasnt "true dual core" coding but still worked brutally good.

Still though, sure it would be nice with support for all new technologies, but i think its harder to slap it in that you think. Especially also if the dev team doesnt have many spare coders. Coders that must have training in all these technologies. Those coders that are making games today didnt have multi core/SLI technology on their shedules when they went to coding school.

wink_o.gif

I might be wrong though. in that case go ahead and slap me.

Have a good one now.

Alex

Share this post


Link to post
Share on other sites
I'm wondering if preferably a BIS developer could tell me if ArmA2 will work well with quad core CPUs and dual-GPU cards?

If not, please make ArmA2 take advantage of quad core CPUs and dual-GPU cards (GeForce GX2, Radeon X2, etc.).

You do realize those cards use internal crossfire/sli right ?. I'd rather the game be optimized properly for the latest hardware and have excellent scaling for both crossfire and SLI. Also you do realize the 9800 GX2 is no longer in production and obsolete.

Share this post


Link to post
Share on other sites
I'm wondering if preferably a BIS developer could tell me if ArmA2 will work well with quad core CPUs and dual-GPU cards?

If not, please make ArmA2 take advantage of quad core CPUs and dual-GPU cards (GeForce GX2, Radeon X2, etc.).

You do realize those cards use internal crossfire/sli right ?. I'd rather the game be optimized properly for the latest hardware and have excellent scaling for both crossfire and SLI. Also you do realize the 9800 GX2 is no longer in production and obsolete.

Oh, my bad. It was an example. *laughs*

Don't come in here acting like a smartass when it brings nothing to the topic. You're just embarassing yourself. I didn't mean any specific cards, just general dual-GPU cards (like the GTX 295, that good enough for you?).

I agree though, crossfire/sli support is of utmost importance.

Share this post


Link to post
Share on other sites

But how many people own such system? SLi and Crossfiresystems are very expensive and their are still a niche product for highend systems.

Share this post


Link to post
Share on other sites
I agree though, crossfire/sli support is of utmost importance.

I beg to differ.

On the Steam Hardware survey (Which, considering the popularity of Valve games is quite a good representation of gaming hardware worldwide) only about 2% of users have SLI or Crossfire systems. I'd much rather the sort of coding effort required to make SLI work would be spent at making the game run smoothly for the rest of us mere mortals who have invested our money wisely on more sensible machines smile_o.gif

Edited for typos

Share this post


Link to post
Share on other sites
I agree though, crossfire/sli support is of utmost importance.

I beg to differ.

On the Steam Hardware survey (Which, considering the popularity of Vavle games is quite a good representation of gaming hardware worldwide) only about 2% of users have SLI or Crossfire systems. I'd much rather the sort of coding effort required to make SLI work would be spent at making the game run smoothly for the rest of us mere mortals who have invested our money wisely on more sensible machines smile_o.gif

I've had plenty of SLI issues over the past 2 years..due to mostly Vista/Nvidia.

ArmA plays perfect with 1 card, though I would luv to have the performance of both cards, as I see in games that do support SLI.

It would be nice if ArmAII started off with SLI support.

Share this post


Link to post
Share on other sites

I'm not even sure if that's entirely up to BIS.

It may very well be that decent SLI/Crossfire support is as much up to the driverteams of the GPU manufacturars as up to the dev team of a game.

The question might actually be "will NVIDIA and AMD support this game enough to optimise their drivers for it and send in their support teams to get to this point".

Don't really know this however (never worked in a company that makes games).

Share this post


Link to post
Share on other sites
72 @ Mar. 08 2009,17:56)]I baught the first dual core. Where there any games for it? no. Did it come any games for it within the first years? no. Was it praised and everyone told you to buy it? yes.

It takes time for the dev's to learn all new programming and mostly it needs huge overhaulings in the code. Nothing you just do over night. Im still chocked over Falcon 4.0 AF though that came out with a dual core patch. I had 12FPS in campaign over airbases, and after the patch i haf 40-50FPS. And that was made with a patch. It wasnt "true dual core" coding but still worked brutally good.

Still though, sure it would be nice with support for all new technologies, but i think its harder to slap it in that you think. Especially also if the dev team doesnt have many spare coders. Coders that must have training in all these technologies. Those coders that are making games today didnt have multi core/SLI technology on their shedules when they went to coding school.

wink_o.gif

I might be wrong though. in that case go ahead and slap me.

Have a good one now.

Alex

There were not much games for it, other than Quake III with SMP support and perhaps Il-2 Sturmovik.

But for encoding/decoding there always was support aswell as video/graphics editing.

What I would like to see is that game developers already start to develop games with support for vector extensions and floating-point operations on the CPU.

In 2011-2012 CPU will make a HUGE leap forward, perhaps to the point where the GPU becomes redundant.

On the Steam Hardware survey (Which, considering the popularity of Valve games is quite a good representation of gaming hardware worldwide) only about 2% of users have SLI or Crossfire systems. I'd much rather the sort of coding effort required to make SLI work would be spent at making the game run smoothly for the rest of us mere mortals who have invested our money wisely on more sensible machines smile_o.gif

You assume that SLI/Crossfire is going to be high-end, rarely used technology forever?

The technology behind SLI and Crossfire will be integrated in a single card in about half a year, if that doesn't work properly you can forget about playing ArmA 2 with your new mid-end PC without the occasional hiccups which renders your game useless.

Share this post


Link to post
Share on other sites
What I would like to see is that game developers already start to develop games with support for vector extensions and floating-point operations on the CPU.

In 2011-2012 CPU will make a HUGE leap forward, perhaps to the point where the GPU becomes redundant

Take a look at Intel's "Larrabee" project. Things seem to be heading the way you describe.

Quote[/b] ]You assume that SLI/Crossfire is going to be high-end, rarely used technology forever?

The technology behind SLI and Crossfire will be integrated in a single card in about half a year, if that doesn't work properly you can forget about playing ArmA 2 with your new mid-end PC without the occasional hiccups which renders your game useless.

Im confused, internal SLI has been around for quite a while now - since around 2006 with the 7950GX2. Yet that sort of technology has been stuck in the high end because nVidia has a bit of a chicken and an egg problem with nobody supporting SLI because nobody has bought it. That said, even if its working at its best - two cards will only be about 50% faster than one. Therefore, the benefit is a bit dubious at best. Of course, I would never be as cynical as to suggest that it's a cheap ploy by nVidia to make people want to purchase more cards than they need  whistle.gif

Share this post


Link to post
Share on other sites
What I would like to see is that game developers already start to develop games with support for vector extensions and floating-point operations on the CPU.

In 2011-2012 CPU will make a HUGE leap forward, perhaps to the point where the GPU becomes redundant

Take a look at Intel's "Larrabee" project. Things seem to be heading the way you describe.

Quote[/b] ]You assume that SLI/Crossfire is going to be high-end, rarely used technology forever?

The technology behind SLI and Crossfire will be integrated in a single card in about half a year, if that doesn't work properly you can forget about playing ArmA 2 with your new mid-end PC without the occasional hiccups which renders your game useless.

Im confused, internal SLI has been around for quite a while now - since around 2006 with the 7950GX2. Yet that sort of technology has been stuck in the high end because nVidia has a bit of a chicken and an egg problem with nobody supporting SLI because nobody has bought it. That said, even if its working at its best - two cards will only be about 50% faster than one. Therefore, the benefit is a bit dubious at best. Of course, I would never be as cynical as to suggest that it's a cheap ploy by nVidia to make people want to purchase more cards than they need  whistle.gif

You are absolutely wrong with you're scaling statistics. Most video cards scale at 80%+ today depending on the video card configuration , drivers etc.... Just read some Anandtech reviews.

Share this post


Link to post
Share on other sites

Took a look at the Anandtech review of the GTX295. Some were scaling up as high as 60-70% which shows things have obviously improved somewhat since the last generation of cards. Still, is spending 100% more to get 60-70% more really a good idea? And you're still ignoring the fact that most people don't have €800 to spend on two high end graphics cards. If nVidia was going to go mainstream with SLI, I think they would have done it long before now. The technology has been kicking around for almost five years.

Share this post


Link to post
Share on other sites

With SoC like Lucid's Hydra Engine, maybe the developer never had to care with multi-GPU.

Depends on how it will work and we will know before the end of this year.

Share this post


Link to post
Share on other sites

About getting extra 60-70% power the question is not that you dont get a 100% improvement but you pay 100%. The question is that no single card will deliver what the other 2 do (talking high-end). Of course you can play the game with the settings low and viewdistance of 250m with great performance, but I would love to play the game at 50-60 fps with all settings on highest and viewdistance of 15.000m if I have the money to pay the hardware required.

Not every game out there meet the hardware needs that this game can demand on a given moment especially in what regards the viewdistance therefore making the game cappable of taking advantage of every possible technology upgrade should be a must.

It´s like purchasing a bugatti veyron with narrow tires ... let us run!

Share this post


Link to post
Share on other sites
With SoC like Lucid's Hydra Engine, maybe the developer never had to care with multi-GPU.

Depends on how it will work and we will know before the end of this year.

Lucid Hydra will probably be implemented in the CPU sometime since Intel is doing some serious backing and co-researching, or maybe it will actually be offloaded by the CPU with an extension superset or co-processor to accelerate/emulate it.

Intel is already integrating their latest GMA GPU and a seperate QuickPath Interconnect-like interface to a discrete dedicated PCI Express graphics card in their new Core i5.

I don't think that the Lucid chip will last long, like Ageia and nVidia it will probably be integrated in a whole different way which is more cost effective.

In the long term parallel (master-slave) GPU technology will disappear, but in the next couple of years it will still be used.

It already is integrated in the high-end 9800GX2 and GTX 295 and in the high- and mid-end HD*8*0x2.

It's always better to fully support short-term tech and make it so it's modular enough to easily patch it for long-term tech.

Share this post


Link to post
Share on other sites
I don't think that the Lucid chip will last long, like Ageia and nVidia it will probably be integrated in a whole different way which is more cost effective.

Intel is putting it on their next-gen X58 motherboard, or at least considering it.

EDIT: Source

Share this post


Link to post
Share on other sites
I don't think that the Lucid chip will last long, like Ageia and nVidia it will probably be integrated in a whole different way which is more cost effective.

Intel is putting it on their next-gen X58 motherboard, or at least considering it.

EDIT: Source

That would be very contradictive to what Nehalem/QPI/X58 is all about.

See, the idea of making a NUMA architecture CPU was that moboco's (motherboard companies) don't need to put so much chips on their board.

It could be true, but I doubt it. I can't think of anybody who wants a motherboard which is more expensive than their Core i7 Extreme Edition. rofl.gif

And I hope I am right.

Share this post


Link to post
Share on other sites

Intel is supposedly one of the major investers in this project. Nope, it ties in perfectly.

Intel already provides SLI and Crossfire support on the X58 (or at least provides the option to board makers). Yet, it doesnt want to pay lisencing fees. There's now also the problem that they have fallen out with nVidia big time and nVidia may stop letting manufacturers have SLI on motherboards with Intel chipsets. Add in the fact that Intel is releasing its own graphics chipset early next year and will probably want some sort of dual card system.

I have no idea where you got the idea that including this sort of tech is going to make the motherboard cost more than €1,000...

Share this post


Link to post
Share on other sites
Intel is supposedly one of the major investers in this project. Nope, it ties in perfectly.

Intel already provides SLI and Crossfire support on the X58 (or at least provides the option to board makers). Yet, it doesnt want to pay lisencing fees. There's now also the problem that they have fallen out with nVidia big time and nVidia may stop letting manufacturers have SLI on motherboards with Intel chipsets. Add in the fact that Intel is releasing its own graphics chipset early next year and will probably want some sort of dual card system.

I have no idea where you got the idea that including this sort of tech is going to make the motherboard cost more than €1,000...

Yes, it does tie in perfectly.

And I was being a tad ironic about the mainboard being priced higher than 1000 euros. But the problem is that PCB's (in this case the motherboard) get more expensive when you put more silicon on it. If you can integrate as much as you can on the CPU die, it's going to be cheaper as the prices for CPU's are constant.

Lots of AMD-hardware boards are much cheaper than pre-X58 Intel-hardware boards.

The Lucid Hydra is a great technology, but why add more of the same if you can change your existing setup a bit?

You don't need a microOS to run the Hydra software on as you can do that through drivers, you don't need the chip as you can add a SIMD extension and offload it to the CPU (or the CPU's integrated GPU) with minimal load and you have direct access to the RAM memory.

Maybe it works on a very short timespan, but I can't imagine moboco's to have a long enough breath to keep this more than half a year.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×