Jump to content
Sign in to follow this  
infiltrator_2k

How Much Is There To See?

Recommended Posts

Card upgrades obviously mean we not only get faster performance, but we also get to see and appreciate more detail etc... Although I'm just wondering... how much detail is there to actually see? For example: do the latest cards allow us to see all? Or is there detail there locked away that can only be seen with future cards?

Share this post


Link to post
Share on other sites

Somehow I am not able to understand your threads at all. Just try the highest settings and see how it turns out.

Share this post


Link to post
Share on other sites

Its a question how many fps and stuttering you're willing to accept during gaming.

With a bad pc you have 5fps and heavy stuttering with all details switched on. A pc where you can switch everything to very high / 10km view distance, without stuttering and constant 60 fps is not invented yet.

Share this post


Link to post
Share on other sites

If you're wondering, set a scene, take a screen shot, crank the settings, take another screenshot, and look at the difference.

Share this post


Link to post
Share on other sites
If you're wondering, set a scene, take a screen shot, crank the settings, take another screenshot, and look at the difference.

I thought the overall image quality was dictated by the card's technologies, but obviously not :o

Share this post


Link to post
Share on other sites

The graphics quality outside is stunning, not to mention the surround sound and the utterly hyperrealistic physics. Just too bad I can't wave a gun around :( :p

Share this post


Link to post
Share on other sites
The graphics quality outside is stunning, not to mention the surround sound and the utterly hyperrealistic physics. Just too bad I can't wave a gun around :( :p
Oh you can, but the damned thing is so heavy, the shoes are bad and the mud is so muddy and it may be raining and beeing yelled at by a Unteroffizier is often not that funny at all...and in the end they will send you to A-stan for at least 4 months.

Share this post


Link to post
Share on other sites
The graphics quality outside is stunning, not to mention the surround sound and the utterly hyperrealistic physics. Just too bad I can't wave a gun around :( :p

Anything's possible in the Matrix... You just need to focus :D

Share this post


Link to post
Share on other sites

Don't want to hijack your thread, infiltrator, but about the Army: I sorted it out by having myself sorted out. I waited for my conscription call but then I found a good doc who understood that i had other plans, and he wrote them a letter that rendered me unservicable. So i unfortunately never got to even shoot a fish in a barrel. But I've heard that doing the 4 months in the Kunduz camp is very well paid... damn

Share this post


Link to post
Share on other sites
I thought the overall image quality was dictated by the card's technologies, but obviously not :o

It is, but so is the monitor, your eyes, the ambient lighting of the room, etc. Image quality has nothing to do with game detail like you were talking about in your first post. ArmA 2 is a direct x 9 game, so it should display on any shader model 3 card. The elements of the images in the game (frame count notwithstanding) is model complexity, texture resolution, and post process effects (I don't think the lighting scales in ArmA 2).

What you are talking about in this post is like some weird hardware thing that reviewers spend one paragraph on out of a 10 page hardware review. ATI tends to produce slightly more contrasty images than Nvidia, and image quality probably differs between different types of cards some 1 to 5 percent. Compared to the differences you will see in ArmA by changing the PP from Low to High, it's not even worth talking about. Even in my experiences with Dx11 compared to Dx9 in games like Aliens Vs. Predator 2010, the differences between the two APIs is not really something you notice unless you're staring at very specific static imagery.

Edited by Max Power

Share this post


Link to post
Share on other sites
you have an Ati 6870 what the F are you talking about? o.O

So that's it.... I've hit the ceiling as far as being able to see attention to detail by being able to game at 1920x1200 on very high settings at a decent frame rate. No matter what card comes out in the future it won't allow me to see anymore detail than I can already see. I now get it, although I did honestly believe there was a lot more to it than that.

Share this post


Link to post
Share on other sites

The amount of detail is definite by design, any game engine can only manage and draw so many objects before it goes knobs up.

Share this post


Link to post
Share on other sites
So that's it.... I've hit the ceiling as far as being able to see attention to detail by being able to game at 1920x1200 on very high settings at a decent frame rate. No matter what card comes out in the future it won't allow me to see anymore detail than I can already see. I now get it, although I did honestly believe there was a lot more to it than that.

You could have figured that yourself :rolleyes: :

Fire up the good 'ol OFP, IT doesnt look much different from what it did in 2001 right? :D

Share this post


Link to post
Share on other sites

I'm quite ignorant when it comes to the science behind graphic processing, so I've always envisaged the developers sat behind powerful computers modelling graphics with commercial cards what cost several thousands pounds: cards too expensively advanced for the consumer market at the time of the game's development. Which in turn would leave the consumer waiting for the same processing technology to come along that was used to create the graphics to become affordable before they could visually appreciate the game and see it to how it is intended to be seen. I now know this isn't the case, which kind of leaves me with mixed emotions. I'm glad I'm now viewing ArmA2 how it's suppose to be viewed, although knowing that upgrading in the future will now only allow me to up the resolution and frame rate is a little disappointing. The idea and anticipation of being able to see more and more detail with every card upgrade was up to now quite exciting. :p

Share this post


Link to post
Share on other sites

Modelling doesn't actually require graphics cards- in fact, 3d modelling software has only just started to use any hardware acceleration at all!

Back in the day, you ideas about the industry were true. I have actually worked on a silicon graphics machine- a computer dedicated to video production. Nowadays, though, any personal computer has the horsepower necessary to run the tools. The special things these companies have now are proprietary software tools and pipelines, render farms (where a room full of computers are set to share render tasks instead of taking up the resources of the artists computers), and so on. The specialization is more or less in the methods rather than any kind of fancy hardware.

Edited by Max Power

Share this post


Link to post
Share on other sites
So that's it.... I've hit the ceiling as far as being able to see attention to detail by being able to game at 1920x1200 on very high settings at a decent frame rate. No matter what card comes out in the future it won't allow me to see anymore detail than I can already see. I now get it, although I did honestly believe there was a lot more to it than that.
.

the better the card the more attention to detail you will get. and you will get nicer smoother edges with a better card.

he not talking about performance he just saying will it look much better with a nicer card.

Share this post


Link to post
Share on other sites

attention to detail is dictated by the artist and the player's level of spatial acuity, or how observant he is. a graphics card won't contribute to it at all.

if you mean detail by itself, then yes, better card means you can afford to crank up the settings which raises the level of detail. but attention to detail is separate thing.

Share this post


Link to post
Share on other sites
attention to detail is dictated by the artist and the player's level of spatial acuity, or how observant he is. a graphics card won't contribute to it at all.

if you mean detail by itself, then yes, better card means you can afford to crank up the settings which raises the level of detail. but attention to detail is separate thing.

I think more often than not, it's limited to time, budget, and processing constraints rather than a bottleneck at the artist.

Share this post


Link to post
Share on other sites

Back in day's of 'Return To Castle Wolfenstein', I remember upgrading my card which left me glaring at the game's brickwork walls. I was left thinking "I've not seen that before!" The graphics that were previously looked like blocks of multi-coloured mirky blur now actually looked like textured weathered blocks of natural stone. The soldiers faces also looked smoother, more defined and more realistic. The difference was unreal!

Share this post


Link to post
Share on other sites

wow guys really back up a sec, there's a bit of misunderstanding here. A graphics card that is capable of all the rendering procedures will give the best possible graphics. For example trying to run a game that uses shader model 3.0 but only having shader model 2.0 will mean you will 'not' see the SM3.0 effects. If however all the rendering features are supported, from then on its just a case of how fast the graphics card (and cpu to a lesser extent) is. Because its not viable to play a game at 5fps if its absolutely maxed out, so therefore you the user will lower resolution, texture detail etc until it becomes playable

When 3d graphics (models, textures etc) are created I can almost guarantee you that what you see in the game is NOT the same as what the artist initially created. Usually they create things in a higher resolution and then scale it down for the end product to meet certain requirements (such as framerate on their 'recommended specs'). What you have on the CD or game installation is that scaled down version, not the original. But no matter how good your graphics card is it cannot recreate detail which is not there in the first place. The only exception to this is the way things are rendered. For example Anti-Aliasing, the game still renders the same models and textures, but AA is applied by your graphics card afterwards to smooth things out. Anisotropic Filtering is another which lets textures appear sharper at angles due to the rendering technique. But at no point does it touch the actual texture or game model! it does NOT add detail to models or textures. It may apply a filter over the top but thats about it

Back in day's of 'Return To Castle Wolfenstein', I remember upgrading my card which left me glaring at the game's brickwork walls. I was left thinking "I've not seen that before!" The graphics that were previously looked like blocks of multi-coloured mirky blur now actually looked like textured weathered blocks of natural stone. The soldiers faces also looked smoother, more defined and more realistic. The difference was unreal!

this for example would only be a case of the previous graphics card not supporting a certain rendering feature that the game is capable of and therefore must leave it out. The newer one would have supported it and thus given a better appearance. But it did not 'improve' the original graphics in any way. Rather the older graphics card had to 'lower' graphics quality because it wasn't capable of showing them correctly in the first place

Edited by Millenium7

Share this post


Link to post
Share on other sites
wow guys really back up a sec, there's a bit of misunderstanding here. A graphics card that is capable of all the rendering procedures will give the best possible graphics. For example trying to run a game that uses shader model 3.0 but only having shader model 2.0 will mean you will 'not' see the SM3.0 effects. If however all the rendering features are supported, from then on its just a case of how fast the graphics card (and cpu to a lesser extent) is. Because its not viable to play a game at 5fps if its absolutely maxed out, so therefore you the user will lower resolution, texture detail etc until it becomes playable

When 3d graphics (models, textures etc) are created I can almost guarantee you that what you see in the game is NOT the same as what the artist initially created. Usually they create things in a higher resolution and then scale it down for the end product to meet certain requirements (such as framerate on their 'recommended specs'). What you have on the CD or game installation is that scaled down version, not the original. But no matter how good your graphics card is it cannot recreate detail which is not there in the first place. The only exception to this is the way things are rendered. For example Anti-Aliasing, the game still renders the same models and textures, but AA is applied by your graphics card afterwards to smooth things out. Anisotropic Filtering is another which lets textures appear sharper at angles due to the rendering technique. But at no point does it touch the actual texture or game model! it does NOT add detail to models or textures. It may apply a filter over the top but thats about it

this for example would only be a case of the previous graphics card not supporting a certain rendering feature that the game is capable of and therefore must leave it out. The newer one would have supported it and thus given a better appearance. But it did not 'improve' the original graphics in any way. Rather the older graphics card had to 'lower' graphics quality because it wasn't capable of showing them correctly in the first place

Many thanks explaining things Millenium7 ;) I might be talking utter b****cks now, although from what I can understand I think it's much more to do with how well the card can handle how the game engine controls the graphic when animated, as well as its ability to how well it applies the dynamic effects to create the realism. I mean, almost any PC can display a static image to how it's suppose to be seen, as long it's displayed at the right colour depth and resolution. however, when images are applied in games like they are they become animated and their dynamic effects fluid so it's obvioulsy going to demand huge resources from the card. So, in short what I'm trying to say is: I believe the detail is all there to see when you have ARMA2 maxxed out, although how smoothly and crisp you'll be able to see it move will be down to the performance of the card.

Edited by Infiltrator_2K

Share this post


Link to post
Share on other sites
Many thanks explaining things Millenium7 ;) I might be talking utter b****cks now, although from what I can understand I think it's much more to do with how well the card can handling how the game engine controls the graphic when animated, as well as its ability to how well it applies the dynamic effects to create the realism. I mean, almost any PC can display a static image to how it's suppose to be seen, as long it's displayed at the right colour depth and resolution. however, when it becomes animated and fluid it's obvioulsy going to demand huge resources from the card. So, in short what I'm trying to say is: I believe the detail is all there to see when you have ARMA2 maxxed out, although how smoothly and crisp you'll be able to see it move will be down to the performance of the card.

Not exactly. The processor is also very important when it comes to Arma2's visuals. For me at home i can run Arma2 at high and very high settings with 3 km viewdistance and good framerates. But when i go play it at a friends house, who has 2 PC's, i play it on normal settings with 1.6 km viewdistance. The only difference between my PC and the one i play on at my friend's place is the processor. The GPU's are similar but i have a quad-core and his has a dual-core.

Share this post


Link to post
Share on other sites

As a matter of fact, I dare say that a better CPU would provide more benefit than a better GPU in almost every case. A better CPU allows higher view distances and more AI at a time, with less hitching from scripts and triggers.

Share this post


Link to post
Share on other sites

Yeah, ArmA2 is known for being CPU intensive, that's the main reason for my OC. I guess it's building a machine with components that run level in performance.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×