[Edit: the title should say 540M - I couldn't see the title field because the page layout is broken (isn't anyone going to fix that?)]
ASUS A93S (K93SV) laptop
Intel HD3000 + Nvidia Geforce GT 540M
Win 7 64-bit
Steam version of Arma Combined operations
Arma II uses the Nvidia. Crisp clear menus and spash screen. Decent graphics and textures. 30 FPS (expected on this laptop).
I also have America's Army, and after adding it to the Nvidia 3D settings, it also uses the Nvidia chip.
Also, I can add virtually any software (Word, etc) to that list and it will use the Nvidia.
However, Operation Arrowhead absolutely refuses to use the Nvidia chip (or is refused?); it only uses the Intel HD3000.
I can tell straight away, because the text in the splash screen is blurry (although the images are fine). The game landscape has
texture anomalies and glitches. The in-game menu text is blurred (even though the images are, again, fine). Frame rate is
down to 15 FPS. The monitoring tools I have (GPUZ, Optimus Tools and Nvidia Inspector) all show that the Nvidia chip is not
being used. GPUZ shows 100% GPU load on the HD3000 chip.
I've spent hours trying to figure this out. Here's a broad list of what I've done:
· Use right-click context menu to start the game via 'run with graphics processor' option.
· Nvidia 3D Settings: adding, deleting & editing the game profile using Nvidia Inspector after exhausting the options in the 3D Setting dialogue itself.
· Updating Nvidia drivers. Rolling back drivers. Installing original drivers from ASUS website. Choosing random old drivers from Nvidia.
· Reinstalling Arma (including deleting the player profiles/config, and using CCleaner to clear out the Windows Registry). Installing Arrowhead betas.
· Using -winxp parameter so that DirextX9 is forced (I thought the text blur might be caused by DirectX).
My first question has to be, does Arma II use some kind of clever architecture which makes it appear to
monitoring tools like GPUZ, Optimus Tools and Nvidia Inspector as though it is using the integrated
graphics instead of the Nvidia chip?
If not, then my second question is - has anyone heard of this issue before? Is the cause known?
What could cause ONE SINGLE application (and incidentally the most graphic intense application!) not to use the Nvidia GPU?
Thanks.