Jump to content
Sign in to follow this  
WhiskeyBullets

Dual Core User try the following

Recommended Posts

Seriously, Whiskey you need to stop now. You have gone beyond the point of ridiculous to insane. You sound like Baghdad Bob telling reporters how the Iraqi army was slaughtering US forces while the building he was in was surrounded by US troops. I mean cmon dude, get real. The facts are right there in print you listed yourself. C2D and Nvidia cards will rule ArmA in performance. Thats becasue Nvidia cards use dual cores to improve performance. Even in (get this) single threaded applications .

What I found offensive was you telling a guy to buy a new power supply based on watts alone. Not one symptom he listed was an indication of power issues. You were right that he needed more power with that rig, but you were wrong in your diagnosis. Watts mean nothing in PSU, its all about amps and stability. Ive seen 600watt PSU with 45A on the 3V rails and 15A on the 12v rails. Basically a 350 watt PSU with too much power on low volt rails. So by you not even asking him the amps showed me you know nothing about the subject but told him to spend $$$ as if you were an authority. For that alone (if I were an admin) I would have pulled your account.

So please, if your incaplbe of admitting your wrong, then deleting your account would suffice instead icon_rolleyes.gif

Share this post


Link to post
Share on other sites

lol ethne your computer knowledge would fit on the head of a pin. You seemed to want to make this personal which is sad. i never made this personal from the start but you only want to keep insulting, which is fine. The main reason i started this thread is to help the users out there know why dual core was not what the game required and to educate them as to why. So keep with the insults im quite enjoying them because at the end of the day i will be right and you would be wrong. xmas_o.gif

Share this post


Link to post
Share on other sites
it all boils down to this the game was design for the spec listed above, why do you think that dual core users are having issues with VRAM usage. Its simple really it is the TIMING affect affecting different devices.

What the hell... more crap? I've just looked through the entire troubleshooting forum, and the only mention of vram issues I can find is of people with high-end cards (8800 or 7950GX2) not using all the vram (stutters as textures get loaded and dumped) and people with low end cards having issues as their cards struggle to load/dump the textures onto their limited vram. Most people having these issues are also using single core Athlons or P4's...

Good luck with threating me with a warning level because everything i stated in this thread is 100% correct. wink_o.gif and the forum admins know it.

It wasn't a threat, and it certainly wasnt a threat of warning levels. It was simply refering to the Biki sysops who would be speaking to whom ever edited the specs page to read "dual core not supported" which is untrue.

Share this post


Link to post
Share on other sites
lol ethne your computer knowledge would fit on the head of a pin. You seemed to want to make this personal which is sad.  i never made this personal from the start but you only want to keep insulting, which is fine. The main reason i started this thread is to help the users out there know why dual core was not what the game required and to educate them as to why. So keep with the insults im quite enjoying them because at the end of the day i will be right and you would be wrong. xmas_o.gif

Really, I've been in Networking since I was 16. I've owned my own business since I was 19. I am now 35. I'm not taking it personally at all, I'm just sick and tired of watching you lie to people. If my computer knowledge would fit on the head of a pin, the space where your computer knowledge would fit must be immeasurably smaller.

One thing I can do which you seem to have a very serious problem doing, is accepting fact. Fact that is plastered across every Tech site on the WWW. C2D is absolutely without equal at present. It wipes the floor with your poxy little FX-57 in any and every test. Please STFU now, you really have nothing else to say.

E

Share this post


Link to post
Share on other sites
it all boils down to this the game was design for the spec listed above, why do you think that dual core users are having issues with VRAM usage. Its simple really it is the TIMING affect affecting different devices.

I dont have any issues with VRAM usage? My game works fine, and my PC outperforms PCs with similiar specs but only one core. All the tests prove it. Whats wrong with you? How can you ignore all those tests? Do you think we made them all up? confused_o.gif

Share this post


Link to post
Share on other sites

biggrin_o.gif

This is comical.

From the foliage thread:

whiskeybullets: what type of power supply are you using?

Me thinks someone has a power supply fetish. The horror...The horror...

Share this post


Link to post
Share on other sites
biggrin_o.gif

This is comical.

From the foliage thread:

whiskeybullets: what type of power supply are you using?

Me thinks someone has a power supply fetish. The horror...The horror...

Yes, you really have to wonder how retarded this guy is? I don't think I've ever seen someone be so wrong for so long with absolutely no intention of accepting the reality. Most of us would touch the hot stove, realise it was hot and step back. This guy would hold on until his whole arm had burned off, the whole time refusing to admit it was hot rofl.gif

E

Share this post


Link to post
Share on other sites

Ahem....

mugabeve5.jpg

There is inadequate data on Armed Assault performance, and none of it was done on clean-room machines. Therefore it is all entirely spurious and utterly useless. I frankly don't care if you're running it on a WINE VM on an IBM Z/9 or your PDA. The point is those numbers mean absolutely nothing to anyone. The only person that gives a damn is the sales dude who's laughing about how you blew all your beer money buying cooked sand from him to float his beer money.

Now for the technical flamage. The entire dual-core hardware transition is an idiotic attempt to short-circuit the dereliction of duty by the software designers. In this sense Microsoft is not wholely to blame, as the Unix world in large part has failed to make any cohesive attempts at standardized multitasking capabilities, and academia again proves their lethargic uselessness by speculating in theoretical realms devoid of practical reality.

System virtualization has recently gained prominence as a method to increase hardware utilization by bypassing the virtual software limitations and operating multiple concurrent OS sessions, well within the capabilities of the hardware. But that's for all intents and purposes a hack, not an enhancement. If the underlying foundation were truely functional, there wouldn't be a need for virtualization.

Similarly with multi-core systems. The problems actually lie in the software side of the world. Intel tried to hack around it in hyperthreading in the Pentium side of their R&D house and other predictive preprocessing, but that remained artificial guesswork, and the only reason Pentiums maintained positive performance was their high clock speed.

AMD on the other hand fixed the external bottlenecks in standard memory controller and local buses, which resulted in a net performance comparable to the Pentiums. That's the key point there, that while the internal mechanisms were dynamiclly different, the net result was very close, unless your myopic narrow-mindedness was focused solely on the last one half of one percent or less in ultimate performance.

The Core series is an entirely new architecture, and does a number of things differently. Most significantly though, is that they took some lessons from their work on mobile-specific designs and realized that shorter pipelines not only solved their heat problems, but it also helps for speed as well. Combined with Intel's cache-based compensation for memory interface throttling, the net results have been remarkable, for now.

AMD's response has been a marketing one, and if you note many of the recent review whores' sites, AMD's consistently maintain their price for performance lead for normal, sane, or otherwise sensiblely minded people who don't live the thug or chav life.

From early hints, it appears though that AMD is pushing down the distributed core path though, potentially further than Intel is. The fallout from the ATI merger is that the industry - like it or not - is headed in the route of the Cell, in replacing the CPU and GPU with essentially a bunch of intermediate processing units that are distributable for different tasks. Also, while virtualization is presently focused on multiple OS sessions, it's quite likely that there will be a shift to process sessions running packages in a pseudo-OS structure.

Going back to the original point however, all the new machines are Core2Duo boxes, and all the old machines have low performance. This is neither statistically functional, or reflective of actual performance. As far as I'm aware, the only consideration of multi-tasking or multi-threading at BIS has been solely to increase executable compile time, and nothing more.

So when you spam crappy numbers, do your numbers reflect the altered performance of Windows, the altered performance of Armed Assault, the hidden load of spyware and malware on your system, the altered state of your hardware realigning the earth's magnetic field twisting the cross-talk out of your network cables, or just the altered state of the mixed cocktail of drugs and booze?

Share this post


Link to post
Share on other sites

Well i guess since my thread has been hyjack again into power supply questions now i guess i need to go in to detail on how a power supply works for a computer system.

Power supplys require by the manufacturers spec to get the correct wattage to each device to be pluged directly into the wall outlet. Now if you plug your system into an outlet strip you will not be getting the correct wattage to supply what the power supply needs. So for instance a 350 watt power supply will act like a 300 watt power supply and for each device you have pluged into the outlet strip will drop the wattage supplied to the power supply. So if said user still has issues then you look at the wattage required by the devices he has.

Some causes for underpowered devices are as follows:

Crash to destops when loading a program

Video card Issues

OS issues [bSOD]

Hard Drive issues

Now if these happen then you need to get high wattage power supply and thats why i suggested a 600 Watt one to the other forum member. Most computer upgrades done by the user fail to update there power supply to save money and by not going with a 500 Watt one makes it easy to upgrade in the future without buying another power supply at a high wattage.

Most power supply issues do not show up unless it is under high load which Armed Assault gives you.

Now if the computer has enough wattage to supply all devices under heavy load and then some you start to look at video card which the forum member was having issues with.

The main issue with underpower video card underload is after the program starts and runs for a time it will start showing artifacts and then crash to desktop or lockup. Now then i would look to see if the rail the the extra power required by the most video cards to was supplying the correct wattage to it. There shouldnt be anything on that rail plugged into that rail not even a case fan.

So now "IF" im getting the correct wattage threw my power supply and to all my devices[i.E. Video card] then its either a driver issue or dead video card or that the program itself doesnt support it.

If you have ever ordered a High End System from a High End Builder wink_o.gifwink_o.gif they require it to be plug into the wall outlet only.

Share this post


Link to post
Share on other sites

First

techcat.jpg

Seems to be an accurate description of WB... crazy_o.gif

Second, aside from a single reference to your repeated statements of "two words >> power supply" from other threads, there arent any references to power supplies in this thread.

Third, woah there highspeed. Sharing an extension lead with other items will NOT downgrade the performance of your PSU. That just goes to show your further ignorance on the subject of computers.

Please, stop now, you really are making yourself look like more and more of an idiot with every post...

Share this post


Link to post
Share on other sites
Quote[/b] ]Second, aside from a single reference to your repeated statements of "two words >> power supply" from other threads, there arent any references to power supplies in this thread.

Look at the above post

Quote[/b] ]Third, woah there highspeed. Sharing an extension lead with other items will NOT downgrade the performance of your PSU. That just goes to show your further ignorance on the subject of computers.

How much money would you like to bet on that wink_o.gif

Share this post


Link to post
Share on other sites

Now, now highspeed, let's not all go down to my level and be the forum jackass. It's obvious that there are opposing viewpoints. Step back and ask yourself, who really gives a damn? Is it really worth refreshing this page just to see what the next guy will write? ...Probably not.

Share this post


Link to post
Share on other sites

DeadMeatXM2, ethne and olemissrebel, if you cant discuss this topic in a more civilized manner I'll be happy to PR you.  mad_o.gif

Shinraiden and DeadMeatXM2, that kind of off-topic pictures you posted are just childish.

Stick to the topic.

Share this post


Link to post
Share on other sites
Quote[/b] ]Second, aside from a single reference to your repeated statements of "two words >> power supply" from other threads, there arent any references to power supplies in this thread.

Look at the above post

I see 1 reference to a post of yours in another thread, then a page ago I see a person taking offense to your power supply recommendations. I hardly call that a thread jack. Infact you seem to be the one jacking your own thread as you've run out of argument...

Quote[/b] ]Third, woah there highspeed. Sharing an extension lead with other items will NOT downgrade the performance of your PSU. That just goes to show your further ignorance on the subject of computers.

How much money would you like to bet on that wink_o.gif

Actually, quite a lot.

You see, the power supply's ability to generate the power needed is not affected by the number of other appliances sharing the socket. What IS effected is the total amount of power that the socket can provide to the power supply for it to be able to do its job.

The reason "high end builders" will reccomend that their "high end rigs" be plugged straight into the wall socket is that the voltage type in the US means that running a lot of demanding (read: power hungry) electrical appliances off of an extension lead can draw so much current so as to cause the extension lead to overheat and catch fire. (This also effects any other country which uses 110V power supplies).

There is also the issue of overloading the wall outlet by plugging dozens of high-drain appliances into extension cords, causing them to trip the circuit breakers. Again, nothing to do with the PSU being able to do its job...

And I'm not even American icon_rolleyes.gif (in ref to the US building codes/power supplies)

Edit: And to stay on topic, Shin, yes the results arent "clean room" results. But since the C2D's are generally outperforming any other processor, we can make the assumption (I know, assumptions = the devil) that the C2D's are providing the best performace.

I think the initial point of the thread is somewhat lost, but to sum it up:

Who cares if the extra performance is being achieved by "nefarious" means, as long as the C2D's are providing the sort of performance documented by both "dirty" results such as ArmA-Mark and "clean" results such as the various "review whores" then it can not be argued that the outdated FX-57 will provide you with the "best ArmA experience" because its been prooven that the C2D's outperform it.

Share this post


Link to post
Share on other sites

Deadmeat who changed the Biki now, guess you or whoever needed to and thats really sad. You or whoever should have left it to what it was by what BIS Stated Not your or whoevers option.

Share this post


Link to post
Share on other sites
Deadmeat who changed the Biki now...

Discussions about the Biki content goes on the discussion pages at the Biki-site, not here.

Share this post


Link to post
Share on other sites
Deadmeat who changed the Biki now, guess you or whoever needed to and thats really sad. You or whoever should have left it to what it was by what BIS Stated Not your or whoevers option.

If you read the comment I made upon changing the Biki page (yes I did update it) you will see I edited out the phrase which stated that "Dual-core CPU's are unsupported" which is NOT true, since if they weren't supported the game simply wouldnt run.

And the previous edit (which introduced the "Dual-core unsupported" quote) was NOT made by a member of the BI staff, but by someone by the name of JustinSavidge.

And what i edited it to is not opinion, its fact. The fact that ArmA is not dual-core optimised, but it will run on dual-core cpu's.

Share this post


Link to post
Share on other sites
Quote[/b] ]And what i edited it to is not opinion, its fact. The fact that ArmA is not dual-core optimised, but it will run on dual-core cpu's.

So does that mean it will run worse, the same or better that single core support?

Share this post


Link to post
Share on other sites
Quote[/b] ]And what i edited it to is not opinion, its fact. The fact that ArmA is not dual-core optimised, but it will run on dual-core cpu's.

So does that mean it will run worse, the same or better that single core support?

It means it'll RUN, which is the truth not an opinion.

Whereas unsupported means it will NOT run.

As for the better/worse/the same, its clear from the linked ArmA-Mark results that C2D fill the top 24 spots, with a single Athlon in the middle of them. From that, you can draw your own conclusions, but it appears to me that C2D provides better performance than any of the other listed CPU's...

Share this post


Link to post
Share on other sites

I have to give you credit Dead on admiting you changed the it, Second that in your opinion is what it is.

But Dont you think at this moment that there might be issues related to dual core and it should be set more as "undetermined as of now" in case there "are" issues and let the person viewing it make the choice.

Share this post


Link to post
Share on other sites
Guest

Heres what helped me, generally speaking (ive gained about 10 fps overall) - dual core user

Got nHancer for use with my Nvidia Geoforce 6800 256mb card, I now can actually turn on both cores for Arma, as well as turning V-Synch off and also setting 'quality' settings in there to 'performance' This right here has done most of the fps gain.

Then, ive found a few key things (for me anyhow) in the Arma video options.

Of course, I got rid of shadows, disabled. They eat up too many fps.

I though having done that, shading detail would matter not, but it does, big time.

Shading detail is another big fps eater, and imho is not neccessary, I set it to very low. This gained an easy 3 fps, if not more.

Left object detail and also texture detail to normal, I found setting texture detail to very low will cause distant shrubs and whatnot to turn into bloby looking when viewed with scope or binocs, about 1 fps difference noticed for each texture detail setting, so considering the graphical difference for each setting for their cost, I went for normal.

Object detail does kill fps too, but, each setting makes a considerable difference in appearance of stuff in game, so personally I went for average cost and kept normal.

Terrain detail is set to very low, basically because ive noticed little difference in overall quality of game play due to this setting, and I lose between 2 and 3 fps per setting.

Anisotropic Filtering, only difference ive noticed in gameplay is if its set to very high or not, very high setting appears to add in a very little bit of detail to all trees and whatnot(focus on a tree and change it back and forth, ull see small change), also costs me around 3 - 4 fps for very high, so I said forget it. I set it to low, not worth the difference imho.

Antialiasing, set to very high, noticed if anything an increase of 1 fps from disabled/low to setting it to very high, no idea really what it does, nor have I noticed any dif in gameplay, but if very high doesent seem to hurt anything...

Overall quality is set to very high, imho havent messed with this one that much, so ill leave that to each his own to decide whatever on that.

Also, of course, all the other stuff suggested, got updated spyware removals, updated defragger, turn off virus stuff.

Thats that

Share this post


Link to post
Share on other sites

Well its not opinion, as many users are using C2D (or AMD X2) CPU's and having no problems. That can hardly be an opinion. Even Suma has said (I think) somewhere that arma isnt dual core optimised. But at the end of the day it DOES work on dual core.

So saying ArmA is "not optimised for dual core" is much more fair than saying ArmA is "unknown if dual core is supported" since that implies that ArmA wont work on dual core CPU's, which it clearly does.

Its much more fair to simply state that it'll work on dual core than it is to state that its "unknown if it will even work" on dual core, as that will dissuade people with dual core rigs from even trying the game.

The biggest problem I see with this WHOLE thread seems to be a misunderstanding of the words "Optimised" and "Supported".

Whilst ArmA (or windows) isnt dual-core optimised, it does support dual-core. And with C2D's new architecture, C2D is consistently providing better results.

Share this post


Link to post
Share on other sites

This thread needs to be closed now. It is a pointless debate because there is NO debate. C2D is the fastest desktop processor out there for Arma. Arma-mark proves this. It's also the fastest desktop processor available for anything else. Refusing to accept this is very much the same thing as refusing to accept the sun will rise tomorrow.

E

Share this post


Link to post
Share on other sites

ArmaMark proves nothing at all, other than the single instance of an isolated test on a unique system located in a black hole. It is both unscientific, and spurious in sourcing - given the limited ability to both log and compensate for the eccentricities of both the hardware platform and the abuser mashing the buttons in gluttonous lust.

C2D's for all the testing done so far - perhaps present in greater quantities to justify their recent spending guilt - appear to be uniformly faster. Contrary to any reliable testing methodologies, there is no cohesive explanation of gross or net performance, nor plausible reasoning as to why ArmA has a performance differential between disimilar hardware architectures.

Therefore, as this flamewar is devoid of rational reasoning and is firmly uprooted in illogical fallacies, I hearby declare a war of extermination on carrots, since that has just as much rightful place in the most glorious intellectual exercise.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×