Jump to content
Sign in to follow this  
Polymath820

CPU and GPU overclocking does sweet f a.

Recommended Posts

Don't believe me?

I think they were joking... maybe anyways.

I have a 2600K here, currently overclocked. What do you need?

Don't bother doing anything :p

To do an accurate comparison you would need both a 2600K and a 4770K so you can test both on a machine with the exact same specifications.

Edited by Sneakson

Share this post


Link to post
Share on other sites
You don't like it when people talk to you the way you talk to them, huh? Sounds like one of them you problems, bro. About the CPU comparison - you're wrong again! Imagine the odds of that. You claiming that his 2600K is equal to your 4770K is a ridiculous, especially in a game as CPU intensive as ARMA 3. And also because it's factually incorrect.

Here's some proof for you.

Seriously, you gave people purchasing advice? I hope they didn't follow it.

You saying that because I haven't really bothered posting on here means that what I say carries less water is about as stupid a thing as you've said in the three days I've been correcting your amateur hour panto tech support act. How sad. Honestly, you really believe that?

Since you asked, my machine is significantly better than your machine, and the fact that you try to use it as a point of contention thinking you're getting somewhere is embarrassing. You're not, and your single player framerates are as utterly meaningless and pithy as your backpedalling over your snarky reply I quote-caught is funny. I'm tempted to tell you what I use but I think it'll be more fun staying quiet and smugly superior.

Oh, and I've played for several hundred hours - that's in competent player time, which is about 1600 hours after being converted into your units.

I look forward to you trying harder at your next attempt :)

Wow, this is really quite embarrassing. I mean, if you have built yourself this sh*t hot machine you would know there is no appreciable difference in Arma performance between a 2600k or a 4770k running at the same frequency. Wouldn't you?

Edited by jiltedjock

Share this post


Link to post
Share on other sites
Don't believe me?

No, you kinda proved your expertise with the title of this thread and numerous links related to nothing at all, as well as a lot of technical jargon related again to nothing at all.

Share this post


Link to post
Share on other sites
No, you kinda proved your expertise with the title of this thread and numerous links related to nothing at all, as well as a lot of technical jargon related again to nothing at all.

1. Bit-with interface this is the amount of "piplines" running into the memory chips from the processor die. Although this is known to affect the video-card if your memory can't keep up then you have no where to go as GDDR5 is rated for something between 900Mhz and up-to 8000Mhz off the top of my head.

2. Memory clock speed which is the rate at which data can be moved in and out of memory to the processor die

3. Processor clock this is the rate at which the ALU based operations connected to the DRAM(Memory) can be done.

4. Video-card memory the fact is, memory does not mean a thing. At all arma 3 from within the config-file on ultra graphics uses just 768MB of the video-card and another 32 -64MB ( at a guess to do the shader operations).

5. Number of unified shaders also known as a "general graphics computing chip" where by the system divides up the work on each core for example processing pixel shaders, vertex shaders or geometry shaders but this is also effected by the portion of which the game is capable of using for example if you have a CPU that has 8 cores and 8 threads it will be able to run more parallel operations than a CPU with 4 cores and 4 threads. But the game has to also be able to scale across those cores in the ideal world if you were to have an E-ATX motherboard with 3 Intel core I7's each one connected to the next by a QPI *Quick-Processor Interface" and either 2 or 1 GTX 780 in theory you would see the Video-card being used a lot more as there is now 4 * 3 = 12 cores running each having 8 threads * 3 = 24 threads. and a lot more data lines coming into them.

6. SLI works for mainly non-real-time applications it is much more suited for that as it is bottlenecked at 1GB/s across the SLI bridge your video-card will be well and truly outperforming this bridge at the PCI-E slots running at 16X 2.0 4GB/s or 16X 3.0 8GB/s bandwidth in the slot vs a 8X which brings a 16X down to an 8X to equalise with the same data rate as a 16X 2.0 More SLI bridges and the more slots your video-cards are in the more loss of speed you will have unless you explicitly bought a motherboard with 2 16X slots. or more.

And before poop pooing bohemia for their bad game. Just remember, it took Billions of $$$ to make a processor architecture it is more than likely going to cost a lot to implement it into a game.

1. bit-width interface which is the amount of bandwidth the "pipelines have to each memory chip" say for example each pipeline to each memory chip is 8 bits that is a single byte so because the video-cards use the same technology as RAM just a higher clock speed for the memory, DDR3 is (Double Date Rate RAM version 3)

This means with each single clock-cycle speed it pumps 2 cycles of data in 1 cycle. So technically a bit-width of 256bit will be 512bit full-duplex full duplex just means signals traveling in and out at the same time. The reason the bit-width interface affects the GPU's performance for example is because you are "memorising / storing" the texture data into the RAM "on the fly"

not only this but you have Anti-aliasing and anti-aliasing all it does it blow up an image to get rid of the jagged edges so a 2048pix image gets super-sampled 2X which makes the image 4092pix so you are "loading a sort of "shadow copy of the games texture files to increase their size in smoothness but the effect of this is a very big hit to performance the algorithm that is used to perform "FSAA" full-screen-anti-aliasing is mathematical approximations to the gradient curve of the original image" 768MB is the base amount of video-memory stated in arma 3's config file for ultra-settings on textures only.

2. Processor clock? You think thats jargon not even remotely... Processor Clock or ALU, the ALU(Arithmetic Logic Unit) is a glorified calculator. Thats it.

3. Video-memory matters only in multi-monitor setups. And let alone do people actually use multi-monitor setups for anything other than video-games and teamspeak -_- (Waste of power and waste of computing power)

4. Unified Shaders is just a concept where they said "We won't bother using seperate Vertex, and Pixel Shaders instead we will create a unified (universal shader processor that can be purposed depending on the task)

This is what is inside the GPC's (Graphics Processing Cores) of the Kepler Architecture currently used in most Nvidia cards: http://images.bit-tech.net/content_images/2012/03/nvidia-geforce-gtx-680-2gb-review/gtx680-20b.jpg

And the outer design for the cores. http://images.bit-tech.net/content_images/2012/03/nvidia-geforce-gtx-680-2gb-review/gtx680-21b.jpg

If you think this is jargon because can't or don't want to understand it, so be it.

Share this post


Link to post
Share on other sites

I've been oc'ing since my dx2-66, but after reading that LifeHacker article I'm a changed man.

oc'ing is dangerous!

Share this post


Link to post
Share on other sites
What is the fps difference in-game though? ;)

That's the important thing... sorry to stick a spear in the side of your inky god.

Why 150% sampling if I may ask? I've never touched the setting because I assume it eats performance massively and I assume it only does the same thing as anti-aliasing basically?

A 780 should be a good deal stronger than my 770 and still I have 60% longer object distance and 60% higher framerate... that's sort of sad.

Isn't it?

Why 150% sampling? Because the graphics difference is significantly improved. You know those eye candy screen shots where the game looks near perfection? That's why. I can't get a steady 60 and at times with view distance reduced I can get a "fairly" stable 50 but I would rather a consistent 30 in all events and the game looking outstanding. The game doesn't need 60 although It would be ideal in an ideal world but it's very similar to DCS world, I can play with a fluctuating annoying frame rate or a solid 30 near maxed. I choose the latter. By doing so I can focus on playing the game smoothly which is more important then playing a fluctuating game.

I only play these games as they're sims as opposed to the huge rack of junk games out there and as it happens they are pc only, thankfully.

ARMA 3 at a smooth 30 near maxed with track ir and x52 is unbeatable in terms of its gameplay and graphics, they go hand in hand and in simulators where your the marine or pilot it's got to look real to feel real.

Edited by Dav

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×