Jump to content
Sign in to follow this  
Polymath820

CPU and GPU overclocking does sweet f a.

Recommended Posts

overclocked ages ago, no problems major increase in performance at arma, same with three of my friends but maybe we all are just delusional ... WAIT WHAT! so much utter BS in this thread that I have hard time to comprehend it. YES running cpu faster within safe temps and volts will make it faster there nothing else to it, and uuh it degrades faster? what u want to use same cpu +5 years ? buying new chip is safer? yes but which one is cheaper?! + u cant really buy better chip than the market has to offer and if u want to top that u need to oc it. And kudos to all oc'ers here for already pointing out the things needed pointing, And for love of god guys oc the sh*t out of your cpu's / gpu's !

Share this post


Link to post
Share on other sites
It's over in the user missions section. Google for ARMA benchmark and it should pop up.

Got it, thanks!

Stock clocks, two runs, results were the same on both times;

TmTASvC.jpg

Then ran twice with 4.7GHz, results were also the same on both runs;

tfaLZ3F.jpg

So, from the results we can draw a conclusion that OCing is the meaning of life, and it also gives you considerable gain in FPS.

In actual gaming situation the experience is much better and the game runs considerably smoother, of course I don't have any number to show about that, but yea, I think everyone gets the idea.

Share this post


Link to post
Share on other sites

26% increases in framerate.

Perhaps now we could stop feeding the troll who thinks he's found the font of all knowledge on the internet and wonders why overclocking a POS cpu results in nothing more than a warmer POS cpu.

Share this post


Link to post
Share on other sites
overclocked ages ago, no problems major increase in performance at arma, same with three of my friends but maybe we all are just delusional ... WAIT WHAT! so much utter BS in this thread that I have hard time to comprehend it. YES running cpu faster within safe temps and volts will make it faster there nothing else to it, and uuh it degrades faster? what u want to use same cpu +5 years ? buying new chip is safer? yes but which one is cheaper?! + u cant really buy better chip than the market has to offer and if u want to top that u need to oc it. And kudos to all oc'ers here for already pointing out the things needed pointing, And for love of god guys oc the sh*t out of your cpu's / gpu's !

Some people do use their stuff for more than 5 years you know.

My CPU was about 5 years old when I upgraded this year and I could easily have continued using it for a while and would have if I didn't have so much money right now and know I'm not going to over the next few years, so I ceased the moment and upgraded now.

26% increases in framerate.

Perhaps now we could stop feeding the troll who thinks he's found the font of all knowledge on the internet and wonders why overclocking a POS cpu results in nothing more than a warmer POS cpu.

Overclocking results speak for themselves, LOL @ OP what a nub!!

Again, you two are acting like idiots.

The 2600K is as far as I know the best overclocking CPU around. I would assume the 2500 is close.

A 1 GHz overclock isn't even possible on a Haswell without breaking it open and modding it as far as I know.

Earlier in the thread I also did a Showcase: Infantry benchmark 4770K stock vs 4.4 GHz with only a 53 fps->56 fps increase... so the overclocking isn't the solution to world hunger just because it works for someone with a 2500K.

Not to mention OP based his opinion on a Lifehacker article. Moan about the lazy article instead of his opinion.

Share this post


Link to post
Share on other sites

Ok joking aside, I'm running a 2600k atm clocked @4.6 and before that a i7 920 @4.

It does a make difference to playing this game from stock especially the minimum frames where it counts, OP's claim that it does sweet fa is his opinion and he is entitled to it but it doesn't make it so.

Share this post


Link to post
Share on other sites
Some people do use their stuff for more than 5 years you know.

My CPU was about 5 years old when I upgraded this year and I could easily have continued using it for a while and would have if I didn't have so much money right now and know I'm not going to over the next few years, so I ceased the moment and upgraded now.

Again, you two are acting like idiots.

The 2600K is as far as I know the best overclocking CPU around. I would assume the 2500 is close.

A 1 GHz overclock isn't even possible on a Haswell without breaking it open and modding it as far as I know.

Earlier in the thread I also did a Showcase: Infantry benchmark 4770K stock vs 4.4 GHz with only a 53 fps->56 fps increase... so the overclocking isn't the solution to world hunger just because it works for someone with a 2500K.

Not to mention OP based his opinion on a Lifehacker article. Moan about the lazy article instead of his opinion.

1ghz on 4770k is more than possible without de-lidding but takes some luck, like with everything when it comes to overclocking, heck my 4770k runs at 4.5ghz. 2600k isn't magically going to oc better, i've had 2 and they "only" oc'd to 4.5 and 4.6ghz

while oc'ing isn't the solution to world hunger it's a decent free performance boost, even if you sport a stock cooler in some cases as long as you're careful

Share this post


Link to post
Share on other sites

PRAISE OVERCLOCHTHULU, HE OF THE VOLT AND BUS

Avoiding voiding a warranty under the suspicion that bad luck is around the corner aside, there is no logical or practical reason why anyone who is conversant in even the shallow water subject of gaming would object to or naysay overclocking. People choosing to ignore or deride any % performance increase is truly a sad indictment and the PC equivalent of having a car but not understanding the first thing about how it works.

He who chooses not to overclock will forever find himself spending ~30-40% more money for the same effective practical performance as someone savvy does. All else aside it can be boiled down to simple value for money versus ability to manifest the potential savings and gains available.

SO SAYETH OVERCLOCHTHULU, TENTACLED SHEPHERD OF GOOD STEPPINGS AND THE RITE OF THE BURN-IN

while oc'ing isn't the solution to world hunger it's a decent free performance boost, even if you sport a stock cooler in some cases as long as you're careful

Nobody can perform any meaningful research without the aid of computers. Ain't nobody going to develop a supergrain or anything really useful without them. So logically then the faster they can work through their equations the sooner their discoveries will be published and shared, and the sooner the effects of that research will flow downwards to settle like water finding its own level. Ipso facto overclocking has true potential to increase the quality of life of everyone on this planet to any degree between barely and massively..

Trufax.

Edited by mausAU

Share this post


Link to post
Share on other sites

All I know if (didn't read all post) but a lot of the CPU's out on the market, are either ready to be overclocked, or even meant to be! Like the whole "k" lineup from intel, all unlocked and ready to be Fucked with. same with the "fx" lineup from AMD, I have a 8350 I can get that bad boy up to 5.0 and have no issues and no overheating! So there is no real risk here!

Share this post


Link to post
Share on other sites

My CPU is stock 3.4 and clocked at 4.2 with liquid cooling. I didn't mess with the voltage at all. I would rather play Arma at 4.2 then 3.4, I'll dump the CPU early next year.

Share this post


Link to post
Share on other sites
my cpu is stock 3.4 and clocked at 4.2 with liquid cooling. I didn't mess with the voltage at all. I would rather play arma at 4.2 then 3.4, i'll dump the cpu early next year.

overclochthulu is pleased

Share this post


Link to post
Share on other sites

All hail "overclochthulu",

my Gpu has a k-boost and overclock too

I run at ultra

30fps

150% sampling

2500 dist & obj

Game looks mind blowing, even after 600 hours :o

Share this post


Link to post
Share on other sites

A full analysis of arma 3's capacity.

Arma 3 appears to cause the largest performance hit when using large amounts of polygons, so my 384 CUDA cores (Unified shaders) are doing a lot of the work when it comes to arma 3's scene complexity e.g "Object level" Ultra, which is 1,000,000 vertexes in the scene. Now interestingly enough arma 3 no matter the stress you put it under never likes to use anymore than 1 GPU compute node which in my case is approximately 36 cores. That being said the memory on my videocard is more than sufficient and memory clock as well, the one thing it lacks is a wider bit interface. Namely the 128bit memory interface to the video-cards processor. I probably should have looked at a 256bit or a 375bit memory interface I would be utilising much more of my videocard:

Arma 3 has a lot of texture operations:

Also when looking at arma 3 is appears to use a lot of in and out texture operations due to the texture size being 2048px*2048px and my current video-card is capable of 36 GTexel/s which lets think we have a lot of vertexes arma 3 uses about 1,000,000 vertices on ultra and about 500,000 on Very high. Arma 3 also does a lot of pixel shader operations anti-aliasing is a type of pixel shader operation in which arma 3 calculates anti-aliasing. So is shadows so is HDAO or SSAO.

Not only this but the thing that causes the biggest hit to arma 3's performance is HDR now for those who are unfamiliar with the term HDR is the precision of a games world to colour namely for example arma 3 has a 16bit HDR precision(There is 32bit) but 16bit means instead of the limited 8bit we see from "low" it's given a wider colour range to display around it so darker is a more even gradated and so is lighter, effect HDR stands for "High Dynamic range" it's a problem with a lot of cameras and I have taken HDR like images by getting the lighting just right, but processing it on the fly eats a lot of frames. Even though it looks nice but again these are pixel shader operations. As it's making things more "realistic".

Arma 3 uses the CPU more than GPU:

From what I can see arma 3 is using a lot of CPU particularly for operations that should be done on the GPU such as pixel-shader,vertex shaders, and geometry shaders

GPU analysis:

http://imgur.com/rhc59QM

As you can see above I have only 36 cores active and one single computing node being stressed to 100% which 36 cores of a total 384 is about 9.67% GPU usage. it would explain why people with more CUDA or Stream processing unified shaders are getting better performance as they have more cores in each of the Nodes namely a person I talked to who had a video-card with 2880 Unified Shaders had 5% of his GPU active which was approximately 97 cores when running metro-last light he had 3 Nodes active never all of them but definitely more than what arma 3 is leveraging. Another benchmark saw it using 2 Nodes using 3DS Max and 3D max Maya

This is very disappointing to see from Bohemia really is. A CPU is designed for "Generalised computing" not specialised computing such as games and pixel shaders and expecting a game to operate with them is sacrilege.

CPU Analysis:

http://imgur.com/jIYNQFz

As you can see arma 3 is scaling across the CPU cores but it's only using about 40% so my video-card could be the bottleneck at the memory interface "potentially" but how can that be if the video-card is only reporting a 9% core usage?

So the conclusion is bohemia either has a bottleneck in their code similar to how BF4 had or the deliberately did everything on the CPU expecting to have a good result. Also I do believe my video-card might be the bottleneck but I am unsure.

Ha maybe I should get an E-ATX motherboard and 3 I7's and QPI them together and 2 GTX 780's

P.S the poor core usage could also be related to Amadahl's law as I have only 4 threads and 384 cores I only have about 50% that is parallel.

Edited by Polymath820
Why Bohemia Why?

Share this post


Link to post
Share on other sites
My CPU is stock 3.4 and clocked at 4.2 with liquid cooling. I didn't mess with the voltage at all. I would rather play Arma at 4.2 then 3.4, I'll dump the CPU early next year.

What is the fps difference in-game though? ;)

That's the important thing... sorry to stick a spear in the side of your inky god.

All hail "overclochthulu",

my Gpu has a k-boost and overclock too

I run at ultra

30fps

150% sampling

2500 dist & obj

Game looks mind blowing, even after 600 hours :o

Why 150% sampling if I may ask? I've never touched the setting because I assume it eats performance massively and I assume it only does the same thing as anti-aliasing basically?

A 780 should be a good deal stronger than my 770 and still I have 60% longer object distance and 60% higher framerate... that's sort of sad.

Isn't it?

Share this post


Link to post
Share on other sites
What is the fps difference in-game though? ;)

That's the important thing... sorry to stick a spear in the side of your inky god.

Why 150% sampling if I may ask? I've never touched the setting because I assume it eats performance massively and I assume it only does the same thing as anti-aliasing basically?

A 780 should be a good deal stronger than my 770 and still I have 60% longer object distance and 60% higher framerate... that's sort of sad.

Isn't it?

Oh, I thought you were an expert on all this stuff? Hmm, must have been fibbing. That's ok, I'm quoting you so I'll have a nice record :)

Look at your CPU. Now look at his CPU. Now think about that.

Share this post


Link to post
Share on other sites
Oh, I thought you were an expert on all this stuff? Hmm, must have been fibbing. That's ok, I'm quoting you so I'll have a nice record :)

Look at your CPU. Now look at his CPU. Now think about that.

I am an expert. Tested all settings thoroughly except for scaling.

My question wasn't as much a question as a rhetorical advice telling him to change his settings unless he can tell me how he benefits from scaling.

Given more than a little buyer's/tech advice on this site, benchmarked settings, overclocks, CPU settings, NVidia settings and various other settings/software and what have you done, Mr 13 posts?

What's your machine like? How many hours have you even played ARMA? Ten? Thirteen?

Okay.

I've looked at my CPU.

I've looked at his CPU.

They're equal.

Next he has a 40%~ stronger graphics card than mine.

Next I have 60% higher view/object distance and 60% higher framerate...

Am I missing something, expert?

Edited by Sneakson

Share this post


Link to post
Share on other sites
I am an expert. Tested all settings thoroughly except for scaling.

My question wasn't as much a question as a rhetorical advice telling him to change his settings unless he can tell me how he benefits from scaling.

Given more than a little buyer's/tech advice on this site, benchmarked settings, overclocks, CPU settings, NVidia settings and various other settings/software and what have you done, Mr 13 posts?

What's your machine like? How many hours have you even played ARMA? Ten? Thirteen?

Okay.

I've looked at my CPU.

I've looked at his CPU.

They're equal.

Next he has a 40%~ stronger graphics card than mine.

Next I have 60% higher view/object distance and 60% higher framerate...

Am I missing something, expert?

Super-sampling combined with anti-aliasing can get rid of "jaggy edges" I also discovered a strange thing if I set my screen resolution to 720P and force an upscaling of 150% I can get 30 frames constant but the nasty side effect is I get a "per texture shimmering / jagging" making the screen feel blury even though I am scaling to 1080P in 3D space.

"40%" stronger graphics card is somewhat "relative" You can have a good video-card with a 128bit interface and a higher clock speed and it will be capable of pumping the same amount of data as a video-card with a lower clock speed and a larger bit interface. I could probably overclock my GTX 650 2GB from 5000Mhz to 5500Mhz and probably improve the performance or so bits 128bit * 5000Mhz clock = 80GB/s then overclock to 5500Mhz * 128bit = 88GB/s but if I had bought a video-card with a 256bit interface * 5000Mhz memory clock I would have 160GB/s of memory bandwidth nearly double that of my 128bit interface.

You see the thing that directly effects the performance of a video-card is:

1. Bit-with interface this is the amount of "piplines" running into the memory chips from the processor die. Although this is known to affect the video-card if your memory can't keep up then you have no where to go as GDDR5 is rated for something between 900Mhz and up-to 8000Mhz off the top of my head.

2. Memory clock speed which is the rate at which data can be moved in and out of memory to the processor die

3. Processor clock this is the rate at which the ALU based operations connected to the DRAM(Memory) can be done.

4. Video-card memory the fact is, memory does not mean a thing. At all arma 3 from within the config-file on ultra graphics uses just 768MB of the video-card and another 32 -64MB ( at a guess to do the shader operations).

5. Number of unified shaders also known as a "general graphics computing chip" where by the system divides up the work on each core for example processing pixel shaders, vertex shaders or geometry shaders but this is also effected by the portion of which the game is capable of using for example if you have a CPU that has 8 cores and 8 threads it will be able to run more parallel operations than a CPU with 4 cores and 4 threads. But the game has to also be able to scale across those cores in the ideal world if you were to have an E-ATX motherboard with 3 Intel core I7's each one connected to the next by a QPI *Quick-Processor Interface" and either 2 or 1 GTX 780 in theory you would see the Video-card being used a lot more as there is now 4 * 3 = 12 cores running each having 8 threads * 3 = 24 threads. and a lot more data lines coming into them.

6. SLI works for mainly non-real-time applications it is much more suited for that as it is bottlenecked at 1GB/s across the SLI bridge your video-card will be well and truly outperforming this bridge at the PCI-E slots running at 16X 2.0 4GB/s or 16X 3.0 8GB/s bandwidth in the slot vs a 8X which brings a 16X down to an 8X to equalise with the same data rate as a 16X 2.0 More SLI bridges and the more slots your video-cards are in the more loss of speed you will have unless you explicitly bought a motherboard with 2 16X slots. or more.

And before poop pooing bohemia for their bad game. Just remember, it took Billions of $$$ to make a processor architecture it is more than likely going to cost a lot to implement it into a game.

Edited by Polymath820
Additional Information

Share this post


Link to post
Share on other sites
Super-sampling combined with anti-aliasing can get rid of "jaggy edges" I also discovered a strange thing if I set my screen resolution to 720P and force an upscaling of 150% I can get 30 frames constant but the nasty side effect is I get a "per texture shimmering / jagging" making the screen feel blury even though I am scaling to 1080P in 3D space.

"40%" stronger graphics card is somewhat "relative" You can have a good video-card with a 128bit interface and a higher clock speed and it will be capable of pumping the same amount of data as a video-card with a lower clock speed and a larger bit interface. I could probably overclock my GTX 650 2GB from 5000Mhz to 5500Mhz and probably improve the performance or so bits 128bit * 5000Mhz clock = 80GB/s then overclock to 5500Mhz * 128bit = 88GB/s but if I had bought a video-card with a 256bit interface * 5000Mhz memory clock I would have 160GB/s of memory bandwidth nearly double that of my 128bit interface.

You see the thing that directly effects the performance of a video-card is:

1. Bit-with interface this is the amount of "piplines" running into the memory chips from the processor die. Although this is known to affect the video-card if your memory can't keep up then you have no where to go as GDDR5 is rated for something between 900Mhz and up-to 8000Mhz off the top of my head.

2. Memory clock speed which is the rate at which data can be moved in and out of memory to the processor die

3. Processor clock this is the rate at which the ALU based operations connected to the DRAM(Memory) can be done.

4. Video-card memory the fact is, memory does not mean a thing. At all arma 3 from within the config-file on ultra graphics uses just 768MB of the video-card and another 32 -64MB ( at a guess to do the shader operations).

5. Number of unified shaders also known as a "general graphics computing chip" where by the system divides up the work on each core for example processing pixel shaders, vertex shaders or geometry shaders but this is also effected by the portion of which the game is capable of using for example if you have a CPU that has 8 cores and 8 threads it will be able to run more parallel operations than a CPU with 4 cores and 4 threads. But the game has to also be able to scale across those cores in the ideal world if you were to have an E-ATX motherboard with 3 Intel core I7's each one connected to the next by a QPI *Quick-Processor Interface" and either 2 or 1 GTX 780 in theory you would see the Video-card being used a lot more as there is now 4 * 3 = 12 cores running each having 8 threads * 3 = 24 threads. and a lot more data lines coming into them.

6. SLI works for mainly non-real-time applications it is much more suited for that as it is bottlenecked at 1GB/s across the SLI bridge your video-card will be well and truly outperforming this bridge at the PCI-E slots running at 16X 2.0 4GB/s or 16X 3.0 8GB/s bandwidth in the slot vs a 8X which brings a 16X down to an 8X to equalise with the same data rate as a 16X 2.0 More SLI bridges and the more slots your video-cards are in the more loss of speed you will have unless you explicitly bought a motherboard with 2 16X slots. or more.

And before poop pooing bohemia for their bad game. Just remember, it took Billions of $$$ to make a processor architecture it is more than likely going to cost a lot to implement it into a game.

40% better gaming performance is what I meant, based on benchmarks. Nothing else.

Share this post


Link to post
Share on other sites
Bold and underlined? Must be true then.

Yep, when someone posts a wall of text with bolded and underlined text, it means the poster knows his shit, take note people!

Share this post


Link to post
Share on other sites
I am an expert. Tested all settings thoroughly except for scaling.

My question wasn't as much a question as a rhetorical advice telling him to change his settings unless he can tell me how he benefits from scaling.

Given more than a little buyer's/tech advice on this site, benchmarked settings, overclocks, CPU settings, NVidia settings and various other settings/software and what have you done, Mr 13 posts?

What's your machine like? How many hours have you even played ARMA? Ten? Thirteen?

Okay.

I've looked at my CPU.

I've looked at his CPU.

They're equal.

Next he has a 40%~ stronger graphics card than mine.

Next I have 60% higher view/object distance and 60% higher framerate...

Am I missing something, expert?

You don't like it when people talk to you the way you talk to them, huh? Sounds like one of them you problems, bro. About the CPU comparison - you're wrong again! Imagine the odds of that. You claiming that his 2600K is equal to your 4770K is a ridiculous, especially in a game as CPU intensive as ARMA 3. And also because it's factually incorrect.

Here's some proof for you.

Seriously, you gave people purchasing advice? I hope they didn't follow it.

You saying that because I haven't really bothered posting on here means that what I say carries less water is about as stupid a thing as you've said in the three days I've been correcting your amateur hour panto tech support act. How sad. Honestly, you really believe that?

Since you asked, my machine is significantly better than your machine, and the fact that you try to use it as a point of contention thinking you're getting somewhere is embarrassing. You're not, and your single player framerates are as utterly meaningless and pithy as your backpedalling over your snarky reply I quote-caught is funny. I'm tempted to tell you what I use but I think it'll be more fun staying quiet and smugly superior.

Oh, and I've played for several hundred hours - that's in competent player time, which is about 1600 hours after being converted into your units.

I look forward to you trying harder at your next attempt :)

Share this post


Link to post
Share on other sites
You don't like it when people talk to you the way you talk to them, huh? Sounds like one of them you problems, bro. About the CPU comparison - you're wrong again! Imagine the odds of that. You claiming that his 2600K is equal to your 4770K is a ridiculous, especially in a game as CPU intensive as ARMA 3. And also because it's factually incorrect.

Here's some proof for you.

Seriously, you gave people purchasing advice? I hope they didn't follow it.

You saying that because I haven't really bothered posting on here means that what I say carries less water is about as stupid a thing as you've said in the three days I've been correcting your amateur hour panto tech support act. How sad. Honestly, you really believe that?

Since you asked, my machine is significantly better than your machine, and the fact that you try to use it as a point of contention thinking you're getting somewhere is embarrassing. You're not, and your single player framerates are as utterly meaningless and pithy as your backpedalling over your snarky reply I quote-caught is funny. I'm tempted to tell you what I use but I think it'll be more fun staying quiet and smugly superior.

Oh, and I've played for several hundred hours - that's in competent player time, which is about 1600 hours after being converted into your units.

I look forward to you trying harder at your next attempt :)

Are you trying to be funny now? Because you are.

Your source is a comparison of the two CPUs with STOCK CLOCK in BENCHMARK SOFTWARE and ENCODING with the Haswell being up to twice as good at some things and significantly worse at some other things compared to the Sandy.

If you think synthetic benchmarks are comparable to games at all then you clearly have a very bad understanding of how computers work and all I can tell you is to stay in school.

Meanwhile I’ll bring you this:

http://www.ocaholic.ch/modules/smartsection/item.php?page=0&itemid=1123

Now go away. We don’t need a forum fool here. I have disproved every wrongful claim you have made in every thread you have appeared in and if you continue trolling you will be banned, because the mods are quite strict here.

You not saying what sort of a computer you have only shows you couldn't even dream up what a good one is.

I haven't been able to find a direct 2600K vs 4770K ARMA3 benchmark but you're free to find one.

Based on all other evidence my CPU and his are about as close to equal in gaming performance as they come and after looking his graphics card up closer it should be about 16% stronger than mine in ARMA3 specifically.

That’s why I questioned him using 150% sampling which means he has to use other very low settings compared to mine.

Edited by Sneakson

Share this post


Link to post
Share on other sites
Yep, when someone posts a wall of text with bolded and underlined text, it means the poster knows his shit, take note people!

Don't believe me?

Here's one:

http://www.tomshardware.com/reviews/graphics-card-myths,3694-5.html

Here's two:

http://www.tomshardware.com/reviews/graphics-beginners-2,1292.html

Here's Three:

http://www.tomshardware.com/reviews/graphics-beginners-3,1297.html

Here's Four:

http://www.tomshardware.com/reviews/graphics-card-myths,3694-7.html

That is all the evidence you need.

Not to mention I read:

PCAuthority

PCworld

Technet

cnet.news

APCmag

IEEE.org

Phys.org

Nvidia News

ATI News

Atomic magazine(Condensed into APCmag)

PCPowerPlay

Techradar

Maximum PC and finally

techspot

ZDnet

Extremetech

DefenceTech

do I need to keep going?

Intel and AMD news as well.

I by no means class myself as an expert, but I am willing to admit I need more information and so... I seek more. Even though I have been called an expert I do not have the "papers" to state it.

Edited by Polymath820

Share this post


Link to post
Share on other sites

I haven't been able to find a direct 2600K vs 4770K ARMA3 benchmark but you're free to find one.

I have a 2600K here, currently overclocked. What do you need?

Share this post


Link to post
Share on other sites

I took this one at random as an example of you reading but not understanding. Earlier you claimed Arma would not use more than 700MB of VRAM, with no reference to the fact that VRAM use will increase significantly with anti aliasing, downsampling or running at a high native resolution.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×