Jump to content

Polymath820

Member
  • Content Count

    161
  • Joined

  • Last visited

  • Medals

Posts posted by Polymath820


  1. Call Of Duty Ghosts is suffering from performance problems:

    http://www.reddit.com/r/CODGhosts/comments/1q0zlm/pc_settings_and_tweaks_that_will_greatly_improve/

    Battlefield 4 is suffering performance problems:

    There is literally hundreds of posts...

    http://www.tomshardware.com/answers/id-1864126/battlefield-performance-issues.html

    ArmA 3 is suffering performance problems.

    Correct me if I am wrong, but isn't it a bit suspicious to be seeing this across multiple platforms? And game engines, meaning it implies the code-bottleneck is not in the games them selves?

    Why are all 3 games suffering effectively the same problems?


  2. Okay.

    No it probably will per-cursors to that technology are nanomotors (propulsion devices), artificial muscles (weaved carbon nanotubes, activated by electricity or heat) nano-batteries and piezoelectric nano-generators to recharge the nano-batteries. Nanites controlled by an on-board artificial intelligence system. Biological software <-> Hardware interfaces, BCI's (Brain computer interfaces connected to the part of the brain via non-invasive methods which intercept threat messages, artificial intelligence and immunology, to develop an artificial immune system which is not narrow band operation as in antigens and adjuvent enhanced vaccines. Which is easily over-run, no a system that is capable of storing genetic information about viral and other intrusions in full detail much like a nano-sized computer database and responding to them / analyzing them.

    Basic needs will be overrun by no-longer needing food for any part of the body as the nanites are capable of breaking down surrounding matter and assembling it at the molecular scale and synthesizing glucose using nanobots or nanites, for the brain. No longer suffer disease because of the advanced artificial immune system.

    Humanity will become something else.

    Skies the limit. And beyond.

    You only have to look at this every day: http://phys.org/

    To know what is happening.

    http://phys.org/news/2014-04-week-bestquantum-mechanics-breakthrough-d.html

    The world is changing. Very fast you'd only need a few billion dollars and research teams to develop the nanosuit now with converging fields of research all moving towards one goal.

    Industrial Engineering (Solve manufacturing processes)

    Artificial Intelligence (Developed BCI's and genetic code interfaces)

    Mathematicians (Design optimization models)

    Computer Engineering with Biotechnological sector and bio / chemical computing research

    Nanotechnologists ( Design the machines involved such as the nanobatteries and integrate them into the muscles etc. To provide powersupplies, self-feeding nano-wires to connect to muscles and re-enforce them.)

    If people were to know of the true reality. They would be terrified.


  3. ArmA 3 is not modern war. It's the future, but if you notice in the "campaign" their equipment is "dated" meaning by 2035 based on what I know.

    http://defensetech.org/

    Any of the weapons in ArmA 3 are probably 2016 - 2018 era or less.

    This is the sort of things you'd likely be seening by 2035 - 2040

    http://defensetech.org/category/cyber/future-wars/

    http://www.army-technology.com/projects/raytheon-xos-2-exoskeleton-us/

    http://web.mit.edu/ssp/publications/working_papers/wp-00-2.pdf

    http://www.darpa.mil/Our_Work/AEO/Programs/One_Shot_XG.aspx

    http://www.darpa.mil/Our_Work/TTO/Programs/

    http://www.darpa.mil/Our_Work/BTO/Programs/

    Ebola 2.0

    http://spectrum.ieee.org/nanoclast/biomedical/devices/nanomotors-could-churn-inside-of-cancer-cells-to-mush

    Give them their own micro-magnetic power-supply and self-replicating protocol which seeks out gold in the human body. You have a weapon that is immune to antibotics and antivirals.

    This is the real future of war and it is ugly.

    And based on the rate of change with technology. The Crysis 2 nanosuit type stuff probably won't be too far beyond 2050 - 2060.


  4. Note: Sorry about the other thread not so objective more a critical review.

    ArmA 3 from a critical standpoint. Although ArmA 3 is a good game, with excellent graphical fidelity and unique characteristics not seen in many games, such as being able to command units yet it can be improved.

    Environmental experience.

    ArmA 3 has some details that are quite obvious when it comes to "destroying the immersion feel" for example the lightning in ArmA 3 and the thunder a simple improvement to the game would be to add thunder claps with a random loud-ness value set in a min-max sort of system and on the fly processed pitch. I would find ArmA 3 much more enjoyable if I am sitting on top of a hill with a sniper and a flash of lightning above me and then a loud bang, making me jump out of my seat. Would definitely add a "adrenaline" rush feel normally associated with high speed action games such as battlefield 4. Although after a while the "rush wears off" in those action games and becomes a rinse and repeat system.

    Mission development and Editor Fidelity.

    ArmA 3 has a very dynamic mission editor making missions very interesting but ArmA 3 lacks a certain "flow" by this I mean controlling events or cinematic experiences is a pure nightmare. From my own experience telling AI to go inside a building and wait there, they fail to do even the simplest tasks of cinematic experiences, such as a scene such as bohemias opening camera-scene where you are in the helicopter flying towards base.

    ArmA 3 also lacks non-physical mission design tools. Such as "plan-design etc". You even have a blatantly obvious problem where if you place down a trigger set to activate on blufors presence the trigger has delay (search delay) where it attempts to check if the condition is true or false and proceed with the activation of the trigger. Additionally Synchronizing objects is a nightmare when you get into very complex mission design with synchronizing objects for activation or units it's a spider-web of mess. Much like what bohemia was suffering when they had their animation states diagram.

    You could quite easily fix this issue by adding in a "first and last" synchronization tree when you have the module synchronized and the units synchronized in a tree view and group them together into sort of "virtual call signs" so you could group an entire AI,buildings etc, into that group making it micro-manageable.

    Scripting in ArmA 3 is also a nightmare I am the sort of person that gets "caught up" in the details, when I started scripting I couldn't handle un-optimised missions so I went looking into how to optimise little did I find an article last updated in.

    https://community.bistudio.com/wiki/Code_Optimisation

    Ok it tells you what is faster, but it does not tell you the "reasons it is faster". I also pursued the idea of compiled vs non-compiled vs pre-compiled functions and trying to get them to work as planned not exactly easy, when sure you have functions on the wiki, but has anyone actually ever looked at how dis-organized the wiki is? It's worse than reading mandarin while hanging upside down. Not to mention debugging is hell in ArmA 3's .sqf it could be so much better if a simple text-editor / parser was introduced into ArmA 3 even an IDE would be better and clear documentation of every single piece of scripting would be nice as well .SQF lacks a lot of the comprehensive documentation found in C++, Java or Python, Ruby etc. Yes steps have been started to improve this done by Karel Moricky but more needs to be done a lot more.

    Zeus DLC.

    Zeus is a nice touch but the dungeon master idea, needs serious refinement, maintaining a Zeus mission is hard, very hard. The lack of an ability to create a "pre-created" or pre-stored array of objects (custom objects) makes the design of Zeus missions difficult for example I made a mission in Zeus was a Transformer node destruction mission that would cause a power overload at a enemy transmitter tower a nice touch to this would have been when transformers synch'd to a (destroyed condition was true) it would have activated an "electrical sparking sounding and particle effect at the top of the radio-transmitter and small fire-effect inside the transmitters control boxes there needs to be more "informational zeus components as well" briefing is in sufficient Zeus also needs to be able to see the "players objectives have been activated or de-activated maybe show by "Green text on Objective Module (completed)" Red-text ("Failed") Yellow ("Cancelled") and Assigned "Grey" standard colour. Thats in the Zeus interface only makes for paying attention to whats active and whats not much easier.

    There is also a lack of "triggers" which could automate everything by which when a "destroy condition becomes true on a synchronized object" it activates the next object and so on, automating a lot of hard-work instead of "Zeus having the maintain the entire mission" Zeus is a step in the right direction I also know that ArmA 3 also had a 3D map-editor although incomplete, which has an interface that appears to date back to "Arma 2 era" Maybe it was a prototype experiment but I don't know.

    Artificial Intelligence.

    ArmA 3 uses FSM based AI which is more like an expert system, than an artificial intelligence. AI which are attempting to execute a command seem to repeatedly fail to find their way around objects effectively, making them walk right around a vehicle you told them to get in and then randomly stop as if their FSM just "froze", and at other times the AI walk around as if they are "bump-bots" with random error correction procedures. AI also do some other pretty stupid things, such as when I am in the commander seat of a tank and say "target that Tank at 230*" The AI attempts to track the tank and fails to do so resulting in me being blown to bits, when I even explicitly state "target" and ~+3+3 -> Fire.

    Could bohemia not use Neural-network based AI, to work out where it is going around obstacles and then use FSM's for more "static-based" responses?

    There is an even more blindly annoying issue with AI take a few jets set them into a specific formation e.g a simple V-configuration watch what they do, they fail even if you stay on a straight movement, to get into a V-formation correctly and quite often crash into each other, which further causes pain for their recalculation having to compensate when one AI goes down changing formation and moving closer together or further. Now I remember in a lot of games I played as a kid particular aircraft ones and even Star-lancer epic game, the AI did form formations seamlessly, they would also respond to your targeting commands etc as well. So how can a game made by Microsoft from what 2001 or so, be able to form formation so easily and is a 3d game, and ArmA 3 can't even do that in flight?

    Graphics and Fidelity.

    ArmA 3's Graphics are absolutely beautiful amazing and down-right epic. But there is a few very glaring problems. For example ArmA 3 does not like deferred CSAA (Coverage Sample Anti-aliasing) X8 Anti-aliasing it appears to make things worse and more jagged, ArmA 3 also has quite a few problems with shadows,trees and other non-linear objects, where it calculates the light coming through trees but for some odd reason even on ultra, does not get rid of the slight pixelation so compare this to for example CryEngines (Crysis 2) and look at the trees, they trace the outline of the leaves perfectly, abeit the movement / animation of the trees themselves is a bit fast for "realistic standards" they successfully do this, without a hassle, although most people wouldn't notice it while running around is that a cause for, erroneous light-tracing of objects? Just because people wouldn't notice it? Another question, although I know a lot of games do this, to improve performance but paper-cutout card-board bushes are just depressing again CryEngine / Crysis 2 is able to create (more dynamic) independent model movement compared with arma 3, it also have a much more dynamic destruction engine. Making buildings just reduce to a basic-model is really cutting back on the realism, and Crysis 2 also had a unique capability which "battlefield 4" appeared to rip-off, interactive environmental components such as switching the lights off kicking a car, breakable windows, destructible, pretty much everything. And PhysX is supposed to be an innovation? I find it hard to believe, and PhysX supports liquid simulation acceleration? Did Bohemia utilise this in the maps liquid e.g (sea)? Rain, lacks GPGPU where by you see progressive water form in places over-time this is expressed in CryEngine 4.0

    There is a lot of questions that need to be asked of Bohemia.

    If you would like to know more about game AI programming please refer to:

    https://software.intel.com/en-us/articles/designing-artificial-intelligence-for-games-part-1

    If you would like to know more about CryEngine 4.0 please refer to:

    If you would like to know more about Coverage Sample Anti-aliasing please goto:

    Foreword: Coverage Anti-aliasing is faster than MSAA

    http://www.nvidia.com/object/coverage-sampled-aa.html

    P.S: Maybe Bohemia should do a joint effort with CryEngine and Virtuality 4.0 come up with a new engine.


  5. 1. Do me a favour goto control panel and find "Event viewer" look for "Errors of where display adapter stopped working" what does it say? Normally indicated by a X with a red shield (Fault / Error)

    2. What resolution is the computer screen you are playing on? Because what might be happening is you have a 1080P screen but from what I can tell that video-card will have a hard-time driving anything higher say for example you are using a 1440P screen 1GB will not cut it.

    Yep just confirmed it not only does that video-card have a very weak GPU core clock speed (unified shaders) it also is utilizing DDR3 not GDDR5 which is what most video-cards are using. 128 Bit very small bit-width interface would be compensated by a higher memory clock but your memory clock is also very low I am guessing being low-end DDR3 it will be about 600Mhz or so. so the math is 128bit memory interface * 600Mhz clock for memory. Total memory bandwidth of 9.6GB/s compared to a GTX 650 which has a 128bit memory interface but a 6000Mhz clock speed amounting to 80GB/s.

    If you want to play arma 3 you need alteast a dedicated video-card. 6620G is a IC / Integrated video-card built into your A8 APU. APU's are low-power consumption also the A8 you have is only 2.5Ghz arma 3 is failing to run because it 1. Can't create the video-card buffer required to run the game

    2. Your clock speed is too weak on both sides.

    My advice: Gaming laptops or any kind of gaming laptop is a waste of money.

    If you are going to throw away $3500+ for a new laptop you may as well buy a top of the range desktop system. Gaming Laptops are extraneously expensive and are rapidly outdated. Atleast if you spend the money on a desktop say spend between $300 - $600 on a motherboard (E-ATX) it has over 12 PCI-E slots including 2 16X and 2 8X

    Laptops are designed for work related practices, non-realtime activities and "soft-gaming" Personally to give that laptop a new lease on life, remove windows put Fedora,Ubuntu,Magiea, or Debian on the laptop


  6. Mods: A3

    Distribution: 0

    Version 1.16.123633

    Fault time: 2014/04/12 --------

    Fault address: 009505D3 01:0054F5D3 C:\Program Files (x86)\Steam\steamapps\common\Arma 3\arma3.exe

    file: mp_zgm_m11

    world: Altis

    Prev. code bytes: 01 F3 0F 11 00 C2 04 00 CC CC CC CC CC 56 8B F1

    Fault code bytes: 8B 06 8B 90 E4 08 00 00 FF D2 85 C0 75 20 8B 06

    Registers:

    EAX:5CE32B00 EBX:5CE32B04

    ECX:00000000 EDX:00000005

    ESI:00000000 EDI:5CE32F90

    CS:EIP:0023:009505D3

    SS:ESP:002B:01E9F1E8 EBP:00000000

    DS:002B ES:002B FS:0053 GS:002B

    Flags:00010206

    =======================================================

    System Specifications:

    CPU: I5-2400 3.4Ghz(Turbo) 3.1Ghz(Stock) Quadcore

    RAM:

    Part Number: CMV8GX3M1A1333C9

    Model: PC3-10700 (667 MHz)

    Brand: Corsair

    HDD: WDC WD10EZEX-00KUWA0

    GPU: NVIDIA

    Model: GeForce GTX 650

    PSU: Aywun Mega Power Elite 750W

    Motherboard: MSI

    Model: H61M-P35 (MS-7680) (SOCKET 0)

    Chipset Model: Sandy Bridge

    Chipset Version: H61M

    Motherboard Form-factor: M-ATX

    ===========================================================

    Reproduction for me:

    Join as a Zeus Game-Master and then (Arma 3 has stopped working) Fault: C0000005

    ===========================================================

    Event Viewer Dump:

    arma3.exe

    1.16.123.633

    53455a72

    arma3.exe

    1.16.123.633

    53455a72

    c0000005

    005505d3

    f74

    01cf55e52534dbcc

    C:\Program Files (x86)\Steam\SteamApps\common\Arma 3\arma3.exe

    C:\Program Files (x86)\Steam\SteamApps\common\Arma 3\arma3.exe

    =============================================================

    Windows Memtest in progress.

    Result: No memory faults.

    Windows Diskchk in progress.

    Result: No errors were found

    Windows file system verification in progress.

    Result: No integrity violations were detected


  7. Contrary to what people believe about artillery they are not 100% accurate as you increase the firing distance of the artillery shell it's in-accuracy increases. Just look at when you are in an M4 Scorcher it has a "Spread report" this means the amount of distance the shell will spread no matter what if you fail to maintain an LOS. You also must maintain a complete and stable LOS at all times for the target to be hit. Trees will obstruct the laser, bushes, land, rocks anything. A "Complete" LOS. Otherwise it will not hit.

    Guided Shells are: Most arma 3's shells are "passive" which means a "point and shoot concept" you lock onto it from a distance using T and it goes to that with the spread report included. Passive systems are prone to error. You can confuse the missile guidance system quite a few ways including Chaf, Deceptive transitters, Flares etc (Interesting question could you screw up a missile AA by firing a standard grenade shell flare.

    If you wish to know more about missile guidance systems please follow the following link: http://www.britannica.com/EBchecked/topic/1357360/rocket-and-missile-system/57325/Passive


  8. This tutorial shows you how to make a leather texture. You can modify the neatness of the leather texture which uses "generated hexagons" to create a "leather texture". What you'll want is to http://design.tutsplus.com/tutorials/create-your-own-leather-texture-using-adobe-illustrator--vector-5572

    1. Create a hexagon array of full neatness, bump / bevel the hexagon layer by a small amount (adjust as you like) and widen their "branches slightly"

    2. Probably add a filter to get rid of a lot of the high lights (intense) and blend in some alpha-colour to "show lighting effects over the entire image".

    3. With the beveling you'll want a "sharp point of light" meaning you don't have a smooth gradient transition with the vertices and lines Lighting along the very edge.

    4. The camo-texture can be generated using Plasma or Noise render, distort it as you see fit to change it. Colourise that noise map a certain colour and add a graident map to give it the "blotchy effect"

    5. Add another noise map and use a layer-mask which is similar a non-destructive way of editing the opacity of the image. So you can smooth the transition throughout the entire image.

    6. You are all done


  9. Don't know about the flexibility part. You could quite easily use a well developed scripting language such as Python to script missions reducing the development time considerably and allowing main-stream training to develop the missions. Just simply make it so a "stripped down" version of python can call / manipulate the games functions / components and that is it eliminating the concern about "viruses".


  10. No it wouldn't... It's just aggravating to see a CryEngine 3 based Military simulator outstrip arma 3 on so many levels.

    Arma 3 also suffers not limited to:

    1. Shadow Approximation errors (Pixelation of shadows (found in arma 2 as well))

    2. Anti-aliasing cost far too high than necessary and model-jagging removal results in "fuzzy edges"

    3. Ambient Occlusion costs too high than necessary for not even a noticeable difference

    4. Terrain mapping and bump-mapping of terrain "slightly more bumpy" from appearance in the sky but not noticeable

    5. Ugly post processing that adds no value to immersion

    .....

    the list goes on.


  11. And yet BIS(Simulation department) has much better documented .SQF scripting than does arma 3. And you tell me it's not the same engine? It has execVM and compile etc. All components apart of the .SQF virtual machine. And it's not the same engine? Sure looks like it is.

    Just like Bohemia says "Zeus" is an innovation, yet it was already in VBS 2.0 as "Instructor". I don't think it is coincidence.


  12. http://www.rt-immersive.com/media

    This is really depressive.

    Not only does arma 3 suffer from performance problems. But I get the same GPU usage and less CPU usage in Crysis 2 and the CryEngine 3 development set. Than arma 3. All on a GTX 650... and I5-2400 3.5Ghz Bohemia pull your socks up. And before anyone makes an excuse the "terrain is too large"

    I am sorry we need to stop making excuses for bohemia and face it bohemia are going to lose their VBS contract (http://www.pcgamer.com/au/2011/05/31/us-military-commission-57m-virtual-reality-training-sim-powered-by-cryengine-3/) . It's only a matter of time before they get kicked out of the Military simulation department as well.

    We have

    http://unigine.com/

    Valley Benchmark...

    Honestly Bohemia you are holding back so bad. It is so angering.


  13. No, you kinda proved your expertise with the title of this thread and numerous links related to nothing at all, as well as a lot of technical jargon related again to nothing at all.

    1. Bit-with interface this is the amount of "piplines" running into the memory chips from the processor die. Although this is known to affect the video-card if your memory can't keep up then you have no where to go as GDDR5 is rated for something between 900Mhz and up-to 8000Mhz off the top of my head.

    2. Memory clock speed which is the rate at which data can be moved in and out of memory to the processor die

    3. Processor clock this is the rate at which the ALU based operations connected to the DRAM(Memory) can be done.

    4. Video-card memory the fact is, memory does not mean a thing. At all arma 3 from within the config-file on ultra graphics uses just 768MB of the video-card and another 32 -64MB ( at a guess to do the shader operations).

    5. Number of unified shaders also known as a "general graphics computing chip" where by the system divides up the work on each core for example processing pixel shaders, vertex shaders or geometry shaders but this is also effected by the portion of which the game is capable of using for example if you have a CPU that has 8 cores and 8 threads it will be able to run more parallel operations than a CPU with 4 cores and 4 threads. But the game has to also be able to scale across those cores in the ideal world if you were to have an E-ATX motherboard with 3 Intel core I7's each one connected to the next by a QPI *Quick-Processor Interface" and either 2 or 1 GTX 780 in theory you would see the Video-card being used a lot more as there is now 4 * 3 = 12 cores running each having 8 threads * 3 = 24 threads. and a lot more data lines coming into them.

    6. SLI works for mainly non-real-time applications it is much more suited for that as it is bottlenecked at 1GB/s across the SLI bridge your video-card will be well and truly outperforming this bridge at the PCI-E slots running at 16X 2.0 4GB/s or 16X 3.0 8GB/s bandwidth in the slot vs a 8X which brings a 16X down to an 8X to equalise with the same data rate as a 16X 2.0 More SLI bridges and the more slots your video-cards are in the more loss of speed you will have unless you explicitly bought a motherboard with 2 16X slots. or more.

    And before poop pooing bohemia for their bad game. Just remember, it took Billions of $$$ to make a processor architecture it is more than likely going to cost a lot to implement it into a game.

    1. bit-width interface which is the amount of bandwidth the "pipelines have to each memory chip" say for example each pipeline to each memory chip is 8 bits that is a single byte so because the video-cards use the same technology as RAM just a higher clock speed for the memory, DDR3 is (Double Date Rate RAM version 3)

    This means with each single clock-cycle speed it pumps 2 cycles of data in 1 cycle. So technically a bit-width of 256bit will be 512bit full-duplex full duplex just means signals traveling in and out at the same time. The reason the bit-width interface affects the GPU's performance for example is because you are "memorising / storing" the texture data into the RAM "on the fly"

    not only this but you have Anti-aliasing and anti-aliasing all it does it blow up an image to get rid of the jagged edges so a 2048pix image gets super-sampled 2X which makes the image 4092pix so you are "loading a sort of "shadow copy of the games texture files to increase their size in smoothness but the effect of this is a very big hit to performance the algorithm that is used to perform "FSAA" full-screen-anti-aliasing is mathematical approximations to the gradient curve of the original image" 768MB is the base amount of video-memory stated in arma 3's config file for ultra-settings on textures only.

    2. Processor clock? You think thats jargon not even remotely... Processor Clock or ALU, the ALU(Arithmetic Logic Unit) is a glorified calculator. Thats it.

    3. Video-memory matters only in multi-monitor setups. And let alone do people actually use multi-monitor setups for anything other than video-games and teamspeak -_- (Waste of power and waste of computing power)

    4. Unified Shaders is just a concept where they said "We won't bother using seperate Vertex, and Pixel Shaders instead we will create a unified (universal shader processor that can be purposed depending on the task)

    This is what is inside the GPC's (Graphics Processing Cores) of the Kepler Architecture currently used in most Nvidia cards: http://images.bit-tech.net/content_images/2012/03/nvidia-geforce-gtx-680-2gb-review/gtx680-20b.jpg

    And the outer design for the cores. http://images.bit-tech.net/content_images/2012/03/nvidia-geforce-gtx-680-2gb-review/gtx680-21b.jpg

    If you think this is jargon because can't or don't want to understand it, so be it.


  14. Yep, when someone posts a wall of text with bolded and underlined text, it means the poster knows his shit, take note people!

    Don't believe me?

    Here's one:

    http://www.tomshardware.com/reviews/graphics-card-myths,3694-5.html

    Here's two:

    http://www.tomshardware.com/reviews/graphics-beginners-2,1292.html

    Here's Three:

    http://www.tomshardware.com/reviews/graphics-beginners-3,1297.html

    Here's Four:

    http://www.tomshardware.com/reviews/graphics-card-myths,3694-7.html

    That is all the evidence you need.

    Not to mention I read:

    PCAuthority

    PCworld

    Technet

    cnet.news

    APCmag

    IEEE.org

    Phys.org

    Nvidia News

    ATI News

    Atomic magazine(Condensed into APCmag)

    PCPowerPlay

    Techradar

    Maximum PC and finally

    techspot

    ZDnet

    Extremetech

    DefenceTech

    do I need to keep going?

    Intel and AMD news as well.

    I by no means class myself as an expert, but I am willing to admit I need more information and so... I seek more. Even though I have been called an expert I do not have the "papers" to state it.


  15. I am an expert. Tested all settings thoroughly except for scaling.

    My question wasn't as much a question as a rhetorical advice telling him to change his settings unless he can tell me how he benefits from scaling.

    Given more than a little buyer's/tech advice on this site, benchmarked settings, overclocks, CPU settings, NVidia settings and various other settings/software and what have you done, Mr 13 posts?

    What's your machine like? How many hours have you even played ARMA? Ten? Thirteen?

    Okay.

    I've looked at my CPU.

    I've looked at his CPU.

    They're equal.

    Next he has a 40%~ stronger graphics card than mine.

    Next I have 60% higher view/object distance and 60% higher framerate...

    Am I missing something, expert?

    Super-sampling combined with anti-aliasing can get rid of "jaggy edges" I also discovered a strange thing if I set my screen resolution to 720P and force an upscaling of 150% I can get 30 frames constant but the nasty side effect is I get a "per texture shimmering / jagging" making the screen feel blury even though I am scaling to 1080P in 3D space.

    "40%" stronger graphics card is somewhat "relative" You can have a good video-card with a 128bit interface and a higher clock speed and it will be capable of pumping the same amount of data as a video-card with a lower clock speed and a larger bit interface. I could probably overclock my GTX 650 2GB from 5000Mhz to 5500Mhz and probably improve the performance or so bits 128bit * 5000Mhz clock = 80GB/s then overclock to 5500Mhz * 128bit = 88GB/s but if I had bought a video-card with a 256bit interface * 5000Mhz memory clock I would have 160GB/s of memory bandwidth nearly double that of my 128bit interface.

    You see the thing that directly effects the performance of a video-card is:

    1. Bit-with interface this is the amount of "piplines" running into the memory chips from the processor die. Although this is known to affect the video-card if your memory can't keep up then you have no where to go as GDDR5 is rated for something between 900Mhz and up-to 8000Mhz off the top of my head.

    2. Memory clock speed which is the rate at which data can be moved in and out of memory to the processor die

    3. Processor clock this is the rate at which the ALU based operations connected to the DRAM(Memory) can be done.

    4. Video-card memory the fact is, memory does not mean a thing. At all arma 3 from within the config-file on ultra graphics uses just 768MB of the video-card and another 32 -64MB ( at a guess to do the shader operations).

    5. Number of unified shaders also known as a "general graphics computing chip" where by the system divides up the work on each core for example processing pixel shaders, vertex shaders or geometry shaders but this is also effected by the portion of which the game is capable of using for example if you have a CPU that has 8 cores and 8 threads it will be able to run more parallel operations than a CPU with 4 cores and 4 threads. But the game has to also be able to scale across those cores in the ideal world if you were to have an E-ATX motherboard with 3 Intel core I7's each one connected to the next by a QPI *Quick-Processor Interface" and either 2 or 1 GTX 780 in theory you would see the Video-card being used a lot more as there is now 4 * 3 = 12 cores running each having 8 threads * 3 = 24 threads. and a lot more data lines coming into them.

    6. SLI works for mainly non-real-time applications it is much more suited for that as it is bottlenecked at 1GB/s across the SLI bridge your video-card will be well and truly outperforming this bridge at the PCI-E slots running at 16X 2.0 4GB/s or 16X 3.0 8GB/s bandwidth in the slot vs a 8X which brings a 16X down to an 8X to equalise with the same data rate as a 16X 2.0 More SLI bridges and the more slots your video-cards are in the more loss of speed you will have unless you explicitly bought a motherboard with 2 16X slots. or more.

    And before poop pooing bohemia for their bad game. Just remember, it took Billions of $$$ to make a processor architecture it is more than likely going to cost a lot to implement it into a game.


  16. A full analysis of arma 3's capacity.

    Arma 3 appears to cause the largest performance hit when using large amounts of polygons, so my 384 CUDA cores (Unified shaders) are doing a lot of the work when it comes to arma 3's scene complexity e.g "Object level" Ultra, which is 1,000,000 vertexes in the scene. Now interestingly enough arma 3 no matter the stress you put it under never likes to use anymore than 1 GPU compute node which in my case is approximately 36 cores. That being said the memory on my videocard is more than sufficient and memory clock as well, the one thing it lacks is a wider bit interface. Namely the 128bit memory interface to the video-cards processor. I probably should have looked at a 256bit or a 375bit memory interface I would be utilising much more of my videocard:

    Arma 3 has a lot of texture operations:

    Also when looking at arma 3 is appears to use a lot of in and out texture operations due to the texture size being 2048px*2048px and my current video-card is capable of 36 GTexel/s which lets think we have a lot of vertexes arma 3 uses about 1,000,000 vertices on ultra and about 500,000 on Very high. Arma 3 also does a lot of pixel shader operations anti-aliasing is a type of pixel shader operation in which arma 3 calculates anti-aliasing. So is shadows so is HDAO or SSAO.

    Not only this but the thing that causes the biggest hit to arma 3's performance is HDR now for those who are unfamiliar with the term HDR is the precision of a games world to colour namely for example arma 3 has a 16bit HDR precision(There is 32bit) but 16bit means instead of the limited 8bit we see from "low" it's given a wider colour range to display around it so darker is a more even gradated and so is lighter, effect HDR stands for "High Dynamic range" it's a problem with a lot of cameras and I have taken HDR like images by getting the lighting just right, but processing it on the fly eats a lot of frames. Even though it looks nice but again these are pixel shader operations. As it's making things more "realistic".

    Arma 3 uses the CPU more than GPU:

    From what I can see arma 3 is using a lot of CPU particularly for operations that should be done on the GPU such as pixel-shader,vertex shaders, and geometry shaders

    GPU analysis:

    http://imgur.com/rhc59QM

    As you can see above I have only 36 cores active and one single computing node being stressed to 100% which 36 cores of a total 384 is about 9.67% GPU usage. it would explain why people with more CUDA or Stream processing unified shaders are getting better performance as they have more cores in each of the Nodes namely a person I talked to who had a video-card with 2880 Unified Shaders had 5% of his GPU active which was approximately 97 cores when running metro-last light he had 3 Nodes active never all of them but definitely more than what arma 3 is leveraging. Another benchmark saw it using 2 Nodes using 3DS Max and 3D max Maya

    This is very disappointing to see from Bohemia really is. A CPU is designed for "Generalised computing" not specialised computing such as games and pixel shaders and expecting a game to operate with them is sacrilege.

    CPU Analysis:

    http://imgur.com/jIYNQFz

    As you can see arma 3 is scaling across the CPU cores but it's only using about 40% so my video-card could be the bottleneck at the memory interface "potentially" but how can that be if the video-card is only reporting a 9% core usage?

    So the conclusion is bohemia either has a bottleneck in their code similar to how BF4 had or the deliberately did everything on the CPU expecting to have a good result. Also I do believe my video-card might be the bottleneck but I am unsure.

    Ha maybe I should get an E-ATX motherboard and 3 I7's and QPI them together and 2 GTX 780's

    P.S the poor core usage could also be related to Amadahl's law as I have only 4 threads and 384 cores I only have about 50% that is parallel.


  17. I don't think it works like that. Arma still uses all cores. Forcing it to only one core degrades performance immensely.

    Not what I meant' CPU's are always running Async just make it so those cores run faster than the rest. Namely the cores arma 3 is hogging all the time.

    And it has been stated in computer science fields that a single core single thread CPU can outperform a multi-core

    this explains it quite well

    http://www.extremetech.com/computing/116561-the-death-of-cpu-scaling-from-one-core-to-many-and-why-were-still-stuck/3

    So if we keep going we are going to reach a "flat lining of moores law" You want to make systems faster? Learn how to design a new architecture. (Completely Parallel based)

    This is where you delve into the area of memresistors:

    http://phys.org/news190016024.html

    Graphene Transistors:

    http://phys.org/news/2013-02-graphene-transistor-principle.html

    Photonic CPU's:

    http://phys.org/news/2011-09-fujitsu-compact-silicon-photonics-source.html

    3D transistor Lithography:

    http://phys.org/news/2013-03-transistors-dimension.html

    And last but not least single-atom transistors:

    http://phys.org/news179331125.html

    Regardless of chip organization and topology, multicore scaling is power limited to a degree not widely appreciated by the computing community… Given the low performance returns… adding more cores will not provide sufficient benefit to justify continued process scaling. Given the time-frame of this problem and its scale, radical or even incremental ideas simply cannot be developed along typical academic research and industry product cycles… A new driver of transistor utility must be found, or the economics of process scaling will break and Moore’s Law will end well before we hit final manufacturing limits

  18. You make it look like Overclocking is new to you and/or the cause for all the bad things happening in the world.

    I bought an i7 4770k last year and after some testing at the stock clock I overclocked it to 4.5ghz and arma3 feels like it's running on an entirely different cpu, the performance increase is ridiculous. (at least for me having upgraded from an i7-920@4.2 which was running at that speed since it came out)

    Especially for arma3 overclocking your cpu and ram is totally worth it.

    I have been warned off it several times. I had a long discussion with an IT technician that said I don't know how many PC's come to the shop who have been overclocked and died.

    Overclocking shortens the life of components A LOT.

    Come to think of it an interesting question to test if arma 3 benefits from mainly CPU clock speed. Why don't people try overclocking one of the cores of the CPU and only 1 core and make arma 3 run on it? It's been said that programs can run faster on a 1 single thread processor than a quadcore if the processor is high enough clock speed.

    Method:

    1. Overclock a single core

    2. Run arma 3 as -cpuCount=1

    3. Check FPS in-game

    4. If FPS is higher compared to quad-core we can definitively say that arma 3 does not use mutli-core CPU's very well.

    5. If performance is worse then maybe using 1 CPU core faster than the other ones arma 3 does not use could offer benefit? e.g Async'd clocks

    6. Repeat 5 times

    7. Draw conclusions based on averaging data.


  19. http://lifehacker.com/a-beginners-introduction-to-overclocking-your-intel-pr-5580998

    They clearly state there is a lot of risk. I know about the voltage potentials of the CPU and if I am going to screw around with the voltage I'd be stepping in micro-volts,nano-volts and maybe pico-volts. As said, it's safer to just go buy a new CPU. Than run-risks of causing damage and is even explicitly stated that overclocking can be dangerous, and can shorten the life of or permanently damage some of your components if something goes wrong,

    So if I have an I5-2400 which is running at about 1.24V -> 1240milli-volts you should step in values of 0.05 which is 50 milli-volts. And again. more voltage you add the more heat you will add. Also, adding more voltage to the CPU will result in more stress placed on the electro-plated wires. namely under peak voltage 0.01326 amps. More current draw along those already stressed electroplated cables will generate "more overall heat" how do I know this, had a GTS 450 running on a 12V rail and it decided to draw 12.8V under maximum load with my little I5-2400 caused my motherboard to goto 80*C.

    And killed the HDD

    Lastly, remember that no two systems will overclock the same—even if they have the exact same hardware.

    And the resistance generated by higher voltage potentials and increased current draw you can't escape.

    The old adage is "Just because you can and you are able to does not mean you should". Voiding your CPU warranties is not cool. But that being said getting someone "professional" to overclock your system who are "qualified" will likely result in much better outcomes as if they cause your CPU to BBQ they wear the cost not you.

×