Jump to content

Recommended Posts

I came across this old article on X-Plane

http://https://developer.x-plane.com/2007/04/cpu-or-gpu/

 

and wondered if there was a corresponding article for DCS.

 

 

If your X-Plane is framerate low, or you want to increase your rendering quality, you might think “time for a new graphcis card But is it?

Some rendering settings actually tax the CPU more than the GPU (graphics card). Here’s a simple rule of thumb: if you increase the setting (and restart X-Plane) and your frame-rate does not go down, a new graphics card isn’t going to make it go up!

For example, if you have one of those new-fangled GeForce 8800s, you may have noticed that when you turn on FSAA the framerate doesn’t dip at all. That’s because the 8800 is insanely overpowered for X-Plane (at normal monitor resolutions) and has plenty of extra capacity that will be sitting idle on an older PC. When you turn up FSAA, you are simply wasting less of the card’s excess capacity. It goes without saying that if there were a card faster than the 8800, it wouldn’t improve your fps any more than the 8800, it would simply be even more bored.

Here’s a rough guide to which features tax the CPU vs GPU:

CPU-Intensive

World Level of Detail

Number of Objects

Draw Cars On Roads

Draw Birds (not that expensive for modern machines)

Draw Hi Detail World

World Field Of View (wider view means more CPU work!)

GPU-Intensive

Texture Resolution (requires more VRAM)

Screen Resolution

Full Screen Anti-Aliasing (FSAA)

Anisotropic Filtering (most cards can do at least 4x)

Draw Hi-Res Planet Textures From Orbit

Cloud Shadows and Reflections (not that expensive)

Draw Hi Detailed World

A few specific framerate-optimization warnings:

FSAA is equivalent to a higher screen resolution – that is, running at 2048×2048 and no FSA is similar to running at 1024×1024 and 4x FSAA. Both of these tax the video card with virtually no CPU increase. This is probably the only setting that can be helped only with a video-card upgrade.

Texture resolution: do not worry if the total size of all textures loaded is larger than the VRAM of your card. To find out if more VRAM would help, measure frame-rate with your normal settings, with texture resolution down a notch, and with anisotropic filtering down a notch. If turning texture resolution down increases fps more than turning down anisotropic filtering, more VRAM may help. Machines with faster graphics busses (like PCIe x16) will be less sensitive to VRAM.

Most Important: do not ever turn “World Detail Distance” beyond the default setting – you will simply destroy your fps and chew up your CPU for no benefit. I strongly recommend trying “low” for this setting – if you like a lot of objects, this setting can make a big difference in performance.

The number of objects is virtually always a factor of how fast your CPU is, not your GPU — that is, most GPUs can draw about a gajillion objects if the CPU could only get through them fast enough. If you are unhappy with the number of objects you can draw, do not expect a new graphics card to help – it probably won’t.

Cars on roads hurt fps on machines that don’t have the fastest CPU.

Draw Hi detail World is doubly dangerous – it uses both the CPU and GPU. Quite literally this is where we stash “luxurious” options. Everything this checkbox does chews up framerate. (If these options didn’t, we’d leave them on all the time!) So you should not use this option if you aren’t happy with fps, if you don’t have a fast CPU, or if your graphics card isn’t that modern. (HINT: if your nVidia card has “FX” in the title, don’t use this!)

Start with the default settings and experiment – turn a setting up one notch, then restart, then turn it down and try another. Different machines will be faster for some things and slower for others.

EDIT: one user correctly determined (by observing CPU utilization relative to settings) that puff-style 3-d clouds bottleneck the GPU, not the CPU! This was not what I expected – when Austin originally wrote that code, our measurement indicating that sorting the puffs from far to near taxed the CPU a lot, making this CPU-intesive. At the time the old Rage 128s would also get bogged down by filling in translucent puffs as you flew right into the thick of the cloud.

Times have changed and neither the sorting nor the alpha-drawing is even remotely expensive on a modern machine. So I was surprised to see the CPU not being used. After some investigation, it turns out that while the CPU and GPU have gotten a lot faster over time, the communciations channel between them has not. The result is that they both do their jobs really quickly, and as a result clog up the communications channel…the CPU simply can’t talk to the GPU fast enough to get the clouds out.


Edited by LaLa
Link to post
Share on other sites
I came across this old article on X-Plane

http://https://developer.x-plane.com/2007/04/cpu-or-gpu/

 

and wondered if there was a corresponding article for DCS.

......

 

 

Times have changed and neither the sorting nor the alpha-drawing is even remotely expensive on a modern machine. So I was surprised to see the CPU not being used. After some investigation, it turns out that while the CPU and GPU have gotten a lot faster over time, the communciations channel between them has not. The result is that they both do their jobs really quickly, and as a result clog up the communications channel…the CPU simply can’t talk to the GPU fast enough to get the clouds out.

 

 

Very interesting.

 

Since I am gradually upgrading my PC, I've been doing a lot of research work, my choices were first based on budget but I fast realized that playing DCS in VR would put a lot of strain on my new system.

 

I was very lucky to find a brand new EVGA GEFORCE GTX 1080ti 11GB which only cost me £361, including voucher exchange for my previous graphic card at CEX, it did reduce the cost of the upgrade tremendously.

 

Reading other players posts I was stricken by the number of PC specs which were based solely on the supposed performance of one or the other, PCU or GPU but rarely on the bound between the two.

 

Since I run my 1080 Ti 11GB with an AMD Ryzen 3 3200G, I was curious to know what would be the result of bounding the card with the right CPU so I did more research work on the subject, I also ran 3DMark bench with my actual set up and compared.

 

Two benchmark results shows the speed of the card itself, 6914 IN 3DMark Fire Strike Ultra, 9717 IN Time Spy, but a Nitro+ 5700XT Ryzen 9 3600X would give you a Graphic score of 9071 and 61.81 fps, vs my 9717 and 62,27 fps in both test 1.

 

I have to say that the 1080ti is moderately overclocked using MSI Afterburner, both Core clock (+107) and Memory Clock (+399), the gain is not huge but noticeable and more to the point, the whole system runs stable and well below temperatures limits, fans Speed is set to reach 100% just above 60°C.

 

But my CPU score is only 3592/12.7 FPS vs 13288/44.65 FPS (Nitro+ 5700XT Ryzen 9 3600X), this gives you an idea of how much the CPU can influence the results.

 

The impact of a Ryzen 5 3600X will results in a gain of 19% at 1080p and a 19% Average 4K Performance.

 

It's a non-brainer, anyone with a 1080ti intending to play in VR should make sure they have as low a bottleneck as possible, the impact on FPS of the Ryzen 3 32000G on the 1080ti is -20.0%.

 

Bottleneck percentage for both the Ryzen 3 3200G and 5 3600X are:

19.65% at 1080p, 11.79% at 14040p, 7.14% at 4k vs 4.52% at 1080p, 6.65% at 14040p, 8.34% 4k.

 

But the limitation of the 3200G doesn't guaranty performances anywhere near that of the 3600X at 4k even if bottleneck percentage is lower and below 10%, and this resolution is precisely what I will need for my next headset.

 

So from where I am standing, it is as important to make sure of a good bound between CPU and GPU, or one can end up with performances barely superior to that of a much slower GPU despite the 1080ti potential.

 

Next upgrade includes:

 

AMD Ryzen 5 3600X CPU.

 

Arctic Freezer 7 X Compact Quiet Intel AMD CPU Air Cooler.

 

Crucial Ballistix 32GB Kit (2 x 16GB) DDR4-3200 Desktop Gaming Memory (Black) (64GB total).

 

Case fans.

 

End of this month I will be running the same 3DMark tests and compare them to that of my actual system, I expect some serious gain in performance, the next bottleneck is my Octopus CV1.

 

 

https://www.gpucheck.com/en-usd/compare/nvidia-geforce-gtx-1080-ti-vs-nvidia-geforce-gtx-1080-ti/amd-ryzen-5-3600x-vs-amd-ryzen-3-3200g/ultra/ultra/-vs-

 

https://pc-builds.com/calculator/Ryzen_5_3600X/GeForce_GTX_1080_Ti/0Um0XF8A/32/100/


Edited by Thinder

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites
I have the Rayzen 5 3600. Runs like a charm for the price.

 

 

Yeah, it can only be an improvement over the Ryzen 3 3000G which is by no mean a bad processor, it is just not strong enough to make a good bound for a 1080Ti 11GB.

 

I kept running test and had to revise my plans: I'm already at the recommended minimum for RAM for heavy missions in VR, so it is not the priority.

 

On the other hand, my case is showing its limitation when it comes to cooling so it's a change of plan after some more researches.

 

October Upgrades:

 

AMD Ryzen 5 3600X CPU Six Core 4.4GHz Processor Socket AM4

 

Arctic Freezer 7 X Compact Quiet Intel AMD CPU Air Cooler

 

Fractal Design Define C Tempered Glass - Compact Mid Tower Computer Case - ATX/

 

NF-F12 IP2000P Noctua NF-F12 industrialPPC-2000 PWM, 120mm for exhaust, the two existing 120mm fan will go in the front.

 

 

The Arctic Freezer 7 blows directly toward the exhaust fan, it will be supported by the front fan, the second front fan will cool the Power Unit at the bottom of the case.

 

The graphic card doesn't need so much airflow, it sucks air from the back, so higher case pressures limits its ability to achieve maximum cooling this way.

 

That's why I chose the strongest exhaust fan possible, to favor low pressures, one single fan front top plus CPU fan blowing backward and a strong exhaust fan should help both CPU and GPU cooling.

 

 

 

Define-120mm-fans.jpgmanual-7-X-01-EN.gif


Edited by Thinder

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...