Jump to content

Intel or AMD


Recommended Posts

Hi Everyone, i am going to start to build my new pc in a few weeks and need some help with the cpu. i can not decide if i should go Intel or AMD. I am asking because i can get more bang for my buck with AMD, but does DCS run well with AMD? i was thinking either an I7 8700 or 9700, and a ryzen 7 2700x, 3700x, or 3800x. Thanks for you input.

Link to comment
Share on other sites

anyone is a great CPU for DCS but I would only buy the "k" models from Intel, not the 8700 but the 8700k or 9700k.

 

If I was you, my choice would be 100% AMD. I am about to put a 3800x together for a friend who was a pure No-GO AMD guy. Took me 5min to convince with facts. He also orders a 5700X over a several hundred € more expensive Nvidia card. He even got a better PSU than he usually buys, that took the longest time to convince,,,LOL

Take 32GB as well, 3200MHz or faster with AMD is mandatory btw to achieve best performance.

The board we choose is a Gigabyte Aorus Ultra ATX format and 3200MHz RAM certified for AMD.

 

Maybe that helps you


Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

DCS runs excellent with AMD. Counting hairs aside, you will experience the same fluid gameplay on AMD ad you would on an blue rig. E.g, on my system - see sig- with a non ti card I pretty much get 100+ frames at 2k with all the bells and whistles turned on. I cant speak to 4k as I dont have a 4k monitor.

 

Sent from my SM-G965U using Tapatalk

Ryzen9 5800X3D, Gigabyte Aorus X570 Elite, 32Gb Gskill Trident DDR4 3600 CL16, Samsung 990 Pr0 1Tb Nvme Gen4, Evo860 1Tb 2.5 SSD and Team 1Tb 2.5 SSD, MSI Suprim X RTX4090 , Corsair h115i Platinum AIO, NZXT H710i case, Seasonic Focus 850W psu, Gigabyte Aorus AD27QHD Gsync 1ms IPS 2k monitor 144Mhz, Track ir4, VKB Gunfighter Ultimate w/extension, Virpil T50 CM3 Throttle, Saitek terrible pedals, RiftS

 

Link to comment
Share on other sites

What's your budget and what resolution are you planning on playing at? At 1080p the CPU will be the limiting factor and at higher resolutions the GPU will be the limiting factor.

 

You should take people's opinions with a grain of salt. Best to look at the objective benchmarks.

 

Jarred Walton at PC Gamer and Steve at Gamer's Nexus have done the most thorough benchmarks I'm aware of. They don't test DCS unfortunately but they have tested all the current Intel and AMD CPU's on DX11 and DX12 games and Intel is still currently beating AMD in real world performance.

 

https://www.pcgamer.com/best-cpu-for-gaming/

 

For example, if you flip through the slides in that article and check Far Cry 5 which is based on the DX11 Graphics API like DCS is Intel's 9600K and even the 7700K still beat AMD's newest processors even the high end 3900X which is almost the same cost as a 9900K.

 

re6IHxF.png

 

DCS is still currently limited with the single thread performance of the CPU. The 9700K is as good or better at most games as the 9900K and is essentially the same chip without Hyperthreading/multithreading. DCS does not use multithreading and is really only used by software like Blender and Premiere for rendering and editing.

 

b4tRrZ8.png

 

Here's Gamer's Nexus review of the 3700X but they compare it to all the current CPUs in their benchmarks.

 

 

One of the testers mentioned in the forums that they use the 9700K on the machines at the DCS office. I'd go with that one as it has great performance and is much cheaper than the Intel 9900K or the AMD 3900X.


Edited by Sn8ke_iis

 

 

Link to comment
Share on other sites

Absolutely loving my i9 9900k/Z390mb setup, next week will mark 1 year since I did the build.

Performs quite well in my flight sims and other games, even in VR.

Don B

EVGA Z390 Dark MB | i9 9900k CPU @ 5.1 GHz | Gigabyte 4090 OC | 64 GB Corsair Vengeance 3200 MHz CL16 | Corsair H150i Pro Cooler |Virpil CM3 Stick w/ Alpha Prime Grip 200mm ext| Virpil CM3 Throttle | VPC Rotor TCS Base w/ Alpha-L Grip| Point Control V2|Varjo Aero|

Link to comment
Share on other sites

...

3900X and 3800X are about the same performance as 9900K with the newest AGESA versions. Stop living in August of 2019 and spread false and/or outdated information. Thank you.

 

Edit: There you go, you can see it on the second graph you yourself posted.


Edited by Der Hirte
Link to comment
Share on other sites

3900X and 3800X are about the same performance as 9900K with the newest AGESA versions. Stop living in August of 2019 and spread false and/or outdated information. Thank you.

 

Edit: There you go, you can see it on the second graph you yourself posted.

 

Beg your pardon? What misinformation am I spreading exactly? Those graphs are from Dec 9, 2019 not August. The benchmarks speak for themselves. Opinions don't really matter when you have objective benchmarks. In every one of the benchmarks except for 2 use cases Intel beats AMD in gaming performance. In Assassin's Creed the TR 3960X ($1400) has a sleight edge and in Total War, AMD has the edge because those game uses multicore processing. DCS is still bound by single thread performance as stated in my post.

 

The OP said he was looking at the Intel 8700 or 9700 or AMD's 3700X, 3800X. I don't think he wants to spend $1400 for a CPU that will have no benefit in DCS. The 9700K beats those AMD processors in every gaming benchmark except for the 2 multicore games previously mentioned. The 9700K is also cheaper than the 3800X right now in the US.

 

In various productivity software AMD has an edge. The Gamer's Nexus benchmarks also confirm and replicate the same results. If you have better benchmarks please share.

 

On the second graph I posted the results are very clear, perhaps you should look at it again. I never said they weren't "about" the same performance. All the processors are "about" the same performance. That's why we look at objective benchmarks with a quantitative score and compare prices. I think you need to actually read the article and watch the video. Nothing I stated was false or outdated and the evidence is clear for everyone to see.

 

If you feel the results are in error or there is a flaw in the benchmarking methodology you need to contact Jarred or Steve directly as I did not conduct these benchmarks myself. I just posted the results.

 

Also, as you get to 4K resolution there's almost no difference as you are hitting GPU thresholds. In high framerate 1080p gaming is where you see the most difference between CPUs. Which is why I asked the OP what resolution and budget he was looking at.

 

 

Link to comment
Share on other sites

Another thing to consider is how many years do you want to invest in this system. Are you the type of person to run out and grab the latest stuff hitting mainstream, or is the system you want to build intended to stand a modicum test of time. If you want to stare at a frame rate counter @ 1080 and look for a modestly larger number, and want the absolute top performing system in dcs now in its current state, then go blue; if you want a more future proof system that yields the same experience with the frame rate counter turned off but offers more cores for your dollar, is on a socket that's just hitting maturity and excels at higher resolution gaming beyond 1080, then I'd lean towards red team. Either way you go is going to yield a well performing system in dcs with all the above advice. Vulkan is comming - see today's update- and performance wise is antibodies guess, but one thing is certain, DCS will no longer be a single core pony so there's that.

 

Sent from my SM-G965U using Tapatalk

Ryzen9 5800X3D, Gigabyte Aorus X570 Elite, 32Gb Gskill Trident DDR4 3600 CL16, Samsung 990 Pr0 1Tb Nvme Gen4, Evo860 1Tb 2.5 SSD and Team 1Tb 2.5 SSD, MSI Suprim X RTX4090 , Corsair h115i Platinum AIO, NZXT H710i case, Seasonic Focus 850W psu, Gigabyte Aorus AD27QHD Gsync 1ms IPS 2k monitor 144Mhz, Track ir4, VKB Gunfighter Ultimate w/extension, Virpil T50 CM3 Throttle, Saitek terrible pedals, RiftS

 

Link to comment
Share on other sites

...
I initially scrolled to your post and saw the first graph, which gives off the impression that the 9900K is far superior in terms of gaming compared to the 3800 or 3900X, which is not true. It was true before the latest AGESA versions came out.

 

So I posted and a moment later edited because I saw your second graph which is way more true to this. The more you look for latest benchmarks or bench yourself, the more you can see that there really is no difference between these processors anymore, it only and only depends on the application you're benching with/using in general.


Edited by Der Hirte
Link to comment
Share on other sites

If you invest in a Z390 chipset you will fall short when a card superior the 2080ti arrives.

 

The 2080ti is likely the last single card/max performance which will can get by with PCIe v3.

 

Seeing that as a fact, voids any Intel if you plan to use the rig for 3-5 years and upgrade the GPU every 1-2 years.

 

Source: Der 8auer in one of his last YT videos.

 

Using SLI on an Intel desktop chipset ( ie Z370/Z390 ) with 2 x 2080ti and high fps will cut your performance, whereas those 2 cards could strive with 2 x 8x PCIe v4.

 

https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/


Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

I initially scrolled to your post and saw the first graph, which gives off the impression that the 9900K is far superior in terms of gaming compared to the 3800 or 3900X, which is not true. It was true before the latest AGESA versions came out.

 

So I posted and a moment later edited because I saw your second graph which is way more true to this. The more you look for latest benchmarks or bench yourself, the more you can see that there really is no difference between these processors anymore, it only and only depends on the application you're benching with/using in general.

 

Which benchmarks? I always look for them. The benchmarks I posted from PC Gamer are from DEC 9 2019. Just posting that there are different benchmarks without linking to them doesn't really help anybody. I'm not aware of any source that does better benchmarking and testing than Gamer's Nexus and Jarred Walton.

 

Most people use their gaming rigs for gaming. Sometime in the next year I hope to build a second rig for CC and it will probably be an AMD 3900X as AMD performs really well in video editing and encoding benchmarks. Then I can run some comparison benchmarks for DCS specifically. I would love to do some benchmarks as thorough and exhaustive as GN and PCG but would need a budget of $20,000+ to buy all the chips and motherboards.

 

DCS is limited by single thread performance. You should read the article and watch the video. The 9900K beats the 3900X by 10-30 fps in all of the benchmarks except for the 2 that I already mentioned. The mean on the first slide was 10fps.

 

The OP says he was looking at AMD's 3700X, 3800X and Intel's 8700K and 9700K. The 9700K is currently cheaper than the 3800X in the US.

 

I was just providing the OP with data so that he can make an informed decision rather than rely on opinions which are subject to bias.


Edited by Sn8ke_iis
typo

 

 

Link to comment
Share on other sites

Just to add to this i run on a 9700K with an EVGA RTX2080 and this gives me solid performance in VR with a Rift S.

 

I debated the i9 when i bought the new rig a year ago however just thought it was a waste of money given that DCS is my main interest.

 

TO note it runs Cod, Battlefield all brilliantly in 4K also.

---------------------------------------------------------------------------------------------------------------------------

 DCS & BMS

F14B | AV-8B | F15E | F18C | F16C | F5 | F86 | A10C | JF17 | Viggen |Mirage 2000 | F1 |  L-39 | C101 | Mig15 | Mig21 | Mig29 | SU27 | SU33 | F15C | AH64 | MI8 | Mi24 | Huey | KA50 | Gazelle | P47 | P51 | BF109 | FW190A/D | Spitfire | Mossie | CA | Persian Gulf | Nevada | Normandy | Channel | Syria | South Atlantic | Sinai 

 Liquid Cooled ROG 690 13700K @ 5.9Ghz | RTX3090 FTW Ultra | 64GB DDR4 3600 MHz | 2x2TB SSD m2 Samsung 980/990 | Pimax Crystal/Reverb G2 | MFG Crosswinds | Virpil T50/CM3 | Winwing & Cougar MFD's | Buddyfox UFC | Winwing TOP & CP | Jetseat

Link to comment
Share on other sites

PS - I should mention that whilst i have mine on an OC it runs really well without.

---------------------------------------------------------------------------------------------------------------------------

 DCS & BMS

F14B | AV-8B | F15E | F18C | F16C | F5 | F86 | A10C | JF17 | Viggen |Mirage 2000 | F1 |  L-39 | C101 | Mig15 | Mig21 | Mig29 | SU27 | SU33 | F15C | AH64 | MI8 | Mi24 | Huey | KA50 | Gazelle | P47 | P51 | BF109 | FW190A/D | Spitfire | Mossie | CA | Persian Gulf | Nevada | Normandy | Channel | Syria | South Atlantic | Sinai 

 Liquid Cooled ROG 690 13700K @ 5.9Ghz | RTX3090 FTW Ultra | 64GB DDR4 3600 MHz | 2x2TB SSD m2 Samsung 980/990 | Pimax Crystal/Reverb G2 | MFG Crosswinds | Virpil T50/CM3 | Winwing & Cougar MFD's | Buddyfox UFC | Winwing TOP & CP | Jetseat

Link to comment
Share on other sites

If you invest in a Z390 chipset you will fall short when a card superior the 2080ti arrives.

 

The 2080ti is likely the last single card/max performance which will can get by with PCIe v3.

 

Seeing that as a fact, voids any Intel if you plan to use the rig for 3-5 years and upgrade the GPU every 1-2 years.

 

Source: Der 8auer in one of his last YT videos.

 

Using SLI on an Intel desktop chipset ( ie Z370/Z390 ) with 2 x 2080ti and high fps will cut your performance, whereas those 2 cards could strive with 2 x 8x PCIe v4.

 

https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/

 

The OP didn't mention anything about SLI (1 x 1070) and the link you posted doesn't test PCIe 4.0. It's not a fact yet and doesn't void Intel, you are just speculating as the new cards would have to surpass the theoretical bandwidth thresholds of PCIe 3.0. Which if they do would be very impressive.

 

I tried to find the video you were looking for and couldn't find it. I would be interested to watch it as I respect his work and learned a lot about overclocking from him. I did find this one.

 

 

Here's a good explanation of PCIe 3.0 vs PCIe 4.0. On its face having a motherboard that supports 4.0 would make sense for future proofing but the reality is more nuanced and complex.

 

https://www.pcworld.com/article/3400176/pcie-40-everything-you-need-to-know-specs-compatibility.html

 

"...because few games ever saturate the 32GBps of data today’s x16 PCIe 3.0 slot can carry."

 

"Ryzen 3000 “only” can support a single-slot x16 PCIe 4.0,..."

 

So there is actually no difference in current gaming CPUs as Intel's 9000 series/Z390 chipset and AMD's 3000 series/X570 chipset can still only support 1 PCIe x16 lane. They both can have mulitiple x16 slots but the CPU and chipset still can only fully utilize 1 PCIe x16 slot at a time.

 

To get more PCIe lanes for dual x16 you have to use the expensive HEDT X series or Xeon processors with the x299 chipset or AMD Threadripper. Those chips do not have better single thread performance in synthetic benchmarks.

 

https://www.cpubenchmark.net/singleThread.html

 

In that benchmark and on AMD's website it does show good singlethread for the PRO processors but those don't seem to be available for DIY systems, only prebuilt.

 

Here's some real world gaming tests comparing PCIe 3.0 to 4.0 in 20+ games.

 

https://www.techpowerup.com/review/pci-express-4-0-performance-scaling-radeon-rx-5700-xt/

 

CONCLUSION:

 

"Looking at the results, we can see a whole lot of nothing. PCI-Express 4.0 achieves only tiny improvements over PCI-Express 3.0—in the sub-1-percent range When averaged over all our benchmarks, we barely notice a 1% difference to PCIe Gen 3. I also included data for PCI-Express Gen 2, data which can be used interchangeably to represent PCIe 3.0 x8 (or PCIe 4.0 x4). Here, the differences are a little bit more pronounced, but with 2%, not much to write home about, either. These results align with what we found in previous PCI-Express scaling articles.

 

That's of course a good thing as it confirms that you do not need an expensive PCI-Express 4.0 motherboard to maximize the potential of AMD's new Radeon RX 5700 XT. It also produces strong evidence that PCIe 4.0 won't be needed for even more powerful next-gen graphics cards because our three tested resolutions reveal more details.

 

If you look closely, you'll notice that lower resolutions show bigger differences in performance when changing the PCI-Express bandwidth, which seems counter-intuitive at first. Doesn't the graphics card work harder at higher resolutions? While that may (mostly) be true, graphics card load does not increase PCI-Express bandwidth; it actually lowers it because frame rates are lower. The amount of data transferred over the PCIe bus is fairly constant—per frame. So if the graphics card can run at higher FPS rates because the resolution is lower, the PCIe bus does have more traffic moving across it.

 

So even if next-gen graphics cards significantly increase performance, we won't see huge differences in PCIe requirements because you'd not use those graphics cards at 1080p, but rather 4K. In such a scenario, with FPS increasing on 4K, the difference in scores would be more similar to this review's data for 1440p, or even 1080p.

 

When looking at individual game results, the effects of constrained PCIe bandwidth vary wildly. Some games, Shadow of the Tomb Raider, for example, barely show measurable differences, while titles like Rage 2 and Wolfenstein are much more dependent on PCI-Express bandwidth. I can't see a clear trend between APIs or engines as it rather looks like a dependency on how the game developer chooses to implement their game and how much data they copy from the CPU to the GPU, or even back.

 

These results are also good news for people who consider running their graphics card at reduced link width, like x8 or even x4, to free up precious PCI-Express lanes for other devices, like storage."

 

Hopefully we have the Vulkan API update that takes better advantage of modern processors by this time next year. I'll definitely run some comparison benchmarks when it does.

 

 

Link to comment
Share on other sites

I think that intel is better. Unfortunately DCS presently is a single core game and muti-thread is no use. The intel i5 9600K/KF is the perfect CPU for DCS when it overclocks to 5G because its single core capability has won all the AMD CPUs.

AMD Ryzen 3th CPU is weak in OC and the frequency is still too low comparing to intel. Nevertheless, memory latency is still a big problem for Ryzen.

However, in the future DCS will switch to Vulkan, which is a good news for AMD Ryzen and AMD Navi. AMD will win the future. If you don't want to wait, just choose Intel.

giphy.gif

Attache ta tuque avec d'la broche.

Link to comment
Share on other sites

giphy.gif

 

:megalol:

 

They have to convert 4.3 millions lines of code. It might actually be a while before the new engine is out. This time next year is probably optimistic at best. There's a popular GA sim that's also doing a Vulkan conversion and it's taking a long time.


Edited by Sn8ke_iis
typo

 

 

Link to comment
Share on other sites

I would need to check my YT history to find the video where he mentions that you will loose fps if you cut the 2080ti down to PCIe v3 x8. The card is somewhere at a "best guess" x12 bandwidth, too much for x8 but still fitting in the x16 bandwidth door. A new card could be x14-16 and just about to be the last card to be able to run x16 v3 or the first card that would need x16 +1 or 2 so it would need x16 v4 ( then with bandwidth to spare).

Time will tell how much bus bandwidth the new cards can use when used in 1080p and ultra high fps.

 

SLI is a different beast. With those cards, even x8 v4 would by logic then be a bottleneck, which means v5 needs to come sooner than thought it would need to be.

 

If you play above 1080p, all this is no issue. I have no data on how much VR fills the bus, maybe more maybe less, no clue.

 

I would be eager to see your 2080ti SLI on 1080p on your 9900k and then on a 3900x or 3950x "IF" that card was a v4 card, but it isn't. So using such a combo to drive 1080p to 300Hz

would limit the achievable frames as x8 v3 is too narrow for 2080ti's in high fps scenarios.

 

That is what I basically ment.

 

* It could also well be that when you watch Der 8auer in english that this little site note isnt even mentioned. They are not word by word the same, from what I have seen.


Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

I'm totally highjacking the thread off topic. Hope the OP doesn't mind. I sold my second 2080 Ti, got a good price for it too. Shipped if off a few days ago. I did get some good benchmark data before I shipped it. CPU is definitely the bottleneck with 2080 Ti's, even with a 9900K. I'm planning on doing a more thorough post so it's searchable on the forums but here's the TL;DR of what I got. It really only helps with 1440p and up and SSAA and MSAA settings. At 1080p current CPUs just can't send enough draw calls with the current engine. These were all on the latest Open Beta and December's Nvidia driver. These were on the stock mission over Tbilisi right over the airfields and the city with lots of trees and smoke stacks so it's a good stress test. I wanted to do some for the F-18 too with all the MFDs on but I ran out of time.

 

DCS TF-51 Single GPU

1080p High Preset, Mirrors On, Caucuses, Instant Action: Flight Over Tbilisi

 

05-01-2020, 16:26:17 DCS.exe benchmark completed, 2970 frames rendered in 21.579 s

Average framerate : 137.6 FPS

Minimum framerate : 122.4 FPS

Maximum framerate : 143.6 FPS

1% low framerate : 97.7 FPS

0.1% low framerate : 58.1 FPS

 

DCS TF-51 Dual GPU SLI

1080p High Preset, Mirrors On, Caucuses, Instant Action: Flight Over Tbilisi

 

05-01-2020, 21:20:37 DCS.exe benchmark completed, 4354 frames rendered in 31.047 s

Average framerate : 140.2 FPS

Minimum framerate : 137.1 FPS

Maximum framerate : 143.3 FPS

1% low framerate : 114.1 FPS

0.1% low framerate : 99.1 FPS

 

DCS TF-51 Single GPU

1440p High Preset, Mirrors On, Caucuses, Instant Action: Flight Over Tbilisi

 

05-01-2020, 19:27:37 DCS.exe benchmark completed, 2222 frames rendered in 18.063 s

Average framerate : 123.0 FPS

Minimum framerate : 120.7 FPS

Maximum framerate : 128.1 FPS

1% low framerate : 112.7 FPS

0.1% low framerate : 97.2 FPS

 

DCS TF-51 Dual GPU SLI

1440p High Preset, Mirrors On, Caucuses, Instant Action: Flight Over Tblisi

 

05-01-2020, 20:11:45 DCS.exe benchmark completed, 3897 frames rendered in 27.688 s

Average framerate : 140.7 FPS

Minimum framerate : 136.4 FPS

Maximum framerate : 145.7 FPS

1% low framerate : 121.0 FPS

0.1% low framerate : 102.4 FPS

 

DCS TF-51 Single GPU

4096x2160p High Preset, Mirrors On, Caucuses, Instant Action: Flight Over Tbilisi

 

05-01-2020, 23:08:58 DCS.exe benchmark completed, 2098 frames rendered in 25.547 s

Average framerate : 82.1 FPS

Minimum framerate : 79.9 FPS

Maximum framerate : 84.6 FPS

1% low framerate : 78.8 FPS

0.1% low framerate : 58.2 FPS

 

DCS TF-51 Dual GPU SLI

4096x2160p High Preset, Mirrors On, Caucuses, Instant Action: Flight Over Tbilisi

 

05-01-2020, 23:44:22 DCS.exe benchmark completed, 5352 frames rendered in 38.656 s

Average framerate : 138.4 FPS

Minimum framerate : 130.7 FPS

Maximum framerate : 143.4 FPS

1% low framerate : 106.3 FPS

0.1% low framerate : 80.2 FPS

 

It worked out to almost 70% scaling in SLI at 4k resolution at high preset. I was pleasantly shocked. That's way better than most of the games Gamer's Nexus tested in SLI.

 

I went ahead and went balls to the wall on settings and cranked up all the important settings, extreme draw distance, 4x MSAA, and 2X SSAA and the results were even better.

 

DCS TF-51 Single GPU

4096x2160p Custom Extreme, Mirrors On, Caucuses, Instant Action: Flight Over Tbilisi

 

06-01-2020, 00:33:24 DCS.exe benchmark completed, 2411 frames rendered in 43.844 s

Average framerate : 54.9 FPS

Minimum framerate : 53.5 FPS

Maximum framerate : 56.0 FPS

1% low framerate : 52.3 FPS

0.1% low framerate : 51.7 FPS

 

DCS TF-51 Dual GPU SLI

4096x2160p Custom, Mirrors On, Caucuses, Instant Action: Flight Over Tbilisi

 

06-01-2020, 00:58:18 DCS.exe benchmark completed, 4455 frames rendered in 44.172 s

Average framerate : 100.8 FPS

Minimum framerate : 95.0 FPS

Maximum framerate : 105.4 FPS

1% low framerate : 63.6 FPS

0.1% low framerate : 36.8 FPS

 

That's 83% scaling. I saw those and I was :huh: but I double checked and did some more runs and the math checked out again. They've definitely made some improvements to the engine over the last year and it looks really really beautiful in full 4k with all the eye candy cranked. Even on a 65" big screen you can't see any jaggies. I was flying along the Normandy coast around Mont St Michel and the frame rate counter hit 160+ out over the water. :)

 

2O44Vnl.jpg

 

31eEI8I.jpg

 

It's getting late here, I got to get some sleep, hopefully people found all the data useful. Hopefully I get that rig built sometime this year for a good Intel - AMD comparison. The 3080 Ti Ampere is definitely on my list when it's released. I'm tapped out on disposable fun money for the next couple months. But I've been learning Blender to do video editing for YT videos.


Edited by Sn8ke_iis
typo

 

 

Link to comment
Share on other sites

Not bad your SLI at all at high or crazy LOD and high res, which makes sense, those hardly touch the CPU ( apart from view range I guess ) and do not add more to bandwidth.

 

At 4k and crazy LOD your SLI shined, respect.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Intel or AMD

 

DCS does not use multithreading

 

i don’t know where you read this but it is factually incorrect.

 

you can see for yourself with task manager or any other process tool.

 

please stop posting this misinformation. it’s misleading and simply wrong.

Link to comment
Share on other sites

Afaik and iirc it does use threading where ever possible according to a statement of a lead dev not too long ago here in the forum, so both is somehow correct.

 

It still hammers too much on 1 core but they are midway to multithreading whatever allows it when they redo it.

 

I wish there was a sticky post from a lead dev about this topic to give an end to our endless discussions about this. It would calm down the crowd :helpsmilie:

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Intel or AMD

 

open a process tool and look at the “threads” section. you can easily see that there are at least 40 threads in the process.

 

next, take a look at how much cpu time is used by each thread. you will see that every thread has some cpu usage, so multiple threads in the process and they are each using cpu time

 

dcs is multithreaded.

 

if it wasn’t, you could lock it to a single (1) core and get the same or better performance.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...