Jump to content

AMD 5000 Series CPU Performance in DCS


Recommended Posts

As I can see, all the AMD bashing we've seen since the launch of the Zen 3 is going down in flame.

 

I haven't been playing DCS for a while because of issues with my G2 but all in all I'm more than pleased with the performances i got from my 5600X.

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites
  • Replies 177
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

For what it's worth, thanks to the Radeon driver software that keeps track of recently played games and their framerates, I know that my system runs DCS in single player campaigns (More specifically:

I did a short sam on vs sam off test in DCS at 1440p if someone is interested: I get no benefit from SAM in DCS. In AC:Valhalla it gives me a nice 10fps boost.  

It does remain stable at 40fps However depending on the settings (talking about the cpu impacting ones) and the situation there may still occur some dips under 40fps. Take the p47 instant action in th

13 hours ago, danielwwm said:

Hopefully the new CPU will address this issue for me as average and peak FPS have been generally ok.

Don't expect too much from the current graphics engine. The performance leap going from 2600k to 5900X will be much more pronounced once we have Vulkan in Q3 or thereabouts.

Spoiler

Ryzen 9 5900X | 64GB G.Skill TridentZ 3600MHz CL16 | Gigabyte RX6900XT | ASUS ROG Strix X570-E GAMING | Samsung 960Pro NVMe 1TB | HP Reverb G2
Pro Flight Trainer Puma | TM Warthog (with custom spring, 10 cm extension, custom TDC, replacement pinky switch) on Wheelstand Pro | TPR rudder pedals

My in-game DCS settings (PD 1.0 SteamSS 76%):

EduSYaK.png

 

Link to post
Share on other sites
13 hours ago, danielwwm said:

May I ask you guys who have already upgraded to the new Ryzen 5000 series how much the frame rate dips are when both MFDs are pulled up? Hopefully the new CPU will address this issue for me as average and peak FPS have been generally ok.

I made a frametime measurement using Oculus Debug tools. Low-level in Syria, F/A-18C at around 500kts, MFDs off I get 9-12ms CPU frametimes at 500-1000ft AGL over the coast including cities, enough for 80fps (which is the target for my Rift S).

 

Once I pull up both FLIR and MAVR pages and play around to lock onto something, I jump to 18ms average (variance 16 to 22ms). My fps dips to 40 in this case, still decent with reprojection, but noticeable.

 

This is done with a Ryzen 5 5600X stock, 32GB DDR4 3000MHz CL16 dual ranked, with the game on a dedicated NVMe drive.

 

Overall I am impressed by the CPU change, coming from a 2600X I saw a larger increase in overall smoothness, especially for large missions and multiplayer. I am sure you will be happy with yours too.


Edited by Qiou87
  • Thanks 1

AMD R5 5600X | 32GB DDR4 3000MHz | RTX 2070 SUPER | HP Reverb G2 | VKB Gunfighter Pro Mk3 | Thrustmaster TCWS

Link to post
Share on other sites
On 12/29/2020 at 11:07 AM, Thinder said:

My Ryzen 5 5600X was dispatched today, I should get it tomorrow or Thursday, then I'll run a benchmark to compare with my 3600X...

 

 

 

On 1/3/2021 at 8:34 AM, aleader said:

I didn't read every single post in this topic, but has anyone who had a 3600/X/XT and upgraded to a 5600x actually tested DCS yet at 1440p?  It's a no-brainer that anything higher than a 5600x to play games is a waste of cash (look at the benchmarks), but DCS is an old, unoptimized beast and I'm wondering if the bump in clock speed makes any difference. 

 

I am either going to give my 3600 to my son (I gave him my MSI Carbon B450 board) and get a 5600x (I can pick it up tomorrow at Memory Express), or just have him buy his own 3600.  I only paid $230 CAD ($170 USD) for my 3600 and they are charging $420 CAD for the 5600x, so basically $200 more.  From what I can see the benefit at 1440p is about 3%. 

 

Anyways, has anyone actually tested it?

Yeah help a brother out. Can anyone point me to some posts that share some numbers from before and after , not just after? Thanks Juimo. A lot of us don’t have time to read through all these. 

SimRig: Winter '18/19

Asus Maximus XI Formula | Intel 9900K @ 5.4 GHz Single Core, 5.2 GHz All Core | Nvidia/EVGA 3090 K|NGP|N @ ???? MHz | EKWB Direct Die Custom Liquid Cooling Loop | Gskill 32 GB DDR4 @ 3600 MHz | Samsung 512 GB, 1 TB NVMe SSDs | Asus/Seasonic 1200w PSU

Samsung Q6FN 65" 4K | Realsim FSSB | TM Cougar HOTAS | Vipergear IPC | Buddyfox UFC | Saitek Trim Wheel | MFG Crosswinds Rudder Pedals | Playseat Flightseat | Monstertech HOTAS mount | Gametrix Jetseat

 

Link to post
Share on other sites
4 hours ago, Sn8ke_iis said:

Yeah help a brother out. Can anyone point me to some posts that share some numbers from before and after , not just after? Thanks Juimo. A lot of us don’t have time to read through all these. 

Here are my before/after measurements, with 2600X so a wider gap than 3600 or 3600X:

 

Scenario #1: low-level F/A-18 in Syria, take-off from Rene Mouawad in single player, fly south along the coastline, max 1000ft AGL.
2600X: mini 14ms, average 17ms, max 21ms
5600X: mini 9ms, average 11ms, max 12ms

Scenario #2: roll & take-off from Supercarrier in F/A-18 in Syria, multiplayer on 4YA server with ~20 active players.
2600X: mini 18ms, average 23ms, max 37ms
5600X: mini 12ms, average 14ms, max 19ms

 

To go from frametime to frames per second, just do 1/frametime. E.g. 11ms: 1/0,011 = 90 fps. Of course actual fps depends on the GPU, these numbers are only from the CPU measurement.

  • Like 1
  • Thanks 1

AMD R5 5600X | 32GB DDR4 3000MHz | RTX 2070 SUPER | HP Reverb G2 | VKB Gunfighter Pro Mk3 | Thrustmaster TCWS

Link to post
Share on other sites

I'd be more interested in seeing the difference from a 3600 to a 5600X as it's a far more widely installed CPU at this point, and it did offer a pretty good jump over the 2600, even at higher resolutions.  I just installed my 5600X about a week ago, and see literally no difference in DCS at 1440p (from my 3600 that I gave to my son).  This new testing from HW Unboxed sheds light (yet again) on how little you gain by spending on any more than a 3600 to play games, especially at 1440p and beyond:

 

https://www.youtube.com/watch?v=PSxuiWih_Z8

 

You are always far better off spending your cash on a GPU upgrade.


Edited by aleader

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Link to post
Share on other sites

In gaming in general I agree.

 

However, in DCS when there is is a lot of assets on a busy map/scenario every tick of the CPU in single thread makes a difference - especially with the jets where the radar systems are basically a whole other (simplified) computer system inside the aircraft being modelled.

 

Even on the WW2 maps - get a flight of 20+ B-17s on a bombing run plus the associated flak, little friends, and red team and then single core performance matters in a big way.

 

I'd suggest that in those scenarios you'd see a difference between a 3600 and 5600X in DCS.

 

There's also a number of graphics elements in DCS that are CPU bound - if you have them dialed back/off then you'd likely not notice the difference either.

 

$0.02

  • Like 1
Link to post
Share on other sites
11 hours ago, reece146 said:

In gaming in general I agree.

 

However, in DCS when there is is a lot of assets on a busy map/scenario every tick of the CPU in single thread makes a difference - especially with the jets where the radar systems are basically a whole other (simplified) computer system inside the aircraft being modelled.

 

Even on the WW2 maps - get a flight of 20+ B-17s on a bombing run plus the associated flak, little friends, and red team and then single core performance matters in a big way.

 

I'd suggest that in those scenarios you'd see a difference between a 3600 and 5600X in DCS.

 

There's also a number of graphics elements in DCS that are CPU bound - if you have them dialed back/off then you'd likely not notice the difference either.

 

$0.02

 

Without any actual benchmark data in DCS, this is always just pure speculation.  It's too bad that MSFS wasn't included in their tests as it is very similar to DCS, although based on a much more modern platform obviously.  RDR2 is also a very CPU-driven game and the results are pretty much the same.  I just don't believe that DCS is so special that it is the only game that requires a 5950X.  Maybe the most poorly-optimized game in existence...😉

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Link to post
Share on other sites
On 1/17/2021 at 5:26 AM, aleader said:

 

Without any actual benchmark data in DCS, this is always just pure speculation.  It's too bad that MSFS wasn't included in their tests as it is very similar to DCS, although based on a much more modern platform obviously.  RDR2 is also a very CPU-driven game and the results are pretty much the same.  I just don't believe that DCS is so special that it is the only game that requires a 5950X.  Maybe the most poorly-optimized game in existence...😉

No one said anything about 5950X, as it has too many cores and DCS really just needs one. But basically I did a lot of measurements (it is not hard, you can try it for yourself, you just need a software that can record CPU and GPU frametime and repeat the same mission/flight) and yes, graphics is extremely important (especially in VR when running supersampling, roughly equal to 2160p gaming for me), however even in these circumstances you can find situations where CPU frametimes are HIGHER than GPU frametimes. This means your GPU has finished calculating the frame, but the CPU has not, and so you get lower framerate because of the CPU. Situations where this happens, according to my own measurements, are very heavy SP missions (although it will be punctual, like launching multiple weapons in sync) and multiplayer (again, on a heavy map with lots of players).

 

Does this mean you should spend more on CPU than GPU for DCS? Absolutely not. But the CPU matters a lot, especially single-threaded performance. Even in my situation (very high resolution in VR with a powerful but not top-of-the-line GPU) I see a lot of improvements in smoothness since my switch in CPU. And contrary to what you were saying, the gap from 2600X to 3600 was actually quite small (exclusively IPC, ~13% on average), the gap from 3600 to 5600X is actually bigger (IPC improvement ~19% + frequency improvement ~10%).

  • Like 1

AMD R5 5600X | 32GB DDR4 3000MHz | RTX 2070 SUPER | HP Reverb G2 | VKB Gunfighter Pro Mk3 | Thrustmaster TCWS

Link to post
Share on other sites
20 hours ago, Qiou87 said:

No one said anything about 5950X, as it has too many cores and DCS really just needs one.

Not true. DCS has more than 20 threads running in the background. Also, part of the roadmap for 2021 is to implement better multithreading. I guess the 5950x might be handy. BTW I ordered one.


Edited by max22
Link to post
Share on other sites
1 hour ago, max22 said:

Not true. DCS has more than 20 threads running in the background. Also, part of the roadmap for 2021 is to implement better multithreading. I guess the 5950x might be handy. BTW I ordered one.

 

It has been on the roadmap for many years ; better multithreading is a journey, not a destination. I will believe it when I see it. Many things in DCS have been on the roadmap for many years, doesn't mean they are coming in two weeks.

DCS has multiple threads but only pushes one core to 100%. It means that single core performance is the key determining factor to DCS CPU performance, and having 12, 18 or 24 cores won't change that. 

 

But hey, we all need to convince ourselves that we spend our money in the best way possible, and the 5950X is an incredible processor. Right now though, you'd get exactly the same performance in DCS (CPU-wise) from a 5600X with PBO2 to equalize the frequency deficit.

 

And once multithreading is implement, you might ask? Well, that's not bad news for any quad or hexacore either, since they also have cores doing almost nothing today in DCS, they will benefit as well. In the end I just don't think we are going to see a point, multithreading optimization or not, where running a 12 or 16-core CPU will provide significant benefits over a 6 or 8-core CPU in DCS. Why do I think that? Well, today the single-core performance of a Zen3 CPU is enough to reach comfortable FPS levels in DCS, and then the graphics card becomes the determining factor. Of course you can always run into more CPU bottlenecks if you run a RTX3090 @720p and aim for 200fps, but realistically I don't think anyone is aiming higher than 90fps (for VR headsets) or 60fps (for pancake). Flight simulation has never been about very high framerates (just ask the Flight Simulator crowd).

And so if a 5600X, whilst using mostly one core, is able to reach 90fps in the current engine, I really don't see how it would be at a disadvantage once more cores are used. After all it does have 5 under-utilized cores today in DCS, so plenty of headroom.

 

The 5950X is, again, a great processor, it provides incredible performance and even has reasonable power consumption. But don't fool yourself into thinking that it will be mandatory to have such a CPU to run DCS once more multithreading is implemented, as in my humble opinion, such a time will not come for a long while (and by then we will all have 24-core CPUs with 1TB of DDR6).


Edited by Qiou87

AMD R5 5600X | 32GB DDR4 3000MHz | RTX 2070 SUPER | HP Reverb G2 | VKB Gunfighter Pro Mk3 | Thrustmaster TCWS

Link to post
Share on other sites
3 ore fa, Qiou87 ha scritto:
It has been on the roadmap for many years ; better multithreading is a journey, not a destination. I will believe it when I see it. Many things in DCS have been on the roadmap for many years, doesn't mean they are coming in two weeks.
DCS has multiple threads but only pushes one core to 100%. It means that single core performance is the key determining factor to DCS CPU performance, and having 12, 18 or 24 cores won't change that. 
 
But hey, we all need to convince ourselves that we spend our money in the best way possible, and the 5950X is an incredible processor. Right now though, you'd get exactly the same performance in DCS (CPU-wise) from a 5600X with PBO2 to equalize the frequency deficit.
 
And once multithreading is implement, you might ask? Well, that's not bad news for any quad or hexacore either, since they also have cores doing almost nothing today in DCS, they will benefit as well. In the end I just don't think we are going to see a point, multithreading optimization or not, where running a 12 or 16-core CPU will provide significant benefits over a 6 or 8-core CPU in DCS. Why do I think that? Well, today the single-core performance of a Zen3 CPU is enough to reach comfortable FPS levels in DCS, and then the graphics card becomes the determining factor. Of course you can always run into more CPU bottlenecks if you run a RTX3090 @720p and aim for 200fps, but realistically I don't think anyone is aiming higher than 90fps (for VR headsets) or 60fps (for pancake). Flight simulation has never been about very high framerates (just ask the Flight Simulator crowd).
And so if a 5600X, whilst using mostly one core, is able to reach 90fps in the current engine, I really don't see how it would be at a disadvantage once more cores are used. After all it does have 5 under-utilized cores today in DCS, so plenty of headroom.
 
The 5950X is, again, a great processor, it provides incredible performance and even has reasonable power consumption. But don't fool yourself into thinking that it will be mandatory to have such a CPU to run DCS once more multithreading is implemented, as in my humble opinion, such a time will not come for a long while (and by then we will all have 24-core CPUs with 1TB of DDR6).

 


Yeah I agree.
Also when multithreading will be a thing, it is not likely that a 16 core cpu will be 50% faster then an 8 core cpu.
Usually in these kind of computing there are diminishing returns.

 


Edited by VirusAM
  • Like 1
  • Thanks 1

Vincent "Virus" DThe

PC: R9 5900x/RTX2080Ti, 64GB RAM.

Joystick bases: Virpil T-50CM2 with 20cm extension (Center)

Joystick grips: Realsimulator F18

Throttles: Winwing Super Taurus

Hardware: Saitek Combat Rudder, 4x Thrustmaster Cougar MFD, Logitech G13, Winwing Panels

VR: Valve Index 

Monitor: Samsung Odyssey G5, TrackIr v5

Link to post
Share on other sites
On 1/19/2021 at 3:20 AM, Qiou87 said:

No one said anything about 5950X, as it has too many cores and DCS really just needs one. But basically I did a lot of measurements (it is not hard, you can try it for yourself, you just need a software that can record CPU and GPU frametime and repeat the same mission/flight) and yes, graphics is extremely important (especially in VR when running supersampling, roughly equal to 2160p gaming for me), however even in these circumstances you can find situations where CPU frametimes are HIGHER than GPU frametimes. This means your GPU has finished calculating the frame, but the CPU has not, and so you get lower framerate because of the CPU. Situations where this happens, according to my own measurements, are very heavy SP missions (although it will be punctual, like launching multiple weapons in sync) and multiplayer (again, on a heavy map with lots of players).

 

Does this mean you should spend more on CPU than GPU for DCS? Absolutely not. But the CPU matters a lot, especially single-threaded performance. Even in my situation (very high resolution in VR with a powerful but not top-of-the-line GPU) I see a lot of improvements in smoothness since my switch in CPU. And contrary to what you were saying, the gap from 2600X to 3600 was actually quite small (exclusively IPC, ~13% on average), the gap from 3600 to 5600X is actually bigger (IPC improvement ~19% + frequency improvement ~10%).

 

Obviously I was saying that a 5950X is the extreme example that a LOT of DCS players on this site seem to think you need for DCS (see guy's post above).  I understand frametimes, but I don't have the time or patience to do all that testing.  I rely on sites like HW Unboxed to do the work for me.  As I said, DCS is not that special, other than it's incredibly poor optimization.  To me it's on the same level as Combat Mission, Steel Beasts, IL2, etc.  All sim games are very poorly optimized for modern CPUs.  Wait until the new clouds come. If it's anything like the lighting release, it will be a shi*tshow. 

 

As I think I've said before, I did see a 'smoothness' improvement moving from my i5 4670K @4.5GHz to a 3600, but the frames were identical.  I did not 'see' the improvement this time moving to the 5600X.  Maybe a different story at 1080p, but I play at 1440p, so I don't care about 1080p.  Certainly not worth double the price that it cost over the 3600.  That's all that matters to me is the performance I can see and feel in the game.  I would be much smarter taking the extra $300 or $400 and spending it on a 3070/6800. 


Edited by aleader

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Link to post
Share on other sites
15 hours ago, Qiou87 said:

It has been on the roadmap for many years ; better multithreading is a journey, not a destination. I will believe it when I see it. Many things in DCS have been on the roadmap for many years, doesn't mean they are coming in two weeks.

DCS has multiple threads but only pushes one core to 100%. It means that single core performance is the key determining factor to DCS CPU performance, and having 12, 18 or 24 cores won't change that. 

 

But hey, we all need to convince ourselves that we spend our money in the best way possible, and the 5950X is an incredible processor. Right now though, you'd get exactly the same performance in DCS (CPU-wise) from a 5600X with PBO2 to equalize the frequency deficit.

 

And once multithreading is implement, you might ask? Well, that's not bad news for any quad or hexacore either, since they also have cores doing almost nothing today in DCS, they will benefit as well. In the end I just don't think we are going to see a point, multithreading optimization or not, where running a 12 or 16-core CPU will provide significant benefits over a 6 or 8-core CPU in DCS. Why do I think that? Well, today the single-core performance of a Zen3 CPU is enough to reach comfortable FPS levels in DCS, and then the graphics card becomes the determining factor. Of course you can always run into more CPU bottlenecks if you run a RTX3090 @720p and aim for 200fps, but realistically I don't think anyone is aiming higher than 90fps (for VR headsets) or 60fps (for pancake). Flight simulation has never been about very high framerates (just ask the Flight Simulator crowd).

And so if a 5600X, whilst using mostly one core, is able to reach 90fps in the current engine, I really don't see how it would be at a disadvantage once more cores are used. After all it does have 5 under-utilized cores today in DCS, so plenty of headroom.

 

The 5950X is, again, a great processor, it provides incredible performance and even has reasonable power consumption. But don't fool yourself into thinking that it will be mandatory to have such a CPU to run DCS once more multithreading is implemented, as in my humble opinion, such a time will not come for a long while (and by then we will all have 24-core CPUs with 1TB of DDR6).

 

 

I agree, I'll believe it when I see it.  Probably two years out by the time we actually see any in-game improvements.  By then all of these CPUs will be 'obsolete'.  My 5600X I'm sure will be quite sufficient for 5 to 7 more years.  Games have a lot of catching up to do.  The 5950X is maybe incredible for people that have IT jobs and can actually use the extra cores for something?  As indicated in the scaling video I linked, I don't see what exactly it's so 'incredible' for when it comes to gaming, other than it's incredible ability to separate certain people from their cash.  

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Link to post
Share on other sites
8 hours ago, aleader said:

As I think I've said before, I did see a 'smoothness' improvement moving from my i5 4670K @4.5GHz to a 3600, but the frames were identical.  I did not 'see' the improvement this time moving to the 5600X.  Maybe a different story at 1080p, but I play at 1440p, so I don't care about 1080p.  Certainly not worth double the price that it cost over the 3600.  That's all that matters to me is the performance I can see and feel in the game.  I would be much smarter taking the extra $300 or $400 and spending it on a 3070/6800. 

 

Totally agree, an improvement in an area where it is not the bottleneck doesn't bring anything. For example if I go from 32 to 64GB of RAM when my current memory is never the limit, I will not see any improvement.


Generally for gaming, upgrading the GPU is the smarter choice, except if you know for a fact that your CPU is the limiting factor. In my case it was in some scenarios, and the benefits are tangible (except multiplayer, Supercarrier and some heavy WWII SP missions, another one: with the 2600X I could only accelerate time x4 because more than that I had only 5 frames per second and it was horrible in VR - now I can accelerate to x10 with no reduction of smoothness and no discomfort for me).

 

Do not take what people on this forum do for general trends in gaming, though, most people here are older and wealthier than the average PC gamer - that is true for most flight sim enthousiasts, not specific to DCS. It means for many the 5950X is actually not that expensive, they don't actually mind spending 3000$+ for a gaming PC. Especially the CPU, which has an effective lifespan of at least 5 years, there is usually less resistance to invest a certain amount.


Edited by Qiou87
  • Like 1

AMD R5 5600X | 32GB DDR4 3000MHz | RTX 2070 SUPER | HP Reverb G2 | VKB Gunfighter Pro Mk3 | Thrustmaster TCWS

Link to post
Share on other sites
6 hours ago, Qiou87 said:

Totally agree, an improvement in an area where it is not the bottleneck doesn't bring anything. For example if I go from 32 to 64GB of RAM when my current memory is never the limit, I will not see any improvement.


Generally for gaming, upgrading the GPU is the smarter choice, except if you know for a fact that your CPU is the limiting factor. In my case it was in some scenarios, and the benefits are tangible (except multiplayer, Supercarrier and some heavy WWII SP missions, another one: with the 2600X I could only accelerate time x4 because more than that I had only 5 frames per second and it was horrible in VR - now I can accelerate to x10 with no reduction of smoothness and no discomfort for me).

 

Do not take what people on this forum do for general trends in gaming, though, most people here are older and wealthier than the average PC gamer - that is true for most flight sim enthousiasts, not specific to DCS. It means for many the 5950X is actually not that expensive, they don't actually mind spending 3000$+ for a gaming PC. Especially the CPU, which has an effective lifespan of at least 5 years, there is usually less resistance to invest a certain amount.

 

 

Right, I also didn't see any improvement going from 16GB to 32GB RAM.  8GB to 16GB however was almost a necessary upgrade, especially for DCS.  RAM was so cheap though that I coudln't not do it. 

 

Ha, yah, I've heard the 'if you can afford it' nonsense trotted out on a lot of sim forums over and over again.  It's a smug way of gaming nerds thinking they're superior to other gaming nerds.  I am a 47 year old civil engineer and my wife is a registered nurse.  I can easily 'afford' it.  I however have other interests besides the *maybe* 1 hour I spend on my PC in the average day.  I also plan on retiring in 7-8 years and every place you save cash quickens that.  I just checked and the 5950X would cost me (if you could even get one) $1,250 CAD.  That's $1,000 (!!) more than I spent on my 3600 when I bought it about a year ago ($165 USD).  That $1,000 is MUCH better spent on my pontoon boat 😉  

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Link to post
Share on other sites
On 1/16/2021 at 4:02 PM, aleader said:

I'd be more interested in seeing the difference from a 3600 to a 5600X as it's a far more widely installed CPU at this point, and it did offer a pretty good jump over the 2600, even at higher resolutions.  I just installed my 5600X about a week ago, and see literally no difference in DCS at 1440p (from my 3600 that I gave to my son).  This new testing from HW Unboxed sheds light (yet again) on how little you gain by spending on any more than a 3600 to play games, especially at 1440p and beyond:

 

https://www.youtube.com/watch?v=PSxuiWih_Z8

 

You are always far better off spending your cash on a GPU upgrade.

 

 

Personally I noticed a <> 21% imrprovement in 3D Mark 4K benchmark in Physics score.

 

My settings for both CPU were excactly the same, Ryzen Master like 3D Mark Fire Strike Ultra at 3040 X 2160 MSAA X 2, that's full 4K.

 

The difference is noticeable but the drawback is that now my GPU bottleneck has increased, it is no slouch but the 3600X can easily accomodate a RTX 3080 and still be in the green when it comes to bottleneck, not the 1080 Ti, even with 11GB, so Physics score is higher but results for the GPU are a bit lower.

 

bound.jpg

 

bound.jpg

 

This suggest to me that it isn't really the CPU core speed or number of cores that matters but the bound between CPU and GPU.

 

A 5600X can run at 4.850GHz all cores, it has enough power if set up properly to handle most of what DCS can throw at it but AMD has designed Zen 3 to work together, so as I was expecting, paired with one of the new AMD GPU it should work better anyway, give it a stronger GPU and you'll get better results than with a 3600X, and same goes for RAM.

 

Since everyone who doesn't see the improvement still mention complicated scenarios and maps, Fire Strike Ultra at 3040 X 2160 MSAA X 2 settings for 3D Mark are perfectly relevant, and nearly 21% better score with no particular O.C other than the average Ryzen Master Boost settings should say more than just opinions.

 

The 5600X is noticeably faster than the 3600X, if it doesn't show in DCS it is mainly because it is not used by the game and not bound with anbother Zen 3, which is was designed to be in the first place, what you can expect from this 5600X is to be able to handle much faster GPUs and still not be the bottleneck.

 

The 5600X is AMD Gaming CPU and should be treated as such, meaning properly bound (RAM, GPU), set up and O.C, it has the capabilities, and the difference between it and the other Ryzen 5000 serie CPU, when set up properly is marginal, the price difference between the 5600X and the other Zen 3 is a lot harder to justify than it and the 3600X.

 

I had mine for £308.98 and the best thing about it is that I don't need to upgrade my cooler (Artic Freezer 7X) and that even with the strongest GPU in the market i'd still be safe with my Power supply (Corsair 750W), it's a kit, not a single unit that must work together.

 

I didn't buy the 5600X hoping to see a bump in FPS, I purchased it in the frame of a complete upgrade including GPU, with a Radeon RX-6800 or Radeon RX-6800XT and a better RAM bound in mind, something the 3600X wouldn't be able to cope with, now you can chose another AMD CPU to do the same, it doesn't make the 5600X a bad choice at this price.

 


Edited by Thinder

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites
1 hour ago, Thinder said:

A 5600X can run at 4.850GHz all cores, it has enough power if set up properly to handle most of what DCS can throw at it but AMD has designed Zen 3 to work together, so as I was expecting, paired with one of the new AMD GPU it should work better anyway, give it a stronger GPU and you'll get better results than with a 3600X, and same goes for RAM.

 

The 5600X is noticeably faster than the 3600X, if it doesn't show in DCS it is mainly because it is not used by the game and not bound with anbother Zen 3, which is was designed to be in the first place, what you can expect from this 5600X is to be able to handle much faster GPUs and still not be the bottleneck.

That is actually completely wrong and misleading. The idea that an AMD processor only works at its full potential with an AMD GPU is wrong. There are currently small benefits to having an AMD RDNA2 GPU on AMD Zen3 / 500 Series platform, a function called ReSize BAR or SAM, but this has proven to be temperamental and bring marginal gains in most situations - even performance loss in a few games. And this functionality will be coming for nVidia Ampere GPUs and on Intel platforms as well.

 

Don't buy into the internet legend that you have to couple Intel with nVidia and AMD with AMD for optimal performance, that has simply never been proven. I have personnally swapped Intel with AMD for CPUs, both ways, same for GPUs going between nVidia and AMD, and never noticed any processor performing "better" once a specific type of GPU was installed.

 

However some of what you say is true: CPU and GPU have to be balanced, and a super-fast CPU won't bring any benefits if you have a weaker GPU relative to your screen definition. That is also an important notice: a 3060Ti can crush everything at 1080p but struggle a bit at 4K ; therefore that card will require a faster CPU at 1080p to not be bottlenecked that at 4K. Of course this is considering that you are trying to hit the highest framerate, if you cap framerates at 60fps the same CPU performance is necessary irrespective of definition.

 

I also don't understand the obsession with "CPU bottleneck calculator", this is just a benchmark based on some metrics, and not necessarily applicable to every game or every situation in every game. In DCS I manage to have situations where my CPU is basically sleeping (fly at 40,000, no AI) and others where the CPU is working hard (low-level, heavy multiplayer server).

 

 

AMD R5 5600X | 32GB DDR4 3000MHz | RTX 2070 SUPER | HP Reverb G2 | VKB Gunfighter Pro Mk3 | Thrustmaster TCWS

Link to post
Share on other sites
1 hour ago, Qiou87 said:

That is actually completely wrong and misleading. The idea that an AMD processor only works at its full potential with an AMD GPU is wrong.

 

Well you obviously didn't research the matter long and hard enough, and I didn't say "Only" it will depend on what NVIDIA does to make their GPUs work with what AMD came up with...

 

They've been advertizing the Zen architecture for a while now and Zen 3 devices are designed to work together, AMD scalable memory have been in use even with the 3000 Serie, but AMD have been advertizing smart access memory from Zen 3 launch.

 

Quote

Smart Access Memory: the simple switch that makes an all-AMD gaming PC sing

By Jacob Ridley November 24, 2020

Never turn your nose up at free performance.

https://www.pcgamer.com/uk/amd-smart-access-memory-benchmarks-performance/

 

Quote

 

>>>>>>

 

Quote

I also don't understand the obsession with "CPU bottleneck calculator", this is just a benchmark based on some metrics, and not necessarily applicable to every game or every situation in every game.

 

If you want to optimize your system, you'll need to reduce all bottlenecks, CPU, RAM and GPU and one simple example speaks volume: Let's say you have purchased 3080 Ti but now have a bottleneck of 15% with your old CPU, your new GPU is 25% faster than the old one, so in effect, you only gained 10% at best.

 

I don't see why one should pay £600.00+ for this sort of gain, so before commiting to spalshing this kind of dosh into a CPU, you'd better know in advance what it is capable of.

 

CPU bottleneck calculator is certainly not an obsession, it's a tool to decide of your specs before commiting to a pruchase, and if people were less prone to be controvertial in forum and did a little more home work, we would have less complains about their results in game in the first place.

 

What this "just a metric" shows, is that the 5600X is perfectly capable of handing a card such as the RTX 3080 Ti, and the reasson for this is its <> 21% better performances over the 3600X as reflected by the 3D Marks bench test.

 

Quote

In DCS I manage to have situations where my CPU is basically sleeping (fly at 40,000, no AI) and others where the CPU is working hard (low-level, heavy multiplayer server).

 

That's irrelevant to the topic of system optimization, what you manage with your gear and settings is completely unique to you, on the other hand, knowing what does what and why is what I am talking about:

 

I mentioned motherboard, bus speeds, and now the need to have the right specs for this kind of optimization, which leads me to what I was about to write before your replied, I still have to upgrade my RAM, G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin DDR4 SDRAM DDR4 3200 (PC4 25600) kit, and GPU, Radeon RX 6800 MBA 16GB GDDR6 or XT depending on budget. 

 

The motherboard will then become the bootleneck, limited to PCIe2 and not supporting Smart Access Memory.

 

2.jpg

 

1.jpg


Edited by Thinder

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites

Before I replied to Qiou87 I was about to disclose my complete upgrade path and how I figured the best way to achieve it:

 

First there is a <> up to 10% FPS gain possible by using a X 4 stick RAM kit.

 

Second I used the pangoly.com to figure my RAM to Motherboard compability, the 600 Serie Motherboard is also compatible with this RAM, it is low latency (CAS 14).

 

Here are the video and link.

 

https://pangoly.com/en/review/msi-b450-gaming-plus-max/compatibility/ram

 

 

So in the order, since my G2 went back to HP for refund after being find faulty and there are none available at the moment:

 

G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin DDR4 SDRAM DDR4 3200 (PC4 25600) kit, AMD RX 6800 serie GPU and MSI B550 serie motherboard (PCIe4).

 

This combination should allow me to reduce bottleneck at all levels and use Smart Access Memory.


Edited by Thinder

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites
1 hour ago, Thinder said:

 

Well you obviously didn't research the matter long and hard enough, and I didn't say "Only" it will depend on what NVIDIA does to make their GPUs work with what AMD came up with...

 

They've been advertizing the Zen architecture for a while now and Zen 3 devices are designed to work together, AMD scalable memory have been in use even with the 3000 Serie, but AMD have been advertizing smart access memory from Zen 3 launch.

Why do you keep referring to "all Zen3 devices" when Zen3 is a CPU architecture and the GPU architecture is called RDNA2? Also: SAM is a function part of PCI-Express, it has been available for a while (2012 if I remember correctly, don't quote me on this). The fact that AMD chose to implement it is commendable, however there are technically no reasons why it should be limited to 500 series chipsets or Zen3 processors. It is only limited like this for AMD to make a profit, because enabling it widely for Zen2 or Zen+ processors and 300/400 Series motherboards wouldn't bring cash for them.

 

I actually research and read stuff about it, I don't limit myself to the AMD brochure. Here is a test of SAM/re-size BAR applied in a multitude of games where you can see that, yes, it can have benefits, but sometimes it is actually not beneficial (Flight Simulator...). In many cases it actually seems to increase CPU overhead.

 

As it is, SAM/re-size BAR is a nice feature, but definitely not a must-have. When it comes to DCS World, we actually do not know if it brings benefits or not.

 

Your motherboard is PCI-E 3.0 and does not represent a bottleneck from a CPU/GPU performance standpoint. Using a PCI-E 4.0 GPU on a PCI-E 3.0 16x slot does not present any tangible performance penalty because PCI-E 3.0 16x is not bottlenecking even high-end GPUs. PCI-E 4.0 is only useful for NVMe SSDs, and for certain scenarios where high data transfer rates are palpable.


Edited by Qiou87
  • Like 1
  • Confused 1

AMD R5 5600X | 32GB DDR4 3000MHz | RTX 2070 SUPER | HP Reverb G2 | VKB Gunfighter Pro Mk3 | Thrustmaster TCWS

Link to post
Share on other sites

Yawn....

 

I didn't think it was possible for someone to talk this much about their computer. And how long did you spend getting all these posts together?

 

You still haven't run a relavant DCS benchmark. Posting a Firestrike Physics score is completely irrelevant to DCS. DCS doesn't scale with cores and threads like Firestrike does. It's a synthetic benchmark to race and compare computers with each other. 

 

Instead of pulling all this information from AMD marketing materials could you actually generate some original data and do a benchmark in DCS comparing the 3600 and 5600x? If you have this much time on your hands we'd appreciate it if you could donate some of it and do a useful benchmark. 

 

People don't come here to learn how to build a PC. There's plenty of other forums for that. And we already know that DCS's graphic engine is limited by CPU single core performance. 

 

What we don't know and what people are curious about is what the actual measurable performance gains are upgrading from a 3600 to a 5600 keeping the same graphics card and all other specs the same. All the gibberish about bounds and percentages. It's just weird...

 

Now if you can show an actual improvement in framerate or frametime from some of this propietary AMD tech that would be interesting, but you're just speculating without any data. 


Edited by Sn8ke_iis

SimRig: Winter '18/19

Asus Maximus XI Formula | Intel 9900K @ 5.4 GHz Single Core, 5.2 GHz All Core | Nvidia/EVGA 3090 K|NGP|N @ ???? MHz | EKWB Direct Die Custom Liquid Cooling Loop | Gskill 32 GB DDR4 @ 3600 MHz | Samsung 512 GB, 1 TB NVMe SSDs | Asus/Seasonic 1200w PSU

Samsung Q6FN 65" 4K | Realsim FSSB | TM Cougar HOTAS | Vipergear IPC | Buddyfox UFC | Saitek Trim Wheel | MFG Crosswinds Rudder Pedals | Playseat Flightseat | Monstertech HOTAS mount | Gametrix Jetseat

 

Link to post
Share on other sites

Really? So CPU benchmarks are irrelevant?

 

Give us a break.

 

You talk about relevant benchmark for DCS but completely ignore the CPU performances in 3D Marks, sorry this doesn't compute.

 

FACT is, the 5600X is 21% more performants, and as I pointed out it is perfecly capable to bound with a top GPU so stop lecturing and ask for "relevant benchmarks" when you're given one in the first place.

 

As for your pal he was WRONG, as I was saying, an ALL AMD system is designed to be more performant, Motherboard, PCU and GPU burt of course, you know better, your bunch make me laugh, did you read the tests?...

 

I don't think so, so don't be surprise if I don't take your "specialists" opinions too seriously, the day I need an advise, I know where to ask and you're not on my list, somwething else, it's not about "my computer" but how and where I did find RELEVANT infos. Cheers.

 

 


Edited by Thinder

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites
42 minutes ago, Qiou87 said:

 

I actually research and read stuff about it, I don't limit myself to the AMD brochure. Here is a test of SAM/re-size BAR applied in a multitude of games where you can see that, yes, it can have benefits, but sometimes it is actually not beneficial (Flight Simulator...). In many cases it actually seems to increase CPU overhead.

 

As it is, SAM/re-size BAR is a nice feature, but definitely not a must-have. When it comes to DCS World, we actually do not know if it brings benefits or not.

 

Your motherboard is PCI-E 3.0 and does not represent a bottleneck from a CPU/GPU performance standpoint. Using a PCI-E 4.0 GPU on a PCI-E 3.0 16x slot does not present any tangible performance penalty because PCI-E 3.0 16x is not bottlenecking even high-end GPUs. PCI-E 4.0 is only useful for NVMe SSDs, and for certain scenarios where high data transfer rates are palpable.

 

 

Much about what you don't know, doesn't make up for posting whatever as a first reply... So you don't know if BAR works in DCS and it's your reason for not optimizing your PC? I'll have mine built at home, thanks.

 

Correction:

My Motherboard is PCIe3, PCI-E x16 and PCI 3.0, 3.1 or 4.0 is something else. As I said your bunch talk much but don't provide people with the proper infos.

 

PCI_E1 slot on the B450: PCIe 3.0 x 16 For 1rst, 2nd and 3rd Gen AMD Ryzen processors.

 

New BIOS update:

Version
7B86vHB1(Beta version)
Release Date
2020-12-10
Description
- Updated AMD AGESA ComboAm4v2PI 1.1.0.0 Patch D.
- Support Re-size BAR function to enhance GPU performance.

 

So that's conflicting information from AMD launch docxumentation (500 Serie MNotherboard in specs).


Edited by Thinder

MSI B450 GAMING PLUS MAX 7B86vHB1(Beta version) BIOS, AMD Ryzen 5 5600X, EVGA NVIDIA GeForce GTX 1080 Ti 11GB, 32GB G.SKILL TridentZ RGB (4 x 8GB) DDR4 3200 CL14, Thrustmaster T.16000M FCS HOTAS. My G2 is DEAD, I'll get VR again when headsets will be better.

M-2000C. F/A-18C Hornet. F-15C. MiG-29 "Fulcrum". 

Avatar: Escadron de Chasse 3/3 Ardennes.

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...