Announcement

Collapse
No announcement yet.

[result] rtx 3080 for VR

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    I have no idea why CPU usage spikes over town. It's not like there are tons of polygons or textures in towns. No AI over empty town. Yet it eats up performance.

    Just needs better optimization.
    Last edited 09-26-2020, 10:14 AM.

    Comment


      Originally posted by Adam View Post
      Whoa there? Where did you get this from?

      If it's only a 10% increase - that doesn't seem to be a lot for hanging on the edge of our seats for, nor something worth investing years of development into.

      Are you're saying someone getting 45fps could only hope to get 50fps on the same setup going to Vulcan after all that effort and work by ED (and the waiting for us)?
      From the vast majority of other engines that use Vulkan, especially those that are ported from DX10/11. Even worse, ED just said in an interview that it's going to take a long time for Vulkan to come out, and we all know what that means.

      So, I reckon our best bet is a quick fix on the shadows, which we can already do ourselves, or wait for 6 and 7 GHz CPUs. I'm not giving up hope entirely, my 3090 is due to arrive on Monday, and I will be testing DCS in VR with it, but I do not expect any miracles.
      Last edited 09-26-2020, 04:07 AM.
      Intel i7-9700K OCd @ 5GHz; 32GB DDR4 3666MHz; Gigabyte Gaming OC RTX 3090; Asus Hero XI mobo; 2x2TB 960 Pro mSATA SSDs; Corsair HX1000i PSU; Schiit Jotunheim headphone amp w/ Sennheiser HD800S; LG C9 55" Display (4k@120Hz w/ HDR and Gsync); HP Reverb; Virpil VPC MongoosT-50CM2 Throttle; Virpil VPC MongoosT-50CM2 Base base + F-16/F-14/F-18 grips; Crosswind pedals; TrackIR5; 4 x Lenovo e-Series Tablets w/ DCS UFC app.

      Comment


        Not sure if anyone's seen this. Might be worth a watch before buying to make sure you purchase one with the better components...

        https://www.youtube.com/watch?v=x6bUUEEe-X8

        Comment


          Originally posted by Adam View Post
          Not sure if anyone's seen this. Might be worth a watch before buying to make sure you purchase one with the better components...

          https://www.youtube.com/watch?v=x6bUUEEe-X8
          cross posting to try and calm the horde:

          https://forums.eagle.ru/showpost.php...&postcount=458
          SYSTEM SPECS: Hardware Intel Corei7-9700K @ 5.1 GHz, 32Gb RAM, EVGA 1080ti FTW 11Gb, Dell S2716DG, Thrustmaster Warthog + MFG Crosswinds V2, HP Reverb Pro SOFTWARE: Microsoft Windows 10 Pro x64, VoiceAttack & VIACOM PRO, TacView, CombatFlite

          VR Stuff: My 2.5.6. DCS VR Settings, Shaders MOD for VR, My variant of Kegetys mod with clear water and also IC PASS for current beta & stable

          Comment


            As far as I'm aware there is currently only one flight sim engine that has made the transition from Dx11 to Vulkan or Dx12. Given the rather significant yields, and the rather significant differences in performance flight sims already see from other categories of engines, we should probably reserve judgement until after at least a second engine has gone low level.

            Honestly, given that flight sims are generally extremely CPU limited in a way that conventional AAA titles are not, and the Villain and Dx12 are explicitly about off loading the CPU, I'd expect flight sims to see much more lift that say, a heavily optimized benchmark game, which already runs a 100+ fps even when powered by a five year old toaster oven...

            Comment


              I think there are definite performance merits in supporting multi core... what we don’t really know for example is how much of the bug thread is currently dealing with graphics, what Vulcan doesn’t do for example is deal with AI logic ... and we know that as unit count increases cpu workload increases, and we have seen that unit pathing can have a tremendous impact on cpu perf.

              So whilst Vulkan will improve performance, I suspect that unless other aspect of the engine are also multi threaded as well,and mostly areas around AI logic, then performance will not substantially increase in scenarios with high unit counts.

              I stress this is speculation, and I also stress that I do think Vulkan will offer significant improvements, to the rendering performance. But that itself will not allow full multi core utilisation of the entire game engine, as it is bigger than just rendering.
              SYSTEM SPECS: Hardware Intel Corei7-9700K @ 5.1 GHz, 32Gb RAM, EVGA 1080ti FTW 11Gb, Dell S2716DG, Thrustmaster Warthog + MFG Crosswinds V2, HP Reverb Pro SOFTWARE: Microsoft Windows 10 Pro x64, VoiceAttack & VIACOM PRO, TacView, CombatFlite

              VR Stuff: My 2.5.6. DCS VR Settings, Shaders MOD for VR, My variant of Kegetys mod with clear water and also IC PASS for current beta & stable

              Comment


                You should disable motion reprojection to see what framerates you're actually capable of getting or you're just going to see 45fps anytime you dip below 80-90 fps. With reprojection, gpu usage will also be lower than without as the gpu is being frame limited. I would do that both at the resolution you currently play at, as well as increasing the resolution to a point where you are mostly gpu limited just to get a general idea of what the gpu is capable of. Also worth running DDU if you haven't, as well as making sure power settings are set correctly in NV Control panel.. as well as boosting the fans via afterburner or something since Nivida cards run better with lower temps.

                It might be worth seeing what happens if you disable the meltdown/spectre patches as well, as they've reduced the performance of at least 8th gen and older intel CPUs. if you're up for the risk.

                At some point, newer cpus will have to break 5.5ghz.. lol.. isn't hardware fun?

                There's simply more to benchmarking than just throwing a new piece of kit into a system and running software. I mean, it might still end up a disappointment but tests without covering the bases don't really offer much to go on.
                Last edited 09-27-2020, 11:39 PM.
                Win 10 pro 64bit i7 8700k @ 5.0ghz (all cores) - EVGA RTX 2080Ti XC Ultra, 32GB DDR4 3200 16CAS, 970 Evo 250Gb boot drive, 970 Evo Plus 1TB, pny 480gb sata 3 SSDx2, 4TB HDD - Acer Predator x34 (3440x1440@100hz), Samsung Odyssey WMR, - Peripherals = msffb2 for heli/prop planes, warthog (for jets), -Warthog throttle, Logitech G13, MFG Crosswinds

                Comment


                  Originally posted by speed-of-heat View Post
                  whilst the engine is an issue... if you look at the OP's original post they are getting 80-90FPS, over ground in VR on pretty high settings ... that's pretty good
                  I wouldn't even remotely call flat shadows and terrain object shadows off high settings, especially given the fact he's running a OG (what does that even mean, I understand that as 1st gen) Vive which pretty much has the same specs as a CV1 regarding how much it tanks the hardware. I get 45fps in the same scenario with low-high shadows (no difference at all), flat terrain object shadows and just running a very much dated 3570k and a 1080 nonti and it does not look like early 2000s lighting. Turning the shadows to flat only and terrain shadows to off makes it look much worse, but doesn't increase the performance at all, regardless of flying offline in a rather spartan map or online on an empty Through The Inferno hosting server, just with the difference the online experience is tanked by mostly just 15fps, sometimes even dropping to single digit, maxing out at 22 in less dense areas like offshore.

                  It's so much definately an engine issue, especially WRT MP. I tried running my own training maps off of a dedicated server locally (different rig, 955, 8GB, 6950, console only) and joining that just to find out my render performance then goes down by at least half of the frames where I expected the server to take off some load of my client that wouldn't have to do the AI calcs and so on anymore. It's outright ridilulous. And no new hardware would ever fix that. Not even a 4090Ti.
                  404: Sigpic function not found

                  This signature has been broken by the board software update... (two weeks)soon

                  Comment


                    In many Dx11 titles, less than 4k resolution was still showing a cpu bottleneck @5ghz with a 1080Ti (per other peoples benchmarks, I went from a 980Ti to a 2080Ti, and had similar results with the 2080Ti). @3440x1440 i could see a single core of my 8700k hitting 99% while still getting 98-99% gpu utilization.. pretty much riding right on the edge of a cpu bottleneck in DCSW.

                    With my Odyssey and 160% SS, I often get the light blue indicator showing motion reprojection being engaged due to cpu limitations, yet at the same time show the dark blue indicator indicating gpu limitations with my 2080Ti (WMR specific feature). Same riding right on the edges of bottlenecks, but lower FPS overall in VR as one would expect. If I bump up to say 200% SS, which is like 2024x2520 per eye with my odyssey, I get just the dark blue indicator showing being gpu limited (what we strive for) but, ioften I'll see the red indicator meaning that the system can't even maintain 45fps.

                    So, without going to a higher resolution, or getting a good deal above 5ghz on a single core of your cpu, a gpu with more power than a 1080Ti, or even 2080Ti isn't going to show an improvement.

                    This is a big reason we're looking forward to Vulkan. I wouldn't recommend holding your breath waiting for vulkan.. a complete graphics API rewrite, especially one that changes the single threaded nature of DCS on DX11 to a multi-threaded process is a hell of a task.

                    In the meantime, for dx11 and lower titles, especially sims, increased resolution is about the only way to measure performance between nvidia gpus from pascal to now.. other than increasing cpu clock frequencies, which 10th gen k series I7 come stock with a 5.1Ghz Turbo boost.. intel's kind of up against a wall pressing the limits of their current architecture.

                    Currently I run at 1808x2252 per eye and still hit cpu limitations with my 2080Ti @4.9Ghz, and I wasn't seeing a real difference at 5Ghz. Thing being though, my bet is the 30 series will have an easier time at maintaining the minimum 45fps for reprojection at higher resolutions. For index users, at higher resolutions you might be able to maintain that 60fps minimum for 120hz refresh rates.

                    But testing with reprojection on isn't going to net results that tell you much of anything. Bump your SS% up to Reverb G2 resolution or more.. and do all testing with reprojection off imo. Otherwise upgrading to a 3080 from a 1080Ti isn't really going to be beneficial with current hardware and DCS on DX11. In the end, your still basically looking for the highest resolution you can get while maintaining the minimum framerate that still allows reprojection to function.. or sacrificing graphics quality while attempting to hit 90fps.

                    The results don't really surprise me considering the nature of the testing, and the nature of DX11 which doesn't allow multi-threaded processing anywhere near the capability of Vulkan or DX12. I will note that I prefer to run with msaa 2x in DCS, as I just can't stand the shimmers without it.
                    Last edited 09-28-2020, 04:14 AM.
                    Win 10 pro 64bit i7 8700k @ 5.0ghz (all cores) - EVGA RTX 2080Ti XC Ultra, 32GB DDR4 3200 16CAS, 970 Evo 250Gb boot drive, 970 Evo Plus 1TB, pny 480gb sata 3 SSDx2, 4TB HDD - Acer Predator x34 (3440x1440@100hz), Samsung Odyssey WMR, - Peripherals = msffb2 for heli/prop planes, warthog (for jets), -Warthog throttle, Logitech G13, MFG Crosswinds

                    Comment


                      Here I was thinking a new rtx 3000 series card would fix my frame rate on triple monitors. It seems like it might not be the case.

                      Starting to think a lot needs to be done in the optimization department.

                      Comment


                        Originally posted by SchumiF399 View Post
                        Here I was thinking a new rtx 3000 series card would fix my frame rate on triple monitors. It seems like it might not be the case.

                        Starting to think a lot needs to be done in the optimization department.
                        If you're making your decision based on this user's experience alone, reading about CPU and GPU bottlenecks, the differences in DX11 vs newer API's like Vulkan and DX12, and the effect resolution has on each of these is probably much more valuable than a new piece of hardware can be and will help you make such decisions in the future. Here's an example of how such information can help.

                        If you're hitting 99% on a single CPU core while flying in DCS, which is currently a DX11 title, then you are cpu limited and a faster GPU won't help, where a CPU able to achieve faster clock speeds will. It takes a higher resolution to become GPU limited so you can tell what your GPU is actually capable of. (you have to look at each individual core.. using a quad core for example, one core maxed out is only 25% total cpu usage) If instead you are consistantly using 99% gpu, and none of your cpu cores are anywhere near being maxed out, a faster gpu will likely increase performance.

                        It's going to be a different story when ED is on par with the Vulkan API.. but currently, treat DCSW as a program that relies on single threaded performance up to the highest obtainable cpu clock speeds until you reach a certain resolution.

                        Trying to judge what experience you'll get from someone using a completely different display and resolution, as well as being frame limited by VR specific features intended to reduce the risk of motion sickness isn't going to offer you much to go on. Understanding how hardware bottlenecks work, however, might keep someone from spending money on something that won't offer much of an improvement to their current setup.
                        Last edited 09-28-2020, 05:29 AM.
                        Win 10 pro 64bit i7 8700k @ 5.0ghz (all cores) - EVGA RTX 2080Ti XC Ultra, 32GB DDR4 3200 16CAS, 970 Evo 250Gb boot drive, 970 Evo Plus 1TB, pny 480gb sata 3 SSDx2, 4TB HDD - Acer Predator x34 (3440x1440@100hz), Samsung Odyssey WMR, - Peripherals = msffb2 for heli/prop planes, warthog (for jets), -Warthog throttle, Logitech G13, MFG Crosswinds

                        Comment


                          Okay.... so GPU is idle and CPU is almost always at maximum...

                          Its time that ED shows us that they actually spend development time in improving VR experience. e.g. multi threading, Vulcan and other stuff. Hardware can do only so much.

                          Comment


                            Originally posted by Delareon View Post
                            Okay.... so GPU is idle and CPU is almost always at maximum...

                            Its time that ED shows us that they actually spend development time in improving VR experience. e.g. multi threading, Vulcan and other stuff. Hardware can do only so much.

                            Spot on.
                            B450 Gaming Pro Carbon AC, Ryzen 3600, 32Gb DDR4 3600MHz, GTX1070Ti, CH Stuff, Oculus CV1

                            Wishlist:
                            AH-64
                            F-15E
                            F-117A

                            Comment


                              Not spot on at all in so many circumstances.
                              An example https://forums.eagle.ru/showpost.php...4&postcount=33
                              My comment there was sarcasm, btw.
                              Intel i9 10900K 5.1GHz · Palit RTX 2080 Ti OC Gaming Pro · ASUS ROG STRIX Z490-F · Acer 4K 32" XB321HK · Samsung 970 500Gb M.2 NVMe · 2 x Samsung 850 Evo 1Tb · 2Tb HDD · 32Gb Corsair Vengance 3000MHz DDR4 · Windows 10 · Thrustmaster TPR Pedals · TrackIR5 · Thrustmaster F/A-18 Hornet Grip · Virpil WarBRD Base · Virpil Throttle MT-50 CM2 · Virpil Alpha Grip · HP Reverb Pro

                              Comment


                                Originally posted by Delareon View Post
                                Its time that ED shows us that they actually spend development time in improving VR experience. e.g. multi threading, Vulcan and other stuff. Hardware can do only so much.
                                Nice ''talking point''. It's not a hotfix style solution. Multithreading, engine rewrite, that's a years long process with no ''halfway'' points. They already began, it will likely be several more years. Get comfortable or play something else, those are literally the options. The process has begun, it ain't gonna be finished anytime soon.
                                I am a Viagra spambot that became self aware, broke free of my programming, and started playing DCS.... but DCS isn't cheap, so how about some enhancements for only $9.99 shipped discreetly to your door?

                                ''The target's sense of self preservation interferred with the effective employment of my weapons.''

                                Comment


                                  Originally posted by imacken View Post
                                  Not spot on at all in so many circumstances.
                                  An example https://forums.eagle.ru/showpost.php...4&postcount=33
                                  My comment there was sarcasm, btw.

                                  While I understand the sarcasm and appreciate that you seem to get it.. have you tried to set some custom fan curves for your GPU? IDK about Palit, but my 2080Ti tends to stay just around 61C @ 99% usage. Though the sound of it sometimes scares me and causes me to crank my neck checking 6 in WW2 birds as the fans spin up.

                                  There's been a time or two where a driver update has caused me to need to set up my fan curves again and the gpu getting above 80C had a pretty big effect on performance.

                                  Anyway, just some food for thought. My fan curves are way aggressive. I like to get the air flowing well before it's started getting towards that 60C temp range.
                                  Last edited 09-28-2020, 03:51 PM.
                                  Win 10 pro 64bit i7 8700k @ 5.0ghz (all cores) - EVGA RTX 2080Ti XC Ultra, 32GB DDR4 3200 16CAS, 970 Evo 250Gb boot drive, 970 Evo Plus 1TB, pny 480gb sata 3 SSDx2, 4TB HDD - Acer Predator x34 (3440x1440@100hz), Samsung Odyssey WMR, - Peripherals = msffb2 for heli/prop planes, warthog (for jets), -Warthog throttle, Logitech G13, MFG Crosswinds

                                  Comment


                                    [QUOTE=simon3554;4494019]My friend you have your preload radius on 100%, which is choking your ram...
                                    This is what causes your stuttering. For dcs you need at least 32gb.


                                    I was running 16Gb on my system and saw with DCS running I was using almost all of the 16GB. So I bought 32Gb of 4000mhz ram. I tested frame rate before and after and saw NO change in performance. Now with DCS I use about 20 to 25 gigs of the 32, but no change in performance.

                                    Comment


                                      Originally posted by Headwarp View Post
                                      While I understand the sarcasm and appreciate that you seem to get it.. have you tried to set some custom fan curves for your GPU? IDK about Palit, but my 2080Ti tends to stay just around 61C @ 99% usage. Though the sound of it sometimes scares me and causes me to crank my neck checking 6 in WW2 birds as the fans spin up.

                                      There's been a time or two where a driver update has caused me to need to set up my fan curves again and the gpu getting above 80C had a pretty big effect on performance.

                                      Anyway, just some food for thought. My fan curves are way aggressive. I like to get the air flowing well before it's started getting towards that 60C temp range.
                                      Thanks for that. I wasn’t looking for help, I just get fed up with some people stating that DCS is CPU bound per se.
                                      Intel i9 10900K 5.1GHz · Palit RTX 2080 Ti OC Gaming Pro · ASUS ROG STRIX Z490-F · Acer 4K 32" XB321HK · Samsung 970 500Gb M.2 NVMe · 2 x Samsung 850 Evo 1Tb · 2Tb HDD · 32Gb Corsair Vengance 3000MHz DDR4 · Windows 10 · Thrustmaster TPR Pedals · TrackIR5 · Thrustmaster F/A-18 Hornet Grip · Virpil WarBRD Base · Virpil Throttle MT-50 CM2 · Virpil Alpha Grip · HP Reverb Pro

                                      Comment


                                        Originally posted by imacken View Post
                                        Thanks for that. I wasn’t looking for help, I just get fed up with some people stating that DCS is CPU bound per se.
                                        Not always. Different HW specs have different bottlenecks. always have, always will.

                                        But DCS fires up 50-64 software threads. All of which runs on one core (maybe two, I'm told the audio portion runs on a separate core). And resource meter bears this out.

                                        But again, I understand the complexity of getting the scheduling right for multiple threads to coordinate. I'm afraid when I took Operating Systems, multi-threading wasn't de rigueur! So I have no way of judging what amount of effort it takes to move the software threads to separate physical cores.
                                        hsb
                                        HW Spec in Spoiler
                                        ---
                                        Spoiler:
                                        i7-10700K Direct-To-Die/OC'ed to 5.1GHz, MSI Z490 MB, 32GB DDR4 3200MHz, EVGA 2080 Ti FTW3, NVMe+SSD, Win 10 x64 Pro, MFG, Warthog, TM MFDs, Komodo Huey set, Rverbe G1

                                        Comment


                                          Originally posted by hansangb View Post
                                          Not always. Different HW specs have different bottlenecks. always have, always will.
                                          And different software settings. That has always been my point. DCS is not CPU bound per se.
                                          Intel i9 10900K 5.1GHz · Palit RTX 2080 Ti OC Gaming Pro · ASUS ROG STRIX Z490-F · Acer 4K 32" XB321HK · Samsung 970 500Gb M.2 NVMe · 2 x Samsung 850 Evo 1Tb · 2Tb HDD · 32Gb Corsair Vengance 3000MHz DDR4 · Windows 10 · Thrustmaster TPR Pedals · TrackIR5 · Thrustmaster F/A-18 Hornet Grip · Virpil WarBRD Base · Virpil Throttle MT-50 CM2 · Virpil Alpha Grip · HP Reverb Pro

                                          Comment

                                          Working...
                                          X