Jump to content

Nvidia RTX 2080 and 2080Ti Benchmarks are starting to be released


Recommended Posts

  • Replies 117
  • Created
  • Last Reply

Top Posters In This Topic

RTX compatible drivers are now released (just installed these on my GTX 1080) v411.63 WHQL

 

https://forums.guru3d.com/threads/geforce-411-63-whql-drivers-download-discussion.423061/

 

Please share the benefits/performance of this driver, if any, you see with your 1080.

 

Cooler Master HAF XB EVO , ASUS P8Z77-V, i7-3770K @ 4.6GHz, Noctua AC, 32GB Corsair Vengeance Pro, EVGA 1080TI 11GB, 2 Samsung 840 Pro 540GB SSDs Raid 0, 1TB HDD, EVGA SuperNOVA 1300W PS, G930 Wireless SS Headset, TrackIR5/Wireless Proclip, TM Warthog, Saitek Pro Combat Pedals, 75" Samsung 4K QLED, HP Reverb G2, Win 10

Link to comment
Share on other sites

The implementation of DLSS will be the make/break for this generation of cards. Right now not many games (heck just 1 I think) use it, but if game devs pick it up it should be a pretty big difference maker.

 

 

I am holding off to see if games I play (DCS being the major one) will implement it once they make the move to Vulcan. If they do that alone could possibly be a reason to buy the 2000 series of card.

 

 

 

5900X - 32 GB 3600 RAM - 1080TI

My Twitch Channel

~Moo

Link to comment
Share on other sites

Wags has stated unequivocally that DCS will not implement gameworks or any AMD equivalent. That means DCS will not support DLSS as RTX is part of gameworks.

 

Uhhh, ok. You have zero idea what you're talking about. :doh: :rolleyes:

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

Could you elaborate on this statement of yours? Especially seeing this: https://developer.nvidia.com/rtx/ngx

 

That link shows you're wrong. Not a single mention of GameWorks.

 

Raytracing is a standard within the graphics APIs.

 

DLSS is a feature that's provided for free and taken care of by Nvidia.

 

Having 40-50+% improvement in AA performance without degradation of image quality via DLSS is something that Matt Wagner specifically said DCS will not have?

 

Are you serious? :doh:

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

That link shows you're wrong. Not a single mention of GameWorks.

 

Raytracing is a standard within the graphics APIs.

 

DLSS is a feature that's provided for free and taken care of by Nvidia.

 

Having 40-50+% improvement in AA performance without degradation of image quality via DLSS is something that Matt Wagner specifically said DCS will not have?

 

Are you serious? :doh:

 

Raytracing still needs to be supported by the developer for its benefits to be seen. ED/Wags have already stated they are not planning to implement support - I believe they are going to wait on Vulcan which will have native support as well.

 

DLSS also needs to be supported by the developer. The developer needs to reach out to Nvidia and provide them whatever data they need to build the DLSS database for a particular game. I would not hold your breath for this support; it may come - one day.

Win 10 Pro 64Bit | 49" UWHD AOC 5120x1440p | AMD 5900x | 64Gb DDR4 | RX 6900XT

Link to comment
Share on other sites

That link shows you're wrong. Not a single mention of GameWorks.

 

Raytracing is a standard within the graphics APIs.

 

DLSS is a feature that's provided for free and taken care of by Nvidia.

 

Having 40-50+% improvement in AA performance without degradation of image quality via DLSS is something that Matt Wagner specifically said DCS will not have?

 

Are you serious? :doh:

 

Maybe I should've been more clear: where on that page do you see a difference from gameworks? RTX still requires an SDK to be included with the engine:

NGX software: The RTX feature is integrated into any game, application or plugin through the use of the NGX SDK. This is the main code for the AI-accelerated functionality, but it also requires a trained neural network in order to function.

On top of that RTX is listed on the Gameworks page - indicating it will be integrated into it as soon as it's ready.

 

And if it's just about performance gains: why would they refuse to integrate gameworks (and VRworks) to begin with?

 

Personally I'd call it highly speculative to state openly that they will include any of this. It would be nice for them to optimize performance - but I can't see any indicators as to why gameworks RTX will be the only exception.

Best regards

Link to comment
Share on other sites

Not holding my breath, but DLSS would definitely be nice compared to current anti-aliasing options within DCS.

 

The devs claim 4k won't need anti-aliasing, but I know VR and my 3440x1440 21:9 have to use msaa 2x at least or else almost everything looks pretty bad/jaggy/shimmery, and FXAA didn't do much to alleviate it. I also know that MSAA wrecks our framerates

 

I don't know the science enough but - if an anti-aliasing technique is invented that offers double the performance numbers and likely clearer/smoother image quality, especially in a program as resource instensive as DCS World it makes no sense not to research and eventually implement it. People who chooes not to opt for an RTX card can stick to old methods, but completely ignoring this tech sounds silly to me.

 

The NGX SDK will be available for download in a few weeks supposedly, and I could be mistaken, but in the interveiws I've been watching, nVidia will train the AI neural network on their superexpensive supercomputer and implement code for DLSS support for free. The basic interpretation I got was "Hey, we want our technology to succeed because it's honestly pretty awesome, and we want your games to run well and we're sure your userbase will appreciate it, submit a development copy of your game and we'll get it working for you." Win/win if you ask me.

 

https://developer.nvidia.com/rtx/ngx

 

Early adoption may be a slow process given the majority of developers develop for console or mobile, but this just isn't physx or Hairworks. AI enhanced graphics are the future, real-time ray-tracing aside. . And we'll have to wait for games with DLSS support and windows updates to be certain, but so far it sounds like DLSS going to be a substantial performance/quality increase compared to current graphics technologies.

 

 

 

DLSS and tensors cores? There are BOOKS worth of complaints about the current state of anti-aliasing in DCS World within these forums. DLSS COULD be the thing that allows for 90FPS VR in DCS with a picture much clearer than what we can achieve currently.

 

I won't keep visiting these threads as I've said similar in another, but as a fan of this game who will likely be spending money on it for years to come, hoping these guys improve and go the distance, I strongly encourage Eagle Dynamics to look in what it would really take to implement DLSS functionality, of course, after we see some real performance comparisons in upcoming games released with DLSS support.

 

We got a lot of folks soothesaying because of high pricetags which is understandable, however, I strongly believe that if a feature were implemented that allowed for near double performance with BETTER image quality, quite a few DCS World "Pilots" would be forking out the dough for a card capable of supporting it. And be able to witness the full potential of deferred shading at last without compromising framerates or resolution.

 

I'll stop posting in these threads about this subject. But the thought does plague my mind regarding DLSS and DCS World. It does seem like a solution for a lot performance complaints, and it's not like adding it will make the game any less playable for people who don't adopt capable cards. Although, I'm in the mindset that EVENTUALLY even if a couple years down the road, we'll all be sporting some kind of card capable of AI enhanced graphics.

 

*edit* - At least one of the games (Ark) already announced to feature DLSS isn't even DX12 exclusive, so - realm of possibility here anyone?


Edited by Headwarp
Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Link to comment
Share on other sites

Not holding my breath, but DLSS would definitely be nice compared to current anti-aliasing options within DCS.

 

The devs claim 4k won't need anti-aliasing, but I know VR and my 3440x1440 21:9 have to use msaa 2x at least or else almost everything looks pretty bad/jaggy/shimmery, and FXAA didn't do much to alleviate it. I also know that MSAA wrecks our framerates

 

I don't know the science enough but - if an anti-aliasing technique is invented that offers double the performance numbers and likely clearer/smoother image quality, especially in a program as resource instensive as DCS World it makes no sense not to research and eventually implement it. People who chooes not to opt for an RTX card can stick to old methods, but completely ignoring this tech sounds silly to me.

 

The NGX SDK will be available for download in a few weeks supposedly, and I could be mistaken, but in the interveiws I've been watching, nVidia will train the AI neural network on their superexpensive supercomputer and implement code for DLSS support for free. The basic interpretation I got was "Hey, we want our technology to succeed because it's honestly pretty awesome, and we want your games to run well and we're sure your userbase will appreciate it, submit a development copy of your game and we'll get it working for you." Win/win if you ask me.

 

https://developer.nvidia.com/rtx/ngx

 

Early adoption may be a slow process given the majority of developers develop for console or mobile, but this just isn't physx or Hairworks. AI enhanced graphics are the future, real-time ray-tracing aside. . And we'll have to wait for games with DLSS support and windows updates to be certain, but so far it sounds like DLSS going to be a substantial performance/quality increase compared to current graphics technologies.

 

 

 

DLSS and tensors cores? There are BOOKS worth of complaints about the current state of anti-aliasing in DCS World within these forums. DLSS COULD be the thing that allows for 90FPS VR in DCS with a picture much clearer than what we can achieve currently.

 

I won't keep visiting these threads as I've said similar in another, but as a fan of this game who will likely be spending money on it for years to come, hoping these guys improve and go the distance, I strongly encourage Eagle Dynamics to look in what it would really take to implement DLSS functionality, of course, after we see some real performance comparisons in upcoming games released with DLSS support.

 

We got a lot of folks soothesaying because of high pricetags which is understandable, however, I strongly believe that if a feature were implemented that allowed for near double performance with BETTER image quality, quite a few DCS World "Pilots" would be forking out the dough for a card capable of supporting it. And be able to witness the full potential of deferred shading at last without compromising framerates or resolution.

 

I'll stop posting in these threads about this subject. But the thought does plague my mind regarding DLSS and DCS World. It does seem like a solution for a lot performance complaints, and it's not like adding it will make the game any less playable for people who don't adopt capable cards. Although, I'm in the mindset that EVENTUALLY even if a couple years down the road, we'll all be sporting some kind of card capable of AI enhanced graphics.

 

*edit* - At least one of the games (Ark) already announced to feature DLSS isn't even DX12 exclusive, so - realm of possibility here anyone?

 

Agreed.

 

Unfortunately a lot of people, well all of us are disappointed with the pricing, it is pretty gouging.

 

Some peeps have been waiting for the card and can't justify the price and some can. A small minority of the ones who can't, indulge themselves with the politics of envy. It is plain stupid to decry a whole new generation of cards because of price and the fact they refuse to pay it.

 

I paid £800 for pre order Rift and Touch controllers and look how that turned out. One of the best purchasing decisions for my hobby I ever made. Up there with DCS, crosswinds and warthog.

 

Some have said this hobby is relatively inexpensive...It is on a hour to hour basis. Try calculating the costs per hour for renting a light aircraft, owning a boat, a high performance motorcycle or a motocross bike. I have and its eye watering.

 

Those who can't justify the costs may have other interests they indulge in that demands money be thrown at it and DCS is a secondary diversion. They may have a young family and crippling mortgage payments. I have also been there.

 

Then there are those of us who have been through all of the above, all the hair on fire interests, the mortgage and kids, come out the other side with their main interest being DCS (cheaper than light aircraft which I have not been able to afford since the mortgage and kids appeared) and now have a little discretionary income to pursue that hobby. Have to say I would prefer to have my youth back though.

 

Indulging in the politics of envy and in some perverse way trying to somehow insinuate those who early adopt are stupid are very short sighted. The early adopters pave the way for such technology to become mainstream and affordable.

 

I see a lot of thinly disguised envy in this thread and a wish for the technology to fail, just because they can't afford it at the moment.

 

EDIT: on reflection, envy may not be entirely appropriate. I would think bitterness would suit better. Semantics, the end result is the same.


Edited by Tinkickef

System spec: i9 9900K, Gigabyte Aorus Z390 Ultra motherboard, 32Gb Corsair Vengeance DDR4 3200 RAM, Corsair M.2 NVMe 1Tb Boot SSD. Seagate 1Tb Hybrid mass storage SSD. ASUS RTX2080TI Dual OC, Thermaltake Flo Riing 360mm water pumper, EVGA 850G3 PSU. HP Reverb, TM Warthog, Crosswind pedals, Buttkicker Gamer 2.

Link to comment
Share on other sites

Being early adopter and buying blindly without any knowledge on how the thing actually performs are two different things, don't mix them.

 

With the Rift you had people showing demos and reviewing the hardware all over internet, they were just not available for regular purchase until Oculus produced them in enough quantity.

 

With RTX, up until now, you had zero benchmarks in actual games, other than usually overly optimisic slides from nvidia themselves. As it turns out, those who preordered 2080 would have been better just buying a 1080Ti for less money with the same performance, rather than believe nVidia powerpoint presentations. You still have zero knowledge about RTX performance of any of the cards because there is no software to test it, not to mention use it.

 

basically what this guy said:

 

I can still buy an RTX 2080Ti in a 2-3 months and call myself early adopter. I'll just won't be buying a pig in a poke.

Hardware: VPForce Rhino, FSSB R3 Ultra, Virpil T-50CM, Hotas Warthog, Winwing F15EX, Slaw Rudder, GVL224 Trio Throttle, Thrustmaster MFDs, Saitek Trim wheel, Trackir 5, Quest Pro

Link to comment
Share on other sites

Some first impressions with a 2080Ti in VR with a Vive Pro, replacing a 1080Ti.

- Jack of many DCS modules, master of none.

- Personal wishlist: F-15A, F-4S Phantom II, JAS 39A Gripen, SAAB 35 Draken, F-104 Starfighter, Panavia Tornado IDS.

 

| Windows 11 | i5-12400 | 64Gb DDR4 | RTX 3080 | 2x M.2 | 27" 1440p | Rift CV1 | Thrustmaster Warthog HOTAS | MFG Crosswind pedals |

Link to comment
Share on other sites

Being early adopter and buying blindly without any knowledge on how the thing actually performs are two different things, don't mix them.

 

With the Rift you had people showing demos and reviewing the hardware all over internet, they were just not available for regular purchase until Oculus produced them in enough quantity.

 

With RTX, up until now, you had zero benchmarks in actual games, other than usually overly optimisic slides from nvidia themselves. As it turns out, those who preordered 2080 would have been better just buying a 1080Ti for less money with the same performance, rather than believe nVidia powerpoint presentations. You still have zero knowledge about RTX performance of any of the cards because there is no software to test it, not to mention use it.

 

basically what this guy said:

 

I can still buy an RTX 2080Ti in a 2-3 months and call myself early adopter. I'll just won't be buying a pig in a poke.

 

 

Valid point. I think Tinkickef makes another very very valid point. There is one thing to consider though. And that's track record. If Ferrari announces a new car, would people assume it's going to be great or garbage? If Yugo announced a new car (back from the dead) what would people assume?

 

NVidia is more on the Ferrari side than Yugo side. So people gave them credit and pre-ordered. I did too because I want the best possible VR experience.

 

When I got my DK1, that was a leap of faith. When I ordered the CV1, it wasn't a leap of faith because I had already owned DK1, and DK2.

 

When I KS backed Pimax, *that* was a leap of faith. After all, the 4K Pimax was very much less than stellar. But I liked the concept and took the plunge.

 

Let's compare it to DCS. I backed the WWII before ED rescued it (thank you), I pre order every module except the MIGs (don't care for them and couldn't care less). That's because I have faith that ED and partners will deliver a quality product. I haven't been disappointed.

 

For that same reason, I pre-ordered the 2080Ti. I believe NVidia will deliver. And 50% increase in some games - especially 4K resolution - is nothing to sneeze at.


Edited by hansangb
spacing

hsb

HW Spec in Spoiler

---

 

i7-10700K Direct-To-Die/OC'ed to 5.1GHz, MSI Z490 MB, 32GB DDR4 3200MHz, EVGA 2080 Ti FTW3, NVMe+SSD, Win 10 x64 Pro, MFG, Warthog, TM MFDs, Komodo Huey set, Rverbe G1

 

Link to comment
Share on other sites

Some first impressions with a 2080Ti in VR with a Vive Pro, replacing a 1080Ti.

 

 

 

 

So it's not the Holy Grail many hoped it could be.

 

 

When I was reading "...I am mostly at 45fps now and dont drop to 30 so often..." I stopped reading and moved on.

 

 

Wake me up when the 4080 xTX-ti arrives, I will likely skip them all till then.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Being early adopter and buying blindly without any knowledge on how the thing actually performs are two different things, don't mix them.

 

With the Rift you had people showing demos and reviewing the hardware all over internet, they were just not available for regular purchase until Oculus produced them in enough quantity.

 

With RTX, up until now, you had zero benchmarks in actual games, other than usually overly optimisic slides from nvidia themselves. As it turns out, those who preordered 2080 would have been better just buying a 1080Ti for less money with the same performance, rather than believe nVidia powerpoint presentations. You still have zero knowledge about RTX performance of any of the cards because there is no software to test it, not to mention use it.

 

basically what this guy said:

 

I can still buy an RTX 2080Ti in a 2-3 months and call myself early adopter. I'll just won't be buying a pig in a poke.

 

Having a 980TI at the moment, I was sure to get an increase in performance.

 

Nvidia have a track record of 25 to 35% increase over the previous flagship card. Hence I was comfortable in forecasting the same with the 20 series, in addition to the rumoured VR enhancements. I never gave the slightest look at Ray tracing as I knew DCS did not have it.

 

When you are looking at tenders from prospective contractors, you look at track record as well as price. When you hire someone, you look at their track record and testimonials.

 

You then have to decide to take a leap of faith that you are engaging the right person for the job.

 

In this world there are the risk takers and the risk averse..... It is not a question of intelligence, or lack of, nor any lack of due diligence. It is just a case of can I afford it? Yes. What If I'm wrong? I will still have a card that comfortably outperforms my current one. Should I buy the cheaper 1080TI instead? Nope, why buy outdated tech when you don't need to, and I have a policy of missing a generation, so the 20 series always was the target product. Gut feeling? The reputation is solid so take the leap.


Edited by Tinkickef

System spec: i9 9900K, Gigabyte Aorus Z390 Ultra motherboard, 32Gb Corsair Vengeance DDR4 3200 RAM, Corsair M.2 NVMe 1Tb Boot SSD. Seagate 1Tb Hybrid mass storage SSD. ASUS RTX2080TI Dual OC, Thermaltake Flo Riing 360mm water pumper, EVGA 850G3 PSU. HP Reverb, TM Warthog, Crosswind pedals, Buttkicker Gamer 2.

Link to comment
Share on other sites

Valid point. I think Tinkickef makes another very very valid point. There is one thing to consider though. And that's track record. If Ferrari announces a new car, would people assume it's going to be great or garbage? If Yugo announced a new car (back from the dead) what would people assume?

 

NVidia is more on the Ferrari side than Yugo side. So people gave them credit and pre-ordered. I did too because I want the best possible VR experience.

 

When I got my DK1, that was a leap of faith. When I ordered the CV1, it wasn't a leap of faith because I had already owned DK1, and DK2.

 

When I KS backed Pimax, *that* was a leap of faith. After all, the 4K Pimax was very much less than stellar. But I liked the concept and took the plunge.

 

Let's compare it to DCS. I backed the WWII before ED rescued it (thank you), I pre order every module except the MIGs (don't care for them and couldn't care less). That's because I have faith that ED and partners will deliver a quality product. I haven't been disappointed.

 

For that same reason, I pre-ordered the 2080Ti. I believe NVidia will deliver. And 50% increase in some games - especially 4K resolution - is nothing to sneeze at.

 

Exactly my point.

System spec: i9 9900K, Gigabyte Aorus Z390 Ultra motherboard, 32Gb Corsair Vengeance DDR4 3200 RAM, Corsair M.2 NVMe 1Tb Boot SSD. Seagate 1Tb Hybrid mass storage SSD. ASUS RTX2080TI Dual OC, Thermaltake Flo Riing 360mm water pumper, EVGA 850G3 PSU. HP Reverb, TM Warthog, Crosswind pedals, Buttkicker Gamer 2.

Link to comment
Share on other sites

It's the same for me. I wasn't replacing my 980Ti with a 10 series card as a matter of principle. And - like any other time I've upgraded my gpu (previously a GTX 680 4GB, and prior to that a GTX 285) I'm buying the most powerful single GPU solution I can (albeit this time I'm spending way more money) , and it will likely last me 3-5 years. I'll get my money's worth. Nobody's saying someone running a 1080Ti or a 10 series card should run out and upgrade. Nobody's saying if you can't afford a $1200 GPU that you should feel bad about buying a used 1080Ti either. Just like, nobody's going to make me feel bad about my upcoming GPU upgrade.

 

Ray tracing is cool and all.. but again.. the potential of DLSS could be pretty ground breaking/game changing on it's own and it seems like nVidia is poised to go out of their way to help developers implement this with minimal effort from game developers. What these tensor cores are doing are like, the biggest change to how graphics are handled since the release of Monster Voodoo, which also took the support of game developers to become as popular as it was as "3d gaming" first came to light.

 

Provided DLSS offers the performance boost we witnessed at gamescom over TAA, it won't be until developers begin adopting that capability that we really see what these cards can do.

 

So far, the potential of DLSS says that if a gamer's selection of games implement support for this feature - the performance value of the 2080 to that gamer goes way up.

 

And we'll still have to wait and see, but to me that's nothing to scoff at. That's a new set of hardware taking over the role of traditional AA, taking the load off of whatever part of the GPU handled it in the past.

 

As someone who planned to upgrade GPU's whenever the next thing came out anyway, and is still going to do so..and doesn't believe AMD is going to suddenly become competitive in the GPU market, DLSS is the thing I want to experience the most, especially in DCS with its MSAA performance.

 

Even without it however? I'm getting a huge upgrade from a 980Ti to a 2080Ti, albeit I might as well be giving up an appendage.. Not blindly either. I didn't pre-order. I run at resolutions both on my monitor and VR that demand a fair amount of GPU power. And to me - especially given experiences with MSAA performance in DCS, DLSS is a very exciting feature/capability. I can only hope my upcoming purchase helps pave the way for such a feature to become a future standard.

 

That being said - I can easily see ED being stubborn about this, no disrespect intended to the team. So I'm not overly hopeful. But I'd love to find myself surprised to find DLSS as an option in DCS in the future.. I think that is likely where the value in these high priced cards will begin to show. Furreal, think about how DCS performs when you disable MSAA. Those lovely high framerates you get. Now imagine getting that SAME performance, only with a supersampled image being provided by the tensor cores on an RTX card clearing up all the jaggies and shimmers with straight and curved lines. Yeah.. you'd quickly open your wallet I'm sure.

 

Benchmarks are showing for traditional methods, a 1080Ti is still a valid/practical option.

 

But, I'm looking at it from the viewpoint of - How long will it be before game devs give in and show us what Deep-learning AI can really do for graphics in gaming? This is innovation gentlemen. We've never had this opportunity before. Where would we be if iD software and Parrallax/Interplay never adopted 3d graphics into their games? Look how that blew up. Without the s3 Virge, ATI Rage, 3dfx Monster Voodoo, and devs jumping on that tech to showcase it, where would graphics be today? Before the introduction of the current state of console gaming, game developers and hardware engineers complimented eachother well as the driving force of innovation in this hobby of ours. Ever since? It seems like an uphill battle trying to see the beneifts of what could be some pretty big strides in technological advancements.

 

Oh by the way, the majority of signatures specifically in these forums, seem to often include an nVidia GPU proudly displayed along with the other hardware in their system. DCS seems to make us chase better performance, which DLSS could rightly offer. nVidia is like Charlie Sheen brosephs. "Winning." Only without the health issues.

 

I had to edit one more point in here... how many of you were building PC's for gaming when a top of the line GPU cost $120US + tax some 20+ years ago? Everything in the US and perhaps world is and has been growing the size of their pricetags. Not just GPU's.


Edited by Headwarp
Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Link to comment
Share on other sites

when it can prove it will constantly out preform my 1080TI that, other than loading very rarely drops below 45 FPM I will get it. Since it a price thing with me I am retired and to spend 1600.00 on a card that's asking a lot

[sIGPIC][/sIGPIC]

Window 10, i9-9900,2080TI, 32GB ram Puma Pro Flight Trainer, 2 x 1TB WB SSD NVMe HP Reverb

Link to comment
Share on other sites

when it can prove it will constantly out preform my 1080TI that, other than loading very rarely drops below 45 FPM I will get it. Since it a price thing with me I am retired and to spend 1600.00 on a card that's asking a lot

 

I don't think anyone is doubting at all that the 2080ti will constantly outperform a 1080 ti. It absolutely does, by around 30%. Most of the debate is around whether or not the huge price tag makes it justified.

PC Specs / Hardware: MSI z370 Gaming Plus Mainboard, Intel 8700k @ 5GHz, MSI Sea Hawk 2080 Ti @ 2100MHz, 32GB 3200 MHz DDR4 RAM

Displays: Philips BDM4065UC 60Hz 4K UHD Screen, Pimax 8KX

Controllers / Peripherals: VPC MongoosT-50, Thrustmaster Warthog HOTAS, modded MS FFB2/CH Combatstick, MFG Crosswind Pedals, Gametrix JetSeat

OS: Windows 10 Home Creator's Update

Link to comment
Share on other sites

I don't think anyone is doubting at all that the 2080ti will constantly outperform a 1080 ti. It absolutely does, by around 30%. Most of the debate is around whether or not the huge price tag makes it justified.

 

Indeed. If I had a 1080TI, I would NOT have preordered the 2080TI.

 

1. As I have said previously, I always miss a generation. Buying the latest whizz bang card every year or so is not cost effective for me.

 

2. I really don't think that the increase in performance over 1080TI justifies the eyewatering price tag on my level of income. In my case I am expecting a huge increase in performance over my 980 and therefore it works for me because of that.

 

However, if you can afford it, and own a 1080TI, there is absolutely nothing wrong with preordering the 20Ti series. It is not stupid, it is not ill advised, misguided or any other term that somehow tries to convey the buyer is wrong.

 

It's your money and you have the God given right to spend it as you wish, without others judging you for it.


Edited by Tinkickef

System spec: i9 9900K, Gigabyte Aorus Z390 Ultra motherboard, 32Gb Corsair Vengeance DDR4 3200 RAM, Corsair M.2 NVMe 1Tb Boot SSD. Seagate 1Tb Hybrid mass storage SSD. ASUS RTX2080TI Dual OC, Thermaltake Flo Riing 360mm water pumper, EVGA 850G3 PSU. HP Reverb, TM Warthog, Crosswind pedals, Buttkicker Gamer 2.

Link to comment
Share on other sites

Can´t help myself. But I am achieving these fps with 1080ti on much higher settings (in multiplayer, oculus rift).

 

I feel like there is more problem in users. Everyone thinks how he undestands the hardware and software optimalization, but truth is loong other way.

 

In my past, the greatest upgrades were like 1 Dollar for heat conducting paste and renewing contact between procesor and cooler etc. Instantly cuts down loading times and increase fps.

Vacuum the pc case, and so on. But this is more about physics than hardware specifications.

 

Wait for real test from people which dont hesitate to spec whole system, and not just that they bought the 2080 ti. You don´t know what hardware generations he is mixing up, what parts he overclocked (destabilized mostly), and at what conditions he runs the complex system. Wait for hardware specialist tests.

Ryzen 7 2700X | MSI Trio 1080Ti | MSI X470 Plus Motherboard | 32GB Kingston HyperX Predator 2933 DDR4 | M.2 XPG GAMMIX S11 Pro SSD | Virpil Mongoost-50 throttle | Thrustmaster Warthog Stick | MFG Crosswind | Rift S

Link to comment
Share on other sites

Can´t help myself. But I am achieving these fps with 1080ti on much higher settings (in multiplayer, oculus rift).

 

I feel like there is more problem in users. Everyone thinks how he undestands the hardware and software optimalization, but truth is loong other way.

 

In my past, the greatest upgrades were like 1 Dollar for heat conducting paste and renewing contact between procesor and cooler etc. Instantly cuts down loading times and increase fps.

Vacuum the pc case, and so on. But this is more about physics than hardware specifications.

 

Wait for real test from people which dont hesitate to spec whole system, and not just that they bought the 2080 ti. You don´t know what hardware generations he is mixing up, what parts he overclocked (destabilized mostly), and at what conditions he runs the complex system. Wait for hardware specialist tests.

 

 

True word, with only 20-30fps more at QHD/WQHD from what I have read, it is close to what you can gain or loose with a simple bad click.

I'd bite my behind if I had a 2080ti + 16GB + stutter...and that can happen.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...