Jump to content

Video Card Rumours


Recommended Posts

qMMm9nHFe0Y

 

 

TL;DW;!!!!

 

 

  • Nvidia got cocky with TSMC (7nm) and played them against Samsung (8nm)
  • TSMC told NV to pound sand so new NV products will be 8nm
  • Expect 7nm "Super" refresh within a year
  • 3080Ti is ~30-40% faster than 2080Ti
  • 3080Ti will consume 300W at full trot, 400W OC'ed b/c 8nm
  • AMD is playing humble
  • AMD RDNA2 is expected to be ~40-50% faster than 2080Ti, 2 times faster than current 5700XT. Maybe even more (PS5 rumours based)
  • AMD will be 5nm CPUs in 2021 to bring even more hurt to Intel; video cards to follow by 2022
  • Cards from both will drop Sept-October

 

Not said in the video - AMD has made TSMC lots of money - probably have a good relationship. Between being flush with cash and not needing to put up with NV's bullsh!t the fallout for NV re: TSMC makes a lot of sense.

 

Interesting times. If Nvidia stays cocky on their pricing I'll be going AMD regardless of AMD "first release driver" issues. :)


Edited by reece146
  • Like 1
Link to comment
Share on other sites

Nvidia and AMD are on the hands of TSMC and his delays or whatever happens inside TSMC hurt those companys.

Nvidia is about ro revamp de Ampere lineup on 5 nm next year too. The problem with Ampere was delays with the contracts with TSMC and 7 nm lineup (problems that Samsung may have allowed to solve, but with no luck).

Samsung allready have a 5 nm transition map, but it´s not as strong as the TSMC one so they may, or not, ensure the 5 nm thing that Nvidia desesperately needs.

Nvidia reached a new contract with TSMC again and (with the absence of Huawei now) it won´t be a problem to have new Nvidia cards on 5 nm and on track by fall 2021. But, yes: the Ampere 2020 consumer cards will be HOT (and may occupy 3 PCI slots).

The HPC cards are from 7 nm TSMC (the HBM2 memory ones).

The problem is on the AMD drivers. (Specially Vulkan and VR is a step behind Nvidia ones allways).


Edited by Leaderface
Link to comment
Share on other sites

I would love to see AMD take the lead to bring gpu prices back down to earth. The GTX 1080 is my first ever nVidia card. I had a Voodoo 5500 and when they folded got a Radeon 8500, then a 9800 Pro. The 9700 Pro had smoked nVidia and put ATi in the lead, then remained competitive for a long time. I only recently retired the 7970 GHz edition in my sons PC to give him a GTX 1080 to support VR and 4K.

 

I dealt with ATi, later AMD, gpu drivers for a very long time. Now having experience with nVidia, I don't see any superiority with nVidia drivers. There is no one driver set that works best across the board with all my applications/flight sims. I had to go to a very old driver just to solve some problems in one sim and found it had fixed a bunch of other little problems as well as giving me solid performance in DCS.

 

If AMD produces a card that is at the right price/performance ratio for me, I will jump on it in a heartbeat. But recent history (like the past several years) suggests that AMD always overhypes their next release and nVidia sweeps in with a huge performance advantage and jacks up prices again.

 

I shudder at the idea of paying $1,200 for a gpu when that costs as much or more than the rest of my PC. When I bought my GTX 1080 for $480 (the lowest I had seen at that time) right before bitcoin mining surged the prices up to over $1,000, I still couldn't believe I paid that much for a gpu. I got my son's GTX 1080 (same MSI Duke model as mine) used for $350 and it has been every bit as good as mine (thankfully not burned out from bitcoin mining).

 

Every cycle, AMD claims they are finally going to equal and/or surpass nVidida's gpus and intels cpus. Every time I am left with disappointment. I am Charlie Brown and AMD is Lucy holding the football. Maybe this time she won't pull it away when I run to kick the ball. Otherwise, I see myself going from a GTX 1080 to a 3080 Ti if my bank account is still being topped off by overtime.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

I'm still rocking a GeForce 1070 waiting on the 3070 (or whatever I can afford, I'm spending my 2020 income tax return on the best 30xx card I can buy). Gaming is a minor concern compared to the renders I do in DazStudio and I would love to see a ton more VRAM on these new cards (which is rumored, but it might just be a rumor). DazStudio uses nVidia iRay, so I'm stuck with nVidia graphics.

Windows 10 64-bit | Ryzen 9 3900X 4.00GHz (OC) | Asus Strix B450-F | 64GB Corsair Vengeance @ 3000MHz | two Asus GeForce 1070 Founders Edition (second card used for CUDA only) | two Silicon Power 1TB NVMe in RAID-0 | Samsung 32" 1440p Monitor | two ASUS 23" 1080p monitors | ASUS Mixed Reality VR | Thrustmaster Warthog HOTAS | MFG Crosswind

 

A-10C Warthog | AV-8B Harrier (N/A) | F/A-18C Hornet | F-16C Viper | F-14B Tomcat | UH-1H Huey | P-51D Mustang | F-86F Saber | Persian Gulf | NTTR

Link to comment
Share on other sites

I pretty much need whatever runs flight sims best, and DCS in particular. It doesn't help that the DCS graphics engine is a little behind as well as the core code that is pretty much limited to using only one or two cpu cores. But no other sim has what DCS gives me. For World War 2, there are several options that are fairly competitive, but for Cold War jets like the F-86, MiG-15, MiG-19, MiG-21, and F-5E, the only other game in town is SF2, which hasn't received any patches/updates in years and runs fine with the hardware I have had for years.

 

DCS is what drove me to try VR. It is what drove me to try the VKB stick and Winwing throttle. So, I would really love to see hardware that can deliver the best performance possible for DCS and I would love DCS to improve its core code to permit using untapped potential in existing hardware as well as the near future. But alas, all I can do is wait for hardware that will give me a useful and cost-effective boost in performance over what I have now. It sounds like the 3080 will give me the useful boost, but there is no way it is going to be cost effective. All I can do is wait and see and hope I get called into work a lot to put up a nest egg to pay for whichever option I end up picking.

 

I don't believe I have a single application that uses ray tracing. Unless one of the newer flight sims has introduced it in a patch. So all I am looking at are raw performance numbers and you can never have enough gpu RAM the DCS World and other flight sims are going with textures and 3d meshes.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

RTX30 uses new 12 PIN Power Connector

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Link to comment
Share on other sites

RTX30 uses new 12 PIN Power Connector

 

I thought that was just speculation? I'd imagine a new plug would overly complicate things for those looking to upgrade.

ASUS TUF GAMING X670E with AMD RYZEN 5 7600X, 64GB DDR4, ASUS TUF GAMING 4080

Pico 4, Thrustmaster Warthog HOTAS

Formally Known As: wpnssgt (google it :smilewink: )

Link to comment
Share on other sites

I thought that was just speculation? I'd imagine a new plug would overly complicate things for those looking to upgrade.

 

it's gotta happen sooner or later. nVidia wants to push it, the same way intel psuhes non industry standard interfaces.

 

Most of the GPUs using the new 12 Pin will be upwards of $700+ so chances are it will likely come with a 2x 8Pin/6Pin to 1x 12Pin Adapter of some sort.

 

nVidia's about to release a portable heater unit again.


Edited by SkateZilla

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Link to comment
Share on other sites

  • 2 weeks later...

I don;t know why everyone should go for Nvidia, I have my upgrade entirely AMD, and next month go full VR with another combo; AMD Radeon RX 5700 XT 8GB GDDR6 and Ryzen 5 2600X CPU Six Core 4.2GHz, another two month to go for full dunding.

 

 

The important thing here, is to get a good result at the resolution i'll need to run VR, 2160P/4K,this means as little bootleneck as possible.

 

 

If you run this combination in /pc-builds.com/calculator/, you end up with Average bottleneck percentage of 6.43, below 10%, to insure maximum efficiency of the GPU.

 

 

Right now, I'm using a new motherboard (B450 Gaming Plus Max, 32 GB of 3200 DDR 4 and a Ryzen 3200G pared with a Radeon XR 5500 XT, still too weak for VR but I have absolutely no issue with drivers or the quality of the Radeon program while running the game with good graphics and frame rate at 1920 X 1080.

 

My Coculus client is showing ready to go, integrated in Steam Client, all I need if my next upgrade and I'll be ready to run VR at the resolution I want.

 

MSI services and support are really good, which is more than what can be said about Oculus or Thrustmaster, since the last hurdle is to be able to use my HOTAS in DCS in VR, which is not yet the case.

 

Some fine tuning should solve this problem soon enough.

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Link to comment
Share on other sites

I don;t know why everyone should go for Nvidia, I have my upgrade entirely AMD, and next month go full VR with another combo; AMD Radeon RX 5700 XT 8GB GDDR6 and Ryzen 5 2600X CPU Six Core 4.2GHz, another two month to go for full dunding.

 

May I ask why you chose the 2600x over the 3300x? Availability ?

Link to comment
Share on other sites

I got similar simulated results bounding them together with more power on tap, 6 core vs 4... (I do video editing).

 

I order all my gear from the same company, very competent techies.

 

We looked at diverse solutions based on the RX700 XT graphic card, they suggested trying to obtain the best bound possible which is the reason, the Ryzen 3 remain an option, it will depend on budget.


Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Link to comment
Share on other sites

I got similar simulated results bounding them together with more power on tap, 6 core vs 4... (I do video editing).

 

I order all my gear from the same company, very competent techies.

 

We looked at diverse solutions based on the RX700 XT graphic card, they suggested trying to obtain the best bound possible which is the reason, the Ryzen 3 remain an option, it will depend on budget.

 

While the 5700XT is a fairly decent GPU for the money, it's not especially great in handling DCS in VR. It's fine at many other VR specific titles, but I can't get anywhere near the same pixel density as my 1080ti in DCS. This could be a matter of the lower available VRAM, which why I opted for the the 1080ti in the first place. I have a Rift CV1 (Kids) and S. I bought the 5700XT for my kids' computer, and there are things I like about it that gives me hope for their next gen. If it outperforms the 2080ti, has more than 8GB of VRAM, and is reasonably priced, I'm in.

Link to comment
Share on other sites

I'm curious to know what were your other devices, because the whole point here is bounding.

 

A new 1080ti i is <> £270 more expensive than the 5700XT, more than a Ryzen 52600X six core, meaning, with my budget, i'll have to keep this processor, a Ryzen 3 3200G.

 

I have a solution though, since I want Max res for VR, Selling this GPU (5500 XT, new) to CEX computer Exchange and buy one of their 1080ti, they are guaranteed 24 month.

 

The Ryzen 3 3200G is a very strong processor, I can easily pull an extra 10% out of it, I have a good Artic Freezer cooler for it.

 

Average bottleneck percentage: 7.14% reduced to 4.89% at 110% CPU setting, good enough at 2150P/4K with little loss for lack of good bounding, then I can upgrade the processor later.

 

 

My VR kit is a brand new Rift CV1 purchased at low price on EBAY, Power Unit is a Corsair 750w...

 

 

 

NewPlan.jpg


Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Link to comment
Share on other sites

I'm currently on the the Ryzen 9 3900X with 32GB of DDR4-3600 with the Rift S, but started on with a CV1 & R7 2700 that eventually became my kids' Rig. I've had the 1080ti for 3 years now and I eventually bought the 5700XT to replace the R9 Fury for better VR performance in their PC. I occasionally steal it for other reasons, but I haven't specifically tested the combination of the CV1 and the 5700XT in DCS, but some research indicated there is a difference in the video processing offloaded to the GPU between the CV1 and the Rift S. The inside-out tracking is probably more intensive. It might be a non-issue with the CV1, but the S is far less smooth in DCS on the 5700XT.

Link to comment
Share on other sites

I'm currently on the the Ryzen 9 3900X with 32GB of DDR4-3600 with the Rift S, but started on with a CV1 & R7 2700 that eventually became my kids' Rig. I've had the 1080ti for 3 years now and I eventually bought the 5700XT to replace the R9 Fury for better VR performance in their PC. I occasionally steal it for other reasons, but I haven't specifically tested the combination of the CV1 and the 5700XT in DCS, but some research indicated there is a difference in the video processing offloaded to the GPU between the CV1 and the Rift S. The inside-out tracking is probably more intensive. It might be a non-issue with the CV1, but the S is far less smooth in DCS on the 5700XT.

 

 

Understood.

 

For as long as my CPU and GPU are bounded at the highest resolution it's OK, I wasn't considering an Nvidia card but my knowledge of VR is limited so I didn't have a clue.

 

 

I have a good motherboard, it supports a lot of Ryzen CPUs and 32 GB DDR4 3200 RAM, still can upgrade later.

 

 

I started to set my Rift up with this combo, only one sensor instaled, just to make sure I'll learn a little about it, it looks pixelated and need to be set up at higher resolutions, the 5500 obviously doesn't take it but tracking and viewing is smooth.

 

 

I decided to go VR because I just can't track a target visually with PoV hats and zoom.

 

 

I can go with the 1080ti because the low cost solutions exists and I have some headroom for overclocking my CPU.

 

 

Thanks for the tip anyway!

:thumbup:

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Link to comment
Share on other sites

  • 2 weeks later...

I'm having a pause or stutter issue with my 1080. I just upgraded my Mobo, CPU, PSU, and RAM (Gigabyte Z390 Aorus Pro, i9-9900K, Rosewill 1000W, and 32GB of 3200MHz RAM, respectively).

 

This was in an attempt to finally eliminate this pausing (randomly for a 1 sec), but to no avail. I've got a high end system (with exception of the GPU), but still the same problem. I am running the 451 version Nvidia driver. Do you think going back to a older driver may be the answer? If so, how far do I need to go back?

System Specs:

AMD 5950X (liquid-cooled), Gigabyte Z590 Aorus Pro Motherboard, 32 GB RAM DDR4 3200MHz, Samsung Evo 970 Plus 2 TB, Seagate 2TB SSD, Geforce RTX 4080 GPU, Rosewill Glacier 1000W Power Supply, Thrustmaster Warthog HOTAS (Stick, Throttle), Thrustmaster TPR Rudder Pedals, NaturalPoint TrackIR 5 w/ProClip, (1) Vizio 40" 4K Monitor, TPLink Dual Band Wireless Card, Window 11 OS

Link to comment
Share on other sites

I'm having a pause or stutter issue with my 1080. I just upgraded my Mobo, CPU, PSU, and RAM (Gigabyte Z390 Aorus Pro, i9-9900K, Rosewill 1000W, and 32GB of 3200MHz RAM, respectively).

 

This was in an attempt to finally eliminate this pausing (randomly for a 1 sec), but to no avail. I've got a high end system (with exception of the GPU), but still the same problem. I am running the 451 version Nvidia driver. Do you think going back to a older driver may be the answer? If so, how far do I need to go back?

Have you tried to add all the DCS drawers as exception on Windows Defender or whatever antivirus you have?

The leftAlt+Enter thing too?

(Many people are solving stutters on the last OB with thinghs like those).

Link to comment
Share on other sites

I'm having a pause or stutter issue with my 1080. I just upgraded my Mobo, CPU, PSU, and RAM (Gigabyte Z390 Aorus Pro, i9-9900K, Rosewill 1000W, and 32GB of 3200MHz RAM, respectively).

 

This was in an attempt to finally eliminate this pausing (randomly for a 1 sec), but to no avail. I've got a high end system (with exception of the GPU), but still the same problem. I am running the 451 version Nvidia driver. Do you think going back to a older driver may be the answer? If so, how far do I need to go back?

 

Interestingly , with my new build (below) i used the default driver Windows installed (Nvidea 432.00) and have never updated it , so well does the default run .

9700k @ stock , Aorus Pro Z390 wifi , 32gb 3200 mhz CL16 , 1tb EVO 970 , MSI RX 6800XT Gaming X TRIO , Seasonic Prime 850w Gold , Coolermaster H500m , Noctua NH-D15S , CH Pro throttle and T50CM2/WarBrD base on Foxxmounts , CH pedals , Reverb G2v2

Link to comment
Share on other sites

I'm having a pause or stutter issue with my 1080. I just upgraded my Mobo, CPU, PSU, and RAM (Gigabyte Z390 Aorus Pro, i9-9900K, Rosewill 1000W, and 32GB of 3200MHz RAM, respectively).

 

This was in an attempt to finally eliminate this pausing (randomly for a 1 sec), but to no avail. I've got a high end system (with exception of the GPU), but still the same problem. I am running the 451 version Nvidia driver. Do you think going back to a older driver may be the answer? If so, how far do I need to go back?

 

 

is HPET disabled?

Link to comment
Share on other sites

Have you tried to add all the DCS drawers as exception on Windows Defender or whatever antivirus you have?

The leftAlt+Enter thing too?

(Many people are solving stutters on the last OB with thinghs like those).

Indeed, add exceptions for Malwarebytes, AVG, Avast, Win Defender, etc... whatever you use. But do add scan exceptions for your DCS app install, plus your users/dcs folder too! Heck, I took it a step further and added my Oculus app folder as well.

MSI MAG Z790 Carbon, i9-13900k, NH-D15 cooler, 64 GB CL40 6000mhz RAM, MSI RTX4090, Yamaha 5.1 A/V Receiver, 4x 2TB Samsung 980 Pro NVMe, 1x 2TB Samsung 870 EVO SSD, Win 11 Pro, TM Warthog, Virpil WarBRD, MFG Crosswinds, 43" Samsung 4K TV, 21.5 Acer VT touchscreen, TrackIR, Varjo Aero, Wheel Stand Pro Super Warthog, Phanteks Enthoo Pro2 Full Tower Case, Seasonic GX-1200 ATX3 PSU, PointCTRL, Buttkicker 2, K-51 Helicopter Collective Control

Link to comment
Share on other sites

Two speculative videos that I found interesting, for those that missed them:

 

 

(skip to 08:22 if you just want to hear about PC hardware)


Edited by LucShep

CGTC Caucasus retexture mod  |  A-10A cockpit retexture mod  |  Shadows reduced impact mod  |  DCS 2.5.6  (the best version for performance, VR or 2D)

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png  aka Luke Marqs; call sign "Ducko" =

Spoiler

Win10 Pro x64 | Intel i7 12700K (@5.1/5.0p + 3.9e) | 64GB DDR4 @3466 CL16 (Crucial Ballistix) | RTX 3090 24GB EVGA FTW3 Ultra | 2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue) | Corsair RMX 850W | Asus Z690 TUF+ D4 | TR PA120SE | Fractal Meshify C | M-Audio USB + Sennheiser HD-599SE | 7x USB 3.0 Hub | 50'' 4K Philips 7608/12 UHD TV (+Head Tracking) | HP Reverb G1 Pro (VR) | TM Warthog + Logitech X56 

 

Link to comment
Share on other sites

:book: More rumours about the Nvidia "Ampere" RTX3000 Series...

 

Source: https://www.gpumag.com/nvidia-geforce-rtx-3000-series/

 

There have been leaks all over the place claiming that Nvidia is preparing an RTX 3090 card, and speculations have run rampant ever since. Many are saying that this is Nvidia’s power move designed to show its dominance over AMD. Other’s are taking a different approach and claiming that Nvidia might step away from the Ti/Super suffixes and that this rumored RTX 3090 is simply what we believed to be the RTX 3080 Ti.

There is also speculation that the top GPU, which we consider to be the RTX 3090/3080 Ti, is, in fact, the next Titan RTX card. But for now, we’re going to stick with calling it the RTX 3080 Ti, as rumors about that name seem the most reliable.

 

The latest reports suggest that both the RTX 3080 and the RTX 3080 Ti will release on September 17, 2020. The RTX 3070 will release in October 2020, and the RTX 3060 in November 2020.

 

The latest leaks seemingly confirm the entire lineups’ prices:

 

  • RTX Titan 2 (GDDR6X 24GB?) - speculated
    $2000
  • RTX 3090/RTX 3080 Ti (GDDR6X 12GB?)
    $1399
  • RTX 3080 (GDDR6X 10GB?)
    $799
  • RTX 3070 TI (GDDR6X 8GB?) - speculated
    $699

  • RTX 3070 (GDDR6 8GB?)
    $599
  • RTX 3060 (GDDR6 8GB?)
    $399

We believe that these are pre-Big Navi prices and that come RDNA 2’s release, and especially come holiday season, we might see these prices dropped, first temporarily, but then (depending on how good Big Navi is) permanently.

 

Related: AMD RDNA 2 Release Date, Price And Specs

 

The next generation of consumer GPUs will have the most unpredictable prices and we might be lucky enough that AMD gives Nvidia a big blow with its RDNA 2 GPUs. That would result in a price war between them and that will be very good for the consumers.

 

Specifications

Micron (memory manufacturer) recently came out with a document suggesting that RTX 3090 will be equipped with GDDR6X memory with 21 Gbps. When GDDR5X first dropped, it was a significant improvement over GDDR5, but not quite a generational jump. If GDDR6X follows in same steps, this could be important news. Another thing we overlooked a bit is that 21 Gbps memory speed, which is more than 30% improvement over Nvidia’s previous generation’s best representative RTX 2080 Ti.

Two GPUs showed 118 and 108 streaming processors, respectively. Given Nvidia’s track record, this would equate to 7,552 CUDA Cores in the former and 6,912 CUDA Cores in the latter. This handily surpasses the RTX 2080 Ti’s 4,352 Cuda Cores and almost sounds too good to be true. These two are most likely professional-grade Quadro cards, but there are some hints that RTX 3080 might be based on the lesser of these two versions.

Newest rumors surrounding the RTX 3070 and a (supposedly to be released much later) RTX 3070 Ti suggest that the latter may use GDDR6X, while the former will stick with GDDR6. The latest specification speculation is that 3070 will have 2994 Cuda cores.

 

As we’re getting closer to the release date, there will be more and more benchmark leaks. The latest leak is more concretely based in reality as it relates to Time Spy Extreme benchmark scores where RTX 3080 has roughly 35% performance boost over RTX 2080 Ti, while RTX 3080 Ti/RTX 3090 is offering 50% more speed compared to the same card.

This is especially interesting given the leaks coming from the red camp (i.e, AMD) that suggest that their flagship RDNA 2 card will be 40-50% faster than RTX 2080 Ti.

 

As for power consumption, the latest rumors are the probable 12-pin power connector required for the RTX 3080. The speculators are split on the possible ramifications with some saying there would need to be a whole new connector and others saying that two 6-pin power connectors will do just fine.

It has been confirmed that the new 12-pin connector will only be present in RTX 3090/RTX 3080 Ti Founders Edition.

There have also been some PCB leaks (non-FE, from AIB partners) and one of the most interesting thing to note is that there are three 8-pin power connectors, which definitely seems to be pushing towards a high power consumption.

One thing that is sure is that these GPUs will require a PSU with at least 650W of power which means that some people will have the additional hassle of upgrading their PSU.

 

NVCache

The first rumors about NVCache started back in May, and as time goes on, they appear more and concrete. On paper, this looks to be the push that will launch the concept into the mainstream.

And that concept is something else. The idea is that the GPU will dynamically utilize the bandwidth from system RAM, VRAM, and SSD to execute multiple tasks simultaneously at a much higher speed.

This is presumably a response to Playstation 5’s custom memory solution that is said to be able to increase the loading speed a hundredfold (compared to PS4). Of course, those are Sony’s claims, but we’ll have to see both how the console and the PC market will handle this technology jump.

 

Tensor Memory Compression

This tech will apparently use tensor cores for compression and decompression of VRAM stored items. Estimate suggest that this application of tensor cores could lead to anywhere between 20-40% less VRAM usage.

 

DLSS 3.0

Another rumored improvement is DLSS 3.0 which aims to allow game developers to fully utilize this option more easily. It’s key to note that DLSS 3.0 won’t work on all games as it needs to be manually enabled by the game developers, but the rumors suggest that it will be more accessible for them.


Edited by LucShep

CGTC Caucasus retexture mod  |  A-10A cockpit retexture mod  |  Shadows reduced impact mod  |  DCS 2.5.6  (the best version for performance, VR or 2D)

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png  aka Luke Marqs; call sign "Ducko" =

Spoiler

Win10 Pro x64 | Intel i7 12700K (@5.1/5.0p + 3.9e) | 64GB DDR4 @3466 CL16 (Crucial Ballistix) | RTX 3090 24GB EVGA FTW3 Ultra | 2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue) | Corsair RMX 850W | Asus Z690 TUF+ D4 | TR PA120SE | Fractal Meshify C | M-Audio USB + Sennheiser HD-599SE | 7x USB 3.0 Hub | 50'' 4K Philips 7608/12 UHD TV (+Head Tracking) | HP Reverb G1 Pro (VR) | TM Warthog + Logitech X56 

 

Link to comment
Share on other sites

Latest I've heard is that the 3090 is the replacement for the Titan.

 

I have seen zero reference to a 3080Ti.

 

The 3080 is what I'm eyeing to replace my 2070 Super but still fully intend to wait and see what AMD has coming in Oct-Nov. Rumours are 12GB VRAM. I'm hoping for more and will wait and see what AMD does.

 

Interesting times.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...