Jump to content

Radeon VII, what do you guys think ?


Recommended Posts

Oh, I didnt mean to be pushy or something mate. Just from my point of view, Nvidia DLSS is sort of limited. Especially in regard to resolution, as you've found yourslef, its limited only to 4k res. DirectML alternative according to AMD is not so it would benefit users across all resolutions and who knows, maybe even in VR. And it would benefit RTX cards just as much as any other GPUs.

 

I agree that Tensor cores are not going anywhere. Neither are RT cores. Both are mandatory for Ray Tracing.

 

It was only limited to 4k res in FFXIV, and I'm guesing that mostly is to due with the amount of time required to train the neural network, and also wanting to get a public test base asap. Perhaps also a state of not knowing the future of the game as the head guy for FFXIV resigned. 3dMark Port royal allows for testing RT and DLSS at 1920x1080, 2560x1440, and 4k. Albeit, framerates were low as crap @4k due to the amount of ray tracing. like 15-17fps without dlss to 30-35fps with dlss.

 

 

With the way it worked in FFXIV, allowing for enabling it at 5160x2160, I'm quite certain as long as it's implemented in a game @2560x1440 it will work for me at 3440x1440.

 

PRDLSS.PNG.6cca7da78a0b396a61152029efda1a42.PNG

 

2560x1440 DLSS results, bare in mind lots of raytracing going on in the benchmark with some lovely shadows and reflections. I don't think most game devs will take it this far nor do you have to have RT enabled to use DLSS outside of this benchmark, but almost 50% improvement at 2560x1440.

 

1440dlsss.thumb.PNG.d739c2574c6d7412f5eac3258d9a60ef.PNG

 

Frankly, the 2060 being on par with the GTX 1080, 2070 and up seems like too much for anything below 2560x1440, at least in dx11 or earlier.

 

Nvidia has mentioned VR in regards to DLSS in interviews but just briefly so I'm inclined to think they'll eventually get to a point where they can offer it over a broader range of resolutions to support the various headsets and supersampling levels. We'll just have to wait and see. And there's the plus of NGX being able to work with all versions of directx and likely vulkan.

 

But as it stands, it's still in the early stages. It's going to be the game developers that make or break the tech.

 

I didn't feel a hint of pushiness btw so no foul there. I've just been keeping my eye on DLSS since before I bought my card, and it's grown into excitement overall for the future of the NGX SDK. ;) I could live without raytracing, but I want to see where machine learning can take us in gaming.

 

*edit* just an afterthought - I'm not saying a 20 series card should be in everyone's immediate future, I don't tell people what to buy as much as share personal experience regarding options. Throughout 2019 and a bit onward, performance measured by traditional methods should be adequate.. and as you say - AMD is working on their counter to AI enhanced graphics. We're just getting our first tastes throughout 2019.


Edited by Headwarp
Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Link to comment
Share on other sites

  • Replies 114
  • Created
  • Last Reply

Top Posters In This Topic

Nvidia will rethink DLSS and XYZ when they take a 2nd look at their market value.

 

LoL, real business is not DLSS or RTX, its SELLING CARDS to the masses, and that, Nvidia forgot.

 

I am very confident my next GPU has AMD written on it, despite my Gsync screen.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Really curious how this baby will handle VR in DCS. A lot of 1080p tests have shown slight less performant results that the 2080. But i just read a test for 5K, where the R7 was better than the 2080 in most of cases.

 

VR, especially in MP uses all of my 8GB VRAM of my 1080. So the 16GB VRAM of the R7 seem to be a good option.

 

I think right now it's a bit too pricey. AMD should sell it for 600€. That would be an statement for Nvidia and a lot more people would be convinced to buy AMD again.

GeForce RTX 4090 Founders Edition - AMD Ryzen 7 5800X3D - 64Gb RAM - Win11 - HP Reverb G1 - Thrustmaster Warthog HOTAS (40cm extension) - VKB Sim T-Rudder MKIV Pedals

Link to comment
Share on other sites

Nvidia will rethink DLSS and XYZ when they take a 2nd look at their market value.

 

LoL, real business is not DLSS or RTX, its SELLING CARDS to the masses, and that, Nvidia forgot.

 

I am very confident my next GPU has AMD written on it, despite my Gsync screen.

 

We'll see. BFV is getting DLSS tomorrow apparently, Anthem is touted to release soon and will be getting DLSS support as well, and according to nvidia, RT+DLSS is going to be very close to equal with performance of both features off. I can only imagine that means DLSS without RT is going to equate to a performance increase over all.. with a 2060 supposedly able to claim 90fps @2560x1440 without either feature as it stands. And leaves the thought of DLSS 2x, without any ray-tracing. afaict doesn't involve any upscaling. I'm very curious about how that compares to current AA methods in both quality and performance.

 

If the image quality is acceptable and provides a significant performance increase.. I'd have to say developers not moving to support such a thing is kind of stubborn. I'm looking at you, MSAA performance hit with deferred rendering. *cough cough* This is undoubtedly going to be a thing with future generations of nvidia GPU's. Hopefully all goes well with the BFV patch tomorrow and I can share my experiences with that.

 

And that's not an argument against you buying an AMD card in the future Bitmaster. Innovation is risky. I'm content taking part in the first steps of a new tech being utilized in the gaming world, and the 2080Ti still has the cuda count for games that won't utilize tensor/RT cores. It's been a long time since hardware really tried to change things up to this level.

 

It's good that AMD is catching up after a slump of nvidia dominating the gaming gpu market, and this time with a reasonable pricetag. Despite the shifty launch from nvidia and the high pricetags while they felt they could get away with it.. they are laying groundwork that could change gaming as we know it at the level of development, and even streamline things like lighting and shadows from a database of real world physics. And their low range offerings right now aren't exactly horrible. If I were on a tight budget a 2060 doesn't look too bad at all. And I'm all for AMD bringing more bang for the buck, causing nvidia to rethink pricing and counter. We'll see.

 

Personally.. I'm just thinking PC gaming is going to get pretty yummy throughout the next decade. At which point I might just be too old for it lol.


Edited by Headwarp
Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Link to comment
Share on other sites

I didnt previously put that much attention into the DLSS details but it appears that there are serious quality drawbacks in that tech, which basically result in lower picture quality in exchange of better performance:

 

 

Plus its introducing shimmering and jagged lines. All in all, I can tune graphics myself.

AMD Ryzen 5900X @ 4.95 Ghz / Asus Crosshair VII X470 / 32 GB DDR4 3600 Mhz Cl16 / Radeon 6800XT / Samsung 960 EVO M.2 SSD / Creative SoundBlaster AE-9 / HP Reverb G2 / VIRPIL T-50CM /
Thrustmaster TPR Pendular Rudder Pedals / Audio Technica ATH-MSR7

Link to comment
Share on other sites

I didnt previously put that much attention into the DLSS details but it appears that there are serious quality drawbacks in that tech, which basically result in lower picture quality in exchange of better performance:

 

 

Plus its introducing shimmering and jagged lines. All in all, I can tune graphics myself.

 

 

I'm going to get pretty hands on with it when the BFV patch comes.

 

In the FFXIV benchmark at 4K, TAA seems perhaps slightly sharper if you look really hard, but the difference between it and DLSS seems negligable to me just looking at the edges of the car model. I mean I really had to look to see the difference, which typical gameplay doesn't usually involve such inspection. For comparison the same benchmark using FXAA produced immediately noticeable jaggies and shimmers along the edges of various surfaces of the car model. But - 4k on my monitor is more like 2560x1440 + dsr so, hopefully BFV allows for DLSS at my native resolution where I'll have a better ability to gauge. Minimum framerates went from 60 to 80.. which I'm not scoffing at regarding 4k resolution.

 

So BFV has a resolution slider, which I just played with a bit in regards to the videos' mention of 1800p being upscaled to 4k via dlss.

The more you downscale said resolution the more you begin to notice jagged edges. I lowered it to 75 just to play around with raytracing at acceptable performance levels, and ultimately didn't like the experience. But, doing some math.. 1800p is 3200x1800, and 3200 is 83.3% of 3840 and 1800 is 83.3% of 2160. So, I dove in and took to the practice range and there is a visible difference between 83% and 100% of my native resolution. It's hard to describe, 17% less resolution again requires inspection to notice and I go from like 140-168fps to 160-188 fps. (dx12 no ray-traycing) The difference in this case was more noticeable than my experiences between TAA and DLSS in the FFXIV benchmark but tis a different game, running at a different aspect ratio and resolution. I guess what it boils down to is how well the AI neural network handles anti-aliasing for the upscaled image to retain the performance of said lower resolution, and how comparable to TAA it is.

 

I think it'll also be good to measure the performance of DLSS 2x and I hope it's an option in BFV, as it will basically be applying the effects of DLSS to my native resolution, and that should tell us what ngx and tensor are capable of in the regard of taking AA load off the back of the streaming processors.

 

nVidia claims at least 64 samples per rendered pixel for dlss. In DCS we have the option of 1.5 to 2 samples per rendered pixel currently. Looking at something like DSR which renders more pixels, rather than samples of pixels but has a similar effect on performance, goes up to 4x.

 

https://developer.nvidia.com/rtx/ngx

 

 

So I'll give that an honest shake with the upcoming BFV patch as I play with DLSS and share some hands on experience rather than continue to speculate.

 

 

Edit* My apologies in advance to forum mods - I'm more interested in the anti-aliasing specific features than I am the titles mentioned.

 

**another edit - I just wanted to state clearly, I'm glad AMD still has a dog in this fight. I don't think any of us were expecting this kind of performance from the next amd card.

 

RTX and DLSS were not the selling point of the 2080Ti for me. I'd held out long enough..and knew I'd be making a purchase when nvidia launched their next line. Spent the previous year watching 1080Ti's sell out and be listed for more than I paid for my 2080Ti. I'd held out long enough with a 980Ti @3440x1440 and recently picked up a VR headset that wanted more juice as well. At the same time I kept hearing about Deep Learning AI for professional use and I'm like "so.. what does this thing have use for in gaming?" . I bought the fastest option I could and while I was salty at the pricetag, I've recovered from the expense and am loving the crap out of my GPU. DLSS is kind of just an added surprise, and MSAA performnace ever since DCS went to deferred rendering IMO has been asking for a solution.

 

Personally, if AMD counters DLSS, both companies have their own version of Machine learning anti-aliasing, and if it offers an improvement over MSAA i'd hope both could find their way into the sim. Proprietary tech or not an fps boost is an fps boost, it's not ED's fault if one company's tech causes a player base to gravitate towards a brand. That's up to the brand and what they bring to the competition. I get not supporting hairworks, turfworks, and silly stuff like that. But when talking image quality and performance provided by a feature said company will do the brunt of the work to implement such.. YES PLEASE. Get on that "will support" list and be seen by every person who clicks on nvidia articles. I'll create my own thread for discussing DLSS, pending the chance to experience it. But that plea is really only if turns out to be good.

 

So pardon me for much interest in what AI enhanced graphics cards can bring to the table, and I'm sorry for that turning into the focus of my posts in this thread. I'm all for AMD sticking it to intel and nvidia even though I'm set on hardware for several years. There's always something better around the corner. My 8700k was outmatched by the 8086K in 9 months. By 9th gen after a year. It's an endless dance I don't think too much about as long as my needs are covered.


Edited by Headwarp
Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Link to comment
Share on other sites

I have Vega 64 and I'm playing DCS on High/Ultra in all settings in 1440p with no issues at all. I'm getting fps capped to my monitors refresh rate of 75Hz while in flight. At ground it's worse where I see dips down to 40-50 fps on busy missions but as I use Freesync it does not bothers me. I think I'm mainly CPU bound in DCS, i7 4790K is starting to show it's age even though it's very good CPU, running a lot of things in background is noticeable... like trackir, all sort of game store launchers, chrome etc.

 

So if Radeon VII is pure improvement on performance, power efficiency, memory bandwidth, its pure win for me. Playing with voltage on Vega is great and it will be even better on VII.

Do, or do not, there is no try.

--------------------------------------------------------

Sapphire Nitro+ Rx Vega 64, i7 4790K ... etc. etc.

Link to comment
Share on other sites

Welp.. I don't need my own thread for DLSS based on my experiences with BFV. lol. It's pretty unimpressive.

 

For some reason they locked it to only work with ray-tracing. And in some instances, lowering my resolution to 83% without DLSS enabled actually looked slightly better, and pretty equal performance if not slightly better. . Both looked worse than my native resolution. But pretty much in line with dude's thoughts from the video i responded to in my previous post.

 

This still leaves the quesiton of DLSS 2x and the effects it might have on traditional performance and image quality. And still no way to test that.

 

But frankly.. so far, even with both ray traycing and dlss, they should not be a reason for buying a 20 series gpu. I do see 2060/2070 as decentmid-range options for now, but not for RTX features. I'll quit bugging you guys and ED with the thought of dlss unless nvidia actually impresses me in this regard. lol.

 

Nvidia's got some work ahead of them .. and while the ray-tracing thing is honestly kind of impressive from the perspective of the fact that it was "impossible" before now, and the reflections and shadows in a not so highly ray-traced game do show potential for something kind of awesome in the future.. it's going to take future hardware and cooperation from the development community to take it to a standard. I've stated in other threads that I see the 20 series as kind of a hybrid, with enough CUDA cores and speed to keep me satisfied with the performance of the 2080Ti for some time to come..and a taste of potential tech.

 

Benchmarks I'm seeing put the Radeon VII quite close to the 2080 and 1080Ti performance wise neck and neck @4k in some titles. 2080Ti is still kind of hold on to the performance lead, but $500-600 more? ehhh subjective. Not for most people. 3440x1440, or 4K? Ehhhhhh well you're obviously not shy about throwing money at hardware. Mayyybe, if you're unsatisfied with your performance with what you're using now and aren't starving anyone.

 

I saw some things watching benchmarks that might indicate the need for driver optimization on AMD's part and they might get more performance over all out of it yet. But the Radeon VII is rivaling the 1080TI and 2080 very closely at the moment. That being said - looking newegg I'm seeing RTX 2080's for $699.

 

It's going to be very interesting to see what AMD does for 2019. Seems nvidia, in trying to prep the world for ray-tracing, has allowed AMD to catch up.. and I'm currently way more impressed with that than RTX features lol. Especially given that they're putting it to intel as well with the upcoming ryzen 3000.


Edited by Headwarp
Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Link to comment
Share on other sites

Ray Tracing will not see any major adopting until consoles will support it, which requires AMD and Microsoft/Sony cooperation. From what I've seen so far, the main goal of new consoles is 4K/30-60fps, I do not see a way how they could combine that with ray tracing in mind.

 

It will probably take next generation (not refresh) of GPUs to handle that comfortably in 1440p. I'm all for it, ray tracing is the future of photo realistic graphics, but it will take time and mainly it will be good only for some games, you probably do not want RT in everything...

Do, or do not, there is no try.

--------------------------------------------------------

Sapphire Nitro+ Rx Vega 64, i7 4790K ... etc. etc.

Link to comment
Share on other sites

Ray Tracing will not see any major adopting until consoles will support it, which requires AMD and Microsoft/Sony cooperation. From what I've seen so far, the main goal of new consoles is 4K/30-60fps, I do not see a way how they could combine that with ray tracing in mind.

 

It will probably take next generation (not refresh) of GPUs to handle that comfortably in 1440p. I'm all for it, ray tracing is the future of photo realistic graphics, but it will take time and mainly it will be good only for some games, you probably do not want RT in everything...

 

I could see it adding to most of the games I play, but yeah, with future hardware. There are definitely a few where it wouldn't matter at all.

 

Maybe 5-10 years from now. But ray-tracing perfromance is pretty much what I expected it to be like after the launch announcement. We're not there yet by any means imo. Still..more titles to adopt the tech to come so, we'll see if anybody else makes it work better.. I think that's going to take more of a ground up approach though, and still, future hardware.

Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Link to comment
Share on other sites

perhaps the best Radeon VII review so far:

 

7j2PFKkYrVg

[sigpic]http://forums.eagle.ru/signaturepics/sigpic4448_29.gif[/sigpic]

My PC specs below:

Case: Corsair 400C

PSU: SEASONIC SS-760XP2 760W Platinum

CPU: AMD RYZEN 3900X (12C/24T)

RAM: 32 GB 4266Mhz (two 2x8 kits) of trident Z RGB @3600Mhz CL 14 CR=1T

MOBO: ASUS CROSSHAIR HERO VI AM4

GFX: GTX 1080Ti MSI Gaming X

Cooler: NXZT Kraken X62 280mm AIO

Storage: Samsung 960 EVO 1TB M.2+6GB WD 6Gb red

HOTAS: Thrustmaster Warthog + CH pro pedals

Monitor: Gigabyte AORUS AD27QD Freesync HDR400 1440P

 

Link to comment
Share on other sites

perhaps the best Radeon VII review so far:

 

7j2PFKkYrVg

 

 

He is spot on with the AMD drivers. They are SO GOOD. When I have switched from nvidia to AMD, it was the first thing that made me so satisfied with the switch.

 

Also how easy it is now to undervolt the cards with new drivers is just mind blowing.

Do, or do not, there is no try.

--------------------------------------------------------

Sapphire Nitro+ Rx Vega 64, i7 4790K ... etc. etc.

Link to comment
Share on other sites

He is spot on with the AMD drivers. They are SO GOOD. When I have switched from nvidia to AMD, it was the first thing that made me so satisfied with the switch.

 

Also how easy it is now to undervolt the cards with new drivers is just mind blowing.

 

They must have hired a new team. IMHO, AMD drivers took forever to mature and sucked long after the card was released.

ASUS ROG Maximus VIII Hero, i7-6700K, Noctua NH-D14 Cooler, Crucial 32GB DDR4 2133, Samsung 950 Pro NVMe 256GB, Samsung EVO 250GB & 500GB SSD, 2TB Caviar Black, Zotac GTX 1080 AMP! Extreme 8GB, Corsair HX1000i, Phillips BDM4065UC 40" 4k monitor, VX2258 TouchScreen, TIR 5 w/ProClip, TM Warthog, VKB Gladiator Pro, Saitek X56, et. al., MFG Crosswind Pedals #1199, VolairSim Pit, Rift CV1 :thumbup:

Link to comment
Share on other sites

They must have hired a new team. IMHO, AMD drivers took forever to mature and sucked long after the card was released.

 

 

No longer true, I have the Vega for a year now and I had only once a bad experience where clean installation of the drivers solved the problem. The drivers are really good now, they are always ready for new games also, new releases every month. The feature set and quality of life features are night and day compared to nvidia.

 

When I compare it to my experience with nvidia drivers when I had 980ti, I have bricked my computer 3 times before I just stopped installing new drivers altogether and opted only for those that I really needed. Maybe I have joined nvidia club in a bad year, I do not know, but it was awful. I had AMD graphic before that and had no problem but at that time I was not installing official drivers but Omega package, that does not exist anymore if I'm not mistaken.

Do, or do not, there is no try.

--------------------------------------------------------

Sapphire Nitro+ Rx Vega 64, i7 4790K ... etc. etc.

Link to comment
Share on other sites

I second that. Had Fury before this 1080Ti and my driver experience was very good. Quite frankly, after switching to nvidia I've considered driver side of things a downgrade.

 

My 1080Ti warranty will run out in June. If only Radeon 7 was a bit faster and had HDMI 2.1 I'd gladly dump 1080Ti and buy it.

AMD Ryzen 5900X @ 4.95 Ghz / Asus Crosshair VII X470 / 32 GB DDR4 3600 Mhz Cl16 / Radeon 6800XT / Samsung 960 EVO M.2 SSD / Creative SoundBlaster AE-9 / HP Reverb G2 / VIRPIL T-50CM /
Thrustmaster TPR Pendular Rudder Pedals / Audio Technica ATH-MSR7

Link to comment
Share on other sites

its a sidegrade between the 2. Better wait for the next gen.

[sigpic]http://forums.eagle.ru/signaturepics/sigpic4448_29.gif[/sigpic]

My PC specs below:

Case: Corsair 400C

PSU: SEASONIC SS-760XP2 760W Platinum

CPU: AMD RYZEN 3900X (12C/24T)

RAM: 32 GB 4266Mhz (two 2x8 kits) of trident Z RGB @3600Mhz CL 14 CR=1T

MOBO: ASUS CROSSHAIR HERO VI AM4

GFX: GTX 1080Ti MSI Gaming X

Cooler: NXZT Kraken X62 280mm AIO

Storage: Samsung 960 EVO 1TB M.2+6GB WD 6Gb red

HOTAS: Thrustmaster Warthog + CH pro pedals

Monitor: Gigabyte AORUS AD27QD Freesync HDR400 1440P

 

Link to comment
Share on other sites

its a sidegrade between the 2. Better wait for the next gen.

 

I know, but as Ive said, my warranty is running out so if anything happens to the gpu I will have to resort to buying a new one anyway. And yes, it is a sidegrade. Disappointing it is that my 2 year old gpu delivers more punch than new RTX 2080 and has more VRAM, so for at least another year or more we're still going to have performance on the same level as we do for the past 2+ years. I dont consider RTX 2080Ti to be a viable option due to extreme pricing of it. The offered performance improvement is not worth the added cost if compared to 1080Ti. So yes, Radeon VII is a sidegrade unfortunately.

AMD Ryzen 5900X @ 4.95 Ghz / Asus Crosshair VII X470 / 32 GB DDR4 3600 Mhz Cl16 / Radeon 6800XT / Samsung 960 EVO M.2 SSD / Creative SoundBlaster AE-9 / HP Reverb G2 / VIRPIL T-50CM /
Thrustmaster TPR Pendular Rudder Pedals / Audio Technica ATH-MSR7

Link to comment
Share on other sites

I would consider buying Radeon VII only when non reference cards will be released. Or when water blocks will be available. The reference cooler is not good.

 

I have bought Vega from Sapphire and it was the best possible choice as people suggested, the cooler is just so good. All other brands like Gigabyte, Asus, etc. had problems with non reference cards where the coolers were not designed well with having hot spots that could kill the cards if you were unlucky, something similar that the new 2000 series from nvidia suffer from.

Do, or do not, there is no try.

--------------------------------------------------------

Sapphire Nitro+ Rx Vega 64, i7 4790K ... etc. etc.

Link to comment
Share on other sites

Oh yes, if i were to buy it, I intend to swap cooler for a waterblock once any decent would arrive.

 

As for second part. That's cause they used their coolers previously designed for nvidia gpus. So no wonder asus or gigabyte have issues.

AMD Ryzen 5900X @ 4.95 Ghz / Asus Crosshair VII X470 / 32 GB DDR4 3600 Mhz Cl16 / Radeon 6800XT / Samsung 960 EVO M.2 SSD / Creative SoundBlaster AE-9 / HP Reverb G2 / VIRPIL T-50CM /
Thrustmaster TPR Pendular Rudder Pedals / Audio Technica ATH-MSR7

Link to comment
Share on other sites

Seems like recent drivers fixed overclocking or at least made it easier. GPU default clock speeds are around 1750-1800 Mhz for core. And I'v been seeing overclocking scores on a few forums (and now video) of 2050 - 2150 Mhz. Once this 7nm process matures, it seems it will give a lot of room for OC.

 

AMD Ryzen 5900X @ 4.95 Ghz / Asus Crosshair VII X470 / 32 GB DDR4 3600 Mhz Cl16 / Radeon 6800XT / Samsung 960 EVO M.2 SSD / Creative SoundBlaster AE-9 / HP Reverb G2 / VIRPIL T-50CM /
Thrustmaster TPR Pendular Rudder Pedals / Audio Technica ATH-MSR7

Link to comment
Share on other sites

Hi guys,

 

 

Wanted to post a first glance of Radeon 7 Performance on my PC.

 

 

Ryzen1700X/3800MHz 1,35V

32 GD G.Skill 3200 cl14

R7 at 1901MHz/1018mV (just playing around with the slider for some minutes)

 

 

 

Just loaded SU27 ramp up training mission and hopped into one of the SU27 flying around. First screen think was 58 and second 46 or so, don't know why theses screens look so blurry. Just have no time today for better ones and the real fun won't start before this baby runs on water.

 

 

I can't say if it's good or not, so tell me please what u think.

gratz

Bild2.thumb.jpg.7407d8c4f3fd349cc973cdaf98e0e95c.jpg

Bild3.thumb.jpg.b38472218fd1204f35e6bbac379270b1.jpg

Link to comment
Share on other sites

Hard to judge, by that picture you have 49 FPS. But that external view. Now jump into the pit and bench it. Better even, try flying online. You're running game at 4k, so having around 50-60 FPS is optimal with current generation of monitors. However, I wonder if by cutting down shadows to flat, removing motion blur and lens / flare you can get more than 60, hopefully closer to 100. And if additional tweaking in Radeon control panel has any impact on DCS.

 

I have 2700X running at 4.4 Ghz, faster ram than you so how much would that benefit FPS.

AMD Ryzen 5900X @ 4.95 Ghz / Asus Crosshair VII X470 / 32 GB DDR4 3600 Mhz Cl16 / Radeon 6800XT / Samsung 960 EVO M.2 SSD / Creative SoundBlaster AE-9 / HP Reverb G2 / VIRPIL T-50CM /
Thrustmaster TPR Pendular Rudder Pedals / Audio Technica ATH-MSR7

Link to comment
Share on other sites

4K combined with 2xAA ... those are good results. :)

 

Do you really need AA at 4K though? What screen size do you have?

Do, or do not, there is no try.

--------------------------------------------------------

Sapphire Nitro+ Rx Vega 64, i7 4790K ... etc. etc.

Link to comment
Share on other sites

I use at 1440p and MSAA x 2 with SSAA x 1.5 and it works smooth, in between 50 to 80 FPS. And with 35" panel I feel like a bit of AA is needed, so maybe at 4k as well ?

AMD Ryzen 5900X @ 4.95 Ghz / Asus Crosshair VII X470 / 32 GB DDR4 3600 Mhz Cl16 / Radeon 6800XT / Samsung 960 EVO M.2 SSD / Creative SoundBlaster AE-9 / HP Reverb G2 / VIRPIL T-50CM /
Thrustmaster TPR Pendular Rudder Pedals / Audio Technica ATH-MSR7

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...