Jump to content

Waxer

Members
  • Posts

    583
  • Joined

  • Last visited

Everything posted by Waxer

  1. Jayz2cents video where he takes the backplate off the XC3 to show the cheaper capacitors. Solution is likely software patch to lower the allowed overclock bins. Pretty confident the higher specification FTW3s would not repeat the same mistake. Edit: XC3 uses 5 of the cheaper caps, not 6. With 1 bank of more expensive mini caps (not 2, not 0). And EVGA's position is that for the XC3 it is "within specifications". ie don't expect stability if you push the overclock. FTW3: EVGA reengineered the card to get rid of the 2 of the cheaper caps and install two banks of the many, smaller caps. This explains the shipping delay on launch. But the card should be good for retail release.
  2. Never mind: [interesting. Please share details: specifically which AIBs?] For people interested in details, either look at latest Igor's Labs or Jayz2cents video. Low end EVGA, Zotac, Gigabyte all using lower specification capacitor arrays on their low end boards. This may not be the case (hope not) on more expensive boards.
  3. Sure, sure. But the OP made an unsubstantiated comment. He was told he was wrong. He doubled down and offered proof. So yeah... I'd like to see nessuno0505's data. I'll laugh if he links to the pre-release leaker that was showing game benchmark results extrapolated from synthetic benchmarks. But yeah... let's see what you got nessuno0505.
  4. Right. Somewhere else on this forum - can't remember where - people were insisting that their mid range 650W, 750W and 850W PSUs would definitely be sufficient. What these people don't seem to understand is that while the overall PSU is rated to - well, let's just take the mid case which is a typical nice mid range PSU... 750W - hitting 750W power draw is NOT THE ONLY ISSUE. Far more fundamental is that these PSUs have detailed specifications that describe limiting power draw on individual rails of the PSU. If peak current draw on a single +12V rail exceeds the maximum specified, while the overall power draw on the PSU could be within specification, the user could exceed the limit of an individual power rail. What happens then? Either a system crash, or artefacts, or an unexpected slow system with system instability. GDDR6 used on Turing cards and many earlier generations of Nvidia GPUs is not error correcting: if GPUs ask for more current and this is not supplied sufficiently quickly non error correcting will often exhibit artefacts on screen before ultimately crashing the system. GDDR6X used on Ampere is error correcting. Hence if the GPU detects memory inconsistencies, the GPU tries to resolve the inconsistency by redoing calculations. Hence before system crash your system might run slow. And then if the GPU really can't figure out what the hell is going on then it might crash. Igors lab has done testing where he shows that the GDDR6X on Ampere is running at c.100 deg C on the Founders Edition 3080 with backplate. The memory is rated to 105 deg C, so that is within specification... but it is awfully close. For that reason - while Micron produce GDDR6X capable of running at 21Gbps - Nvidia chose 19Gbps rated GDDR6X for the 3080 and 3090. If they fitted the higher clocked stuff they would have bigger problems cooling it and the memory controller of their 8nm Samsung GPU die. So on top of all the usual reasons for RMAs (manufacturing defects, component failures etc) we have two additional sources from weak design / design compromises resulting from Nvidia's choice of Samsung 8nm and GDDR6X: 1) high system power draw, causing system instability on systems with insufficiently powerful GPUs (or old, degraded PSUs) and 2) crashing causes by system components - very often the memory - getting too hot. Note that electronic components wear out: electron creep in transistors. This happens faster when silicon is run close to its maximum temperatures for long periods. Hence the importance of 3 year plus warranties and a AIB service centre with customer friendly turn around. I am not saying don't get Ampere... I might get one myself. But I am saying buyer beware. And AMD's offering this season deserves a look in as it is using a more power efficient TSMC node for the GPU and it is using GDDR6 which is likely to be easier to cool. This is why Nvidia's Ampere Quadro cards are using the "inferior" GDDR6, not 6X. "Moores Law is Dead" did a good video on this for people that are interested in the detail. You might not care about a melting polar ice cap. Maybe you do. Maybe you don't. But I do expect you care if your GPU works and keeps working over it's lifecycle.
  5. I've seen that claim and it is just not credible. Either some settings have been changed; otherwise there is a CPU or memory bottleneck.
  6. No. No. No. More like 10% in the best case scenario of 4K gaming. 18% only in benchmarking. It is all in the thread already, if you bother to read it. (Or read or listen to the numerous reviews out there.
  7. Crickey... that sounds like an advert for the 3090. :megalol: We really need the fps number for a 3080 in same test conditions to put that in greater perspective.
  8. I am thinking along the same lines as you guys. My intention is to wait until October and wait and see what the 6900 XT is like. While I think that it will be good I am not expecting it to surpass the 3080, except in memory capacity. There might be specific titles where it outperforms, but I doubt if it will be even a marginal win, let alone a significant win overall. It might be a bit cheaper, but even then if we are talking c.$100 difference I would rather get the higher performing card, regardless of a fairly small price difference. AMD does have ray tracing (not that I care), but it's implementation of DHSS is not as powerful as DHSS 2.0 yet. So big advantage to Nvidia there. And overall Nvidia's software support is stronger. This might well change over the years with software development done primarily for these new consoles but by the time that has happened it will be GPU upgrade time again. The 3080 / 20 will be interesting for sure, but if the 6900 XT does not clearly beat the 3800 / 10 then I imagine NVidia will be quite greedy on the pricing of that card, especially considering how high the demand was at the 3080 / 10's launch. Meanwhile I am not yet convinced that 20GB of GDDR6X or 16GB of GDDR6 is necessary for DCS at 4K. 10GB will probably be fine. Even if you find yourself hitting GPU memory limits all it takes is the lowering of one of two settings and you are back to the races. I will probably get a 3080 / 10. Either the beautifully designed Founders Edition, a Strix or a FTW3. But I still want to wait and see what 6900 XT is like. ... besides... I wasn't as fast as the bots on launch day!
  9. Screen res: I actually use a 5K monitor, so 15mn pixels compared to 4K monitor 8mn pixels. So yeah, I am interested in 1440p --> 4K performance because it gives me a taste of what to expect on 5K. (And this explains why I get peeved with people talking about CPU bottlenecks, who refuse to accept that someone else with a different system could be getting GPU bottleneck). CPU: Comment about "not much you can do to improve CPU performance." Assuming that you are already using a Comet Lake at 5.0-5.3 GHz, yes and no. Clearly if you have such a rig then you will be using a relatively fast low latency memory kit such as 3200 CL14, or 3600 CL16. But I've seen some people get a few extra percentage of performance with a 10xxx series chip using expensive Samsung B die overlocking memory and tuning the timings carefully. Here is a link to one such guy that has managed to squeeze a little extra performance from his system. Note that he is using an Asus Z490 Apex motherboard. (Other option would be the unobtanium EVGA Z490 Dark). https://imgur.com/a/OkMl9sN Results from memory testing on an earlier build of his: https://kingfaris.co.uk/ram/15
  10. We've got some folks in the community upgrading from the 2080 Ti, so rather than speculate on DCS by extrapolating from MSFS 2020 (they are quite different executions of a flight simulator and MSFS is also very early in release and will no doubt get lots of optimisations in the future) lets just wait a little to get feedback from the early adopters within our community. And yeah... I have no patience for a CPU bottleneck discussion. It has been done to death. Yes, we know it is important, and it could be an issue for some people with some setups. But let's wait and see peoples' actual experience.
  11. Wasn't much luck for the poor soul having a heart attack either. Hope you were able to help them. :thumbup:
  12. The testing shows that 3090 is at most 18% better than the 3080 at 4K. And 4K results could be as little as 4%. Typical / average result is 8-11%, so Nvidia not flat out lying, but erring on the side of generosity to their new product as you would expect. Some people just want the best because they can. (They make money, and they don't spend it on racing yachts or private jets etc). And also I don't think you should overlook the fantastic Founders Edition cooler. Using 3 slots I expect it to perform very favourably against the AIBs. The 3080 Founders Edition cooler suffered in comparison to AIBs as it was only 2 slot, against two and a half to three slots with the AIB cards. So for a 2 slot solution it was performing above it's scale. Anyways, I would much rather a 3090 at RRP than overpay for a 3080 right now.
  13. I had a chat to a very nice / helpful sales guy at one of the stores: He basically told me that they had 3080s (the conversation was not about the 3090) in stock currently. However, because "they didn't make much money from the GPUs", all of the 3080s that they had were reserved for their full system builds where overall sales and margins are a lot higher. They expect to be freeing up GPU stock for self builds / upgrades in October. I doubt many system builds will include 3090s, and I imagine margins are a little better so maybe the retailers are more willing to just pass these on without keeping many for their system builds? I did hear of some nutters on Reddit that were buying entire systems just so they could get their paws on 3080s on launch. Seem all a bit ridiculous to me. But that is capitalism for you.
  14. There you go: enjoy! And remember your mates on the forum... let us know how much of an improvement from the 2080 Ti in DCS.
  15. Ahh, sucks. OP is in Sweden, not UK. I guess we are a bigger market for the bots to target. Never mind... you get to wait until all the information gets in to make your choice. I remember you saying that you were more focused on ultimate performance than on price, but even so you will be armed with the data and hopefully have DCS benchmark data to make you decision. Even if you still go for the BFGPU in the end! Besides the 2080 Ti still no slouch.
  16. That is great. Congrats. Glad the actual humans got in on this one.
  17. Well... this isn't the same as RMA figures over a 12 month period, but this is an inauspicious start. https://videocardz.com/newz/geforce-rtx-3080-sees-increasing-reports-of-crashes-in-games
  18. Upgrade? Yes, if you can easily afford it. *** BUT *** You are literally 2 weeks away from the announcement of Zen 3. Sure... the bots will be rebooted for action, but I would wait myself. I know you can always make this argument... but 2 weeks...
  19. You mean in DCS? Probably marginal. I think 3080 is really ideal as a 4K card. But sure... let's see the actual results. We'll find out soon.
  20. In terms of CUDA cores / the GPU chip itself, I don't think Nvidia have anything between the 3090 and 10GB 3080. I think the only thing in between will be the 20GB 3080. And I don't think that will be a six month wait. More like a 1-2 month wait. Probably either just before the 6900 XT launch (to spoil the launch) or soon after once they know the pricing of the AMD part. And we already know there is something over and above the 3090, except that it will cost >$4,000 and not be supported by game drivers.
  21. It is not quite as simple as that. In DCS there will always be a bottleneck somewhere. The simplistic statement that many people make is "the CPU is the bottleneck" or "the GPU is..." or whatever. In some people's system - where components are not well optimised - it might well be one component of the PC that is consistently the bottleneck. In a well optimised system the bottleneck will be moving from component to component depending on what you are doing at any particular time. And it is generally true that on most people's systems are more often towards the CPU part of the spectrum, or something related to it like memory bandwidth (feeding information to the CPU from memory). But the more you travel from 1080p, to 1440p, to 4K, to 5K (my setup) to 8K the more you shift the bottleneck towards the GPU more of the time. (There is some guy on here that keeps droning on about CPU bottlenecks, but while he is not wrong in general terms, many recent threads have been focusing on the GPU bottleneck. So I feel his repeated comments about the CPU are singing a song that a lot of people already know the words to). I digress. If your CPU is 100% of the time the bottleneck (at your monitor resolution and settings) you can move from a 980 Ti to a 3090 and it will literally do nothing for you. That is one possible reason why could be seeing no GPU scaling. On the thing that the OP just mentioned - Ampere CUDA cores are not equal to Turing CUDA cores - this is true. But Nvidia have a very good and legitimate reason for counting CUDA cores as they do, independent of the marketing spin. For details listen to this, but be warned that it gets rather technical. The long preamble is partially necessary to listen to understand the part where he starts to explain the difference with Ampere. Otherwise you will not understand what he is talking about.
  22. I have very high level of interest in a high quality F-16 throttle & grip. And if it were high enough quality I would be prepared to pay a realistic price for it. (In other words, I know it would not be cheap like buying a Warthog throttle). But they need to get the quality sufficiently high.
×
×
  • Create New...