We covered AMD ads on Ryzen and Navi during E3 throughout the week. We still have an aspect of the situation to discuss. We discussed Navi and its RDNA architecture, but we did not discuss any of the software enhancements that AMD plans to offer with its next GPUs. Some of these gains will also be available for GCN cards.
Let's talk about some features and improvements.
First, there are the quality of life gains generated by AMD's Radeon software. With Navi, the system automatically switches your TV to low latency game mode, if the display supports one. You will be able to save the settings to separate the files and reimport them if you need to install the driver completely from scratch or if you reinstall your entire operating system. Some improvements have also been made to the way WattMan reports its results.
The AMD Link streaming application now supports streaming on TVs, including Apple and Android TV. VR wireless streaming is now supported as well. These enhancements are not associated with any specific GPUs.
Radeon Chill is AMD technology to reduce the power consumption of GPUs during games. The software can now set frame rate limits on 60Hz screens to reduce the number of images rendered when you do not actively control your character due to your afk.
The AMD footnote on Radeon Chill deserves to be read. Under the right circumstances, this can significantly reduce the power consumption of the graphics processor, although this has an impact on the rate, and the total size of the gain varies from one title to the other. Any graphics processor that previously used Radeon Chill can take advantage of these enhancements.
Then, Radeon anti-lag. According to AMD, the company has invented a method to reduce the time between the moment you press a button in a game and the one where you see the results. To do this, some CPU jobs are delayed to ensure that they occur simultaneously alongside the GPU rather than being completed in advance.
Honestly, I can not say that I have observed a difference between the activation of Radeon Anti-Lag and its deactivation. AMD demonstrated that the effect worked with custom-built latency monitors attached to displays, and I think the company on which the monitor I tested had slightly higher latency. I am at an age when motor reflexes have already begun to decline, and if I am honest, I have never been a very good twitch player.
In the best case, this feature reduces your total latency by a few milliseconds. If you are good enough to compete in these spaces, it could be worth something. This is not something I feel able to comment on.
The anti-lag is supported in DX11 on all AMD GPUs. Support for DX9 games is a Navi-only feature. DX12 games are currently not supported because of the extremely different implementation requirements in this API.
Radeon Image Sharpening is a feature that combines adaptive contrast sharpness with the use of GPU resizing techniques to improve the quality of the base image without the need for a native 4K rendering penalty. The following slides compare RIS enabled or disabled.
RIS is disabled in the slide above.
RIS is enabled in this slide. The effect is very subtle. You may want to open the two images above in separate tabs, zoom in carefully, and then compare the final product. Although there is a clear improvement in the IQ in the picture "ON", it is a small one.
Nevertheless, small improvements to IQ are generally welcome. RIS was also designed by Timothy Lottes, who worked on FXAA at Nvidia. The use of this feature should not affect performance (the impact on performance is estimated at 1% or less). RIS is a Navi-only feature and is only supported by DX12 and DX9.
Finally, there is FidelityFX.
FidelityFX is AMD's new addition to GPUOpen. It is offered to any developer wishing to take advantage of it. Adaptive Contrast Sharpness can be used on any GPU if developers want it.
Some extra hardware details on Navi that were not in the previous articles, but probably should have (a frenetic briefing schedule and some scrambled note taking):
AMD plans to maintain GCN GPUs on the market to manage HPC workloads. The AMD engineer we spoke to compared GCN to an extremely efficient sword was it balanced properly, but that it was relatively tedious to use, while RDNA was more of a lightsaber in terms of concentration on elegance and the economy of movement. GPUs such as the MI50 and MI60 also offer much larger memory bandwidth and larger memory pools than any of the Navi cards coming on the market.
The RDNA should eventually replace the GCN in this space and correct some of the slow path anomalies that the GCN suffers. Irregular performance with some texture formats has been corrected, for example, and the RDNA has larger caches to prevent bubbles in the pipeline. Overall performance should be more predictable with rDNA-derived GPUs than with GCN.
There is nothing new in these details, but I thought to include them for the sake of completeness. This concludes our E3 coverage.
AMD announced last night its new Radeon GPUs at E3. The new Radeon RX 5700 and 5700XT are positioned as responses to the Nvidia RTX 2060 and 2070 family, rather than a frontal assault on the RTX 2080. As previously announced, the Radeon VII will remain on the market over the RTX 2080 .
Navi, however, seems to be a significant step forward for AMD on several fronts. We will have a deep architectural dive in the near future, but for now, let's look at speeds, flows and competitive positioning. You can click all the slides to open a larger version in a new window.
The Radeon 5700XT is a 40-unit design, with 2,560 stream processors, 9.75 TFLOP floating point performance, and clocks far superior to anything we've seen before. AMD's new RDNA architecture, which finally GCN replaces, is significantly more efficient than its predecessor, with a projected increase of 1.25 times in performance per clock. The TDP on the 5700XT is 225W, compared to 295W on the Vega 64. The power supply is via an 8-pin connector and a six-pin connector.
The Radeon 5700 is a 36 CPU design with 2,304 stream processors and the same 8GB RAM pool as the 5700 XT. As expected, there is no sign of HBM on these products. The 5700 features a 180 W TDP and the same 1x 8-pin + 1x 6-pin power distribution system.
According to AMD, its new RDNA architecture is a major improvement over GCN, with substantial improvements in clock performance, raw clock speed, and watt performance.
The performance improvement per watt of 1.25x does not take into account clock speed gains but adds to them. It's a good time to talk about the newly defined synchronization scheme by AMD, so let's talk about it.
The basic clock of these cards is equivalent to what you will see if you run a type of workload type of power virus like Furmark. The "Game Clock" is a conservative estimate of the clock that you will see when running current titles over long periods of time. According to AMD, this is not the GPU's median clock rate over time during gaming – it's actually a bit lower than expected to allow for variations in silicon and cooling from one system to another. AMD derived its Game Clock values by measuring the average speed of the game's graphics processors on 25 different games.
The Boost Clock is an opportunistic clock that the GPU will try to hit when possible. Even this value does not represent the maximum potential speed (AMD has described it as "close to the maximum"). With improvements to the underlying architecture and overall design, GPU clocks are much higher than anything we've seen before from AMD. (We will talk about this in more detail in the coming days.)
According to AMD, the power improvements to give RDNA 1.5 times better performance in limited power environments than an equivalent GCN configuration. The change to 7nm represents just over 20% of the total gain, with improvements in design and power of about 15-18%. Most improvements result from improving the performance of the core of the graphics processor. We will write as teaser for deep diving: RDNA can execute instructions at each cycle, compared to GCN, which takes at least four cycles. The net result of these enhancements is a significantly better GPU than the cards it replaces, dramatically improving performance and simultaneously reducing power consumption.
AMD expects the Radeon 5700XT to deliver performance around 1.14 times faster than Vega 64, while consuming 23% less energy. TDPs rated on the 5700XT and 5700 are still superior to their Nvidia counterparts, but TDP is not a substitute for power; we will have to test the hardware to see how the GPUs compare. The performance improvement by area is substantial. Vega 64 was 495mm2 part, while Navi is at 251mm2 part. The RTX 2060 and 2070, on the other hand, are 445mm2.
These gains should put the 5700XT slightly higher than the GTX 1080 and the RTX 2070 in terms of overall performance. AMD also told us that we have read our reactions in terms of colder reference noise.
The 5700 and 5700XT models not only use wrapped fans that vent heat from the system, but AMD also promises to lock them to a volume of 43dbA. (It is not known whether it is absolute maximum volume or maximum volume provided you do not manually set the fan to 100%). This should respond to one of the recurring complaints about AMD's reference cards that they are often much more powerful than the competition. New buyers will receive a redeemable card for a three-month subscription to the Microsoft Xbox Game Pass for PC service.
We will have much more to say about Navi and its underlying architecture in the days to come. The GPU seems to have taken a significant step forward in reducing AMD's power over Nvidia and appears to be reaching a higher performance / watt target than Radeon VII.
The price of both cards is mobile, depending on the performance shown. The Radeon RX 5700XT will be available for $ 449, while the RX 5700 is $ 379. The price of the RX 5700 is significantly higher than the current RTX 2060 cards, which sell for $ 335 at the bottom of the market. The RX 5700XT is a $ 450 card, compared to the $ 500 price of the RTX 2070. A 50th birthday of $ 500 from the card will also be available.
Microsoft announced yesterday the Xbox Next at E3 and, although we still do not know what will be the next generation system, we will finally know what kind of specifications it will offer.
First of all, let 's talk about availability. Microsoft said it plans to launch its next generation console for the 2020 holiday, involving a schedule from November to December. That would be pretty much aligning with the Xbox One, which debuted on November 22, 2013. This also means that the Xbox One will not take off as long as its predecessor, although it gets a major mid-cycle upgrade in the form of the Xbox One X The Xbox 360 lasted eight years, from November 22, 2005 (there is again this date) until 2013. The Xbox One / X, on the other hand, had a life of seven years. The Microsoft E3 revelation trailer for the platform is integrated below:
Regarding the features, we know that the console, like that of Sony, will include an SSD, with performance up to 40 times higher than current systems. This is not impossible for SSDs, which offer access times typically measured at 0.1 microseconds or less, compared to 10-12 ms access time for hard drives. .
If all goes well, it means that the inevitable remake of Mass Effect will have fewer elevator trips without nowhere. The other statistics are about what you expect. Zen 2 and Navi are both connected, as is ray tracing (no explanation for the moment, however, ray tracing abilities or effects that will be specifically supported).
Supposedly, the platform will support both 120 fps output and 8K resolutions, but we do not expect 8K resolution to be a playable target resolution for traditional AAA game titles. Technically, the console is able to produce at this resolution, but these are modern PCs – if you play a pretty old game with super-sampling enabled. Similarly, a 120-frame-per-second game may be possible, but console games have always been targeted at 30 frames per second or, more recently, at 60 frames per second.
4Would Xbox Next support this? Yes … but only by sacrificing greater visual fidelity than most developers would likely choose to spend. That said, it would be interesting to see more game developers with this kind of tactic, or simply offer players an option to play with lower details and higher speeds. Maybe the message is less about the probability wide support and more on providing flexible features if developers chose to exploit them. Whatever the case may be, we do not expect 8K games to be standard.
Microsoft also seems to have learned from the terrible legendary debut of the Xbox One: instead of spending his unveiling to reveal everything, the head of Xbox, Phil Spencer, assured the audience that the game performance would be the focus. "For us, the console is vital and central in our experience," said Spencer. "We heard you, a console must be designed, built and optimized for one thing and one: the game."
There are a lot of things we do not know about the console, including how it compares to the PS5 in terms of speed and flow. The prize is also a major issue that has not yet been answered. These decisions were made in the past, but they are two companies, they stick to the subject to make sure they do not sell their current genes. . Despite recent news about collaboration in cloud games, Microsoft and Sony remain very determined competitors in the ongoing console war.
AMD has been very discreet about Navi, its future 7-nm GPU architecture designed to challenge Nvidia in the mid-size GPU market. News leaked from Sapphire public relations official who reportedly spoke to Chinese press (via blog article now deleted) points to a GPU designed to directly challenge Nvidia's RTX cards – and whose price also matches that of cards.
If these leaks are true – and for the moment, all the usual warnings apply – it means that AMD will charge Navi directly against the RTX GPUs that Nvidia launched at higher prices last year. The low-end Navi GPU would target $ 399 and the RTX 2060, while the high-end card would target $ 499 and the RTX 2070. The Radeon VII will remain on the market. anchoring the price of $ 700. According to the Sapphire representative, Navi is a two-GPU family, which does not plan to sell the product at lower prices or lower market segments. As we said earlier, the price is still the last aspect of a launch to be finalized, but we are close enough to Navi's expected launch window for AMD to establish its plans.
The fact that Navi, apparently, can not plunge into lower markets is a surprise. Admittedly, Polaris can give Nvidia's low-end GTX cards the ability to work for their money, but that can only be achieved by consuming about twice as much power. The RX 570 is such a good competition for the GTX 1650 that Nvidia did not even sample it for review and withheld the pilot's release to make sure no one wasted it – but again, power is a real issue here.
It is possible that this aspect of the rumor is wrong or that AMD has a smart 7nm budget that it has not revealed. Polaris is built at GlobalFoundries, so it is unlikely that the design will be increased to 7nm, but AMD could theoretically equip it with the faster GDDR6 technology. Once again, Polaris is already quite heavy in memory bandwidth. It is not clear whether moving from GDDR5 to GDDR6 would bring a significant benefit.
As for prices, this forecast is exactly the opposite of the supposed killer at $ 250 RTX 2070 that has been floating in the pool since December. We were expecting the price of Nvidia, depending on the aggressiveness with which it wants to hurt its larger competitor. Nvidia's price increases at the start of this latest GPU cycle were not popular and enthusiasts hoped that AMD would reach the reset threshold of the cost curve. Again, let me point out, these prices are not confirmed. They are, however, much closer to what I expect. AMD has chosen not to position the Radeon VII at a lower price. It's the same with Navi, betting that Nvidia has already eaten the flak of PC players upset by the GPU price hike.
AMD was never going to run at $ 250 the RTX 2070 killer when the RTX is at $ 500 the GPU because they would leave a disproportionate amount on the table. If they choose to place Navi against RTX at a reasonable price, without ray tracing (which would also have been lacking in Navi 10, for this same rumor), it will send a message indicating exactly how much, according to AMD, this feature is worth 2019. If the market accepts this argument, the question is quite different and could be complicated by the ongoing trade war between the United States and China.
At the moment, this rumor has implied that Navi is basically a high-end GPU refresh, but not targeted at the high-end market where Radeon VII and similar GPUs are playing. In addition, apparently, this will not refresh the GPU market budget. Presumably, another GPU family will be used here, unless AMD intends to keep Polaris in this segment for another 9 to 12 months. And if this rumor proves accurate, the higher GPU prices are apparently a new feature of the space that everyone will have to wait for.
Last week, PlayStation guru Mark Cerny explained in detail what to expect for the next generation Sony console. The statistics were appealing: an eight-core processor, a standard SSD storage and a 7nm SoC suggest that Sony will aggressively position itself to challenge Microsoft for performance leadership (the Sony PlayStation 4 holds the crown of performance against the Xbox One in the vast majority of games until the launch of the Xbox One X).
In a way, this disclosure is a little twisted. Wired UK ran a piece with multiple confusing statements in it. To quote:
According to what Cerny has detailed, it will take a 7-core 7-core Ryzen processor, supporting 8K-compatible ray tracing and running on the AMD Navi 20 graphics processor. This suggests that Sony is using the AMD processor. Ryzen 3600G, presented at CES 2019. According to Smithers, this processor should cost between $ 180 and $ 220 the unit, the ideal amount for a console at $ 399.
Literally, everything in this paragraph is somewhere between confused and fake. Without disrespecting Wired – it's a good publication and everyone makes mistakes – but that's not true.
It is strange to call the processors "ray tracing support" or "8K compatibility," but you can read both statements as a vague reference to computing power. This explains the "confused" part of the paragraph. With regard to the fake:
The interpretation of Cerny's comments by Wired appears to be a mix of incorrect rumors and a misunderstanding of how to shore up margins and pricing.
The rumor of a Ryzen 5 3600G started here. That's wrong – as we discussed two separate opportunities, this table does not represent the products that AMD will market. But that's as well wrong to pretend that this type of chip would end up in the PS5.
AMD desktop APUs use dual-channel DDR4 memory. While we once hoped for a GDDR5 or HBM powered APU, no such project is under development – the costs for HBM have never been reduced enough for the technology to be Low cost chips and the only integrated solution offering HBM is the Intel Hades Canyon, powered by Lake Kaby, which sells for much higher prices than any of the AMD APUs.
The APUs used in PS4 / Xbox One families rely on AMD's IP technology, but they do not use the same hardware configuration as any of the products sold by AMD in the PC market. It's by design. Neither Sony nor Microsoft are interested in competing with their own hardware vendor, and console manufacturers typically pay for initial SoC development costs. The price to pay for someone else to build a product accepts certain restrictions as to how you sell it.
All PS5 / Xbox Next APUs will use an advanced memory type – GDDR5, GDDR5X, GDDR6 and HBM2 are all potential candidates, with HBM2 or GDDR6 being the favorites. Dual channel DDR4 is not.
It is highly unlikely that Sony will pay $ 199 for the APU at the console at $ 400. These two figures are important.
The nomenclature on the PS4 at launch revealed that the console and 8 GB of DRAM together They were $ 188 on the $ 400 price. The APU was $ 100, the DRAM $ 88. It is possible that these two components represent a larger fraction of the total price of the PS5, but it is impossible for Sony to dedicate $ 200 of its $ 400 supposed to the APU. He does not leave enough money for everything else.
The biggest cost difference between the PS3 and the PS4 was by far the 10 times higher cost of RAM, not the slightly higher price of the processor and graphics processor. It is very unlikely that Sony will sell its consoles at a loss (this policy was catastrophic for the income of the PS3 era) and placing a large SSD drive in the PS5 will bring him some money back. If anything, squeezing the PS5 in the pricing of the PS4 seems a little tight. If you look at the prices of other components between the PS3 and PS4, most of them have not decreased or decreased only (the optical drive is the only exception to this rule).
The Wired article concludes that Sony will probably focus on the "8K capacity" of the PS5, which, I'm sorry, will not be. Yes, Japan intends to broadcast the Tokyo 2020 Olympic Games at 8K. The first 4K UHD broadcast took place in 2008. Yet, despite this fact, television and streaming are still anchored in 720p / 1080p. The first 4K TV went on sale in 2013; 4K TVs are expected to exceed only 50% of total television shipments this year, in 2019. 8K is expected to hold a 0.2% market share this year.
Nobody is targeting the PlayStation 5 at 8K, because there are no 8K TVs on the market, there will be very few 8K TVs on the market by 2020 and there is no point in trying to launch a 8K streaming service that would absorb titanic amounts of network bandwidth. The video standard intended to be used for the 8K format, called multi-purpose video coding, will not be completed until 2020 at the earliest, which means that hardware support will not be ready until 2021 or 2022. It is also really important because each generation of video codecs has reduced bandwidth requirements from 40 to 50%, which allows for easier resolution. H.265 / HEVC is about 50% more efficient than H.264; VCC targets 1.5 times more compression with the same level of quality as HEVC. Like HEVC, it will take more power to decode. Like HEVC, it will take several years before the hardware supports it. There is no reason to think that 8K is on a sort of accelerated schedule for deployment compared to 4K.
The PS5 is not an 8K console. It could be updated one day to support 8K video decoding in the software if Sony wants to solve the problem. It may be technically possible to play 8K games on the PS5, just as it is technically possible to play them on a PC. play, say, at Quake 3 with an effective 8K today, with a final output reduced to 1080p or 4K to adapt to the limitations of your monitor. But we will not see the AAA games targeting this resolution on the PS5 standard, and we will not see "Ryzen 3600G" on the PS5. We will not see any 8K gaming service, at least in a year or two. Even if we had done it, without VVC, most Americans would not have the bandwidth needed to broadcast the content. Streaming services are still struggling to provide latency and acceptable performance with a 1080p stream. 8K is 16x more pixels. It can happen, but it will not happen soon.
With the new information we hear around the PlayStation 5, it goes without saying that we also hear tips on the graphics processor that feeds. There have been some major categories of leaks around the chip.
Last year, AdoredTV suggested AMD aggressively bring a set of GPUs to the market to undermine Nvidia's Turing value. These leaks collectively resembled:
Now, anybody posting to 4 channels (SFW variant of 4chan) claims to work at AMD, with additional information on the "Radeon RX 3080" (AMD thinks that AMD will combine the Radeon and Ryzen brands into similar numbers for each family).
According to this leak, Navi has 1 MB of additional L2 cache compared to Polaris (3 MB total). The L1 is now 32 KB, against 16 KB. The high-end graphics processor offers a memory bandwidth of 410 GB / s on a 256-bit bus, against 256 GB / s on a current RX 590.
None of this is in conflict with the previous information leak, but the following information suggests that Navi will target the performance of the Vega 56 / GTX 1080 as opposed to the GTX 1080 / RTX 2070 – and that is new C is also good news if you hope that AMD will bring a devastating blow to Nvidia's RTX family.
Here's the problem: first, the gap between the Vega 56 and the GTX 1080 is between 10 and 20%, depending on the games you are comparing. They are not placed in an equivalent way; The Vega 64 was the closest point of comparison to the GTX 1080. The RTX 2070, meanwhile, is about 8% faster than the GTX 1080. The correspondence of the Vega 56 with Navi is therefore much slower than that of the RTX 2070.
While a Navi that corresponds to the Vega 56 would be disappointing compared to expectations, matching the performance of the Vega 56 while reducing power and price would constitute a collective set of improvements larger than that of the Radeon VII with respect to Vega 64. this year.
The Radeon VII has significantly improved its performance compared to Vega 64, with minor changes in power and noise. This hypothetical Navi would reduce prices, increase performance and significantly reduce power (and therefore noise). Without a doubt, this would bring more benefits – but not necessarily where people want it the most.
If the previous leak is true, of course, Navi would really be looking for stars. A $ 250 GPU competing with a $ 500 GPU would create gaping holes in Nvidia's price range and product line.
The main reason for thinking that AMD would step as this is its own position in the GPU market. AMD may have most of the console space, but it has been virtually driven out of the high-end markets for desktops and laptops. Getting out of the door could bring him back to the mind.
The biggest reason to think that they will not do it? Let's be honest. In the six years since the launch of Hawaii, AMD's graphics performance has not been as good. Hawaii was a big competitor but problems were not solved until third party cooler designs were available. Fury X had a glue problem in its water cooler and could not properly beat the GTX 980 Ti.
When the Polaris family was abandoned, we were faced with problems and concerns about an improper power supply to the PCIe connector due to an incorrect distribution of the 12V graphics processor. Vega 56 and Vega 64 were available only in limited quantities, at exorbitant prices (they already had the same GPU price boom in 2017) and in two different SKUs with two different colder z heights. It may seem strange as an internal problem in baseball, but I promise you that the last thing AMD wanted for its niche graphics processor was the design two niche cooling solutions, only one that can be used depending on the room you own. The commendable improvements in the performance of Radeon VII have been overshadowed by its volume and the general lack of new features compared to other cards in the market.
AMD has had some positive points since. The Radeon Nano was a well received niche product and the HD 7790 has improved AMD's low-end performance compared to the HD 7770 in a cost-effective manner. But AMD needs more than a niche product or a single budget card well received. He needs a new architecture that shows that it can compete from top to bottom with Nvidia. Even if he does not launch new cards in each segment, he must demonstrate a architecture it's able to resize. But any honest review of the past six years reveals that AMD has struggled to do exactly that. It has cost hundreds of millions of dollars in the console market and it still has a very strong lock on the shallow where the RX 570 plays, but it's been a long time that AMD has not yet introduced the GPU by shooting without problem or problem. problem in sight.
That's not to say that I think Navi is a failure. A USD 250 Vega 56 or GTX 1080 equivalent would represent a significant leap in performance over current AMD silicon with a fraction of the power. We do not know what the ray tracing The situation is still unknown because we do not know if the silicone that Navi uses in 2020 is identical to the silicone that Navi will be marketed by AMD in 2019. To date, AMD has been very discreet on ray tracing. except to say that they did not expect to introduce the feature until they could do it, from top to bottom. This could imply that they prevent the possibility of launching a future GPU, possibly aligned with the official debut of the PS5. There are also rumors of different flavors of Navi and different performance targets making their appearance, some arriving in 2020. Alternatively, they could belong to Arcturus, supposed successor of Navi.
The players who called on AMD to reach the "Ryzen" level in gaming performance were not wrong; the company needs one. Although the company still maintains its performance against Nvidia in certain price ranges, it now uses nearly twice the power. It's an ugly and unsustainable position. Navi will be the architecture to change everything? I hope so – but we have been waiting for proper follow-up in Hawaii for six years now. 4chan is not exactly a reliable source of data. However, GPU shipments in the last six years of AMD have not been so solid.