The next generation of consoles will be in less than 18 months and Microsoft is starting to share a little more information about its priorities for the next generation of Xbox consoles. Readability, load times, and upward compatibility of controllers and software are Redmond's top priorities with the launch of Xbox Next.
"I think the area we really want to focus on next generation is the frame rate and game playability," Spencer said. said Gamespot:
Make sure the games load incredibly quickly and that the game runs at the fastest possible frame rate. We are also the Windows company, so we see the work that is done (for the PC) and that of the developers. People love games at 60 frames per second. It is therefore essential that the game design works at 4K 60 (fps).
What is interesting, is that this generation, we really focused on 4K visuals and on the way we bring the movies in 4K Blu-ray and video streaming. With Xbox One X, allowing games to work with 4K visuals will bring really important visual enhancements next generation But playability is probably the main focus of this generation. How fast are the games loaded? Do I feel that I can get into the game as fast as possible and while he is playing? How do you feel? Does this game look like any other game than the one I saw? This is our goal. "
That's more or less what ET predicted earlier this year. 60fps is a much more realistic target for Xbox Next than for the 240fps rumor circulating. Despite various vague claims that the Xbox Next will support 8K, Spencer does not make any sensual mention as a game resolution target. There is no chance that the 2020 console will have a sufficiently powerful GPU to support this resolution. We are happy to see the company focus on other aspects of the game.
According to Microsoft, backward compatibility is a key pillar of the progression of the Xbox. The Xbox One, Xbox 360 and OG Xbox games will all continue to be supported on Xbox Next, Spencer told Gamespot. The company promised that this backward compatibility commitment would also apply to controllers, stating: "So, really, the products you've bought from us, whether it's the games or the controllers you're using, we want to make sure they are compatible. future compatible with the most faithful version of our console, which at that time will obviously be the one we just launched. "
Historically, there has been a handful of games that specifically targeted 60 frames per second for console play, but it was an unusual frame frequency target. The Xbox One X and PS4 Pro have expanded the list of titles offering this pace by encouraging developers to release updates for new and existing games that would add new resolution options or allow to play at higher rates than the basic title supported. In fact, moving the video game industry (backwards) to a 60-fps target would be a feat.
There is reason to think that the two console manufacturers could get by. The Xbox Next and PlayStation 5 will both achieve higher levels of performance than the existing Xbox One and PS4 Pro. The use of Ryzen and a RDNA-derived graphics processor for both platforms ensures superior console performance, but the perceived level of visual quality improvement offered by console generation over the next decreased each cycle. Rather than simply looking for new levels of detail, Spencer wants developers to focus on consistency and load times, two other areas in which major generational gains can be generated, including adoption of SSDs. .
A major question is how the 1080p / 4K splitting will be handled. Spencer refers to a 4K / 60fps target, but 1080p still represents a high percentage of TVs sold and the installation base for the old standard is huge. The easiest way for Microsoft to handle a 1080p output limit is to render internally at 4K, and then output at 1080p. This makes it possible to effectively apply oversampled AA to the overall image and dramatically improve the image quality compared to the standard 1080p resolution. With the PS4 Pro and Xbox One X, Microsoft and Sony have provided developers with many ways to leverage the added power of new consoles to enhance the basic experience. We expect a similar approach. One of the benefits of having a powerful GPU with a low-resolution display is that you can enable secondary features like AA without worrying about the performance impact. We hope Microsoft will bring some of this flexibility to its Xbox Next design.
The PC player in me can not help but notice that the already almost empty line between consoles and PCs will be even finer at the next cycle. The consoles previously offered backward compatibility, but often come with qualifiers related to the version of your hardware and limited to a previous platform. Microsoft will not only support Xbox One games on Xbox Next, it will continue to support Xbox 360 and OG Xbox systems, as well as Xbox One devices. This is exactly the type of backwards compatibility that we expected when upgrading from one PC version to another, and it's nice to see the consoles catching up after a few decades.
The flip side, of course, is that the debate console against PC becomes more difficult each generation. At this point you can also simply ask for "controller or keyboard?" (Keyboard, natch). Functionally, at the hardware level, we are the PC games.
There is a strange rumor that AMD has destroyed or intends to destroy its reference GPU RX 5700 and RX 5700 XT designs. Eleven AIB custom cards are on the market. It started with the French site Cowcatland, which was titled:
The translation of this title indicates that the AMD reference GPUs for the 5700 and 5700 XT systems reached EOL status just five weeks after launch. This is not true. According to AMD, the goal and purpose here are not to compete with AIB partners. "We expect the Radeon RX 5700 Series graphics card offering to remain strong in the market, and that many models are starting to arrive from our AIB partners," said AMD. "In line with usual practice, once the AMD reference card inventory has been sold, AMD will continue to support new partner designs with the Radeon RX 5700 Series Reference Design Kit."
AMD provides reference designs for AIBs that want to accelerate the marketing of cards without designing their own coolers or reference graphics cards. The first boards are usually based on these reference products. The delay between AIB shipments and the availability of the reference card can be relatively short or differ by a few weeks. Some fans are unhappy that it has been five weeks since AIB was designed, although this has already been the case with Nvidia launches. AMD does not destroy its reference cards and they will still be manufactured in the future.
The community of enthusiasts is not particularly satisfied with the delay of the ventilation cards or the fact that these cards are or the fact that the 5700 and 5700 XT are louder than the equivalent Nvidia GPUs. The hope is that dual or triaxial cooling fans offer better acoustics than AMD's default reference designs. This is usually a very good bet.
After testing the 5700, 5700 XT, Vega 64, Radeon VII components and an associated mix of Ti 2060, 2070, 2080 and 2080 parts (both manufactured by Nvidia and not), I would honestly say, the battle for a fan compared to an open-air cooler can be a bit inflated. Thermally, there is an obvious difference between the two solutions (the fans discharge hot air, while the free-air coolers simply move it inside the chassis). What a difference means because your system depends a lot on its preconditions. Open air coolers can offer better performance in spacious cases with good airflow, while fans provide more consistent results. The relative volume of the two solutions depends on their cooler design. A fan may be stronger than a cooler in the open air or vice versa. The 5700 XT (a fan) is much quieter than the Vega 64 (another fan). Vega 64 and Radeon VII (open pit design) have very similar noise profiles.
An interesting aspect of Navi exams, however, is the degree of divergence of noise measurements from different examination sites. Anandtech, for example, indicates that the 5700 XT is a 54 dB (A) solution, compared to 61 dB for the Radeon Vega 64.
This 54 / 61dB (A) solution seems to correspond more to my own subjective experience using the Radeon Vega 64, Radeon VII, 5700 XT and Nvidia associated graphics processors. The reason I say this is that, for me, the 5700 XT is much better than the Radeon 64 or Radeon VII, reminiscent of the bad old days of powerful GPUs like the R9 290X.
Other critics, however, make very different statements:
Guru3D claims that the Vega 64 and Radeon 5700 XT are identical in terms of database (A) and that Radeon VII is much stronger. Since the distance to the target obviously has an impact on the noise measurements, the fact that Anandtech and Guru3D measure different levels of sound does not worry me. What is even more interesting is that one of the articles shows that Vega 64 and 5700 XT are comparable, while the other does not.
TechPowerUp has a third distribution, with 5700 XT scores and 5700 identical and the Radeon VII lower than Vega 64. Three well-thought-out websites for technical journals, three separate results. From my own subjective experience, the one that "seems" the most correct is that of Anandtech – but a number of factors will affect noise measurements, especially the relative levels of background noise. , the file opening tests compared to those closed, distance from the target and the equipment used to perform the test. It is also possible that the individual GPU variation also works here.
In my opinion, the 5700 and 5700 XT are resolutely on the "Quietly Quiet" side of "Is this GPU quiet enough to be used or not?" It is not as quiet as the RTX 2060 or 2070 that we tested for the same exam. C & # 39; greatly quieter than the Radeon VII or the Vega 64. It is known that I wear earplugs when testing these two cards in case of opening to avoid hearing damage, although the fact that I already have lesions hearing in my left ear also made me paranoid to hurt him further. I used a Vega 64 in my own system and I did not like how loud it was for games without headphones. The Radeon 5700 XT does not cause the same problem.
Radeon AIB cards have often been quieter than reference models, so it is likely that this will continue to be the case. We will check if these cards offer reasonable value for money, but they will be when they arrive on the market in larger quantities. Reference card templates will continue to exist alongside these new cards.
In February, I recounted how AMD and Nvidia collectively launched the refreshment cycle of the least-popular high-end GPUs in the history of the video game industry. After the launch of the AMD Navi 5700 and 5700 XT and the Nvidia replica with the RTX 2060 Super and 2070 Super, it makes sense to come back to this conclusion. How did things improve a little over six months later?
In fact, they have improved a lot if you buy at the top of the market. Before reviewing the details of the changes, let me clarify some of the terms. Historically, GPU price ranges look like this:
Budget: $ 150 or less.
Mid-range: $ 150 – $ 300
High end: $ 300 – $ 500
Ultra-high: $ 500 +.
When Nvidia introduced the RTX family, prices went up considerably. Instead of the GTX 1070 around $ 370 and the GTX 1080 between $ 500 and $ 550, the RTX 2070 was a $ 500 GPU, the RTX 2080 was $ 700, and the 2080 Ti actually worked between 1,100 and 1,200 dollars ($ 1,000 technically, but nobody ever got them, as far as I know).
A publication like ours has two basic ways: manage your own price band and insert the new cards, or change our price bands and increase them to meet the needs of the manufacturer. If you take the latter approach, AMD's Navi graphics cards are now "mid-range" cards, despite price tags of $ 350 and $ 400. It's also the way you end up with articles referring to the iPhone XR as "entry-level" or "budget" at $ 750, as if Apple did not just kill the only pseudo-budget device proposed, the iPhone SE at 350 USD.
Adjusting price brackets to reflect what businesses are selling is not wrong, as long as it matches what customers purchase The next quarterly Nvidia figures should provide further confirmation here, but the available data suggests that Turing's sales were way behind Pascal at launch and that they may not have recovered since. If Nvidia really thought they had established ray tracing as a feature that players were willing to pay, the RTX 2060, 2070, and 2080 GPUs would not have been reduced.
As far as ExtremeTech is concerned, at least for the moment, the Navi 5700 and 5700 XT cards are high-end cards, as are the RTX 2060, 2060 Super, 2070 and 2070 Super. The RTX 2080, 2080 Super and 2080 Ti belong to their own distinct category of ultra-high-end devices.
We have recently measured the long-term performance evolution with various graphics processors, but we can use this dataset for different purposes. Keep in mind that in the graphics series below, the GeForce RTX 2080 (non-Super) delivers nearly the same performance as the RTX 2070 Super. (The 2070S is typically between 95 and 105% of the performance of RTX 2080).
Comparing RTX 2070S / 2080 to GTX 1080, we find that the minimum frame rates are 1.18 times higher at 1080p, 1.28 times higher at 1440p and 1.4x higher at 4K. The average frame rates of our entire game range are 1.3 times higher at 1080p, 1.4 times higher at 1440p and 1.44x higher at 4K.
I do not have the same level of data on the GTX 1070 as the RTX 2060 Super, but we know that the 2060S also improves performance by about 1.15, as it almost works out of the same way as the RTX 2070 of origin. The new $ 400 GPU award brings it closer to the original GTX 1070 than the OG 1080.
As for AMD, the 5700 and 5700 XT effectively replace Vega 56 and 64. The slideshow below contains the results of our RX 5700 and 5700 XT tests. The Radeon RX 5700 matches the Vega 64 in almost every test, but costs $ 350 instead of $ 500. It consumes 74% of the power of Vega 64 while outperforming the RTX 2060.
ace upgrades for the current owners of Vega 56 and Vega 64, the best case will be between Vega 56 and RX 5700 XT. In this case, I value my earnings, but I'm pretty sure they are not as important as the improvements between Turing's Turing prices and Pascal and Turing. Vega 56 was typically 1.08x to 1.12x slower than Vega 64, but the 5700 XT's lead over Vega 64 varies significantly depending on the game. In some cases, both GPUs are linked.
AMD players with older cards or Nvidia players wishing to change sides are the most likely customers to use the RX 5700 and RX 5700 XT, and the performance offered by these cards make it a potentially interesting upgrade to these markets.
The new AMD launches have restored a better, more consumer-friendly balance in the upper end of the GPU market. The market of very high-end remains less friendly. The RTX 2080 Super offers the smallest performance improvement of all "Super" cards and does not do a very good job justifying its $ 200 price premium over the RTX 2070 Super. The Radeon VII and the RTX 2080 Super are only justifiable if you play in 4K and, honestly, they are not so convincing even in this situation.
AMD has not yet announced plans for the midrange market, but the company also needs to work on maps to refresh this space as well. Let's hope it will not be long before we have much more efficient and powerful chips ready to replace the RX 570, 580 and 590.
As for whether Navi or Turing is a better way to upgrade, it will depend a bit on what you want: A little more speed (compared to the competition), or features such as ray tracing? Some users may not think that even these gains are enough, which I understand. But we can at least say that it are performance / dollar gains over the previous generation. Six months ago, it was not possible.
It's no secret that high-end GPU prices have recently dropped, thanks to AMD's recent launch of its RX 5700 and RX 5700 XT. AMD has now stirred the pot a little, claiming that she had managed to bluff Nvidia to reduce costs, only to get the rug out from under and reduce the price even more
This data is included in Hot Hardware's interview / podcast with Scott Herkelman, vice president of Radeon at AMD. Scott then explains how AMD carefully planned, assessing the capabilities of RTX cards in terms of Nvidia's clock speeds, chip sizes, revenue targets and expected margins, and more again. AMD's initial prices for the RX 5700 and 5700 XT were $ 500 and $ 379, but after Nvidia unveiled its Super family, the company reduced them to $ 350 and $ 400. According to AMD, this has always been the plan.
Herkelman explains how AMD carefully analyzed the family of RTX graphics processors, including their prices, chip sizes, and the margin of safety they had left. He chose his initial prices for the RX series while waiting for Nvidia to subcontract them, which allowed him to offer his own price improvements:
The prices we originally published we waited to see what they published, then we took the appropriate step: not only to dislodge their Super series, but also to block their 2060 and 2070 series. Because we knew that they were having a slower success and we wanted to do a double jebait, not only block their super strategy, but also slow down the 2060s and 2070s.
First of all, let me say that everything AMD has said about the analysis of Turing's safety margin, its price level and other market factors, seems to me quite worthy of note. the outset. We perform similar analyzes ourselves and AMD is in a better position than us to understand some aspects of Nvidia's manufacturing situation. Nvidia has maintained very high margins on its GPUs; the company wide margin last year was about 60%. When he raised prices with Turing, we argued that Nvidia was doing so in part because it did not have any real competition from AMD.
Similarly, it makes perfect sense that AMD brings to the market a part that gives it an advantage. The sizes of Navi matrices are much smaller than Turing. The RX 5700 and 5700 XT are 251mm2, while the RTX 2060 and RTX 2070 are 445 mm2. The RTX 2070S, 2080 and 2080S are even bigger, at 545 mm2However, AMD does not compete directly with the RTX 2070S.
Without information on platelet performance and cost, however, we can not directly compare what Nvidia and AMD are likely to pay for the final chips. It is true that AMD has a decisive advantage in terms of chip size, but AMD also supports a new node and should pay at least a little more for wafers. The way all this is played out, in the end, is uncertain. But obviously, AMD felt it had a usable advantage.
Nevertheless, once Nvidia announced the launch of the "Super" family, AMD was in danger of reducing its prices. At $ 500 and $ 379, the RX 5700 and 5700 XT would have clashed against the RTX 2060S and RTX 2070S instead of RTX 2060 and RTX 2060S. This represents a more difficult challenge for both cards, at higher prices. He would also have been forced to repeat almost the beginnings of Radeon VII, in which AMD could have matched the RTX 2080 in terms of performance, price, new features and not receiving a warm welcome. Yes, Navi made include new features, and we still want to talk about it in more detail, but there are no revolutionary new features to discuss that could have changed the equation.
At equal prices, the 5700 and 5700 XT would have been less well positioned than today compared to the RTX 2060S and 2070S. AMD has made the decision to position its GPUs more advantageously by reducing prices. Do I think that they planned this? Absolutely Do I think that AMD would have kept its prices higher if the 5700 and 5700 XT cards had been faster? Yes I can hardly argue otherwise. I spent six months writing articles on how AMD would not give its processors if it achieved performance equal to Intel's, simply because some fanboys thought that it would not work. was a good idea. There is no reason to think that the company wants to improve its processor margins, but it agrees to give GPUs a fraction of what it might ask. Our slide show, with the results of our reviews RX 5700 XT and 5700 compared to the RTX 2060, 2070 and 2080 (the 2080 replacing the 2070S) is presented below:
If Nvidia had not lowered prices with its Super cards, I doubt very much that AMD would have canceled its own higher prices. That's not to say that AMD did not have a plan in advance, but it was a fairly predictable and simple plan.
The biggest and most important delivery here is that companies absolutely will be raise prices when there is no competition. Nvidia did not magically find a way to reduce Turing costs the same month that AMD launched new GPUs. They raised prices with Turing partly because there was no competition with AMD to stop them.
As soon as AMD re-entered the market with a competitive part, GPU prices went down again. Had AMD been able to export competing parts last year, Nvidia may not have been able to raise its prices. If Nvidia had not been focused on player compression like juicero fruit juice (and could not misinterpreted the crypto market new games sales), it would not have increased the prices anyway. AMD's attempt to establish a relatively prosaic price cut against Team Green is not really a success, but high-end GPUs are undeniably cheaper than today. It's a win for everyone, no matter what material you prefer.
The GPU market has changed a lot in the last month, thanks to the launch of AMD Navi and the reaction of Nvidia. As soon as AMD announced the prices of the RX 5700 and 5700 XT at $ 379 and $ 500, Nvidia announced the launch of a new family of RTX "Super" cards. These Turing updates for the RTX 2060 and RTX 2070 have substantially improved the performance of these two cards. In response, AMD cut its own GPU prices, repositioning the 5700 and 5700 XT at $ 350 and $ 400.
We now have the RTX 2080 Super coming in, with a smaller improvement at the top than the one already introduced by Nvidia with the RTX 2060 Super and the RTX 2070 Super. As a reminder, here's how the new cards fit into their replacements:
The RTX 2080 Super brings some improvements, including a fully activated TU104 GPU, higher base and boost clocks than the RTX 2080 and, for the first time, a slightly longer memory bandwidth. The graphics processor TDP also reached 250W, but tests showed that the increase in actual power consumption was lower than this.
The increase in performance over RTX 2080 is about 8%. This will not blow up all the doors and will not improve the upgrades, but it is still an 8% improvement over the previous year at the same price as the RTX 2080 of origin. Gains are lower than RTX 2060 Super or RTX 2070 Super. delivered, and the RTX 2080 Ti is still the distant leader.
Both PCMag and Anandtech We welcome the performance improvement at the same price of $ 699 as the original RTX 2080, although both also recognize that the gains are less than those of high-end cards. At the moment, the RTX 2080 Super and mostly the RTX 2080 Ti represent a lower value in terms of performance / dollar than their lower-end counterparts. It is therefore possible to make a strong case for buying at a more reasonable price around the RTX 2060S or the RTX 2070S. The AMD RX 5700 XT is another option for players who want to save $ 300 on a new GPU.
Chris Stobing of PCMag writes:
We think that the price of Super cards matches that of the RTX line. Anyone who bought RTX in 2018 certainly paid an early adoption fee. AMD's lack of competition in elite 4K gaming spaces has allowed Nvidia to free itself from the pricing of much of GeForce RTX's life so far.
Anandtech's feelings are similar. The RTX 2080 Super is not "a card that dramatically changes the calculation of the video card. Instead, this is exactly what is written: a slightly faster 2080, offering a bit more performance (and performance per dollar) than before. "
Whenever a large manufacturer performs a node transition, he must choose the foundry partner with whom to work. Nowadays, this comes down to one of two (possibly three) choices: TSMC, Samsung and potentially Intel, which still seems to run a smelter, but has not made any significant customer announcements for some time. For the most part, this is a two-way race between TSMC and Samsung.
TSMC worked well on 7 nm, blocking a large part of the big companies and the first customers. The two companies pursued different strategies on 7 nm, Samsung deciding to wait until its node is ready until its ultraviolet extreme lithography (EUV) is ready, while TSMC has introduced 7 nm as a node conventional lithography and plans to deploy an EUV for mass-manufacturing next year.
Samsung has been a little light on the customers, although we know that IBM will build something with the long-term foundry. At a press conference in Seoul this week, Nvidia Korea's chief, Yoo Eung-joon confirmed that the two companies collaborate on the future GPUs drawings.
"It is significant that Samsung Electronics' 7-nanometer process is being used in the manufacture of our next-generation graphics processor," Yoo said. "Until recently, Samsung was working very hard to find partners for cooperation in the foundry."
The exact terms of the agreement have not been revealed, any more than the parts that will be built at Samsung, but it is possible that Nvidia will launch fully with his new partner. Designs for one foundry can not be easily or simply transferred to another, which means that the same chips in two state-of-the-art foundries mean a doubling of design work. Nvidia could choose to use both companies, but using Samsung would simplify some aspects of the design process. Samsung's 7nm price would be extremely competitive, hoping to win the prize for its design.
In terms of when we could see Nvidia 7nm chips, nobody knows. AMD will launch its 7nm Navi chips this Sunday, and Nvidia has already responded by adjusting the prices of their existing RTX cards. But we also know that Navi will make the big jump to "Big Navi" next year, and this is a place where Nvidia will probably have a 7nm response on its side. If AMD's 7nm technology proves effective, Nvidia will not want to deal with it with its own aging Turing family. If AMD is still lagging behind in terms of overall competitiveness, Nvidia will be able to cope with its new launch.
In any case, we do not think that Turing will keep the market as long as Pascal has done. Pascal holds the record for the longest running GPU architecture in the consumer space, having dominated from May 2016 to September 2018. We do not know at all when but 12-14 months from now until launch is a pretty reasonable bet.