Unfortunately the situation of Radeon, the AMD graphics division is dying and the near future does not look positive, and more with the arrival of Intel
The situation of the AMD Radeon division in these is very complicated and the future of the company in terms of graphics hangs by a thread. Currently the situation is critical, almost desperate, with a market share of 18.8%, compared to 81.2% market share of NVIDIA. Sure, we are talking about two players, but it is that at the end of the year Intel enters the graphics market. Although many say that Intel will barely compete with NVIDIA's mid-range and will simply compete with AMD, they are wrong.
Radeon is on the brink of demise
The time when ATI was in direct competition with NVIDIA is long gone. Something went wrong the moment AMD bought ATI, and since then things have gone from bad to worse. The auction was to take 60% of the budget from the Radeon division to save the part of processors. Ryzen is a success, but this success means that Radeon is at least two years behind NVIDIA (being generous)
We must add to this a bet, clearly, wrong on the part of AMD. NVIDIA releases graphics for the here and now, with a two-to-three-year margin in mind, while AMD is developing graphics for the future. Radeon is characterized by developing graphics with a large memory bandwidth, a technology of the future, but which today is unable to compete with GeForce.
The most immediate future also does not play in their favor, not in the least. GCN is an outdated and dead architecture, as seen with the Radeon VII. Navi will not be much better than Vega, despite the crowd and will give for the mid-range and perhaps touch the high-end of NVIDIA. But it is that at the end of the year a new player will arrive, such as Intel and at that time the situation will become more complete.
Vega notes Radeon problems
Taking a closer look at the Radeon VII graphics is how you can best see that Vega is a failed architecture and GCN an outdated design. But before talking about this, I want to stop at the point where Radeon VII is nothing more than a Radeon Instinct MI50 converted to gaming.
The first thing we should see is the consumption of this Vega 10 graphics card (RX Vega 56 and RX Vega 64) is based on 14nm, while Vega 20 (Radeon VII) is based on 7nm. There is a big leap in terms of lithography, but curiously the new Radeon VII graphics consumes more than the Vega 64. Clearly this indicates that something is not working correctly. Reducing lithography is supposed to significantly reduce consumption, something that has not happened.
The litho jump should not only make the graph more efficient, it should also make it much more powerful. There should be a 25-40% performance improvement, but there really is barely a 15% performance improvement at best. It is without a doubt yet another example of AMD's problems.
Lack of competitiveness in the high end
While it is true that Polaris GPUs (RX Series) are cheaper than NVIDIA graphics, it is no less true that they barely compete with the mid-range. Vega 64 and Radeon VII are examples of an inability to compete for the high-end. Although the Radeon VII is almost at the level of the RTX 2080 in performance, it does so at the cost of consumption and temperature. Two factors that establish that AMD's solution is a complete disaster. But it is that the price is quite similar and in addition, the RTX 2080 has support for RayTracing and DLSS.
I must clarify at this point that in my view Turing is an architecture that has arrived a year in advance. RayTracing is hardly implemented in games and makes it unattractive. DLSS is lacking development to be optimal. But despite everything NVIDIA has been able to skip the Volta architecture and launch Turing early for a simple reason: THERE IS NO COMPETITION.
Sure, AMD has been competitive in the mid-range and low-end, mostly on price, but let's face it, these low prices mean less profit and less money for development. It is for this very reason that Radeon has separated into a separate division. Radeon despite being within AMD, is already considered a sub-company by the company. A logical decision since Radeon is completely deficient.
Intel comes to conquer the GPU market, not to steal leftovers from AMD
Since Intel announced with the signing of Raja Koduri, that it wanted to enter fully into the graphics card market, it has been said that they came to compete in the entry range and at most for the mid-range. Something said by different means, even by members of NVIDIA and AMD. If anyone thinks that Intel is here to keep the leftovers, they have no idea what they are talking about.
The references in Intel graphics are its iGPUs and Larrabee, two real bungles, the first being to get out of the way and the second a huge failure. The company is aware that a new ridicule would be lethal to the credibility of the company. For this they have bet big and are not skimping on expenses.
Raja Koduri arrives at Intel with a blank check
Betting on Raja Koduri is betting on a safe value in terms of graphics card development. Although Polaris is not as good as it should have been and Vega 10 has been a disaster, few people know more about GPU architecture. One reason for his departure from AMD is the 60% cut from the Radeon division and Lisa Su's insistence on bringing an unfinished, performance- and efficiency-poor architecture to market.
This Intel chip is a master move and it was not because they have put a large sum of money on the table (which too), it has been possible because they have given him a blank check. Koduri was offered all the financial, human and technical resources he needed. The result of this is that Intel will open a research and development center in India, Koduri's native country.
Obviously, it has the pressure to have all the resources available and to have for the development of a solution that competes with the high-end of NVIDIA, no half measures. Not getting it could cost you quite a bit, but everything seen so far indicates that it will be like this. Intel recently released an open source library for real-time ray tracing (aka RayTracing).
What's after Navi?
For me it is the big question right now. I have the feeling that Navi arrived unnoticed and I hope he is completely wrong. It has already been rumored that the entry-level Navi will arrive first and then the mid-range, a completely wrong strategy. What has not been said is whether there will be a high-end Navi, something I do not expect to happen.
Navi for me will possibly be the end of a sad and suffering cycle. I think AMD knows that there is little to scratch and they may already be working on a new GPU design that is much more efficient and powerful.
Also say that AMD has said several times that RayTracing will not implement it until it can be run with entry-level graphics. This makes me think that what they are designing after Navi will have this technology integrated. And that it will not actually be present in Navi and if it is, it will be partially and greatly affecting performance.
Some would think that I hate AMD and the Radeon division, but I'm sorry to tell you that I don't. I like NVIDIA because it has found its way and offers powerful graphics and a very broad ecosystem. Yes, their products are expensive and Turing is overpriced, that's undeniable, but AMD has nothing.
Hopefully AMD is able to compete with Navi, but I don't think so. And this post is not intended to be an attack on the company, but to make an X-ray of the reality of Radeon. The near future looks bleak, because they can not cope with the GeForce and the arrival of Intel may cause it to lose more market share.
Honestly, I don't think that by 2020 AMD's situation will improve, rather the opposite, but I am hopeful that by 2021, that a new GPU design and architecture should arrive, we will see how it becomes competitive again. Above all, I hope it is for the benefit of everyone's pocket, since three companies competing to the dog's eye will benefit us in terms of price and also in terms of technologies, since they must be innovative and risk much more.