HardwareNews

[RUMOR] NVIDIA Ampere GPUs would replace NVIDIA Pascal GPUs and we would not see NVIDIA Volta GPUs

Rumor has it that the next generation of NVIDIA GPUs would be called Ampere and would not arrive to replace Volta, but instead would arrive to replace Pascal GPUs.

There are still a few months to go before the NVIDIA Volta, the new graphics cards, which according to all forecasts, will be presented at CES 2018 in Las Vegas and will be launched during the first quarter. Well, the name of the next generation of NVIDIA GPUs just skipped and it has a pretty logical name, which will be Ampere. We say that it is logical because Volta refers to Volt and Ampere refers to Amps, so the next generation, it would be logical for it to refer to Watts or Resistance, being a clear tribute to Ohm's Law.

It was the media Heise.de, who has revealed the name of the next generation of GPU, which as we have commented will be Ampere and that could be presented at the GTC 2018. We do not have the slightest data on Ampere and be careful, because we could skip Volta. According to this medium the replacement of the Pascal graphics cards (GTX 1000 Series) is Ampere and not the Volta as planned and we all had in mind and the information about these could be revealed at the GPU Technology Conference 2018.

We are going to take this information as a rumor, but in light of recent events, with the recent arrival on the market of the GTX 1070 Ti, it could be viable. Let's not rule out that NVIDIA launches a rehash of the Pascals and leaves Volta for Data Centers, Servers and Artificial Intelligence. Logically, this is only a rumor and we must take it like all rumors, with tweezers. The jump from Pascal to Ampere directly without going through Volta seems quite unlikely, although this move could be caused by the AMD RX Vega fiasco.

Source: wccftech

Show more

Robert Sole

Director of Contents and Writing of this same website, technician in renewable energy generation systems and low voltage electrical technician. I work in front of a PC, in my free time I am in front of a PC and when I leave the house I am glued to the screen of my smartphone. Every morning when I wake up I walk across the Stargate to make some coffee and start watching YouTube videos. I once saw a dragon ... or was it a Dragonite?

Related publications

3 comments

  1. Uff, AMD PACTS WITH INTEL, intel wants to develop graphics…. I see a very bad time to fall asleep in the world of graphics, it can cost nvidia very expensive not to continue advancing and leaving AMD far behind, this move instead of saving them 1 architecture and a lot of money, it can take them ……, remember that many greats of technology disappeared for a single critical error, and nvidia could happen.

  2. Well, see how nvidia did with Pascal and Volta and how the difference between the Titan v and the Titan xp and that the difference in performance does not equate to specs and consumption
    They set the bar very high for themselves and Pascal continues with the ideology of the 7** 9** series optimized for games and with Volta they moved to the Tesla ideology again because of AMD (it was ridiculous that in virtual intelligence an AMD GPU from a few years ago (cheaper as well) performed better and consumed less in those areas (Radeon pro duo fury or 480) than their p100) the problem is that this creates problems for them since they left the calculation area quite far behind in those years
    The worst thing is that the Vega fe and the nvidia v100 are not very separated from consumption and performance (in AI AMD as NVIDIA are stomping AMD with the opensource that facilitates the work of public study agencies and nvidia with their private technologies but that they charge an extra pasta for each help) but if the price is worse, the worst thing is that the nvidia AI cores are more difficult to take advantage of and only show improvements after long periods in addition to that since they are not opensource they have to pay pasta to use them and to apply changes and it is expected that AMD's will be opensource which would take more ground if they become standard (imagine the problems they will have if their AI cores are not compatible with AMD's if that happens)
    More why opensource is becoming more relevant in those areas

Leave your comment

Your email address will not be published. Required fields are marked with *

Button back to top
Close

Ad blocker detected

This site is funded through the use of advertising. We always make sure that the advertising is not too intrusive for the reader and we prioritize the reader's experience on the website. However, if you block the ads, part of our funding will be reduced.