Does the Battle Between AMD and NVIDIA Benefit Consumers?

When it comes to buying a GPU there are lots of people that will tell you AMD or NVIDIA is better for “x” reasons, and some of those arguments may be right. However, for the most part there is only a slight difference when it comes to non-reference cards, or cards built outside of AMD or NVIDIA but still on those architectures. That’s why when comparing AMD and NVIDIA’s latest flagship cards we’ll stick with the reference versions.

Solid card: the Radeon R9 290X

 

First up let’s talk about my personal favorite GPU maker: AMD.  AMD’s latest and greatest card, the R9 290X, is a direct response to NVIDIA’s Dominant GTX 780.  With this card AMD gave us the best of both worlds: an inexpensive graphics card and one that is on-par, if not better than, the GTX 780. As Anandtech.com pointed out in their review of the 290x, when “placed against NVIDIA’s lineup the primary competition for 290X will GPU Wars!: AMD Radeon R9 290X Vs. NVIDIA GeForce GTX 780 Ti 01be the $650 GeForce GTX 780, a card that the 290X can consistently beat.”  This is probably due to the higher boost clock of the 290x and the use of more VRAM.

When this card fist hit the market it was priced very competitively at $550 dollars, forcing NVIDIA to react by dropping the GTX 780 to $500. At the time of writing, the current price for a reference R9 290X on Newegg.com is $700 dollars. There are many reasons for this price jump, be it the rising popularity of Bitcoin mining or just a supply issue and flash memory prices shooting up, but the fact is that this card should no longer be considered competitive in any sense of the word. With the R9 290 non-X version costing as low as $580 on Newegg.com and it being only slightly less powerful, there is no real good reason you should pick up the X version unless money just isn’t a thing to you.

Even with the R9 290X jumping in price, AMD is still the place to go when you want the best bang for your buck in GPUs. The R9 290 and 290X are the only new cards AMD has turned out so far with the new branding system, so all the cards below them in the 200 series are just re-branded cards from the HD 7000 series. Therefore, those cards will be much cheaper, especially if you find them still named HD 7xxx.

Rising star: the GeForce GTX 780 Ti

 

Now it’s the green team’s turn, and boy did NVIDIA come back swinging! With the release of the 290X from AMD, all eyes were on NVIDIA to see how they could possibly combat such a promising card. Lots of people, including myself, were betting on them just lowering the price of the GTX 780 to be more competitive.

Then it happened: not only did they lower the price, but they also gave us a new card that fully unlocked the potential of the GK110 GPUs and put any notion of AMD being on top to rest. This new card is known as the GPU Wars!: AMD Radeon R9 290X Vs. NVIDIA GeForce GTX 780 Ti 02GeForce GTX 780 Ti and it is a monster of a card. Not only is it the fastest single GPU card you can buy, but it is also the same price as the R9 290X.

This card screams performance, and that is shown off by what it is capable of in HD and higher-than-HD gaming. For gamers who are playing at 1080p or 2.5k, the 780 Ti really shows off its strength. This was shown by Anandtech and Eurogamer, playing Bioshock Infinite at 111 fps and 73 fps, respectively, on ultra settings. At the same pace the R9 290X plays Bioshock Infinite at 95fps at 1080p and 62fps at 2.5k. The 780 Ti can also play games pretty well at 4k, averaging around 36 fps on ultra settings.

Not only is the pricing the same for the two cards, with the GTX 780 Ti being significantly ahead of its competition, but according to an AnandTech review, NVIDIA’s flagship also manages to use less power and produces less heat. With the R9 290X using 375 watts under load in Crisis 3 and running at 94 C, it is just outpaced by the 780 Ti using 372 watts under load and running at 83 C. So the GeForce GTX 780 Ti runs cooler, makes less noise, uses less power, costs the same, and comes with all of NVIDIA’s fancy features like the GeForce experience, game streaming to SHIELD, and shadow play for you twitch streamers.

Switching sides

 

With all of this said and the GPU wars over for now, I’m still a big AMD fan and really love what they are trying to do with Mantle and cheaper GPUs. However, I have recently upgraded my PC and have decided to switch to the green team for the simple fact that the GTX 780 Ti is the fastest single GPU card you can buy, and I love using the SHIELD to play my PC games while laying in bed. My computer uses less power and produces less heat, the GeForce experience optimizes all of my games for me, and I can even share my crazy moments with my friends through shadow play if I want.

Additional resources

 

Here is a useful link to a benchmarking website that not only shows you how the various cards perform under their benchmark but also the price-to-performance ratios for each GPU. Here are some additional links to AnandTech’s reviews of the R9 290x and the GTX 780 Ti.

You Might Also Like

  • StephenRinger

    First off, the answer to the title’s question, is a definite yes.
    Second off, as a software engineer with a lot of experience working with graphics utilizing multiple generations of both manufacturers’ products, there is a massive difference between the target consumers of these cards. I’ll keep it simple, though, and stick to shorter explanations.

    Let’s start with AMD/ATI – the GEICO of GPUs.
    I say Geico, because generally, their aim is to be cheaper. And that goal they usually meat pretty easily. And, just like Geico, for most users it’s probably exactly what they want. For professionals, or anybody that appreciates consistency and longevity in their products, they haven’t met the standards in nearly a decade. While their cards have significantly higher clock speeds, they also have significantly higher noise levels and power draw, and lack in decent driver support. Just look at any benchmark tackling OpenCL (general computing with a GPU.) Fortunately, most ‘gamers’ don’t care about this yet, but they should start doing so, as more and more developers start utilizing this functionality for faster computations for things like physics and what not.

    Secondly, NVIDIA – that Apple of pc components.
    Nvidia is like the golden example for pc component manfacturers. Like Apple, their products always have been and always will be the first to innovate in the graphics space (not only with the 780 Ti, just look at their history, or their recently announced Tegra K1.) Unlike AMD, their focus is not solely on affordability and ‘performance for the most demanding and popular games.’ Their focus is top notch performance in anything you’d like to ever use your GPU for. AMD has been better about updating their drivers recently to fix their always existing bugs, but usually only testing and targeting whatever the most popular games are at the time. Nvidia, however, tests and patches on a nearly bi-weekly basis (or faster) for the most demanding professional graphics software. Which is generally more intensive and demanding than any game out today. That, and the fact that everyone ever that gets paid to make games or anything with real time graphics uses nvidia cards, should be enough to tell you that they know what they’re doing.

    And now why both are awesome and completely relevant.
    It’s capitalism. Everyone benefits from competition. Just look at all the cable and internet provider companies that DON’T have competition – they fucking suck. The prescription meds industry sucks ass, too. So how have AMD and Nvidia’s competition helped consumers? AMD gives frugal gamers the ability to run awesome games (even if it may run up their power bills significantly higher…I.E. the 290x will probably run your power bill ~6$ per month more than the 780…be careful..). Not only that, they both learn from each other. AMD has historically been the best in non-discrete graphics. E.G. the ones embedded on those (crappy…I’ll rant about that some other time) AMD CPUs in laptops and cheap all-in-ones. Nvidia has obviously studied their success, and that’s how we came up with the Atom/Ion combo and Tegra chips. And then there’s the obvious price battle that you mentioned.

    Anywho, I’ve typed far too much, and now I can’t even remember where I was going with this. Good article, though! I’d like to see more hardware nerdism from here. (you should do ram and ramdisks next :P)

    • Tyler Beckham

      I agree with you completely about AMD being the Geico of the two companies. I think that is my big draw to them though. The first thing I hear when people who are more console oriented complain about PC gaming is usually due to the whole you spend the same amount of money on one GPU as you do an entire XBox or PSWhatever. With AMD you get very nice performance for USUALLY a great price.

      This however was the big reason I went with NVIDIA this time instead of AMD. I was wanting to get rid of my two HD 6970’s, which really did act as a room heater, for one mega card that could handle 2.5k gaming on ultra settings. I was going to stick with AMD until the price for the 290X shot through the roof, at which there was no possible way you could justify buying it over the 780ti since they cost the same.

      Ever since then I have been loving my 780ti and all the premium features that NVIDIA throws in with it. Shield streaming is one of the best things ever and makes it to where I don’t even need to make a steam machine because of the console mode already built into it. Also if you have a really good internet connection at home you can stream it up to 100 miles away from your house with the help of some fancy know-how.

      I also think that the compation between the two is a good thing but hate that you can’t get both experiences on one card. AMD does is doing a nice job with MANTLE and NVIDIA has there whole PhisX thing going on as well as G-Sync coming soon. I know they both say they are not going to lock down some of these features at least not MANTLE but then again neither of them are going to adopt the others tech simply because it would be like saying they win.

    • I’d have to agree with the whole competition aspect. That’s one of the reasons I love the AMD/NVIDIA battle, and even battles like Amazon/Barnes & Noble. It always benefits the consumer.

      I think your arguments about AMD vs. NVIDIA in relation to video cards also play out in the AMD vs. Intel battle as well. Same kind of philosophy applies (which you also hinted at) – AMD provides a cheaper alternative, while Intel provides the premium experience. However, I think things are closing in a bit more in that space.

      Also, thanks for your comments! We’re looking at expanding our PC-based article selection (and we appreciate Tyler’s contributions so far), so look for more in this space in the future (monitors/displays, RAM, etc.).

  • StephenRinger

    First off, the answer to the title’s question, is a definite yes.
    Second off, as a software engineer with a lot of experience working with graphics utilizing multiple generations of both manufacturers’ products, there is a massive difference between the target consumers of these cards. I’ll keep it simple, though, and stick to shorter explanations.

    Let’s start with AMD/ATI – the GEICO of GPUs.
    I say Geico, because generally, their aim is to be cheaper. And that goal they usually meat pretty easily. And, just like Geico, for most users it’s probably exactly what they want. For professionals, or anybody that appreciates consistency and longevity in their products, they haven’t met the standards in nearly a decade. While their cards have significantly higher clock speeds, they also have significantly higher noise levels and power draw, and lack in decent driver support. Just look at any benchmark tackling OpenCL (general computing with a GPU.) Fortunately, most ‘gamers’ don’t care about this yet, but they should start doing so, as more and more developers start utilizing this functionality for faster computations for things like physics and what not.

    Secondly, NVIDIA – that Apple of pc components.
    Nvidia is like the golden example for pc component manfacturers. Like Apple, their products always have been and always will be the first to innovate in the graphics space (not only with the 780 Ti, just look at their history, or their recently announced Tegra K1.) Unlike AMD, their focus is not solely on affordability and ‘performance for the most demanding and popular games.’ Their focus is top notch performance in anything you’d like to ever use your GPU for. AMD has been better about updating their drivers recently to fix their always existing bugs, but usually only testing and targeting whatever the most popular games are at the time. Nvidia, however, tests and patches on a nearly bi-weekly basis (or faster) for the most demanding professional graphics software. Which is generally more intensive and demanding than any game out today. That, and the fact that everyone ever that gets paid to make games or anything with real time graphics uses nvidia cards, should be enough to tell you that they know what they’re doing.

    And now why both are awesome and completely relevant.
    It’s capitalism. Everyone benefits from competition. Just look at all the cable and internet provider companies that DON’T have competition – they fucking suck. The prescription meds industry sucks ass, too. So how have AMD and Nvidia’s competition helped consumers? AMD gives frugal gamers the ability to run awesome games (even if it may run up their power bills significantly higher…I.E. the 290x will probably run your power bill ~6$ per month more than the 780…be careful..). Not only that, they both learn from each other. AMD has historically been the best in non-discrete graphics. E.G. the ones embedded on those (crappy…I’ll rant about that some other time) AMD CPUs in laptops and cheap all-in-ones. Nvidia has obviously studied their success, and that’s how we came up with the Atom/Ion combo and Tegra chips. And then there’s the obvious price battle that you mentioned.

    Anywho, I’ve typed far too much, and now I can’t even remember where I was going with this. Good article, though! I’d like to see more hardware nerdism from here. (you should do ram and ramdisks next :P)

    • Tyler Beckham

      I agree with you completely about AMD being the Geico of the two companies. I think that is my big draw to them though. The first thing I hear when people who are more console oriented complain about PC gaming is usually due to the whole you spend the same amount of money on one GPU as you do an entire XBox or PSWhatever. With AMD you get very nice performance for USUALLY a great price.

      This however was the big reason I went with NVIDIA this time instead of AMD. I was wanting to get rid of my two HD 6970’s, which really did act as a room heater, for one mega card that could handle 2.5k gaming on ultra settings. I was going to stick with AMD until the price for the 290X shot through the roof, at which there was no possible way you could justify buying it over the 780ti since they cost the same.

      Ever since then I have been loving my 780ti and all the premium features that NVIDIA throws in with it. Shield streaming is one of the best things ever and makes it to where I don’t even need to make a steam machine because of the console mode already built into it. Also if you have a really good internet connection at home you can stream it up to 100 miles away from your house with the help of some fancy know-how.

      I also think that the compation between the two is a good thing but hate that you can’t get both experiences on one card. AMD does is doing a nice job with MANTLE and NVIDIA has there whole PhisX thing going on as well as G-Sync coming soon. I know they both say they are not going to lock down some of these features at least not MANTLE but then again neither of them are going to adopt the others tech simply because it would be like saying they win.

    • I’d have to agree with the whole competition aspect. That’s one of the reasons I love the AMD/NVIDIA battle, and even battles like Amazon/Barnes & Noble. It always benefits the consumer.

      I think your arguments about AMD vs. NVIDIA in relation to video cards also play out in the AMD vs. Intel battle as well. Same kind of philosophy applies (which you also hinted at) – AMD provides a cheaper alternative, while Intel provides the premium experience. However, I think things are closing in a bit more in that space.

      Also, thanks for your comments! We’re looking at expanding our PC-based article selection (and we appreciate Tyler’s contributions so far), so look for more in this space in the future (monitors/displays, RAM, etc.).