AMD R9 290X Review & Benchmarks: The New GPU King?

By
Updated: October 25, 2013
R9 290X Review

Okay so the dust is starting to clear out as AMD’s latest flagship card has finally been released to an eager public and the results may surprise some and confirm for others what they thought would be the real deal in the end game.  This round we were very lucky that our friend and tech peer Linus Sebastian was able to get his hands on a reference R9 290X and that he was willing to do a video review for us so we could have some launch day coverage.

The R9 290X is looking like it will be one serious contender for NVIDIA as the drivers used in todays test are very new from AMD and as the drivers mature performance will be increased. Even though the drivers were new AMD still put the pressure back on NVIDIA’s plate to make them wake up and well, smell the AMD coffee. The card tested was a reference sample card from AMD and yet still shows it has the capability to compete and beat the GTX 780 and Titan in many of the tests.

Take this into consideration as well – AMD just released this beast and as of now we have only seen a reference based solution and not anything from the big players who use only aftermarket cooling on their high-end cards. With better cooling or for those who brave the liquid cooling experience I think the new R9 290X has much more to offer under the hood. As of now the card really runs too hot for any crazy overclocking and the tests were run taking that to mind as if the cards gets to hot it could destroy the card itself, leaving no test sample at all.

On a personal level I am very happy to see AMD come up with a nice comeback and a real product that shows ingenuity and progress for the company. There is a lot of new technology that is coming to the table, but until Mantle and True Sound and XDMA really start to rear its head we will not know exactly how much this new technology will have an affect on our gaming world.  The card running very hot though seems to be a trend that all reviewers are seeing and the one Achilles heel it seems to openly have. Better cooling that will come on aftermarket cards will hopefully embrace that problem and improve the thermal dynamics all-around.

We should begin to get our review samples in a few days, but I think I may have found a reference card we can borrow so we will have a fully written review in a few days. For now Linus was totally cool enough to share the information with you good folks the fans of Tech Of Tomorrow and I wish you guys would give him a thanks in the comments below. So far it seems AMD has pulled a rabbit from out of their hat and created something magical for their fans, which is something good for us all. Thank you for reading and be sure to watch the video to get all the fun facts and performance of the new R9 290X. Are you AMD fans a happy bunch of campers? Lets us know your thoughts and feeling in the comments below. Happy Friday peeps and thanks again Linus.

R9 290X Aftermath
Thoughts on the 290X now that it's officially released?
  • Yaz Akiera

    very happy to see AMD finally coming with strong comeback, 550$ for a card that beats Titan and 780 in most of the tests. but i guess Nvidia will bring 780i and Titan Ultra on the table as its been too long for AMD to release a card that beats one year old Titan, i was really hoping to see double Titan performance or at least 30-40% ahead.

    • Seth Hoke

      What, where did it beat the Titan in most tests? lol

      • Cameron

        Beats the titan at stock clocks in most tests.

      • Serpent of Darkness

        It’s going to beat the GTX Titan in BF4 without a doubt. Have you seen the BF3 benchmarks on it’s Quiet and Uber Mode.

  • Cal

    I don’t know why the Titan is often times compared a lot of the drivers that come out really boost the 780 not the Titan. The Titan is really mainly a compute card that can really push high end gaming as well

    • Serpent of Darkness

      Essentially, all cards are “compute cards.” Workstation card really just have Error Correcting Memory (ECM) and better support besides a few minor perks, but there is really nothing significantly different between Gaming and Workstation cards… You can still use a 3way CrossfireX RX9-290 to do the same number-crunching task as a workstation card, but then the whole 64bit floating point performance kicks in. Now if you want to compare GTX Titan to K6000, it’s really just the Cuda Cores, Frame Buffer size, and ECM that are different. Titan has less cores, less ram, and no ECM. You can use the K6000 for gaming, but it will be a waste. It has more cuda cores, so time to produce frames would be less, but you’ll never use the 12 GB Frame Buffer unless you’re in surround with 6 1440p monitors. Titan is really just a premium card like the ASUS ROG line of graphic cards. The ARES 2 and Mars 2 cards are basically once a year, premium, best of the best binned GPUs on a PCB. Doesn’t matter which brand they use it from, it’s simply the king cards. You can say that GTX Titan was like that for 2012 to 2013, but the ARES 2 still did better in performance. Bottom line, to help clear things up for you, the GTX Titan isn’t a generation card. The 780 is this year’s generation card, and that’s why the drivers will be more tuned for them…

  • Alio

    I’m very VERY dissapointed with AMD, this was supposed to be their comeback, the salvation for us consumers to get better price w/o being raped by current Nvidia prices. GTX 690 is way better than a Titan and was $1000 USD, now, you almost can’t find it on the market because they’re trying to sell their little “Titan”. Still, the better option is still nvidia, and I really hope AMD gets 300% improvements with Mantle, otherwise, I’m waiting for next year.

    P.S: I KNOW 690 is a dual-gpu card, I KNOW the price is much better, but I’m talking about PURE PERFORMANCE, so please don’t be dumb and come saying that!!

    • Squirtleyourmom

      if you want a 690 “killer” look at the 7990, was released quite a while back (quoted killer as it didn’t blow away the 690,just barely edged it but it does have is mantle support)

      • Alio

        Yeah, I know what you’re talking about and I support you, but the thing is, I’m talking about single-gpu cards, and I was particularly interested in seeing AMD getting the crown this time, but already talked about that anyway.
        Why did I mentioned the 690, then? Just for the price comparison. That kinda gets me angry. Titan should’ve been announced to be 800 AT LEAST back when they showed it. But hey, what can I do?

        • Serpent of Darkness

          You won’t find an AMD Solution with 1.8x the Streaming Processors to match the GTX Titan. How ever you define your belief in the words “pure performance.” Pure Performance could be looked at as a sum of different elements of a graphic card. Most importantly, I think it’s the amount of GFlops the cards produce, but this matters less with PC Games. It matters more with OpenGL and OpenCL… This ideology of pure performance for a single graphic card could be seen, or it could happen for Tenerifle in 2014 if AMD follows through with that, but who knows for sure, in a “single graphic-card setup.” Originally, it’s going to be the upcoming contender for Maxwell on a 22nm node. It was suppose to have 2x the amount of streaming processors on a single die from the Tahiti XL chip. So looking at 2048×2, but in truth, it could be 2816 streaming processors x 2 on a single die. In addition, 16 additional streaming processors in Series computing unlike the 2816 in parallel computing… If Hawai’i XT is over the 5.63 Gflop line, Tenerifle could be 5.5 to almost 6.0 GFlops on stock. My calculations say 11.26 GFlops, but that seems far fetched. 2048 x2 would push up to 7.37 GFlops…

          • Serpent of Darkness

            On another note, GTX Titan and 780, on stock settings, is under the 5.0 GFlop line. 690 easily flies over this benchmark because it’s a dual GPU setup…

          • Alio

            What I meant with “pure performance” was how well it would perform in games, generally. No need to talk about future architectures bro!
            P.S: My bad, I should have been more specific… to conclude, it’s an amazing value card, but no the king of “FPS” (trying to get it right for ya!)

          • Serpent of Darkness

            PC Games are going to be biased to each brand. A lot of D3D9 games will be optimized for NVidia. You can see this in the “old” COD games, Batman Games, Planetside 2, etc… Especially the ones coded and rendered with NVidia Graphic Cards. NVidia does pretty well in D3D11, but they have no hardware support for D3D11.1. It’s only software support. So their performance is kind of sucking in BF3 for NVidia, and it will suck a lot more in BF4 without Mantle. This is do to the Kepler Architecture. Probably won’t see full support for D3D11.1, and zero D3D11.2 support until Maxwell. GCN is more optimized for D3D11, 11.1, 11.2, Directcompute, and OpenGL. So it’s easy to spot which games will be better with which Graphic Card Brand. Though, increasing the amount of Cuda Core or Streaming Processor count can overcome any API Optimization hurdles. This can be seen in the recent Bioshock title (AMD Optimized) where NVidia is still pulling a few more frames than AMD. NVidia has never been able to top AMD on Tomb Raider because it’s optimized for D3D and Directcompute. A lot of people are predicting BF4 will be heavily favored by AMD Graphic Cards… I suspect at minimum settings, you can play the game on D3D11.0, but at max settings, you can use the D3D11.1 API.

          • Fiberton

            Reason many games ran so well with Nvidia cards is because of Nvapi. BF3 uses it for Nvidia cards but I can all but assume that support is no longer there for BF4 as EA/DICE has decided AMD is what they rather be associated with. I think the competition is great. Nvidia is in a tight spot as they are just a graphics company over all. They already lost 79 cents per share since 290X release. That is about 450M dollar market cap loss. Yikes.

          • Fiberton

            AMD will be switching to 20nm sometime in 2014 as was said on their last earnings call.

    • Nope.pdf

      you shouldn’t be. it costs much less than the 780, and performs better than anything else, you will never get pure performance and price at the same time. R 290x is the best way to go, unless you are willing to spend a shit load of money.

      • Alio

        Read again! It’s important, disconsidering anything about price, for a company to have the best product in it’s category. Price is important? Of course it is!! But as angry as I am with Nvidia having the Titan at $1000, it won’t change the fact that it is the best card. I’m talking about the performance race, so when talking about that, please, don’t even mention price! Its not the subject!! :)

  • Greg Reavis

    This is a good card. Yes the cooler is terrible in my opinion, but it does keep up and surpass the 780 in some tests. For a card that is at the moment a $100 dollars cheaper it’s awesome. Not only that when the aftermarket coolers come around we will see the card at it’s full potential. I hope that the 780 will get a price drop soon otherwise (when the aftermarket coolers come out) there is no reason to spend the extra on the 780 for gaming alone.

    • Seth Hoke

      But If the 780 gets dropped to $550 I don’t see why you would buy the 290x. And then there the 780Ti to think about, if they get that out at $550-$600, well, that would be interesting, very interesting.

    • Ashton Harris

      Just wait for the next drivers, then we’ll be much closer to it’s true potential.

    • Squirtleyourmom

      more like in ALL tests and close enough to TITAN. And fo keep not that AMD paid EA and Activision to use mantle in their AAA games (frostbite games from EA; Mass Effect 4, Mirrors Edge, Starwars Battlefront…ect.) AND true audio seems more useful than PysX. My recommendation would be buy this card with an aftermarket cooler or liquid cooler, or if you can’t go AMD than w8 till Q3 next year for Maxwell GPU

      • Fiberton

        AMD never paid EA. They already came out and said this never happend. On Twitter the Frostbite developer said that once BF4 mantle is finished all other frostbite games will be able to use mantle. He said over 15 titles.

    • Serpent of Darkness

      I think the cooler is crap because AMD knows that people will invest another $100 dollars for a watercooling solution. Even if OCed, the cards won’t go above 50 degs C unless the person’s loops are setup really bad…

      • Josh Peet

        Asus, MSI, EVGA, ect. dual fan cooling solutions will dramatically reduce the thermal issues like always and increase the stock core frequencies for 10-30 bones more than the reference models. H2O is to hardcore me ;-)

        I only liquid cool my cpu and thats a closed loop one.

        • Serpent of Darkness

          That’s true, but Evga doesn’t sell AMD. MSI and Asus air-cooling solutions are top notch, but still, they have limitations on the air cooling side, and water does a lot better at cooling your investments. I have had 2 6990s in CrossfireX at full load, stay under 50 deg C at 1060 MHz core, and 5500 Mhz Mem. I will agree with that the non-reference models with the better air cooling solutions do a better job than the stock solution on the AMD cards, but it isn’t more efficient than water-cooling. In this scenario, the amount of money you invest to it is proportionate to the performance in some way. If money invested goes up, performance goes up. Performance, in this case, is cooling. I wouldn’t consider Di-hydragon Oxide–cooling as hardcore. It’s more like the best method there is. Unless you start using other methods like a cold chiller, or LN2, or anti-freeze with two radiators in a cooler filled with ice, water-cooling is the less complicated, but just as OP way to approach it… Going with a Cold Chiller or anything that extreme requires you to worry about condensation…

  • SGT Smith

    Great Vid Guys, i guess AMD is not outdone.

  • Josh Peet

    great review guys. I remember back when new cards came out the prices of the previous models dropped drastically, I also remember when new cards never cost more than $500 for the latest and greatest… at least for Nvidia. I am glad that AMD still stays within 150-550 price range for each gen. kinda making me want to switch over to the AMD team. I hope that this Mantle tech proves to be true because the prices have been getting out of hand. i cannot justify paying more than 600 bones for a card just for a 15% performance, over a card that cost $200 less. And why is it that sli and crossfire is still so choppy or poorly optimized except for a hand full of games? I want double precision not 150-175%. manly talking Nvidia.just because NV has a better card does it really justify $100-200 for borderline performance compared to AMD? Nvidia needs to make that investment more worthwhile than just bragging rights.