Is Nvidia Playing Fair With Their New Development Tools?
from the dirty-tricks dept
There's some heavy details in all of this, many of them at least somewhat technical, so let's dispense with the typical introductions and get right to the meat of this GPU industry sandwich. It's no secret to anyone paying attention to the video game industry that the graphics processor war has long been primarily waged between rivals Nvidia and AMD. What you may not realize is just how involved those two companies are with the developers that use their cards and tools. It makes sense, of course, that the two primary players in PC GPUs would want to get involved with game developers to make sure their code is optimized for the systems on which they'll be played. That way, gamers end up with games that run well on the cards in their systems, buy more games, buy more GPUs, and everyone is happy. According to AMD, however, Nvidia is attempting to lock out AMD's ability to get involved with developers who use the Nvidia GameWorks toolset, and the results can already be seen on the hottest game of the season thus far.
Some as-brief-as-possible background to get things started. First, the GameWorks platform appears to be immensely helpful to developers creating graphically impressive games.
Developers license these proprietary Nvidia technologies like TXAA and ShadowWorks to deliver a wide range of realistic graphical enhancements to things like smoke, lighting, and textures. Nvidia engineers typically work closely with the developers on the best execution of their final code. Recent examples of Nvidia GameWorks titles include Batman: Arkham Origins, Assassin's Creed IV: Black Flag, and this week's highly anticipated Watch Dogs.Now, while this is and should be a licensing-revenue win for Nvidia, aspects of the agreement in using GameWorks may actually seek to extend that win into a realm that threatens the larger gaming ecosystem. As mentioned previously, both Nvidia and AMD have traditionally worked extremely closely with developers, even going so far as assisting them in optimizing the game code itself to offer the best experience on their respective cards. How? Well, I'll let PR lead for AMD, Robert Hallock, chime in.
"Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products," Hallock told me in an email conversation over the weekend. But wait, it stands to reason that AMD would be miffed over a competitor having the edge when it comes to graphical fidelity and features, right? Hallock explains that the core problem is deeper: "Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.The code obfuscation makes it difficult to perform our own after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines," Hallock continues. "This change coincides with NVIDIA's decision to remove all public Direct3D code samples from their site in favor of a 'contact us for licensing' page. AMD does not engage in, support, or condone such activities."In other words, the dual symbiotic relationships that have always existed between developers and both Nvidia and AMD becomes one-sided, with AMD being locked out of the process in some very important ways. It means that an essential information repository and communications lines for development and game code optimization nearly become proprietary in favor of Nvidia. And, lest you think one shouldn't simply take the word of a rival PR flack on this kind of thing, other tech journalists appear to not only agree, but have predicted this exact outcome nearly a year ago when the GameWorks program was first rolled out.
"AMD is no longer in control of its own performance. While GameWorks doesn't technically lock vendors into Nvidia solutions, a developer that wanted to support both companies equally would have to work with AMD and Nvidia from the beginning of the development cycle to create a vendor-specific code path. It's impossible for AMD to provide a quick after-launch fix. This kind of maneuver ultimately hurts developers in the guise of helping them."Forbes' Jason Evangelho then digs into the title du jour, Watch Dogs, an Ubisoft production developed within the GameWorks platform. When a tech journalist is this surprised by how stark the difference in performance is between two rival GPU manufacturers, it's worth taking him seriously.
I've been testing it over the weekend on a variety of newer AMD and Nvidia graphics cards, and the results have been simultaneously fascinating and frustrating. It's evident that Watch Dogs is optimized for Nvidia hardware, but it's staggering just how un-optimized it is on AMD hardware. I guarantee that when the game gets released, a swarm of upset gamers are going to point fingers at AMD for the sub-par performance. Their anger would be misplaced.
The graphic above may not appear all that staggering at first, until you understand the cards involved and what it actually represents. The two cards in question aren't remotely in the same category of power and cost when compared to one another. That AMD card that is barely keeping up with the Nvidia card is a $500 workhorse, while the Nvidia card is a mid-range $300 staple of their linecard. Both cards were updated with the latest drivers for Watch Dogs prior to testing. The problem, as suggested above, is that the level of optimization done for the Nvidia cards far outpaces what's been done on AMD's end and it is thanks to the way the GameWorks platform is licensed and controlled. Games outside of that platform, with the exact same cards being tested, tell a far different story.
To further put this in perspective, AMD's 290x graphics card performs 51% better than Nvidia's 770 on one of the most demanding PC titles around, Metro: Last Light — which also happens to be an Nvidia optimized title. As you would expect given their respective prices, AMD's flagship 290x can and should blow past Nvidia's 770 and compete with Nvidia's 780Ti on most titles. To really drive the point home, my Radeon 290x can hit 60fps on Metro: Last Light with High quality settings and 4x anti-aliasing, at a higher resolution of 1440p.There's some history here, with Nvidia having a reputation for being more proprietary than AMD, which has always been seen as more of an open-source, open-dialogue, open-competition company. Indeed, Nvidia even has some history with trying to hide colluding with competitors behind trade secret law. But if it's allowed to simply lock up the open dialogue that everyone agrees makes for the best gaming ecosystem all around, the results could be quite poor for the PC gaming community as a whole. Particularly if upset AMD GPU owners who aren't aware of the background end up pointing the fingers at their co-victims of Nvidia rather than the villain itself.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: development, gameworks, gpus, optimization, video games
Companies: amd, nvidia
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
Generally things work like Nvidia introduce proprietary solution, and AMD pushes for the same thing in less expensive and in an actual standard, like the G-sync stuff were you need a specific hardware from Nvidia in the display, while AMD got the very same functionality introduced in the VESA displayport standard. Or the proprietary CUDA stuff, where AMD has been helping to establish OpenCL.
Nvidia is the one doing the shady business practices. AMD actually does things to help the overall computing world instead of locking things up behind proprietary crap.
[ link to this | view in chronology ]
Re: Re: Re: Re:
http://www.amd.com/en-us/innovations/software-technologies/mantle#overview
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
On the other hand, AMD often prevents developers from giving builds to Nvidia until a few days before release. Tomb Raider, anyone?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
And the idea that AMD is seen as less proprietary or any more open is equally laughable. AMD would ship stripped .o files to partners while Nvidia sent out source. Instead of working on an open low-level standard, AMD asked developers to write yet another code path for an already over-stratified set of platforms.
Optimization is an overall problem on PC games because there are far too many sets of hardware to be optimized on all of them all of the time. So yes, when Nvidia invests a large amount of resources into making sure their codepath is optimized, that time isn't just taken from Nvidia- it's taken from the developers. Yes, it's sad that the market is in such a state than in order to have a phenomenal looking game (if you can even consider Watch Dogs that), it's going to only be phenomenal on a specific set of hardware. Need I remind people that Watch Dogs had already been delayed?
Speaking of which, have these people seen Watch Dogs? It's not exactly much to talk about. Frankly it doesn't seem well optimized in general, but that's just me.
[ link to this | view in chronology ]
Response to: Anonymous Coward on May 30th, 2014 @ 6:57pm
[ link to this | view in chronology ]
Re:
That's not to say that Watch Dogs isn't horribly-optimised, because it is. But I woudl give more latitude to AMD over NViodia in the GPU space over the decisions it makes.
[ link to this | view in chronology ]
The press battle talks about this as tools, which is a nebulous term. It's really pretty much libraries, and education on how to use those libraries.
Nvidia spent years and huge bucks on developing the techniques to make these effect, and more money writing and tuning the code necessary to implement the ideas. Many movie and game effects weren't written by the studios, they came from Nvidia.
AMD does the same thing. Or I should say used to do the same. When they ran into a cash crunch they stopped investing. ATI was still very profitable, but rather than investing in the long-term market position it was used as a cash cow to subsidize the money-losing parts of AMD (the entire rest of the company, including the Global Foundry wafer-start commitments). It was probably needed to save the company, but you still have to look at it as cutting R&D.
What's happening now is what is supposed to happen in a competitive marketplace. Companies that invest in the next generation have a more advanced product and a competitive advantage. Companies that don't invest, or invest in the wrong area, end up with a less desirable product.
AMD chose to invest in next generation console chips. They won every major console, displacing Nvidia GPU IP. Nvidia invested in visual computing, including libraries that do realistic rendering, simulating massive numbers of objects, flame effects, fog effects, hair, facial and gesture realism, etc. AMD has.. TressFX for hair.
These features add huge value to games. Or I should say figuring out how do these things is innovative and inventive. Being able to do these effect in real time is astonishing. People don't know how to put a price on these features. But they can compare two implementations, and buy the one that does the best job. Or pay the same for two that have equivalent performance.
In the GPU business the hard metric has long been compute speed and FPS (frames per second). That's easy to measure. But increasingly customers have realized that driver quality and feature support -- the expansive part of the GPU business -- is far more important. It's hard to measure.. until features like this makes a huge difference that can be measured in FPS.
[ link to this | view in chronology ]
Pathetic Reporting Techdirt
http://www.forbes.com/sites/jasonevangelho/2014/05/28/nvidia-fires-back-the-truth-about-gam eworks-amd-optimization-and-watch-dogs/
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm
[ link to this | view in chronology ]
Re: Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm
[ link to this | view in chronology ]
Re: Re: Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm
At 12.04 ubuntu, you had the restricted drivers manager, (Separate from the manager under software sources), to manage proprietary drivers, which wasn't palatable enough to nVidia, so they nixed the auto-configs to force you to install from command line. Take in mind this was still while nVidia was officially 'working with' linux.
Still, at that point, you could still have installed the nouveau (non-proprietary) drivers from the restricted drivers manager.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Just like that time they told Ubisoft to remove DX 10.1 or else they would be excluded from "meant to be played" money for free program:
http://www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketin g-war/1
or that time they "helped" Crytek with tesselation coincidently just as they released GPU with super fast tesselation performance:
http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2
Or when they were shipping Physx compiled with 586/x87 target instead of using modern SSE/AVX instructions.
http://arstechnica.com/gaming/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library -to-spite-intel/
BTW Watch Dogs on consoles is optimized for AMD GPU PERFECTLY. But as soon as that same x86 codebase moves from console to PC it gets NVIDIA Gameworks "upgrade" and BAM there is your result.
"means to be played" is NVIDIA program that "certifies" your game and slaps 2 second Nvidia banner every time you run said game. Nvidia pays developers for it like its just an ad impression, but if you look deeper in the contract it gets pretty iffy. Nvidia gives you money, but also tells you how to optimize, what features to implement, and how. They not only advise, they directly give you code to inject into your product. Code that makes competitors hardware appear slower.
Intel used to do the very same thing to AMD with their ICC compiler. Compiler injected piece of code that checked CPU vendor string EVERY TIME your code ran. Change vendor string = program gets faster ON SAME HARDWARE.
http://www.osnews.com/story/22683/Intel_Forced_to_Remove_quot_Cripple_AMD_quot_Function_fro m_Compiler_
FTC forced Intel to stop this practice. Who will force Nvidia to stop??
[ link to this | view in chronology ]
For instance, Nvidia helped Crytek use a new sophisticated feature of their upcoming GPUs. Nobody was mislead. It was simply "Use this technique to get better visual quality. It's was too expensive to do with old GPUs, but we now have the hardware and software advances to make it feasible."
How is that legitimately bad for anyone but competitors?
And "They way it's meant to be played" program.. Nvidia advises game creators how to make their game faster by looking at effectively it's utilizing the GPU, and moving some of the physical simulation from the CPU to GPU. They aren't making the game slower for AMD users, they are making the game faster for Nvidia users.
How is that legitimately bad for anyone but competitors?
It's not at all like the Intel compiler generating deliberately bad code for AMD processors. It wasn't a case of not generating special extension instructions such as SSE. Telling the compiler to optimize for AMD resulted in code that would run slower everywhere. Telling it to optimize for generic Intel processors, without SSE, resulted in code that ran fast on AMD as well. That's trying to trip your competitor rather than running faster yourself.
[ link to this | view in chronology ]
But its hard to support nvidia when they do something like this.
Still, its just the opinion of AMD, im not going to hate on either of them until a dev says its bad.
[ link to this | view in chronology ]
Re:
Nvidia is one of the most shady businesses in the current Hardware industry.
Oh and also, they use paid shills to pertuate the "AMD makes horrible drivers" myth (btw, it were nvidia drivers that actively destroyed hardware because they fucked up fan control, and I also remember the statistics from microsoft were nvidia driver were responsible for about 10 times as many system crashes as AMD/ATI drivers) and generally badmouth AMD hardware, especially when new GPU releases are upcoming.
Fact is, Nvidia lies and cheats, they have been for a very long time now.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Market forces should handle this.
2) People buying the game with such hardware will be disappointed. A large number of them will understand point #1.
3) Such folks will become reluctant to buy further Ubisoft games.
The normal pain feedback cycle applies here, unlike in many other non-competitive situations (broadband, anything related to movies/records, etc). If Ubisoft can make more money pissing off 40% of their target market, great. If not, they'll work harder to make sure the performs well on both chipsets.
Laying any blame at nVidia's feet (or AMD's) is silly. Ubisoft makes the call, Ubisoft reaps the results. They don't have a monopoly on the market and Watch Dogs isn't big enough to make people switch display adapters.
[ link to this | view in chronology ]
I don't use those words casually.
It's something completely obvious, but only in retrospect. Before there was a solution didn't know it was a problem.
The usual competition in the GPU wars is an immediate claim of "that doesn't matter", followed by implementing a similar feature a year or two later, when it suddenly matters.
A good example is fixing frame rate variability, which caused jerky game play even with high FPS rates.
The reaction to G-sync was 'no one cares' followed a few days later by '[oh shit]' and a mad scramble by AMD to find something.. anything.. that would have the same benefits.
AMD figured out an alternate approach. The AMD approach is not quite as good, but it still gets most of the benefit. It was possible to reprogram the latest AMD hardware to put out variable sync, and could be quickly added to an existing VESA standard.
AMD would not have done this on their own. The motivation was an innovation by Nvidia. AMD was strictly reacting.
[ link to this | view in chronology ]
Re: Video Sync
[ link to this | view in chronology ]
Re:
Also, it takes far longer to get amendments to a standard ratified than it does to build a frame buffering/repeating device to fake variable frame rate display. Yes, fake. LCD tech requires refresh at certain intervals to maintain the image, so the nVidia solution is to replay the old image at that next interval, then play the new image as soon as it is ready (even if an LCD refresh isn't needed).
This is just display-level double buffering - something that really shouldn't require much extra display logic or hardware. You can keep the last completed frame in the frame buffer of the video card and time updates relative to the forced display refresh (20hz as an example). If you can get a completed frame to the frame buffer before the next forced refresh, you update the frame buffer and trigger a display refresh with a signal to the display (only new display 'logic' required). This update resets the clock until the next required refresh.
If you can't make the deadline, you simply leave the frame buffer intact and wait until the refresh, then you update the buffer again and trigger a display redraw.
nVidia's solution is crude and invasive - the only purpose it has is to make it appear like they are doing something special or innovative.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
This is not a case of Nvidia helping optimize, they are paying developers for CRIPPLING their products.
You dont get to sell millions of copies as a graphic card bundle http://www.geforce.com/getwatchdogs by simply optimizing, you bend over backwards to please your slave master.
[ link to this | view in chronology ]
Re:
Oh the shame of the free market!
[ link to this | view in chronology ]
A biased post. Here is the Nvidia response
To put this particular argument to bed, I told Cebenoyan I wanted crystal clear clarification, asking “If AMD approached Ubisoft and said ‘We have ideas to make Watch Dogs run better on our hardware,’ then Ubisoft is free to do that?”
“Yes,” he answered. “They’re absolutely free to.”
And there’s nothing built in to GameWorks that disables AMD performance? “No, never.”
Perhaps more fascinating was Nvidia’s response when I flipped the situation around. What about AMD-partnered titles like Battlefield 4 and Tomb Raider? How much lead time did Nvidia receive — and how much would they need — to optimize Nvidia GPUs for those games? While I didn’t receive a direct answer, what I got was Nvidia returning fire.
“It varies. There have been times it’s been more challenging because of what we suspect stems from deals with the competition,” Cebenoyan says. “It doesn’t happen often. But when it does there’s a fair amount of scrambling on our part. I can tell you that the deals that we do, and the GameWorks agreements, don’t have anything to do with restricting anyone’s access to builds.”
[ link to this | view in chronology ]
Re: A biased post. Here is the Nvidia response
[ link to this | view in chronology ]
[ link to this | view in chronology ]