Is Nvidia Playing Fair With Their New Development Tools?

from the dirty-tricks dept

There's some heavy details in all of this, many of them at least somewhat technical, so let's dispense with the typical introductions and get right to the meat of this GPU industry sandwich. It's no secret to anyone paying attention to the video game industry that the graphics processor war has long been primarily waged between rivals Nvidia and AMD. What you may not realize is just how involved those two companies are with the developers that use their cards and tools. It makes sense, of course, that the two primary players in PC GPUs would want to get involved with game developers to make sure their code is optimized for the systems on which they'll be played. That way, gamers end up with games that run well on the cards in their systems, buy more games, buy more GPUs, and everyone is happy. According to AMD, however, Nvidia is attempting to lock out AMD's ability to get involved with developers who use the Nvidia GameWorks toolset, and the results can already be seen on the hottest game of the season thus far.

Some as-brief-as-possible background to get things started. First, the GameWorks platform appears to be immensely helpful to developers creating graphically impressive games.

Developers license these proprietary Nvidia technologies like TXAA and ShadowWorks to deliver a wide range of realistic graphical enhancements to things like smoke, lighting, and textures. Nvidia engineers typically work closely with the developers on the best execution of their final code. Recent examples of Nvidia GameWorks titles include Batman: Arkham Origins, Assassin's Creed IV: Black Flag, and this week's highly anticipated Watch Dogs.
Now, while this is and should be a licensing-revenue win for Nvidia, aspects of the agreement in using GameWorks may actually seek to extend that win into a realm that threatens the larger gaming ecosystem. As mentioned previously, both Nvidia and AMD have traditionally worked extremely closely with developers, even going so far as assisting them in optimizing the game code itself to offer the best experience on their respective cards. How? Well, I'll let PR lead for AMD, Robert Hallock, chime in.
"Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products," Hallock told me in an email conversation over the weekend. But wait, it stands to reason that AMD would be miffed over a competitor having the edge when it comes to graphical fidelity and features, right? Hallock explains that the core problem is deeper: "Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.The code obfuscation makes it difficult to perform our own after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines," Hallock continues. "This change coincides with NVIDIA's decision to remove all public Direct3D code samples from their site in favor of a 'contact us for licensing' page. AMD does not engage in, support, or condone such activities."
In other words, the dual symbiotic relationships that have always existed between developers and both Nvidia and AMD becomes one-sided, with AMD being locked out of the process in some very important ways. It means that an essential information repository and communications lines for development and game code optimization nearly become proprietary in favor of Nvidia. And, lest you think one shouldn't simply take the word of a rival PR flack on this kind of thing, other tech journalists appear to not only agree, but have predicted this exact outcome nearly a year ago when the GameWorks program was first rolled out.
"AMD is no longer in control of its own performance. While GameWorks doesn't technically lock vendors into Nvidia solutions, a developer that wanted to support both companies equally would have to work with AMD and Nvidia from the beginning of the development cycle to create a vendor-specific code path. It's impossible for AMD to provide a quick after-launch fix. This kind of maneuver ultimately hurts developers in the guise of helping them."
Forbes' Jason Evangelho then digs into the title du jour, Watch Dogs, an Ubisoft production developed within the GameWorks platform. When a tech journalist is this surprised by how stark the difference in performance is between two rival GPU manufacturers, it's worth taking him seriously.
I've been testing it over the weekend on a variety of newer AMD and Nvidia graphics cards, and the results have been simultaneously fascinating and frustrating. It's evident that Watch Dogs is optimized for Nvidia hardware, but it's staggering just how un-optimized it is on AMD hardware. I guarantee that when the game gets released, a swarm of upset gamers are going to point fingers at AMD for the sub-par performance. Their anger would be misplaced.


The graphic above may not appear all that staggering at first, until you understand the cards involved and what it actually represents. The two cards in question aren't remotely in the same category of power and cost when compared to one another. That AMD card that is barely keeping up with the Nvidia card is a $500 workhorse, while the Nvidia card is a mid-range $300 staple of their linecard. Both cards were updated with the latest drivers for Watch Dogs prior to testing. The problem, as suggested above, is that the level of optimization done for the Nvidia cards far outpaces what's been done on AMD's end and it is thanks to the way the GameWorks platform is licensed and controlled. Games outside of that platform, with the exact same cards being tested, tell a far different story.

To further put this in perspective, AMD's 290x graphics card performs 51% better than Nvidia's 770 on one of the most demanding PC titles around, Metro: Last Light — which also happens to be an Nvidia optimized title. As you would expect given their respective prices, AMD's flagship 290x can and should blow past Nvidia's 770 and compete with Nvidia's 780Ti on most titles. To really drive the point home, my Radeon 290x can hit 60fps on Metro: Last Light with High quality settings and 4x anti-aliasing, at a higher resolution of 1440p.
There's some history here, with Nvidia having a reputation for being more proprietary than AMD, which has always been seen as more of an open-source, open-dialogue, open-competition company. Indeed, Nvidia even has some history with trying to hide colluding with competitors behind trade secret law. But if it's allowed to simply lock up the open dialogue that everyone agrees makes for the best gaming ecosystem all around, the results could be quite poor for the PC gaming community as a whole. Particularly if upset AMD GPU owners who aren't aware of the background end up pointing the fingers at their co-victims of Nvidia rather than the villain itself.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: development, gameworks, gpus, optimization, video games
Companies: amd, nvidia


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 30 May 2014 @ 6:40pm

    So, it's not patent protection, it's keeping information that Nvidia is developing secret from a competitor? It seems that many of the posters here have been arguing that keeping secrets is a better alternative than patents. While the secrets in this case may be hurting AMD, it still seems like Nvidia is playing by generally accepted rules. Sure, you may be critical of Nvidia for hurting AMD's customers, but, they ARE competitors, and it would seem like Nvidia would do anything to maintain an edge, and they are not using patents or copyright to do that. So, what's the problem? Keeping secrets is now bad too?

    link to this | view in thread ]

  2. identicon
    Aerilus, 30 May 2014 @ 6:45pm

    Nvidia is just pissed amd riding there bumper when they used to have ati lapped. afaik amd chips are in new xbox and playstation. dont know why you would want to optimize for nvidia if you are releasing to console. amd has better linux support better support for opengl and better better price to performance ratio whats not to love.

    link to this | view in thread ]

  3. icon
    Dark Helmet (profile), 30 May 2014 @ 6:55pm

    Re:

    The point is that they're changing the game. AMD could do something like this as well, but they don't, and the end result of Nvidia doing it is a worse off gaming consumer.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 30 May 2014 @ 6:57pm

    AMD's no better than Nvidia in this space and them trying to play the sole victim here is laughable. Look at any single AMD-optimized game and you get the same result - AMD performance will far outpace Nvidia performance when trying AMD-proprietary settings. Tomb Raider, anyone? Mantle?

    And the idea that AMD is seen as less proprietary or any more open is equally laughable. AMD would ship stripped .o files to partners while Nvidia sent out source. Instead of working on an open low-level standard, AMD asked developers to write yet another code path for an already over-stratified set of platforms.

    Optimization is an overall problem on PC games because there are far too many sets of hardware to be optimized on all of them all of the time. So yes, when Nvidia invests a large amount of resources into making sure their codepath is optimized, that time isn't just taken from Nvidia- it's taken from the developers. Yes, it's sad that the market is in such a state than in order to have a phenomenal looking game (if you can even consider Watch Dogs that), it's going to only be phenomenal on a specific set of hardware. Need I remind people that Watch Dogs had already been delayed?

    Speaking of which, have these people seen Watch Dogs? It's not exactly much to talk about. Frankly it doesn't seem well optimized in general, but that's just me.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 30 May 2014 @ 7:18pm

    Re:

    It's an anti-competitive practice that interferes with the business relationship between developers and AMD. Real competition is making a better product that customers want to purchase more than your competitor's product. It's not forcing developers not to work with or be able to work with your competitor.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 30 May 2014 @ 7:52pm

    Response to: Anonymous Coward on May 30th, 2014 @ 6:57pm

    Amd is seen as more open because they actually release open source driver code and card interface documents and nvidia tends not to

    link to this | view in thread ]

  7. icon
    DB (profile), 30 May 2014 @ 8:06pm

    Disclaimer: I'm a little biased on this topic, but I'm fairly well informed. Evaluate for yourself.

    The press battle talks about this as tools, which is a nebulous term. It's really pretty much libraries, and education on how to use those libraries.

    Nvidia spent years and huge bucks on developing the techniques to make these effect, and more money writing and tuning the code necessary to implement the ideas. Many movie and game effects weren't written by the studios, they came from Nvidia.

    AMD does the same thing. Or I should say used to do the same. When they ran into a cash crunch they stopped investing. ATI was still very profitable, but rather than investing in the long-term market position it was used as a cash cow to subsidize the money-losing parts of AMD (the entire rest of the company, including the Global Foundry wafer-start commitments). It was probably needed to save the company, but you still have to look at it as cutting R&D.

    What's happening now is what is supposed to happen in a competitive marketplace. Companies that invest in the next generation have a more advanced product and a competitive advantage. Companies that don't invest, or invest in the wrong area, end up with a less desirable product.

    AMD chose to invest in next generation console chips. They won every major console, displacing Nvidia GPU IP. Nvidia invested in visual computing, including libraries that do realistic rendering, simulating massive numbers of objects, flame effects, fog effects, hair, facial and gesture realism, etc. AMD has.. TressFX for hair.

    These features add huge value to games. Or I should say figuring out how do these things is innovative and inventive. Being able to do these effect in real time is astonishing. People don't know how to put a price on these features. But they can compare two implementations, and buy the one that does the best job. Or pay the same for two that have equivalent performance.

    In the GPU business the hard metric has long been compute speed and FPS (frames per second). That's easy to measure. But increasingly customers have realized that driver quality and feature support -- the expansive part of the GPU business -- is far more important. It's hard to measure.. until features like this makes a huge difference that can be measured in FPS.

    link to this | view in thread ]

  8. identicon
    Bios, 30 May 2014 @ 8:21pm

    Pathetic Reporting Techdirt

    Odd how you choose to leave out the response from Nvidia, which mind you was published a couple days ago. This is not what I've come to expect from Techdirt.

    http://www.forbes.com/sites/jasonevangelho/2014/05/28/nvidia-fires-back-the-truth-about-gam eworks-amd-optimization-and-watch-dogs/

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 30 May 2014 @ 9:11pm

    Simple solution, AMD, open up the specifications for your cards so developers don't HAVE to ask you for help.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 30 May 2014 @ 9:24pm

    Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm

    They already do that. It's part of why the open source radeon drivers aren't as much of a pita as the nouveau reverse engineered nvidia drivers

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 30 May 2014 @ 9:42pm

    And AMD pushes Mantle that doesn't work AT ALL on NVidia or Intel GPUs. Pot calling the kettle black.

    link to this | view in thread ]

  12. identicon
    rasz_pl, 30 May 2014 @ 10:15pm

    Sure they are

    Just like that time they told Ubisoft to remove DX 10.1 or else they would be excluded from "meant to be played" money for free program:

    http://www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketin g-war/1


    or that time they "helped" Crytek with tesselation coincidently just as they released GPU with super fast tesselation performance:

    http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

    Or when they were shipping Physx compiled with 586/x87 target instead of using modern SSE/AVX instructions.

    http://arstechnica.com/gaming/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library -to-spite-intel/


    BTW Watch Dogs on consoles is optimized for AMD GPU PERFECTLY. But as soon as that same x86 codebase moves from console to PC it gets NVIDIA Gameworks "upgrade" and BAM there is your result.

    "means to be played" is NVIDIA program that "certifies" your game and slaps 2 second Nvidia banner every time you run said game. Nvidia pays developers for it like its just an ad impression, but if you look deeper in the contract it gets pretty iffy. Nvidia gives you money, but also tells you how to optimize, what features to implement, and how. They not only advise, they directly give you code to inject into your product. Code that makes competitors hardware appear slower.


    Intel used to do the very same thing to AMD with their ICC compiler. Compiler injected piece of code that checked CPU vendor string EVERY TIME your code ran. Change vendor string = program gets faster ON SAME HARDWARE.

    http://www.osnews.com/story/22683/Intel_Forced_to_Remove_quot_Cripple_AMD_quot_Function_fro m_Compiler_

    FTC forced Intel to stop this practice. Who will force Nvidia to stop??

    link to this | view in thread ]

  13. icon
    DB (profile), 30 May 2014 @ 11:09pm

    Actually, almost all of those situations make the opposite point than you suggest.

    For instance, Nvidia helped Crytek use a new sophisticated feature of their upcoming GPUs. Nobody was mislead. It was simply "Use this technique to get better visual quality. It's was too expensive to do with old GPUs, but we now have the hardware and software advances to make it feasible."

    How is that legitimately bad for anyone but competitors?

    And "They way it's meant to be played" program.. Nvidia advises game creators how to make their game faster by looking at effectively it's utilizing the GPU, and moving some of the physical simulation from the CPU to GPU. They aren't making the game slower for AMD users, they are making the game faster for Nvidia users.

    How is that legitimately bad for anyone but competitors?

    It's not at all like the Intel compiler generating deliberately bad code for AMD processors. It wasn't a case of not generating special extension instructions such as SSE. Telling the compiler to optimize for AMD resulted in code that would run slower everywhere. Telling it to optimize for generic Intel processors, without SSE, resulted in code that ran fast on AMD as well. That's trying to trip your competitor rather than running faster yourself.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 31 May 2014 @ 12:31am

    AMD's main strategy is to add more cores. nvidia "fixed" dx11...
    But its hard to support nvidia when they do something like this.
    Still, its just the opinion of AMD, im not going to hate on either of them until a dev says its bad.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 31 May 2014 @ 1:23am

    Re:

    Considering that AMD have managed to leverage a display standard in order to improve frame sunchronisation rates (Freesync using DisplayPort 1.2a) and openly distribute thier GPU drivers for Linux, and they aren't actively harming consumers' collective choice, I'd say that AMD have the upper hand here.

    That's not to say that Watch Dogs isn't horribly-optimised, because it is. But I woudl give more latitude to AMD over NViodia in the GPU space over the decisions it makes.

    link to this | view in thread ]

  16. icon
    Watchit (profile), 31 May 2014 @ 2:27am

    Re:

    The problem isn't Nvidia keeping their code secret, it's that GameWorks platform prevents developers who use it from getting any feedback from AMD. Pretty much barring AMD from even helping out with code that has nothing to do with Nvidia's GameWorks.

    link to this | view in thread ]

  17. icon
    Watchit (profile), 31 May 2014 @ 2:33am

    Damn that pisses me off. Just recently got a new AMD card! Nvidia taking a piss on AMD users sure as hell isn't winning them any points on me buying a new card of theirs.

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 31 May 2014 @ 2:37am

    Re: Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm

    Installing nvidia drivers on linux is an extreme pain, at least when you are learning how to linux (me 3 years ago), especially when Ubuntu 12.04 didn't have a drivers manager thing (which made me leave Ubuntu forever), had to boot in recovery mode in root prompt to install drivers for my 2 geforce 230gt working in sli.

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 31 May 2014 @ 4:51am

    Re: Re:

    Based on numerous comments, AMD has done something similar, and the Tech Dirt reporting is a bit lopsided on this one, in favor of AMD, when it should not be.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 31 May 2014 @ 6:42am

    Re: Re: Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm

    Ubuntu does and DID have a drivers manager thing; nVidia prevented them from using it for drivers for nVidia.

    At 12.04 ubuntu, you had the restricted drivers manager, (Separate from the manager under software sources), to manage proprietary drivers, which wasn't palatable enough to nVidia, so they nixed the auto-configs to force you to install from command line. Take in mind this was still while nVidia was officially 'working with' linux.

    Still, at that point, you could still have installed the nouveau (non-proprietary) drivers from the restricted drivers manager.

    link to this | view in thread ]

  21. icon
    Michael Donnelly (profile), 31 May 2014 @ 7:44am

    Market forces should handle this.

    1) Ubisoft knew the product would under-perform on AMD hardware of a similar class. They chose to release it this way.

    2) People buying the game with such hardware will be disappointed. A large number of them will understand point #1.

    3) Such folks will become reluctant to buy further Ubisoft games.

    The normal pain feedback cycle applies here, unlike in many other non-competitive situations (broadband, anything related to movies/records, etc). If Ubisoft can make more money pissing off 40% of their target market, great. If not, they'll work harder to make sure the performs well on both chipsets.

    Laying any blame at nVidia's feet (or AMD's) is silly. Ubisoft makes the call, Ubisoft reaps the results. They don't have a monopoly on the market and Watch Dogs isn't big enough to make people switch display adapters.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 31 May 2014 @ 9:03am

    Re: Re: Re:

    when has AMD done such a thing?

    Generally things work like Nvidia introduce proprietary solution, and AMD pushes for the same thing in less expensive and in an actual standard, like the G-sync stuff were you need a specific hardware from Nvidia in the display, while AMD got the very same functionality introduced in the VESA displayport standard. Or the proprietary CUDA stuff, where AMD has been helping to establish OpenCL.

    Nvidia is the one doing the shady business practices. AMD actually does things to help the overall computing world instead of locking things up behind proprietary crap.

    link to this | view in thread ]

  23. identicon
    AC, 31 May 2014 @ 9:03am

    Re: Re:

    About time someone got it right, did this commenter read the article, did anyone spouting Nvidia did nothing wrong? The whole of it is if you use the gameworeks platform from Nvidia you are NOT allowed to get help to make the game run better with AMD cards as well. So this gameworks Scheme is the problem not AMD and Nvidia keeping secrets.

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 31 May 2014 @ 9:17am

    Re:

    You mean like they "fixed" DX10 where displacement mapping (essentially tessealtion) was stripped out, because Nvidia was unable to make a compatible GPU in the required timeframe? Coincidently, that was the reason ATI's radeon HD 2800 series performed like arse, because it was built for the un-neutered DX10.

    Nvidia is one of the most shady businesses in the current Hardware industry.

    Oh and also, they use paid shills to pertuate the "AMD makes horrible drivers" myth (btw, it were nvidia drivers that actively destroyed hardware because they fucked up fan control, and I also remember the statistics from microsoft were nvidia driver were responsible for about 10 times as many system crashes as AMD/ATI drivers) and generally badmouth AMD hardware, especially when new GPU releases are upcoming.

    Fact is, Nvidia lies and cheats, they have been for a very long time now.

    link to this | view in thread ]

  25. icon
    DB (profile), 31 May 2014 @ 10:59am

    G-sync was an innovation.. an actual invention.

    I don't use those words casually.

    It's something completely obvious, but only in retrospect. Before there was a solution didn't know it was a problem.

    The usual competition in the GPU wars is an immediate claim of "that doesn't matter", followed by implementing a similar feature a year or two later, when it suddenly matters.

    A good example is fixing frame rate variability, which caused jerky game play even with high FPS rates.

    The reaction to G-sync was 'no one cares' followed a few days later by '[oh shit]' and a mad scramble by AMD to find something.. anything.. that would have the same benefits.

    AMD figured out an alternate approach. The AMD approach is not quite as good, but it still gets most of the benefit. It was possible to reprogram the latest AMD hardware to put out variable sync, and could be quickly added to an existing VESA standard.

    AMD would not have done this on their own. The motivation was an innovation by Nvidia. AMD was strictly reacting.

    link to this | view in thread ]

  26. identicon
    Anonymous Coward, 31 May 2014 @ 8:44pm

    A cardinal rule of politics is never to repeat an allegation in the course of denying it, e.g. Nixon telling the press, “I am not a crook.” Nvidia’s name certainly doesn’t inspire confidence: “invidia” is Latin for “envy” and the root of “invidious.” Maybe a re-branding is in order…

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 31 May 2014 @ 8:51pm

    TechDirt for Twitter: “Nvidia Gameworks. AMD game doesn’t.”

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 31 May 2014 @ 9:32pm

    Re: Re: Re:

    Frankly, I haven't seen a single ounce of proof that people are actively prevented from working with AMD if they work with GameWorks. They are prevented from giving others the source for the GameWorks middleware - but not their own builds.

    On the other hand, AMD often prevents developers from giving builds to Nvidia until a few days before release. Tomb Raider, anyone?

    link to this | view in thread ]

  29. identicon
    rasz_pl, 1 Jun 2014 @ 4:11am

    Technically no one is prevented. Nvidia simply will not "certify" your game as "meant to be played", and in effect WONT PAY YOU the developer a huge bribe^^^ad revenue.

    This is not a case of Nvidia helping optimize, they are paying developers for CRIPPLING their products.


    You dont get to sell millions of copies as a graphic card bundle http://www.geforce.com/getwatchdogs by simply optimizing, you bend over backwards to please your slave master.

    link to this | view in thread ]

  30. icon
    Jack Of Shadows (profile), 1 Jun 2014 @ 6:55pm

    Re: Video Sync

    Sorry but that feature dates all the way back to the Amiga 1000 [1985]. Innovation? Go talk to the ghost of Jay Miner on that one.

    link to this | view in thread ]

  31. identicon
    Rabbit80, 2 Jun 2014 @ 12:49am

    Re: Re: Re: Re:

    link to this | view in thread ]

  32. identicon
    Whatever, 2 Jun 2014 @ 2:12am

    Re:

    So what you are saying is that in a free market, they are not allowed to decide to code their games to work better on the cards of a company that helps them gain sales and makes them money.

    Oh the shame of the free market!

    link to this | view in thread ]

  33. identicon
    Grrr, 3 Jun 2014 @ 8:32am

    A biased post. Here is the Nvidia response

    From the Forbes article linked in a previous comment:

    To put this particular argument to bed, I told Cebenoyan I wanted crystal clear clarification, asking “If AMD approached Ubisoft and said ‘We have ideas to make Watch Dogs run better on our hardware,’ then Ubisoft is free to do that?”

    “Yes,” he answered. “They’re absolutely free to.”

    And there’s nothing built in to GameWorks that disables AMD performance? “No, never.”

    Perhaps more fascinating was Nvidia’s response when I flipped the situation around. What about AMD-partnered titles like Battlefield 4 and Tomb Raider? How much lead time did Nvidia receive — and how much would they need — to optimize Nvidia GPUs for those games? While I didn’t receive a direct answer, what I got was Nvidia returning fire.

    “It varies. There have been times it’s been more challenging because of what we suspect stems from deals with the competition,” Cebenoyan says. “It doesn’t happen often. But when it does there’s a fair amount of scrambling on our part. I can tell you that the deals that we do, and the GameWorks agreements, don’t have anything to do with restricting anyone’s access to builds.”

    link to this | view in thread ]

  34. icon
    John Fenderson (profile), 3 Jun 2014 @ 8:43am

    Re: A biased post. Here is the Nvidia response

    That hardly puts the argument to bed. It just says what we already knew: every player says the other guy is the one doing bad things. It seems likely that they're all doing bad things.

    link to this | view in thread ]

  35. identicon
    Anonymous Coward, 4 Jun 2014 @ 10:12am

    I've never cared for nVidia.

    link to this | view in thread ]

  36. identicon
    looncraz, 10 Oct 2014 @ 10:51pm

    Re:

    Variable frame-rate displays have been around in various forms for decades. The only thing new is that is is being done on LCD instead of CRT or projector.

    Also, it takes far longer to get amendments to a standard ratified than it does to build a frame buffering/repeating device to fake variable frame rate display. Yes, fake. LCD tech requires refresh at certain intervals to maintain the image, so the nVidia solution is to replay the old image at that next interval, then play the new image as soon as it is ready (even if an LCD refresh isn't needed).

    This is just display-level double buffering - something that really shouldn't require much extra display logic or hardware. You can keep the last completed frame in the frame buffer of the video card and time updates relative to the forced display refresh (20hz as an example). If you can get a completed frame to the frame buffer before the next forced refresh, you update the frame buffer and trigger a display refresh with a signal to the display (only new display 'logic' required). This update resets the clock until the next required refresh.

    If you can't make the deadline, you simply leave the frame buffer intact and wait until the refresh, then you update the buffer again and trigger a display redraw.

    nVidia's solution is crude and invasive - the only purpose it has is to make it appear like they are doing something special or innovative.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.