Last week in London, AMD gathered just over a dozen journalists from all over Europe at an event entitled its "GPU Technology Conference". We sat down with Richard Huddy, Worldwide Developer Relations, and Neal Robison, Global Director of Developer Relations from AMD, as well as Adrian Thompson Vice-President of Global Marketing from Sapphire, Gareth Thomas, Senior Programmer from CodeMasters and Chris Kingsley, CTO from Rebellion, in a round-table discussion about all things AMD GPU.
For the most part, we've covered AMD's position on a lot of topics in our
interview with Richard Huddy, AMD Worldwide Developer Relations Manager, so we didn't feel the need to reiterate them again.
That's not the only reason why I've opted for this blog post, though; the second reason is that I feel AMD runs out of steam once it covers DirectX 11 and Eyefinity, and that its "open" attitude might appear noble but it's covering up for other inadequacies.
[break]
Back to that in a minute though - firstly we'll cover the main points made in the round-table:
- Early on in the discussion, Rebellion CTO, Chris Kingsley stated that he believes this year will see PC gaming on the up again. He made the point that such is the uptake of DirectX 11 in the industry, its ease of coding and porting it to consoles, and that technically, given this generation of graphics cards, games should look a jump ahead of consoles.
This continued PC evolution versus static consoles was the same when the PS2 and Xbox were on the back-end of their life cycle too. However, we feel his enthusiasm might be overstated since the strength of console gaming is unparalleled compared to previous generations: marketing budgets and anti-piracy measures are greater, plus integration of internet services for consoles has people locked into that environment.
- AMD claims that Eyefinity is not elitist. We were told that having three cheaper monitors is do-able on a mainstream HD 5670, and the gaming experience is better. AMD dropped in a sly excuse that some developers do not want Eyefinity in their games because it gives an unfair advantage in competitive multiplayer!
We respect the fact that gaming can be better on surround monitors - providing the gameplay suits it - and competitive multiplayer (to the degree of standardising equipment) is a niche compared to PC gaming as a whole. However, we still maintain the rule that a single, better-quality monitor beats having many cheaper ones. It's the same as a single, faster graphics card versus multi-GPU.
We're also very dubious about the claims that the Radeon HD 5670 can achieve a pixel-pushing power necessary for Eyefinity resolutions. In our review, the card could barely manage to achieve playable framerates on a single 1680x1050 monitor, let alone three. Sure, you can turn the detail level right down to compensate, but would you really want to?
- Continuing on Eyefinity, AMD claimed that Samsung are working on narrow to zero bezel monitors that should be available for later this year. This should remove the distracting black bars.
If that wasn't enough though, AMD also said there should even be a stand to support all three together. Nice!
- Continuing on the topic of buying multiple monitors in one go, we probed AMD and Sapphire about bundling cards with multi-monitor purchases (like Nvidia has done with Stereoscopic 3D kits) or working with LCD manufacturers to offer discounts for multi-buys.
Adrian Thomas, Vice-President of Global Marketing from Sapphire Technology piped up at this point and shot down the idea with a barrel full of real-world. He claimed that the global channel (for product distribution) just doesn't work like that, and any multi-product discount incentive created by the manufacturer will be stripped apart by the distributors or retailers to make extra money, or, bundle in their own offers to create a different offer for their own customers.
He elaborated with several convincing scenarios he'd witnessed - some illegal - that go completely outside of the control of both AMD and Sapphire.
- When the discussion moved into game development, Chris Kingsley from Rebellion piped up about Aliens versus Predator, and his relationship with the Publishers.
Chris claimed that for years, games have tried to emulate movies (Mafia, being one of the very few notable examples that worked well) - this was true to even five years ago as the method of designing a game came primarily from story and ideas, followed by engine and game design.
However he says that these days, in Rebellion, the engine and technology is discussed before the game is even fleshed out. They decide what kind of features they want and then know the boundaries of constructing an interaction and story. Chris claims that this is how the movie industry, with its growing reliance on CG, is also getting the ground-work done. Work begins on the engine which drives the latest CG effects before the scripts are finalised (providing the budgets are in place, of course).
With regards to publishers, Chris informed the non gaming-savvy crowd that publishers want the maximum effect for minimum cost, as you'd expect. This means they drive down the minimum spec of a game so more people can play(-buy) it. Conversely, he said it was very rare for a game like Crysis or GTA4 to make it out the door, because they detail settings in excess of what any PC on the market was (and still is) capable of.
For him and his team, this goes against the grain of professionals creating a product to the best of their ability as a talent showcase. This discussion brought him in alignment with AMD, where he explained the tessellation in DirectX 11 offered Rebellion the potential of far greater detail in the up-close-and-personal encounters. He even detailed that some scenes of the game were scripted so that these encounters are destined to occur, and that his team had cranked up the texture detail (potential) and tessellation factor accordingly.
Naturally, once Nvidia releases its own DirectX 11 products, the same should be available on its cards as well.
- Finally, in another different - but related - publishing discussion, Neal Robison, Global Director of Developer Relations for AMD, who has also previously worked for Vivendi and Universal Studios, dropped his opinion of Valve's Steam service.
Neal claimed that, as an ex-publisher, Steam was the best thing to happen to the PC gaming industry in terms of reducing piracy and killing off the resell market. However, he was also keen to point out that Steam needed a well-known, international competitor to make sure it remained competitive and in the best interests of the game buying community. Fair enough!
Our German friend, Nico Ernst from Golem.de, reminded Neal that even though some markets embraced online purchasing, the German market had not. Although, having said that, Germany has greater problems with PC gaming censorship and different gaming tastes as a result.
You may or may not have noticed that there is a lot of discussion about DirectX 11 and Eyefinity as a the future gaming - including "rooms of monitors for ultimate immersion" fantasised about by a few at AMD. However, very little was said about all the other aspects of GPU computing, which, at a "GPU Technology Conference" seems like a glaring omission.
The bottom line is: AMD does not have much to say about it.
We already know from our interview with Richard Huddy that AMD will have OpenCL-accelerated Bullet Physics this year, but AMD openly admits it is taking a very hands-off approach to GPU-accelerated computing. Its attitude is to simply supply the tools and let others get on with it.
After much discussion in the office in the last month, between this and AMD's position of being an "engineering company, not a marketing company", we feel it's a noble front for a deeper issue of underfunding and understaffing. AMD’s developer relations applies itself to specific projects, rather than a more general campaign that Nvidia runs with its TWIMTBP (The Way It's Meant To Be Played) program. AMD might crack out the angel wings to harp on about how developers come running to it from the Nvidia dev-rel program - GSC that develops S.T.A.L.K.E.R being AMD’s trump card - but if Nvidia’s program was
that bad, AMD would likely have many more developers turning up on its doorstep than it could deal.
We just don’t see that as being the case, and simply, some companies prefer an intimate approach with the intensive co-operative bundling and marketing that AMD does, and others prefer Nvidia’s general support and sticker program.
One thing's for sure though, AMD does the intimately co-operative stuff
a hell of a lot better than Nvidia because it's open attitude is still much more acceptable to the end gamer. AMD has never pushed a game that you cannot run on Nvidia hardware, whereas the opposite still cannot be said (I'm looking at you, ATI GPU and Nvidia PhysX card).
Thoughts had crossed our minds that AMD could still want to push its CPUs to some degree. It would be understandable considering it devotes a considerable amount of its (respectfully more limited) resources to developing them. Although, both the ATI and AMD engineering teams are largely still very separate (Neal Robison highlighted this fact because ATI still work closely with Intel and that gets up "AMD's" nose), and its GPUs consistently prop up its CPU business.
In comparison, for all its faults, Nvidia is (ironically) following Intel's tried and tested method of actively pushing the software side to sell hardware. Intel did it with specific compilers (ignoring the anti-competitive "GenuineIntel" tag), broad developer support and real-world demonstrations. Nvidia recently did the same in its recent Fermi deep-dive when it showed off a real-world CUDA application, and how the new technology improved a sample of work from a company it had been co-operating with.
Where Nvidia gets off its hide to inject engineers into companies to show them how CUDA accelerates their products, AMD sits back and hopes OpenCL - a solution without a killer problem - will be taken up by developers in their own time, and, more importantly, at their own cost. We appreciate why the Nvidia (and Intel) way works, because it pays for itself. The companies you help develop software have customers who then buy your product.
There's no incentive to develop open solutions then, because there's no guaranteed return and while we always like the idea of open standards in the PC industry, it’s undeniable that Nvidia has gained more GPGPU traction already. If it convinces enough of the industry to adopt CUDA extensions - something that's very likely given the lack of Larrabee and AMD's apathy, and propitiatory to its hardware or not - it will do well.
AMD's only hope is that Microsoft makes GPU Physics a part of its future DirectX API on the OpenCL standard - which will leave Nvidia in the position Creative was with its EAX versus OpenAL.
We have approached Nvidia recently to discuss its side of the coin and understand the depth of its recent bad-boy attitude, as the pusher of its own standards. The growing perception among the vocal online community is that Nvidia is corrupting itself, and we want to find out how deep that has rooted – or – if AMD is playing on that fear successfully to hide its own lazy-slash-underfunded attitude to wider industry support.
Want to comment? Please log in.