Thursday, August 13, 2009

Diminishing returns

One thing that has been on my mind since SIGGRAPH is the problem diminishing returns poses: when do you switch from an approach, algorithm, or model because any gains to be had are increasingly diminishing?

The specific thing that has got me thinking about this is the rapid approach of fully programmable GPUs. So far this is not looking like it will be another evolutionary change to the venerable D3D/OpenGL programming model, and will in fact be a radical change in the way we program graphics. Which is just another way for saying it will be a change in the way we *think* about graphics.

At SIGGRAPH there was a panel of various industry and academic luminaries discussing the ramifications -- is the OpenGL/D3D model dead? (not yet), what will be the model that replaces it? (no one knows), is this an interesting time to be a graphics programmer? (yes). A colleague pointed out that the members of the panel lacked a key constituency -- a representative from a game studio that's just trying to make a game without a huge graphics programming team. The old model is on its last legs, the new world is so open that to call it a "model" would be an insult to programming models. If you're an academic or an engine maker, this doesn't present a problem, in fact, it is a huge opportunity -- back to the old-school, software renderer days. Anything's possible!

But for your average game developer, it could mean you are one poor middleware choice away from disaster. You don't have the resources of the engine creators, so being ripped asunder from the warm embrace of the familiar D3D/OpenGL model can be a little terrifying. To put it another way: the beauty of a model like D3D/OpenGL is that no matter what engine or middleware you use, when it comes to the renderer, there is a lot of commonality. In this new world, there are a bunch of competing models or approaches -- that's part of the point. Engine creators will have a bevy of approaches to choose from -- but if you're just trying to get a game done, and you find your engine's choice of approach doesn't match what you need to do, well, you've got a lot of work all of the sudden.

But we face these choices in software development all the time: when to abandon an algorithm or model because of diminishing returns. Change too soon and you've done a lot of extra work you could have avoided by just refining the existing code. Change too late and you miss opportunities that could differentiate your offerings. We like to pretend like doing cost/benefit analysis on this kind of stuff is easy, as if we were comparing a Volvo against a Toyota at the car dealer. But often the issues can be quite complex, and the fallout quite unexpected.

It's cliche, but we live in interesting times.


  1. The really scary stuff is the potential extreme divergence that's possible. I'm gonna bet that the middleware winners will be the ones that actually go out of their way to avoid breaking the common DCC tools. Another thing middleware engine providers are going to need to be more up front about is what breaks the engine. Managing expectations has often been a really big problem for a lot of them.

    For example, when a certain popular engine (ahem) underwent a major overhaul that completely broke games with lots of dynamic elements... well let's just say it won't be something as easily recovered from going forward.

  2. Speaking of engines, have you seen Tim Sweeney's HPG talk? Interesting stuff, particularly about when they anticipate shipping a game on UE4: