I hope to have some more substantial thoughts on GDC, but one nice trend was a number of talks that focused on offloading graphics work off of the GPU and onto CPUs (in this case, SPUs on the PS3).
For the last couple of years there has been a major push by the graphics card manufacturers to get non-graphic-y things onto the GPU. Cloth, physics, heck I'm sure there's even some GPU AI examples out there somewhere. These are things that console game developers I know don't particularly want or need.
The lion share of console games are GPU bound. The last thing I want to do is put more stuff on the GPU. So even if your cloth or physics solution runs really fast on the GPU, I'm not going to use it because there is no room at the inn. Even if a CPU solution is slower, it won't matter since I've got the spare processing capacity due to having to wait on the GPU, or have processing elements that are not used during a frame.
What I want to do is offload as much as possible to the CPU, since most games still probably are not maxing out the CPU capabilities of the PS3 or 360. It was nice to see some talks focusing on doing hybrid GPU/CPU solutions to things such as lighting or post processing, and I imagine this trend will continue.