Thursday, April 1, 2010

GPUs and science?

I just attended an invited talk on how someone used GPU computing to process bioacoustic information and resulted in around a ~10x speed increase. Our research computing department has been kicking around the idea of if and how it should support or engage in GPU computing. Inevitably the question always arises, is this useful and when?

I've found myself sort of drawn to GPU computing and CUDA since I first found out about it. Maybe it's the pretty bar graphs showing the increase in speed or maybe it's the idea that it's really "cutting edge" computing but throughout this whole year I've been thinking about it a lot. A lot of people have made some very good points that for doing science, it may be too early to jump on board unless your problem is easily GPU-ized and you need it answered now. The major issues with GPU adoption are:

1) It's so young. What will be the dominant API is likely not whatever is available now. Nvidia released the CUDA SDK in 2007 and they've just released an update that pretty drastically changes it with the inclusion of C++ among other things.

2) GPU's are the next math co-processor. It's only a matter of time until they are fully integrated onto CPUs, AMD has plans to do this by 2012 with bulldozer and Intel attempted to do this recently with Larrabee. Once this happens you won't need any fancy-shmancy GPU code, everything will just happen automatically.

3) Which leads us to the last major issue. Coding for the GPU while definitely easier than it used to be pre-CUDA and openCL is still a major pain in the ass. So far I've mainly coded in high level languages like java, perl, python, ruby with fun, convenient things like garbage collection. With GPU computing I'm suddenly much closer to the hardware. I never learned lower level coding because to me it seemed tangential to what I wanted to do but now with the dramatic potential speed increases GPU computing offers it's really tempting to struggle with it.

Despite these major issues concerning GPU computing I'm still going to forge ahead I think. Thinking in these massively parallel ways has proven to be a lot of fun in its own right and that's the point of academia in my mind, to have fun and increase the knowledge of humanity.

No comments:

Post a Comment