Ever since I mentioned Steve‘s tech demo a few days ago it sure has made the rounds on ze web and not necessarily in a very positive way. Now I’m not completely innocent in the matter, but then again Steve could just admit that teasing users this way was a stupid thing to do. ;-)
Anyway, the whole discussion is a painful reminder why GPU acceleration isn’t the end all be all of all things how we as users have to bear the burden of it every day. In fact I was just working on a project this morning and guess what? I couldn’t use 2 plug-ins from different vendors that use OpenGL at the same time because my graphics card couldn’t handle it or the plug-ins were leaking and interfering with one another due to some perhaps not so great programming. Similar experiences crop up every day and ultimately it doesn’t even matter which acceleration method a plug-in or program uses. Be it OpenGL, CUDA, OpenCL or as in Steve‘s demo, OptiX, they’re all bound to cause trouble somewhere along the way in some combination. I for instance can easily make Cinema 4D crash while After Effects is running just as I can make Optical Flares crash when using FreeForm Pro inside After Effects. All of that is just mundane OpenGL! So why would I want to impair my workflows even further by adding more potentially risky and unstable parts to the equation by adding ever more accelerated features? That’s the part Adobe need to understand – unless they can guarantee that such problems don’t occur, there is no point whatsoever. And they can’t! Obviously After Effects itself has a track record of flaky and non-working OpenGL and to date we still simply recommend to users on forums to turn it off. Similarly, nobody forced them to use graphics hardware for Photoshop Extended‘s 3D and a few other toyish features there when half the time it doesn’t work right. So why should I trust a company that has never managed to get it right in the last 10 years with an old, evolved API to get it right now with a brand-new, untested one? And they all are, not just OptiX.
What’s also figuring in in the discussion is the inability of said graphics hardware vendors to provide reliable, slim and stable drivers that don’t scare the shit out of you every time you may need to update them. At least I’m always at the verge of a nervous breakdown whenever I need a new driver version as by and large chances are good that it will crash the system, the machine only boots in safe mode and programs that were perfectly able to use OpenGL before suddenly will not work anymore. And there are the unavoidable lesser side effects of course like losing color profiles and calibration or multi-screen setups suddenly reverting to single screens and re-arranging your desktop. Would I want to go through yet another of these painful experiences every time Adobe or nVidia decide it’s time for a new version of their acceleration framework and risk losing valuable production time? I don’t think so!
On a completely different level of course this hardware acceleration stuff takes you on the leash. Anyone remember the first version of the Mercury playback engine in Premiere Pro CS5? It ran on exactly 4 or 5 certified nVidia cards and the cheapest one would have been a GeForce 285, the others being more expensive Quadro models. So anyone who actually wanted to use it had to spend a lot of money and lock himself into a specific configuration. The only consolation here is that traditionally the same limitation has applied to edit suites forever because of their video capture cards and other specific hardware people stuck with their equipment for rather long, so it is less of a problem. However, why should the same apply to other tools? Why would anyone with brains turn what still is a resolution independent, multi-format compositing tool like After Effects into a hardware dependent program just for some realtime 3D features? Would you want to potentially spend a lot of extra money just so you can use a feature and then get stuck with an ailing workstation for the next couple of years? I certainly wouldn’t as it’s exactly the situation that I’m currently experiencing at work and it frustrates the hell out of me. It’s more than slightly ridiculous seeing Adobe make such moves, when you cannot even run CS5 on Windows XP 64bit, if you get my meaning.
So ultimately, is there really a need for this? I tend to disagree. True, After Effects could use some better 3D, but what’s shown in the demo is not it. Not only do I think that this extrusion stuff (and a lot of other 3D features) is overrated and actually pretty irrelevant for many workflows, but I also feel that as long as other fundamental issues in the program are not resolved, any enhancements in that area go to waste and equally the team are only wasting their time researching it and building prototypes. Think about it: What good would this do you, if you still cannot animate properly in 3D space because there is neither a unified realworld unit system nor an updated graph editor/ animation curve model? Imagine importing differently sized 3D items and trying to animate them consistently. If all you have is the current composition 3D space and graph editor, this would be a major nightmare still. And even if you get extrusions and all that, wouldn’t it not be sufficient if this stuff were displayed ion a fast wireframe or OpenGL shaded environment as opposed to dragging through molasses like we currently do? I’d be lying if I said that I didn’t love FPrime in Lightwave or modo‘s preview renderer for lighting my scenes, but even they would be useless if I couldn’t navigate fluidly. If I could achieve the same level of interactivity in After Effects, that would go a longer way toward improving the 3D experience than any fancy GPU rendering stuff without these other enhancements ever could…