Ah, love your raytracer! I for one am quite happy with what modo has on offer, I very much liked Lightwave‘s renderer and despite its awful performance, even the Cinema 4D renderer has become kinda okay-ish with recent versions. It still needs a 5000% speed boost, though, which I realized once more trying to render a heavy machinery scene with transparency, refractions and lots of reflections for some "peek inside me" visualization.
Having 2 and a half reasonably good renderers at hand has limited my exploration of others to a short tryout of M4D (which didn’t quite work out), but of course I’m keeping an eye on things developing out there. In fact if VRay for C4D would develop more in sync with the main branch and not always lag a year behind, I’d give that a whirl, too. And of course then there’s always fryrender/ Arion, Thea, Maxwell, Octane and what have you plus experimental ones like Mitsuba and since there’s a new one coming out every week like Corona for MAX depending on what program you use you can turn into a rendering TD and evaluate their pro’s and cons all day.
Since I don’t usually have the time I normally stay away from that, but just for fun I had a look at Fluid Ray RT. Feature-wise it’s your bog-standard physically based still image renderer with a terrible user interface, noisy transparency sampling and bad antialiasing, but it piqued my interest because it uses intel‘s Embree stuff, meaning it’s mostly CPU based. And therein lies the good part. I’ve never been a friend of all that GPU mumbo-jumbo because it’s not really solving problems, merely shifting them to a different playing field, but at the same time imposing its own severe limitations. If nothing else, the GPU raytracer with its exorbitant hardware requirements to do some shitty 3D text in After Effects CS6 is a clear sign of that as is many of the renderers I mentioned before being limited by the graphics card’s limited memory or even things like your PCI bus’ transfer speeds.
And why am I telling you all this? By ways of my telepathic powers I’m telling you that this is likely going to be in CS7. It’s so inevitably logical, it makes you want to burst into tears. You know, Macs have long been intel-based and many of those users got shafted last time when they were part of the AMD/ ATI graphics card fraction. This is a chance on Adobe‘s part to make good. Of course on the other hand, we can always count on Adobe telling people that the existing stuff is good enough, especially now that Apple have switched most of their products to nVidia chips, but I hope they’re not that lazy and a bit smarter. And imagine how that would feel on a Xeon Phi. Mmh, that could be interesting, if only we weren’t relegated to using the legacy OBJ format to shuffle around our stuff…