Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Caustic Graphics 100% Raytracing (vimeo.com)
31 points by mixmax on April 22, 2009 | hide | past | favorite | 5 comments


Proud dad: my son is one of the three founders of Caustic. In the video James (someone else's son :-) demonstrates the developer version done in FPGA. A 10x version in custom silicon is due out next year.

The founders raised their nest egg by cashing in their Apple stock. They created a software emulator and then raised $5.5M from angels, not VCs. They did use a VC to find a CEO whose function was to raise the capital. He got no salary and no stock until he got the funding. Even after funding the founders still had control. Take heart, it's possible to do hardware, even these days.


Well, you're right to feel proud.

It's a fine and refreshingly hype-light product. The only change I'd like to see is a simple application included that takes a common 3d file format (blender, pov or something) and spits out a rendered bitmap.

Instead of waiting for developers to add Caustic support to their 3d apps over the summer, this would allow a 3d artist to start saving time tomorrow, even knowing that that it wasn't optimized. The little app displayed in the video demo would be more than sufficient, it just doesn't seem from the website like it comes with any ready-to-use tools for the non-programmer.


Really quite impressive. Looks like they come from good stock. Congrats.


Impressive technology, they will certainly gain a lot of speed by turning their FPGA architecture into a fixed chipset. This is the kind of hardware/software that would be interesting to see in next generation consoles. Video games production would require the same amount of modelers/texturers/animators, but way less shader/graphic programmers as a lot of the material hacks from the current GPU generation would be obsolete.

I've explored a bit with real-time raytracing on this project : http://www.jrbedard.net/projects/C/raytracer/raytracer.htm It worked in real-time with a simple scene and basics reflective/refractive materials. But I had to rely on OpenGL speed up some of the transformations and blitting. Trying to fit raytracing on current generation GPU/APIs is a pain and feels like a hack, even with Cuda. Their product seem to resolve that problem of more easily reaching photo-realistic real-time performance.


Very cool. Does anyone know what level of global illumination they are able to reach? I noticed they did have an ambient occlusion setting which produced some noisy-looking shadows (though that could've been a video artifact). Obviously difficult (i.e. highly indirect) lighting situations will seriously impact the framerate, but it would still kick ass if they could hardware accelerate it. Given that they use GLSL to define their surface properties, I'd be interested in how they figure out where to cast their random rays.

Baking this into an ASIC instead of an FPGA would be awesome. But I'm sure they're already onto that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: