IRTC entry

General discussion about JPatch

Postby dcuny » Mon Aug 07, 2006 11:07 pm

sascha wrote:
Ray differentials work when the texture extends over a large area of the texture map, but when the area becomes too large, it's sort of impractical to average the texture.

I'm not sure I fully understand.
No, sorry... I was referring to the problem of having to average a texture when the dx/dy value was large - handling something that's far away. But I think it's pretty easy to have the texture routine be smart enough to handle that. As you said, it's a problem for the shader, not the raytracer.

RenderMan has builtin support for LOD models, but I have to check if it can switch between shaders too.
One of the things that had in Cars was an automatic LOD feature - check out the Stochastic Pruning paper.

Did you know that they used BMRT as a "ray server" for prMan in "A Bug's Live"? prMan got native raytracing support in a later version (I think it was 11).
Yeah, and then Pixar sued ExLuna off the face of the earth. :?
dcuny
 
Posts: 2902
Joined: Fri May 21, 2004 6:07 am

Postby sascha » Tue Aug 08, 2006 7:58 pm

I was referring to the problem of having to average a texture when the dx/dy value was large - handling something that's far away. But I think it's pretty easy to have the texture routine be smart enough to handle that. As you said, it's a problem for the shader, not the raytracer.

One interesting option is to "bake" procedural textures (2D and/or 3D) to 2D (u/v mapped) image maps. This would make them applicable to mip-mapping, and low-res versions of the baked u/v-maps could be used for interactive display using OpenGL. If the texture is not animated (i.e. the image-maps can be re-used for all frames) this would even speed up rendering a lot. I think this trick should work not only for image-maps, but for bump- or displacement maps as well.
It isn't applicable for all types of surface shaders, but should work in most cases.

check out the Stochastic Pruning paper.

Wow - this is a cool page. Bookmarked :)

Yeah, and then Pixar sued ExLuna off the face of the earth.

I don't know who's right here. But IIRC Larry Gritz now works for NVIDIA and develops their Gelato renderer (which uses the GPU of NVIDIA cards to speed up rendering).
sascha
Site Admin
 
Posts: 2792
Joined: Thu May 20, 2004 9:16 am
Location: Austria

Previous

Return to General discussion

Who is online

Users browsing this forum: No registered users and 2 guests

cron