Question about renderers

General discussion about JPatch

Question about renderers

Postby squirrelhavoc » Mon Jan 30, 2006 1:55 am

Sorry if this is a double post, I could have sworn I already posted it, but I can't seem to find it.

Sascha has support for different kinds of renderers, such as pov-ray, renderman, and inyo. I don't know a lot about the internals of renderers, but I've been seeing that say one supports this, while the others dont, yet another supports that, while the others dont. My question is: Is jpatch always going to have the same final output with all renderers, by only using common features, or will it support renderer specific features that arent cross-compatible (if thats the right word)? I would prefer the latter, because if I prefer, say, renderman, then I would like to take advantage of all it has to offer, whether it's supported by other renderers or not. Of course, I don't have a preference yet, but I'm starting to get into Inyo.

Speaking of Inyo, what does the future hold for it, in terms of new features, better quality, or better speed? The only thing that really keeps me from using Inyo more is that it seems (on my PC atleast) to be super slow. I don't know what all the options do, so maybe I am setting something I dont need, like caustics or AO. Im still learning about such things. Perhaps a tutorial in the Wiki that explains different renderer options is in order. I'd write it myself, but I dont know what they all do yet.

Anyway, just a few off topic questions. Thanks for your time!
Squirrel Havoc

We live as we think, very very slowly....
Posts: 180
Joined: Tue Jun 28, 2005 11:17 pm
Location: Oklahoma, USA

Postby dcuny » Mon Jan 30, 2006 9:30 am

As far as speed goes, I've already tried to make Inyo as fast as I know how. The real killer is the global illumination code, which is slow because it does a lot of samples in order to reduce noise. This includes ambient occlusion. :?

I'm working on an irradiance cache, which should make things go faster. Instead of generating new samples, Inyo will interpolate values from existing samples. I should be able to use this with any of the different global illumination algorithms. I should also be able to use it to accelerate the subsurface scattering code.

Here's an image of the old irradiance cache in action. The dots indicate where Inyo has taken samples. Compare that to having to sample every pixel, and you can see where the irradiance cache would accelerate things. Of course, it takes time to search through the cache, so it's not all gravy.


I'm also adding in support for photon maps, but I don't expect it to be much faster than any of Inyo's existing global illumination algorithms.

As far as new features go, there are a number of features that I've coded in, but haven't had a chance to test yet. For example, there are a slew of different material algorithms. I also plan on adding materials that make use of the fresnel angle.

I've also got untested code for spotlights and different falloff types.

Alpha channel support is also on my list, which should make it possible to composite characters onto pre-rendered backgrounds. To some extent, that will depend on JPatch supporting compositing layers of some sort - something not likely to happen until version 0.8 goes out.

I've also made some changes which should solve some of the aliasing problems Sascha saw with his drill. I suspect that a good chunk of my time will be spent tracking down bugs.

You can find an overview of the Inyo settings here.

Did that answer some of your questions?
Posts: 2902
Joined: Fri May 21, 2004 6:07 am

Postby sascha » Mon Jan 30, 2006 10:33 am

I try to keep JPatch as independent from any renderer as possible.
My point of view is that a modeler (or, for that matter, the animator) should create the geometry and that it's the renderer which is responsible for, well, rendering. There is a gray area in between (e.g. you'll of course want to setup lighting and cameras in the animator), but it's still the renderer that's resposible for how to interpret e.g. the lightsource. Shading is also done in the renderer.

Long story short - if JPatch only used features that are common to all renderers this would be quite a limitation.

Renderers don't even share a common geometry format (while all of them can render triangle meshes, they diverge a lot with curved surfaces - POV-Ray for example supports bicubic patches, while renderman offers a lot more: NURBS and subdivision-surfaces).
POV-Ray has proprietary support for procedural textures, RenderMan has it's shading language, which has become some sort of standard. I have written a basic framework to support procedural textures for Inyo (which has been used to create the rock texture on the Moais in my first IRTC entry) but right now this is very limited (though extensible).

To not limit its usability, JPatch (currently) doesn't even try to create materials that can be used in any renderer. You can set up some basic attributes (color, highlights, etc.), but that's all. JPatch per default creates a plastic like "texture" that uses the settings from the material editor - to get rid of the plastic look, you'll have to change the "shader code" JPatch exports. You can edit this for each material, and for both - POV-Ray and RIB output.

I recommend defining the textures in external files (a POV-Ray inc file, with renderman you'll have to do this anyway because you must compile the shader before you can use it) and just tell the renderer to use the material defined in that external file.

You can also specify "external" objects - you can specify arbitrary POV- or RIB instructions for each model in the animator - these lines will just be inserted into the .pov or .rib output file JPatch generates.

I think this way is the most flexible one, but you have to know your renderer very well in order to take advantage of that flexibility.
At one point I'll maybe add some kind of graphical texture editor that then will create POV or shader-language code - but this will always be more limited that what you could do by writing POV or RenderMan shaders on your own.

For Inyo I see two possibilities (that don't exclude eachother). It could also take advantage of that (yet to be written) graphical material editor. The second option are beanshell scripts that set-up the materials (using the afore mentioned framework) - this would be similar to setup textures using POV's scene-description-language, and that's how I made the textures used in "Travelling" animation.
Site Admin
Posts: 2792
Joined: Thu May 20, 2004 9:16 am
Location: Austria

Return to General discussion

Who is online

Users browsing this forum: No registered users and 1 guest