sprockets The Snowman is coming! Realistic head model by Dan Skelton Vintage character and mo-cap animation by Joe Williamsen Character animation exercise by Steve Shelton an Animated Puppet Parody by Mark R. Largent Sprite Explosion Effect with PRJ included from johnL3D New Radiosity render of 2004 animation with PRJ. Will Sutton's TAR knocks some heads!
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

Recommended Posts

Posted

Playing with features I had overlooked in the past... I see in the Choreography that I can turn ON Radiosity. A quick forum search turned up nothing (been having bad luck with our search engine... google almost does better...) I remember back in the day Yves put TONS of time and information into this subject on the forum... is it still around?

 

Has Radiosity improved with the newer versions of A:M as far as render times go?

Is Radiosity an alternative to or does it work with A:M's lights... raytraced only?

Is radiosity whats known in other apps as 'global illumination'?

 

Now that I have NetRender maybe I can take a little more of a hit to my render times if it means more realistic renders...

  • Replies 32
  • Created
  • Last Reply

Top Posters In This Topic

Posted

Matt,

 

I tried to play around with it about a year ago, but the results for me were horrific. I guess

if I had written the code for it, it would make sense, but it was ghastly.

 

The render times for me were never-ending only to see a great mess at the end.

 

It seems to me like a stone age tool compared to things like Element, IRAY, Lumion, Unity,

Marmoset, etc.....

  • Hash Fellow
Posted

Yves has a radiosity primer thread on the forum. (Oh I see you mentioned that. I'm sure ti's still there.)

 

I haven't used it in a long time but i recall being able to get good results by following his general process. I think it is more suitable for stills rather than animation.

Posted

Is Radiosity an alternative to or does it work with A:M's lights... raytraced only?

Is radiosity whats known in other apps as 'global illumination'?

Radiosity is not an alternative to A:M's light. It uses them to compute a global illumination solution. So yes, it is known as "Global Illumination" in other apps.

 

The word "Radiosity" is misleading as it refers to an old, not-used-anymore very limited global illumination technique. This name was kept because it was already used in old days where A:M had an implementation of the "Radiosity" technique.

 

The Global Illumination Technique implemented in A:M is called "Photon Mapping". It is a good technique but is a little bit difficult to use due to the numerous parameters that need to be set just right.

 

"Multiple Importance Sampled Bidirectional Path Tracing" (IMS BPT) is the currently preferred Global Illumination technique and it has no parameters. You just let it render until the render is subjectively noise free enough.

 

Indeed "Photon Mapping" is not too suitable for animation because it produces moving noise in the animation. This can be solved by cranking all parameters to their highest values, thus increasing render time by doing so. But then IMS BPT also requires much longer render time when frames are rendered for animation.

  • Hash Fellow
Posted

Once again, we said your name three times and you appeared!

 

Thanks for the answers, Yves!

"Multiple Importance Sampled Bidirectional Path Tracing" (IMS BPT) is the currently preferred Global Illumination technique and it has no parameters. You just let it render until the render is subjectively noise free enough.

Indeed "Photon Mapping" is not too suitable for animation because it produces moving noise in the animation. This can be solved by cranking all parameters to their highest values, thus increasing render time by doing so. But then IMS BPT also requires much longer render time when frames are rendered for animation.




Can you think of anyone, any likely suspects, who might be engaged to implement this new, preferred method in A:M?

Posted

Without a heavy working knowledge of the "widget", it requires much time and trials. I think I experienced the

noise Yves describes in an animated setting. Without having his expertise in the subject, it is a long road to

haul.

 

BTW.....Yves...Thanks for your efforts over the years in providing the ability to implement these types of things in A:M. I'm

certainly not knocking it. As artists we always tend to complain about things we don't fully understand.

 

If you have a massive amount of time and the eagerness to just get it to work, I'm quite sure you can get the

desired result. But In my case, I have neither luxury. :)

  • Hash Fellow
Posted

"Radiosity" seems to be for cases where you need the color bounce from adjacent surfaces. For most situations that's not a major factor..

 

Global ambiance and ambient occlusion should be able to do most of your realistic shading needs.

Posted

 

Global ambiance and ambient occlusion should be able to do most of your realistic shading needs.

SO- by global ambiance, that simply means the choreography setting where you select a global color or HDRI image... if I select a color and add a value, the scene becomes less dependent on lights but does NOT become more realistic, it becomes more 'toony'(flat shadeless) if anything... maybe I need to play around more with HDRI... I know Stian uses that and gets great results... it is the method that more and more programs like Element3D use for their primary lighting... QUESTION: Is there a render 'hit' using HDRI?

  • Hash Fellow
Posted

 

 

Global ambiance and ambient occlusion should be able to do most of your realistic shading needs.

SO- by global ambiance, that simply means the choreography setting where you select a global color or HDRI image... if I select a color and add a value, the scene becomes less dependent on lights but does NOT become more realistic...

 

 

It's not simply choosing it... you have to use it right and for the right reasons.

  • Admin
Posted

Radiosity is awesome but takes considerable time.

If you just want to splash some color (ala psuedo-radiosity) there are faster ways to get that done.

One way being to add a second image into the mix that 'radiates' the scene as desired (such as the original image blurred heavily and then placed back over the frame).

 

Here's a quick proof of concept for quickly adding radiant color to nearby objects (floor, walls, etc.):

 

 

 

 

*with apologies to Matt Bradbury whose image I grabbed from Yves's Radiosity thread. His original image is displayed on the left for comparison

 

Edit: Perhaps starting with an image rendered with radiosity is cheating a bit so I need to find/render another image and start with that. ;)

fakerosity_comp1.mov

Posted

Global Ambiance can be combined with other techniques and that is the key. It is very simple to use actually: put it to lets say 50% and do the rest of the 50% with lights. Then use AO or SSAO on the image too. It takes much less time to render then real lights/radiosity and it will give you a product shot feeling quite easily.

I think I will do a tutorial on the subject. It is not creating 100% realsitic lightening situations or stuff like that, but it can create nice looking renderings like product shots and that "high dolar look"-stuff.

 

I have learned that that is what many of your clients want and that it is not really the "extremly realistic" stuff they are after but it just has to look like an very clean and cool product shot. (cool in both meanings)

 

See you

*Fuchur*

Posted

Can you think of anyone, any likely suspects, who might be engaged to implement this new, preferred method in A:M?

I know a lot of people implementing IMS BPT renderers as a hobby, myself included. Personally, my pet project has been in development on and off (but mostly off) for 5 years. While it is a BPT, it is mostly an experimental project where I can test ideas concerning parallelism, concurrency and vectorization in a path tracer context. At work, I maintain and improve the renderer (see attached example). This takes all my programming time. When I get back home, I'm usually not in a mood to keep focusing on programming again.

 

I have played with the idea of porting my pet path tracer to A:M a few times already. But there are a few difficulties.

 

First, this could not be ported as a plugin. It would need to be integrated in the code base.

 

Then there is the material and light issue. Material definitions and light definitions need to be physically-based in order for a path tracer to work. This basically requires the implementation of a separate physically-based material system in A:M.

 

I know from experience that this would not be appreciated. I'm not talking about the A:M community in particular but in general. There is a surprisingly strong resistance to adopting physically-based material definitions in traditional CG circles. People have invested a lot of time learning how to get materials they like in their legacy 3D application of choice and don't like to have to change that.

 

And material in A:M can become quite complex especially when procedurals are used. It is possible to constraint A:M material definition so they are physically-plausible. But this gives unexpected results in renders because although the material are physically plausible they are still generally physically improbable so they don't look like real world materials. And it usually does not match the regular render results either.

 

Bottom line: This would be a large project.

 

Anyone wanting to "hire" an implementer should post an offer on the ompf forum. There you can find computer graphics students (and also professionals) interested in path tracing of all sort.

Samples.jpg

  • Hash Fellow
Posted

Thanks, Yves.

 

"Physically based" is not something I'm familiar with.

 

What are parameters someone would need to be setting that don't exist in our conventional materials?

 

Would you have to model anything differently? The shapes in your sample pics all look like things that could be made in A:M so that wouldn't be true, would it?

  • Hash Fellow
Posted

To pursue the modeling angle a bit...

 

Materials aside...there shouldn't be any reason that your proposed separate renderer couldn't work with A:M spline models directly instead of them having to be converted into polygon model files first, right?

Posted

Models are fine. No need to convert to polygons. Lights are usable too as their units could be converted easily. It is really about the materials. If you really want to know the details, hold on.

 

I like to distinguish material definitions used in CG world by two categories: "Physically-Based" and "Effect-Based".

 

Physically-Based

 

Describe materials by their physical structure and composition. Essentially using material properties that can (or could) be measured from real-world materials.

 

A material is described by its layers, each layer being described by its reflectance, index of refraction (IOR), absorption/extinction, density. scattering/roughness, emission properties. Typically, providing the reflectance/IOR and roughness for each layer is enough for most materials.

 

Natural materials are single layers composed of raw wood, metal, minerals, etc.

 

Synthetic materials are usually double layer. A based wood material with a varnish coating for example. Paint or a plastic are double layers too because the substrate is a transparent layer in which are suspended colored particles or pigments.

 

Knowing the composition and the physical properties, we can model how light is being reflected off a material. When a "photon" hits a physically-based material, we can compute the probability of it being reflected or absorbed and if reflected, the probability of the direction it will be reflected to.

 

Effect-Based

 

Describe material by an accumulation of visual effects. Diffuse, specularity/shininess/glossyness, reflectivity, ambience are all visual effects. Each of those visual effects have a color and an intensity.

 

Given an assemblage of those visual effects with their properties, it is impossible to infer the physical properties that would produce the resulting material appearance. At best, one of those effects may be probabilistically selected when a photon is being scattered which would constraint the material into being at least physically-plausible. But this does not make the material physically-probable, meaning that even though the physically-plausible material respects all physical laws, the resulting material is very unlikely to exist in nature. And this produces more noise in the final render thus requiring much longer render times.

 

It is possible to describe a physically-plausible and probable material with effect-based properties. But this requires an expertise in actual material composition, an expertise in the actual implementation of each effects for the material representation in the 3D application of choice, and a good dose of math. BTW, this is another difficulty one is faced when using Radiosity/Photon-Mapping in A:M. For best results, materials need to be setup in a physically-plausible way and doing this is not trivial.

 

Effect-based material descriptions are intrinsically full of contradictions:

  • The ambience property assumes the environment is reflected by a perfectly rough material while the reflectivity property assumes the environment is reflected by a perfectly smooth material.
  • The diffuse property assumes light is reflected by a perfectly rough material while the specularity/shininess/glossyness assumes light is reflected by a more or less rough material.
  • In a physically-based renderer, there is no distinction between light and the environment Everything is a source for illuminance either directly or indirectly. But the ambience and reflectivity specify how the environment is reflected while diffuse and specularity/shininess/glossiness specify how light sources are reflected.
  • All the effects properties can make much more sense (less contradictory) if materials are assumed to be double layer (a base material with a transparent coating) but there are no properties that indicates how to separate the properties between the base layer and the coating layer. Such a "separation" property would be impossible to implement anyway.

 

Those are the main differences.

  • Hash Fellow
Posted

thanks, again, Yves.

 

It sounds like the hard task for the user with a physically-based renderer is to define the materials, while the modeling and lighting techniques don't change much.

 

But many people are using such renderers so i presume most of them make do with using material definitions that have been prepared by someone else, much like people who can't model are limited to models someone else has made. It's like clipart.

 

Once a physically-based material has been made it can be re-used on any model and doesn't have to be remade for every scene, right?

 

 

A material is described by its layers, each layer being described by its reflectance, index of refraction (IOR), absorption/extinction, density. scattering/roughness, emission properties. Typically, providing the reflectance/IOR and roughness for each layer is enough for most materials.

 

 

This doesn't sound like too many parameters to manage. I count six there, which is less than the number of parameters in our current "Surface" definitions.

 

If the program used the standard units that these properties are typically described with one could look up the values in appropriate references (or copy them from another program's material definitions B) ) without needing a testing laboratory to ascertain the values from scratch for every material.

 

And someone with general knowledge of the parameters could take an existing material and vary parameters to arrive at a desired appearance by experimentation.

Posted

It sounds like the hard task for the user with a physically-based renderer is to define the materials, while the modeling and lighting techniques don't change much.

 

Yes. You are right. The hardest part is getting used to a different material representation. Lighting techniques need to be adapted too because the indirect lighting takes care of a lot of additional lights that are typically added to scenes in traditional CG scenes. Modeling techniques are the same.

But many people are using such renderers so i presume most of them make do with using material definitions that have been prepared by someone else, much like people who can't model are limited to models someone else has made. It's like clipart.

This doesn't sound like too many parameters to manage. I count six there, which is less than the number of parameters in our current "Surface" definitions.

 

And someone with general knowledge of the parameters could take an existing material and vary parameters to arrive at a desired appearance by experimentation.

Once a physically-based material has been made it can be re-used on any model and doesn't have to be remade for every scene, right?

 

Indeed, physically-based material definition is usually easier to setup. Not only there are less parameters to tweak but the parameters are more meaningful and intuitive. The expected render result is very predictable.

There are basically two material setups: One for single layer raw material and one for double layer coated materials. Once this is setup. all that is left to do is change the color/reflectance maps and the layers roughness. Add bump/normal maps where appropriate. It becomes very intuitive very quickly.

Once a material is setup. it can be reused on any model and in any lighting situations. Because it is physically-based, it will always look the same no matter the environment it is in and the lighting conditions. The material just reacts to light as it would in real conditions.

If the program used the standard units that these properties are typically described with one could look up the values in appropriate references (or copy them from another program's material definitions B) ) without needing a testing laboratory to ascertain the values from scratch for every material.

 

Yes. Absolutely. BTW, reflectance is a big word but it is just a regular color map with gamma correction removed.

 

Posted

It was explained to me that reflectance is the combination of specularity and reflectivity...

 

IF Someone could go thru and implement Yves 'path-traced PBR' renderer... no guarantee it would be any faster ...

 

I have fun playing with Physical Based Materials (PBR) in Element 3D, there is no real brain-surgery in the way it is setup there, and you get great results and can zoom into them very close with no loss.

Posted

Man that are really, really nice renderings. :)

Thanks for all the insides Yves, it really is very interesting.

 

Concerning including you in the code base is very likely a no brainer. You already have worked on the A:M code and I think that is very likely something no one has a problem with at all :).

If you are interested now or in future to implement such a cool thing, I will ask Steffen and Jason in an instance about that :).

 

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

 

Somehow I am a little embaressed to show my (first!) approach on the Global Ambiance tutorial right now, but I made a first video tutorial on it.
(again: I am not totally happy with it myself and like that I am sure I will try it again, but it may give some interesting approaches for some of you).

 

So here it is:

http://www.patchwork3d.de/am-global-ambiance-186-en

 

See you

*Fuchur*

Posted

It was explained to me that reflectance is the combination of specularity and reflectivity...

Reflectance is used in two contexts:

Used alone, it just means the color or texture.

Used in "Bidirectional Reflectance Distribution Function" or BRDF it means the color and the scattering pattern.

 

In A:M, "Reflectivity" refers to the color of the reflection and "Specularity" refers to the width of the specular highlight which is determined by the scattering pattern. So this is not too far from the actual thing.

 

Technically, though, a "Specular" surface is a perfectly smooth surface and "Reflectivity" usually refers to the reflectance of a perfectly smooth surface.

  • Hash Fellow
Posted

As far as it being a different renderer... we have different renderers now that we choose from

 

Wireframe

Shaded

Final

(and it seems like our current "radiosity" is also a different renderer)

 

So having one more option wouldn't be crazy to find within A:M.

 

 

I'm of two minds on it all...

 

On the one hand, most of our users barely scratch the potential of what we have already and they would probably be quite alarmed at the longer render times of even this speedier radiosity technique. (It IS faster, right?)

 

On the other hand, there is a strong desire for this technique among our advanced users; it would get A:M more into the modern CG age and most of all it would permit us to use lighting techniques that are more like the real-world techniques we learn about in photography classes.

 

But if it is still glacially slow I'm doubtful people would be happy with it. Can this be sped up with hardware computing? Can any portion of the process be sped up?

Posted

It isn't intrinsically faster than the current Photon Mapping.

 

However, it is a much simpler rendering algorithm so it is often implemented on the GPU.

 

But in order to be efficiently hardware accelerated, a lot more than just the renderer must be ported to GPU. This basically amount to an almost complete re-write of the application. This would be a huge undertaking.

 

I also think that most users wouldn't use such a renderer because of the longer render time.

  • Hash Fellow
Posted

 

But in order to be efficiently hardware accelerated...

 

Could it be inefficiently implemented and still be faster? Powerful graphics cards will only get cheaper and more powerful with time.

Posted

Could it be inefficiently implemented and still be faster? Powerful graphics cards will only get cheaper and more powerful with time.

From the results I saw on ompf forum, No.

 

The bottleneck is not so much the power of the GPU but the bandwidth between the CPU and the GPU. People who have attempted a "mixed" implementation all came to the same conclusion: You don't gain any speed unless everything is run on the GPU.

 

Note that I have no experience implementing a path tracer on the GPU. All I know are from the ompf forum discussions, blog entries and technical articles.

  • Hash Fellow
Posted

And when you say "everything"... what is that really? Does, for example, the timeline and graph editor all have to run from code on the GPU? I can't imagine it.

 

Some of our users have been exporting OBJ sequences or "pointclouds" and then taking those files to 3rd party GPU renderers and getting fast renders and/or specialized rendering results from that.

 

That would have to be the ultimate "bottle neck" between the CPU and GPU and yet it is still advantageous for them to do so.

 

??

  • Hash Fellow
Posted

Here's part of my confusion...

 

Currently in A:M, as I scrub through an animation or let it play in real time, for every frame interval of 1/24th of a second, is A:M transmitting the entire geometry and lighting and color info of the scene over and over again to the graphics card, which then renders it (in shaded mode)?

 

If that is the case that seems like a small overhead for the transfer of the scene data from the CPU to the GPU.

 

If that is not the case... what IS getting transmitted every 1/24th of a second?

  • *A:M User*
Posted

"Radiosity" seems to be for cases where you need the color bounce from adjacent surfaces. For most situations that's not a major factor..

 

Global ambiance and ambient occlusion should be able to do most of your realistic shading needs.

 

And you could probably fake color bounce well enough with some appropriately placed lights.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...