sprockets The Snowman is coming! Realistic head model by Dan Skelton Vintage character and mo-cap animation by Joe Williamsen Character animation exercise by Steve Shelton an Animated Puppet Parody by Mark R. Largent Sprite Explosion Effect with PRJ included from johnL3D New Radiosity render of 2004 animation with PRJ. Will Sutton's TAR knocks some heads!
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

ypoissant

Hash Fellow
  • Posts

    2,579
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by ypoissant

  1. Yes. Gamma should be put on by A:M on the final render. If post processing is to be done, such as NLA and compositing, then the gamma should be put on after the final compositing. The sRGB standard states that the videocard and the monitor should be both set so the monitor gamma is 2.2. This is where the gamma chart is usefull. But!, and this is a very important but, Setting the monitor's gamma to 2.2 does not mean the same thing as gamma correcting an image at 2.2. They are exactly the reverse from one another. If the monitor is set to gamma 2.2 and the image is corrected for gamma 2.2, they cancel each other. They don't add each other. So it is not applying gamma twice. Now, I just realized that some of the confusions may come from some of my inapropriate words usage. I used to say that the render should be corrected with a gamma 2.2. What I should say is that a render should be corrected for a gamma 2.2. What really happens, when we select a gamma 2.2 correction (NTSC), is that the actual gamma curve applied is 0.45. Not 2.2. 0.45 is the reciprocal of 2.2. 1/0.45 = 2.2. So the net result is that the 0.45 gamma corrected image displayed on a gamma 2.2 monitor results in a gamma 1.0 image. Yes. Absolutely. I agree with that. Correct. There is no guarantees of any sort that a projected render will look exactly the same as on a monitor even if calibrated. But starting with a properly set monitor and a linear workflow is putting as much chances as possible on the your side. That is a good rule of thumb.
  2. ???? Robert. You, indeed took stock objects from the CD and placed them in a default lighting scene. And that is why it looks OK without gamma correction but not with. The default A:M lighting scheme is specifically designed to make renders to look good without gamma correction precisely because until very recently, almost nobody knew much about the intricacies of linear workflow and even today. almost nobody understands it even though a lot more people are aware of the issue. Furtheremore, most users just want to get a render that looks good when they hit the render button and are not even aware of color space issues. It is not your fault, Robert, nor A:M's fault. Every other 3D applications work that way too because that is the way it developped throughout the years and hacks, tweaks and tricks were designed by artists to compensate for that misunderstanding but that is changing. Even large production studios have their production pipeline organized to compensate for a lot of those issues that pop up in renders because of their use on non-linear workflow. That is changing too thanks to more and more use of global illumination techniques which cannot be hacked that way and raised the awareness about those issues. It is the "workflow", Robert. Every step of the production must be "linear" throughout the whole production pipeline. If every step is "linear" then you will get much more realistic renders at the end. But if you mix non-linear steps, such as lighting setups designed specifically for nonlinear workflow and gamma corrected photos as textures, with a linearization step, such as applying gamma correction to the final render, then this will not work because the final result will look like applying gamma correction twice.
  3. Of course, the right image looks way too washed out. What else did you expect Robert? You have a scene designed to look good in non-linear light. Of course, once you apply a gamma 2.2 on it, it will look bad. This scene does not look realistic at all. There are shadows on the floor but not inside the boxed cube. So I guess you used z-buffered shadows with shadows set to 80% which means that once gamma 2.2 is applied will results in shadows of only 60%. This poll just proves that with hacks and tweaks, you can get an image to look good under any specifically selected circumstances. I don't want to look like I have a mission of evangelizing that gamma and linear woorkflow stuff here and annoy people with that. I learned something important for CG artists and I just wanted to share that. There is no right and wrong ways about that. Just a different way of working. If your own workflow is right for you, then no need to change. If you are not looking for realism or photorealism, then certainly no need to care about that. If, however, you are reaching for more realistic lightings, even if not photorealistic scenes, and you are getting difficulties and you are starting to add more and more lights or negative lights, then, maybe, that linear workflow stuff is worth looking into.
  4. Yes. Because that is the old default non-gamma corrected setting and users have been using this settings for all their projects. This can't be changed without the user explicit setting. The "linear workflow" is a "workflow" because it does not only involve changing the render gamma. You also have to texture and light in a linear setting. It involves the whole workflow. It is almost guaranteed that if you did all your texturing and lighting in a gamma compensated setting, then gamma correcting the render will not look good. Yes. A gamma 2.2 monitor is a quite dark monitor BTW. That is very unlikely. If your workflow is setup right, then your image will look right on any monitor. At least it will look just as right as any digital photos on the same monitor.
  5. The more I have to explain this gamma / tone correction stuff to different people and the more I narrow that to a simplified set of rules: 1) Renderers are radiometric calculating applications so the data that we feed them in should be linear and the data that they will produce will be linear. 2) The data we, humans are used to work with, are not linear. They are always adjusted for our human vision. And so this data should always be linearized before the renderer uses them. 3) The data that is produced by the renderer (read the rendered image) is linear and so it should always be adjusted for our human vision. -------------------------------------------------------------- This set of rules is very simple. Yet, the counter argument that comes every time, in a gamma discussion is the "device" argument. The fact that different devices have different gamma and that we cannot control the gamma of the end-viewer equipment. It is not the artist's responsibility to try to solve such a technical issue and an unsolvable problem. It is the engineers' and industry's responsibility. Indeed, the industry took its responsibility and adopted the sRGB standard to solve those devices issues. There is a lot of complexities behing the sRGB standard so that an image can be used on any different device types each with their specific color gammut. But yet, the beauty of this standard is that it can be boiled down to this extremely simple rule: - Gamma correction is 2.2. And the gamma correction is the standard transformation that allows us to transform linear radiometric data to human vision comfort and vice versa. So forget all of what you may have read concerning device gamma and devices color gammut. This is good to know stuff for engineers. But for anyone else, trying to figure what is what in that is just trowing more confusions on the issue. You just need to remember that the devices, any devices you are using are designed so the colors and tones that a human sees on them are comfortable to the human vision but they are not linear. Color coming from any devices that are adjusted for human vision comfort should be linearized before being fed to the renderer. That includes not only photos used as textures but also any color coming from picking a color in a photo, or picking or seting a color from a color selection widget. Since picking a color is done on a human comfortable device, those should be linearized as well. Concerning pointers to other tutorials on this, you just need to type "linear workflow" in google and you will get most of the discussions and tutorials on the subject. A word of caution, though, a lot of those discussions are from people who have discovered the issue and figured workflows so there is a lot of confusions in there.
  6. Just a precision. Textures files, when used as color decals, should always be restricted to low dynamic range because they must represent surface reflectance. Since a surface cannot reflect more light than it receive, the texture data should always be restricted to between 0 and 1. So compressing a HDR texture image with an S curve would be wrong. A HDR image file that is to be used for color texturing should be scaled down to bring the white to about 0.9. Also, by definition, a HDR file is necessarily linear. It is not gamma corrected or tone corrected. This said, some people don`t understand that and apply a tone correction before storing the image in a HDR file.
  7. I agree that it depends but I don't agree that it depends on the system or the display. It depends on what are your intents. If you are going for photorealistic rendering of lights and shadings and shadows, then you must gamma correct throughout the whole production pipeline. If you are not going for photorealism, then you may do whatever you want. The Mac is gamma 1.8 is old story. Mac displays are no more gamma corrected for 1.8. They follow the rest of the industry which are all oriented toward the sRGB standard which requires a gamma 2.2 corrections on photos. And that includes digital 3D renders too. Digital photos and computer display are corrected for gamma 2.2. That means that any material colors are also gamma corrected. Before being used by a renderer, they must be linearized, meaning that their gamma correction must be reversed. Applying a gama 2.2 correction to renders is just doing the same type of correction that are applied to digital photos. Just as there is no "it depends" for photos, there is no "it depends" for renders. Different media production pipelines are geared to correctly print or display photos which are gamma corrected so if your renders are also gamma corrected, they are already suited for print, display on computer screen or on TV or printed to film. Not doing the reverse gamma correction to textures and material colors and not doing the gamma correction to the renders is just plain wrong. It is working in the wrong color space all along. Here is another page with further explanation as to why not gamma correcting is wrong.
  8. I know hat Dusan have finished the animations on it and he is rendering the very last scenes.
  9. OK. Here is a WOW from me too The radiosity still does not do justice though. You need to gamma correct the final render. See attachment and see my tutorial on Tone Correction.
  10. Great model. Bravo. I would add some HDRI environment reflection to that.
  11. Cory Collins is very talented artist with years of experience who can draw wonderfully, sculpt marvelously and have an inherent and incredible talent for animation. He was recently hired by a large game studio where he will be working alongside people such as Sthalberg and Targete. Some of the other things ge did in A:M: A big crab like meatbug monster, Thom gets attacked and, of course Slap Happy
  12. That's a nice little job. Professional video graphics. Clever!
  13. I see several issues. First, the squash and sretch is way too exagerated. The stretch before it hits the ground should not be there especially if you are using motion blur. The motion is not right. The first bounce is 1/2 the height from where the ball first falls. So, the second bounce should also be 1/2 the height of the first bounce height and the third bounce should be 1/2 the height of the second bounce, etc. The acceleration when the ball falls and the deceleration when the ball lifts is almost linear. There is very little sense of acceleration or deceleration. The distance the ball travels when it falls is proportionately equal to the square of the number of frames. So the distance the ball trvels down between frame 2 and 0 is 4 times the distance traveled between frame 0 and 1. And the distance the ball travels down between frame 3 and 0 is 9 times the distance traveled between frame 0 and 1, etc. The timing is not right. Assuming that you keep a 1/2 height bounce height, the timing between each floor hit should be 70% the time of the previous floor hit. I counted 10 frame from the fall to the first hit and this is equivalent to one half bounce so if it was a full bounce it would have taken 20 frames. That means that the second floor hit should be 14 frames away. It is currently 20 frames away. Then the third floor hit should be 10 frames away, then 7 frames away, then 5 frames away, then it gets complicated because we start getting half frames so the next one would be 3 frames away and then 2 frames away and then less than 2 frames can't be perceived except for some blurring. BTW, this 70% is not just arbitrary. It follows the law of gravity. The length proportion of the bounce is equal to the square root of the height proportion and square root of 50% is 70%. The deceleration at the end is too short. The ball should continue to roll and decelerate much more slowly. Once you have those issues straightened, then you can start playing with the timing and the motion to convey a different feeling but IMO, you need to get those straight first.
  14. Interesting the you animated the resistor symbol instead of a resistor. A critique I have is that the animation does not show the true nature of a resistor though. One way to show that would be to have electron enter the resistor and gradually slow speed as they pass through the resistor to exit with a much reduced speed.
  15. Start there: http://www.ypoart.com/tutorials/photon/index.php You will find the Cornell Box tutorial and to answer your questions regarding other room size, look at "Calculating the Sample Area" and "Visually Finding Optimal Photon Properties" pages.
  16. This is really awesome. There is something in those trees and the fog which reminds me of Walt Disney's Ave Maria.
  17. That was discussed a while ago on the forum. See http://www.hash.com/forums/index.php?showtopic=27347
  18. @jakerupert: This is a house facade from Quebec city old town. So this is French architecture. BTW, this is the 400th anniversary of Quebec city this year. Quebec city is the oldest city in North America and there are some very old and nice houses in the old town. @c-wheeler: This house facade was meant to be seen only as a background so the modeling is already quite detailed for that purpose. This said, the model turned out better than I expected so I might decide to use it a little nearer to the camera than I originally planed. But apart from adding more details, it would need bevels and that is the sort of detailing that is better planned from the start of the modeling process rather than added as an afterthought. @kamikaze: It is a 4 minute long film based on a (french) song from local artist. A video-clip in a way. It is the story of a 15y old boy, new in a neighborhood, who is having difficulties being integrated in a local group of kids. The song is very slow and moody and the lirics are not explicit about the story but are rather evocative. The film and animation will be very moody as well. Not much action.
  19. Here is a model for my short film project. It is far from completed and this is not the main building model of the project but I put that together so I can start testing lighting setups. C&C welcome
  20. For backwar compatibility with old models made with old A:M versions that had old bumpmaps.
  21. Impressive! Both the machine and the model are impressive!
  22. Very good point. This scene is an excellent exemple where a "linear workflow" would have immensely helped. See my tutorial on the subject. Indeed, the image is very dark on my monitor. A tone correction would help bring back a more natural illumination even though it is night time. The hard cutoff is an acceleration trick. The way to get the light to continue and fade smoothly is to use OpenEXR as the output file type.
  23. Very interesting and nice character design. I like them all. And I also prefer them without the specularity.
×
×
  • Create New...