sprockets Kaleidoscope Swamp Demon Caboose Break Room Starship Man and flower Room with open light shining through window
sprockets
Recent Posts | Unread Content | Previous Banner Topics
Jump to content
Hash, Inc. - Animation:Master

robcat2075

Hash Fellow
  • Posts

    28,206
  • Joined

  • Last visited

  • Days Won

    394

Everything posted by robcat2075

  1. You can put anything in a zip We need four files: OBJ file from A:M MTL file from A:M OBJ file from Blender MTL file form Blender
  2. I started to do it but it says long posts will be compressed without tags and I figured that would turn all the quotes in to wall-of-text.
  3. The files need to be in readable text format like A:M's export files are, not compressed binary format. There is almost certainly an option for that if that is not the default.
  4. Post both here
  5. I did a test with OBJ export and it retained the original JPG format. It didn't convert either the file or the name to BMP. Just in case you missed it, I'm repeating this... this is the fastest way to find out what is going wrong: Make a very simple model with the same groups in both, then open their OBJ exports up in a text editor and the differences will probably be apparent.
  6. Make a very simple model with the same groups in both, then open their OBJ exports up in a text editor and the differences will probably be apparent.
  7. Is the problem not getting groups or not getting maps? If I make a model in A:M, create three groups on it ("Top", "Middle", "Bottom"), and export to OBJ, the group names and definitions are contained in a .MTL file that is written along with the .OBJ If I look at the MTL file in a text editor, the group names are indeed there I think the problem is not that A:M is not exporting the groups but that the target application is not reading them in correctly. Tell me if I have completely misunderstood what you are trying to do.
  8. Experiment using environment map to create the highlights
  9. I presume cycles is faster but he was never able to show us any more than a speckly preview in real time. It still takes a long time to get finished result. Last time I tried a cycles demo it was very slow and pokey. And none of the beauty shots he showed were something that an amateur did with one light. For your Lego project, I'd distribute the sets with lighting set up already.
  10. that's a good word for it. In real film they used to do something called "timing" or "color timing" because color correction had something to do with timing how much light the film got and how long it was developed for different contrast treatments. There's a process they had to get a very blown out look that was different that just over-exposure but they had to do it by bleaching the original camera negative. That would be scary.
  11. But the LCD has worse black levels so the dynamic range gained on the top is more than lost on the bottom.
  12. That original "sRGB" version is how every interior scene of the White household looked in "Breaking Bad" so it's not necessarily "wrong".
  13. there is also something not-quite-right in his demonstration of "desaturation" I'm still looking into that... I should note that his lighting technique of lighting the entire room with the light bouncing off the wall... that is something the A:M renderer can only do with Radiosity which is very time consuming. I could light that scene to get a very similar result without radiosity but that is another issue.
  14. It takes me a while to watch that because I have to pause it every three or four minutes to go vomit. Listening to him chuckle at his own jokes is maddening. From what i can see, "Filmic Blender" is a render to a high dynamic range format like OpenEXR with a drop down list of post processing presets that do what one would typically do in After Effects. Look at the image below full screen. On the top is his before and after comparison of "sRGB" and "Filmic Blender" On the bottom, I've applied a slight "Curve" adjustment and a "Level" adjustment to just the sRGB side. It's kind of like a "gamma" change" Is that result not very similar? Mine is very noisy because I just have a screen grab off YouTube and not the real render. But just that one change has gotten a very similar result. I'm not an expert but I'm not convinced that the Blender Guru is speaking expertly about all the things he is speaking about.
  15. The basic problem is that A:M and any other 3D software can calculate the brightness of the Sun and total darkness and everything in between. However, common image formats like jpg or targa or don't have enough data range of numbers represent all that. They store values of 0 to 255. If 0 is black and 1 is the dimmest detail visible then you can only double brightness 8 times before you run out of numbers and start clipping values. However, that is also about all that the monitors we look at can show. About 8 doublings of brightness from top to bottom. Maybe 9 on great monitor maybe way less on a cheap one. Actual film formats, viewed in person, can do better. Print does worse. Example: If I take a picture of the Sun and then display that on my monitor or print it on paper, that image will not be as bright as the Sun was. So, for any display format, choices have to be made about what part of that nearly infinite range of brightness you could calculate is crammed into what it can actually displayed from min to max. OpenEXR renders let you store a far greater range of brightnesses... 30 doublings, I think... but you still have the problem that no display can reproduce that. In addition, there is the problem of color....
  16. There are several statements in the first few minutes that I am extremely doubtful about such as the assertion that a CRT monitor has an inferior dynamic capability to an LCD monitor. Just that is provably wrong. His comparison of the photo and render at 6:09 completely misstates why those two look different. I haven't gotten to the end of it yet. His solution (have't gotten there yet) may work in a practical manner, but I sense that he doesn't really know what he's talking about. Again I haven't watched it to the end yet... but if more dynamic range is the key, that's what OpenEXR is for. That is ostensibly "linear data" untainted by any color management You can pretty much render to that and do curve adjustment in post to include or exclude any part of the dynamic range you wish. I watch some more later. I'll also note that Yves Poissant has made a number posts of about color management and A:M and that if you say his name three times he will appear.
  17. I made a few tests your sample in v19, still using the same 0.15 thickness and Jitter set to 100% 1 pass 3 passes 16 passes 64 passes 256 passes
  18. Cool-looking shots!
  19. That is odd. I guess that's not where the ground plane ends. Maybe that's where the top of the sphere is?
  20. I have sometimes wished there was a way reduce more than one channel at a time. Do you have an example case?
  21. It works normally here... model window>select spline>>Plugins>Wizards>Sweeper
  22. Someone is getting a head start on "Planes, Trains, Automobiles"
  23. Just to try it, I rendered Tore's test project with no blur and added blur in post with After Effects "Directional Blur" I manually keyframed the size and direction of the blur for each frame. This took longer than either of the rendered blurs. testMB AEDirectBlur_.mov
×
×
  • Create New...