jakerupert Posted February 22, 2010 Posted February 22, 2010 I did some more testrendering for a project of mine. Format Full HD with quite a few big models in it with shadow, reflections, toonrender on. I know I should use singlefile format, but just for a general test , if it would work at all, I choose Quicktime with high quality. So what happens with my scenes they render quite o.k. with times from 5 to 15 minutes till all of a sudden from one frame to another the rendertime rise exceptionally to 40 minutes or higher, seems like it is stalled somehow. Can I forget and quit this now or does it make sense to show some more patience? What might be the reason? Not enough ram maybe? Would this happen with tga or png also? (I could quit there and start from that problemframe anew) Any ideas and expieriences somebody? Quote
John Bigboote Posted February 22, 2010 Posted February 22, 2010 Another reason why we recommend rendering to an image sequence. You can look at the render readout to see what it is currently working on and that may give a clue to the bottleneck. If you don't really-really need full 1920X1080 HD you could use 1280X720 or even 864X486... and you might want to employ a 2nd render-instance...one on evens and one on odds, maybe even a third. Quote
Hash Fellow robcat2075 Posted February 22, 2010 Hash Fellow Posted February 22, 2010 So what happens with my scenes they render quite o.k. with times from 5 to 15 minutes till all of a sudden from one frame to another the rendertime rise exceptionally to 40 minutes or higher, seems like it is stalled somehow. An interesting experiment to try would be to render that at smaller size and see if there is a proportionally similar increase in render time for that same frame. Then you know it's something to do with the models or whatever in that shot. But really, if i had something that took even 5 minutes per frame, I'd render to TGAs. When a program is rendering to Quicktime it has to keep track of all the frames it has rendered not just the one it is working on, so it's possible that there is some sort of accumulating memory problem. Quote
jakerupert Posted February 23, 2010 Author Posted February 23, 2010 Seems it had nothing to do with quicktime or single imagefiles. Guess it was one of my older models that had internal patches. I presume that maybe the lightbeams can get caught there somehow inside the model, where they rebounce endlessly such stalling the render? To illustrate the dangers of internal patches I post a model of a hangardoor from the times I was carelessly extruding away. Note that in picture A you don`t see anything odd or unusual. But in picture B you can see the internal patches. (That`s why I nowadays always model with "show backfacing polygons" OFF, it´s ON by default, maybe I should better be the other way round. So when you detect later on, that you have created these internal patches accidentely, you can easily delete these spline by coma clicking them and hit the delete button. So all seems to render away quite smoothly now and I am really stunned, what AM (and me) is capable to render now, really huge scenes size-and patchcountwise. Think I will be able to post some examples soon. Quote
Hash Fellow robcat2075 Posted February 23, 2010 Hash Fellow Posted February 23, 2010 Seems it had nothing to do with quicktime or single imagefiles. Guess it was one of my older models that had internal patches. That's a great insight. Quote
HomeSlice Posted February 23, 2010 Posted February 23, 2010 Thanks for this post. One more A:M mystery solved Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.