sprockets The Snowman is coming! Realistic head model by Dan Skelton Vintage character and mo-cap animation by Joe Williamsen Character animation exercise by Steve Shelton an Animated Puppet Parody by Mark R. Largent Sprite Explosion Effect with PRJ included from johnL3D New Radiosity render of 2004 animation with PRJ. Will Sutton's TAR knocks some heads!
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

Luuk Steitner

Hash Fellow
  • Posts

    623
  • Joined

  • Last visited

Everything posted by Luuk Steitner

  1. You can already give it a try with the new squetch rig. David has uploaded it today. He also included exaggeration poses in his rig and an example bvh file. But it's better to track your own video so you know what you're going for. The release is getting very close. I have beta 5 ready but this one only works with an installer, so I'm making an installer now. I hope this will be the last beta. When this is done we'll do some more testing and I'll finish the manual and update my website. I expect this will take 1 or 2 weeks.
  2. That's nice Al! Not bad at all for a second attempt.
  3. I'm not sure but I think it should work. You should give it a try when I release the software (trial) and let me know.
  4. It looks nice. Are you going to do something about the eyes? The eyes seem a little vacant.
  5. Hi again, I'm still working on some improvements, but I'm almost there (I think) I have some good news for those who like to work with the squetch rig; David Simmons has added an BVH function to the squetch rig so animations recorded with Zign Track are now very easy to use. Here's an example video with David's Squetchy Sam. I forgot to store the audio when saving my captured video file, but the movements are pretty accurate.
  6. Yes, the face needs to be rigged with a rig that matches the BVH file. For now it's just a face tracker we'll see what the feature will bring but don't expect you'll be doing full body tracking with just one camera. Although there is a 2D part in the program that allows you track any thing you like with any number of features. It is an experimental part that isn't completely finished yet. I'm not sure I'll have it finished on the first release. If not I'll finish it in a next (free) upgrade. What you can do with that feature depends entirely on your imagination... I understand you want it now but it isn't ready to be released yet. I'm working very hard to fix the last things that don't work smoothly yet and I decided to improve the tracking algorithm. Just be patient for a few weeks
  7. It says "Translate = ..|..|CP #1.Translate.X" That's wrong. "Translate.X = ..|..|CP #1.Translate.X" or "Translate = ..|..|CP #1.Translate" would be ok.
  8. The other expressions you showed had 3x "..|" That could be the problem. (Translate.X is a child of Translate so that's one step more) The best way to avoid mistakes is to select the variables with your mouse and not typing them.
  9. Syntax error means that your expression is not ok. try this on the translate.X expression: (..|..|..|Spline #164|CP #169.Translate.X + ..|..|..|Spline #164|CP #170.Translate.X) / 2 That should give the average X of CP 169 & 170 If this doesn't work I'll try it in A:M but I think this is correct.
  10. I think the way the do this is to use an expression for each axis like translate.x = (x1 + x2) / 2
  11. I will add more marker positions in a later update. first things first... Less is possible, you don't need set them all. Only the neck, forehead nose and chin are always required because they are the base for the calculations. Not every dot. The dots on the neck, forehead and eye corners are only used as reference to calculate the motion.
  12. Yes, you can use it for your monsters The interface looks like this for now: (pictures are a bit blurred because of the low quality compression) EXAGGERATE! Sweeeet! Does your system track the eyes aiming? I don't think it's too big a deal if it doesn't. It would be SOMETHING if you actually brought this to market for us all. The 'animator' in me says "It's too much of a cheat!" but the filmmaker in me says "Long form conversation-driven animation is possible!" The results are quite compelling! It doesn't have eye aiming yet. I'll experiment with that later.
  13. Yes, I will definitely make a tutorial how to work with it. I'll write one when all things are working like they should and I will probably make a video tutorial too.
  14. Thanks for all your thoughts about this. I haven't decided yet but we'll see. Yes, I'll use the paypal services for a start. That seems the best solution for now because it's the first time I'll be selling software online.
  15. Just to show off again, I show a rendered animation with the smoothing I earlier mentioned. See this video This animation was tracked from the same video as the previous tests, but this time the jitter is gone Btw, did I mention I also added exaggerate options for each bone?
  16. It isn't really sub-pixel accurate. There's no need for that because all values are converted to floating-point variables and then smoothed. You're right about using smaller stickers, but those are hard to find and I didn't feel like making them myself. But, that's no problem at all. In the configuration you can specify the size of the used stickers and the algorithm will track to it's center. If you want to edit the motions manually you can do that the choreography or add a second action and blend them. The marker stickers I have used can be any normal sticker with a basic color. The brands I used are Herma and Avery. They should be available in any office supply store.
  17. I don't know about MAC issues but it could be your compression. Which compression type are you using?
  18. I'm really not sure about the price yet, so don't hold me to it (it might also get lower) but I'm currently thinking about something like 129 euro. Keeping in mind the the value of the US dollar is very low at the moment and I don't want it to get too expensive for the American customers. Minus VAT and multiplied by the current rate that would be 150 USD. I think that's a reasonable price considering what you get and the amount of work involved, especially if you compare it to the price of other motion tracking solutions. I hope to be able to release the first version at the end of November.
  19. I was waiting for someone to ask that I won't include it in the first release, but when I done getting all things to work for version 1 I'll start experimenting on this. I do think it would be possible to get seemingly good results but that will greatly depend on the quality of the video file. Yes. That will absolutely improve the results because it can track the movement more accurate and be because of the higher resolution the calculated angles will also be more accurate. And you'll need less or no smoothing to prevent jitter.
  20. I just finished the after-smooth algorithm that smooths the BVH rig. It is sooooo smooth! I like it! I'll post an example later.
  21. If I make enough money with this I will definitely make a MAC version but you'll have to be patient. Exaggeration would be a very useful feature. For some characters the movement made by the face won't be enough and A:M can only reduce the enforcement. It won't go over 100% so that's why I wan't to make it. It can always be reduced later in A:M if it appears to be too much. The jittering is just because the export isn't 100% finished yet. I'm using a smoothing algorithm that smooths the feature movements but if the resolution of the video isn't high enough there will still be jittering in the movements of the rig. That's why I will make an after smooth function that smooths the rig after the 3D motion is calculated. That's one of the first things I will add to the program. I hope it works out like I think it will do... Btw, I changed a few things and made another test video. Yes, this one still jitters...
  22. I just did another MoCap test: see this video Sorry for my crappy english... With this test I find some more things I have to adjust, like reducing the head rotation and link the 'sneer' movement to the rotation. This was also a good test to see how the lip sync is working out. It looks like I have smoothed the mouth bones too much. That's easy to fix
  23. When you import an BVH file in A:M you do that in an action. So it creates keyframes on every frame in that action only. You still can manipulate the movements in the chor / combine it with other actions. You can apply the action to every model that has the same face rig.
×
×
  • Create New...