sprockets Rubik's Cube Nidaros Cathedral Tongue Sandwich A:M Composite Kaleidoscope Swamp Demon Caboose
sprockets
Recent Posts | Unread Content | Previous Banner Topics
Jump to content
Hash, Inc. - Animation:Master

Luuk Steitner

Hash Fellow
  • Posts

    623
  • Joined

  • Last visited

Posts posted by Luuk Steitner

  1. I would like to know how to swap a BVH file in an Action. Does anyone know how to do this? I fixed the errant markers for the bottom lip and exported a new BVH file which imported into A:M14 but when I attempted to swap shortcuts in the Action A:M seemed to freeze and I had to quit via the task manager. Any ideas? :)

     

    Just use the "import sequence" function of the biovision object. Once you have set up the constraints you can use the same action for any BVH file that is set up the same way. A good idea would be to save your action to a separate action file so if you want to use multiple BVH actions just import the same action a few times, rename them and capture the sequence for each action.

     

    Now here's an idea for a useful plugin, or a post process for Zign Track:

     

    BVH2AMAct (BVH to A:M Action)

     

    The idea would be to run a BVH file through this plugin which would analise the data for each bone and would create an Action file with key frames, filtering out any movements that fell within user defined limits for each marker/bone. This would generate a smaller file and would make post editing much easier and native to A:M.

     

    In the meantime BVH is good! :)

     

    You mean a file with less key frames? That's not a bad idea and could be useful for any BVH file. Maybe I'll make that later. Or some plugin writer with some spare time might do it :)

  2. What is the best approach to cleaning up animation like this? I suppose I bake the action and then just tweak the bones on the problematic frames?

     

    Well done Paul. At the point where the upper lip is going a bit to high you might want to reduce the enforcement of the constraints. Where the lower lip is making that sudden movement there is probably 1 or 2 frames where the tracking lost the marker. Did you save the project in Zign Track? If you reopen it you can take a look at those frames to see what happened during tracking.

    It's nice to see such a nice video!

     

    Cheers.

  3. I would love to see some examples of exaggeration, it anyone has time to post those experiments.... I just picked up a little mini dv camera and this program was one of the main reasons for getting one.

     

    Looking forward to its release.

     

    You can already give it a try with the new squetch rig. David has uploaded it today. He also included exaggeration poses in his rig and an example bvh file.

    But it's better to track your own video so you know what you're going for.

     

    The release is getting very close. I have beta 5 ready but this one only works with an installer, so I'm making an installer now. I hope this will be the last beta. When this is done we'll do some more testing and I'll finish the manual and update my website. I expect this will take 1 or 2 weeks.

  4. Hey Luuk,

     

    I was wondering if you think your software will run under parallels or VM ware fusion(virtual machine software), would like to have this working on my mac... even through a vm...

     

    I'm not sure but I think it should work. You should give it a try when I release the software (trial) and let me know.

  5. Hi again,

     

    I'm still working on some improvements, but I'm almost there (I think)

    I have some good news for those who like to work with the squetch rig; David Simmons has added an BVH function to the squetch rig so animations recorded with Zign Track are now very easy to use. Here's an example video with David's Squetchy Sam.

    I forgot to store the audio when saving my captured video file, but the movements are pretty accurate.

  6. Me Want now!! :D

     

    Ok....so, would someone need to setup the bones in the face before applying the BVH generated file? Or, does this require a special rigged face or skeleton?

     

    Is it just a facial tracker for now? Or will it grow into a full body tracking program?

     

    and as a side note.... me want now!! :D

     

    Any ideas on a release date?

    Yes, the face needs to be rigged with a rig that matches the BVH file. For now it's just a face tracker we'll see what the feature will bring but don't expect you'll be doing full body tracking with just one camera. Although there is a 2D part in the program that allows you track any thing you like with any number of features. It is an experimental part that isn't completely finished yet. I'm not sure I'll have it finished on the first release. If not I'll finish it in a next (free) upgrade. What you can do with that feature depends entirely on your imagination...

     

    I understand you want it now but it isn't ready to be released yet. I'm working very hard to fix the last things that don't work smoothly yet and I decided to improve the tracking algorithm. Just be patient for a few weeks ;)

  7. But in this case it shouldn't happen, isn't? I mean what could be wrong in a simple expression like this "..|..|CP #1.Translate.X"? That's cause I thought it could be a bug that should be Reported unless I be missing something...

    The other expressions you showed had 3x "..|" That could be the problem. (Translate.X is a child of Translate so that's one step more) The best way to avoid mistakes is to select the variables with your mouse and not typing them.

  8. thanks for the images , i like the interfas :lol:

     

    is posible add more or less motion caption control points ???

     

    I will add more marker positions in a later update. first things first...

    Less is possible, you don't need set them all. Only the neck, forehead nose and chin are always required because they are the base for the calculations.

     

    Looks like every dot on the face would be associated with a bone in the models face?

     

    Not every dot. The dots on the neck, forehead and eye corners are only used as reference to calculate the motion.

  9. realy nice tool !!! , some images from the app interfas please :P

     

    is Posible animating creatures or monsters faces , not Human Faces ???

     

    Yes, you can use it for your monsters ;)

    The interface looks like this for now: (pictures are a bit blurred because of the low quality compression)

    Interface1.jpg

    Interface2.jpg

     

    Btw, did I mention I also added exaggerate options for each bone? :rolleyes:

     

     

    EXAGGERATE! Sweeeet!

     

    Does your system track the eyes aiming? I don't think it's too big a deal if it doesn't. It would be SOMETHING if you actually brought this to market for us all. The 'animator' in me says "It's too much of a cheat!" but the filmmaker in me says "Long form conversation-driven animation is possible!" The results are quite compelling!

    It doesn't have eye aiming yet. I'll experiment with that later.

  10. I think a video tutorial should be included or available for download with the purchase of the app, explaining how to set up the rig and how to get it to work with bvh data. Then I'm in...

     

    Yes, I will definitely make a tutorial how to work with it. I'll write one when all things are working like they should and I will probably make a video tutorial too.

  11. Thanks for all your thoughts about this. I haven't decided yet but we'll see.

     

    Luuk,

     

    Thanks for the info, Ill start saving up, Im guessing youll be going through a paypal type service or some way one can use a credit card? Keep up the good work!

     

    Yes, I'll use the paypal services for a start. That seems the best solution for now because it's the first time I'll be selling software online.

  12. Just to show off again, I show a rendered animation with the smoothing I earlier mentioned. See this video

     

    This animation was tracked from the same video as the previous tests, but this time the jitter is gone :lol:

    Btw, did I mention I also added exaggerate options for each bone? :rolleyes:

  13. Hi Luuk,

     

    That is sooo impressive. I can't wait to see your future work.

     

    I only have one question. How sub-pixel accurate is your tracker (1/2 pixel , 1/4, etc.) ??

     

    As I'm sure you already know, your sub-pixel accuracy will help a lot with lower resolution video captures.

     

    Also, you might want to consider using small square stickers instead of dots. Or if you want to use dots, draw a black 90 degree corner in the middle of your dots or use smaller dots. This will also help the accuracy of your tracks.

     

    I wish you all the best with your project,

    Greg Rostami

    It isn't really sub-pixel accurate. There's no need for that because all values are converted to floating-point variables and then smoothed.

    You're right about using smaller stickers, but those are hard to find and I didn't feel like making them myself. But, that's no problem at all. In the configuration you can specify the size of the used stickers and the algorithm will track to it's center.

     

     

    I am interested in this motion tracking software package. I hope it can work ok with my webcam. But can we edit the facial animation after we apply it? If it is a keyframe per frame I assume that you cant realy edit it much. And what are the requirements for the markers? Where do you even get something like that?

    If you want to edit the motions manually you can do that the choreography or add a second action and blend them.

    The marker stickers I have used can be any normal sticker with a basic color. The brands I used are Herma and Avery. They should be available in any office supply store.

  14. Luuk, first off I enjoy your accent, and your english is just fine, I would like to ask for a ball park figure on what you think you may charge for the app? I like to put aside moneys for these purposes, like I do with my A:M subscriptions, its a must have app for me. Also a possible release date will help in that matter too. I already have started a list of other possible uses for such an app.

    Mike

     

    I'm really not sure about the price yet, so don't hold me to it (it might also get lower) but I'm currently thinking about something like 129 euro. Keeping in mind the the value of the US dollar is very low at the moment and I don't want it to get too expensive for the American customers. Minus VAT and multiplied by the current rate that would be 150 USD. I think that's a reasonable price considering what you get and the amount of work involved, especially if you compare it to the price of other motion tracking solutions.

    I hope to be able to release the first version at the end of November.

  15. Hey that looks promising. What will it take to track the eyelids and eye direction?

     

    I was waiting for someone to ask that :) I won't include it in the first release, but when I done getting all things to work for version 1 I'll start experimenting on this. I do think it would be possible to get seemingly good results but that will greatly depend on the quality of the video file.

     

     

    I have a HDV cam (hi-def). Would that actually improve things or is it more of a matter of good lighting and contrast?

     

    Yes. That will absolutely improve the results because it can track the movement more accurate and be because of the higher resolution the calculated angles will also be more accurate. And you'll need less or no smoothing to prevent jitter.

×
×
  • Create New...