sprockets TinkeringGnome's Atomic Rings PRJ 2001 Star Gate effect in A:M with PRJ Comparison of AO and Radiosity Renders Animated Commercial by Soulcage Tralfaz's Lost In Space Robot Rodger Reynold's Architectural WIP Live Answer Time Demo
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

Luuk Steitner

Hash Fellow
  • Posts

    623
  • Joined

  • Last visited

Posts posted by Luuk Steitner

  1. realy nice tool !!! , some images from the app interfas please :P

     

    is Posible animating creatures or monsters faces , not Human Faces ???

     

    Yes, you can use it for your monsters ;)

    The interface looks like this for now: (pictures are a bit blurred because of the low quality compression)

    Interface1.jpg

    Interface2.jpg

     

    Btw, did I mention I also added exaggerate options for each bone? :rolleyes:

     

     

    EXAGGERATE! Sweeeet!

     

    Does your system track the eyes aiming? I don't think it's too big a deal if it doesn't. It would be SOMETHING if you actually brought this to market for us all. The 'animator' in me says "It's too much of a cheat!" but the filmmaker in me says "Long form conversation-driven animation is possible!" The results are quite compelling!

    It doesn't have eye aiming yet. I'll experiment with that later.

  2. I think a video tutorial should be included or available for download with the purchase of the app, explaining how to set up the rig and how to get it to work with bvh data. Then I'm in...

     

    Yes, I will definitely make a tutorial how to work with it. I'll write one when all things are working like they should and I will probably make a video tutorial too.

  3. Thanks for all your thoughts about this. I haven't decided yet but we'll see.

     

    Luuk,

     

    Thanks for the info, Ill start saving up, Im guessing youll be going through a paypal type service or some way one can use a credit card? Keep up the good work!

     

    Yes, I'll use the paypal services for a start. That seems the best solution for now because it's the first time I'll be selling software online.

  4. Just to show off again, I show a rendered animation with the smoothing I earlier mentioned. See this video

     

    This animation was tracked from the same video as the previous tests, but this time the jitter is gone :lol:

    Btw, did I mention I also added exaggerate options for each bone? :rolleyes:

  5. Hi Luuk,

     

    That is sooo impressive. I can't wait to see your future work.

     

    I only have one question. How sub-pixel accurate is your tracker (1/2 pixel , 1/4, etc.) ??

     

    As I'm sure you already know, your sub-pixel accuracy will help a lot with lower resolution video captures.

     

    Also, you might want to consider using small square stickers instead of dots. Or if you want to use dots, draw a black 90 degree corner in the middle of your dots or use smaller dots. This will also help the accuracy of your tracks.

     

    I wish you all the best with your project,

    Greg Rostami

    It isn't really sub-pixel accurate. There's no need for that because all values are converted to floating-point variables and then smoothed.

    You're right about using smaller stickers, but those are hard to find and I didn't feel like making them myself. But, that's no problem at all. In the configuration you can specify the size of the used stickers and the algorithm will track to it's center.

     

     

    I am interested in this motion tracking software package. I hope it can work ok with my webcam. But can we edit the facial animation after we apply it? If it is a keyframe per frame I assume that you cant realy edit it much. And what are the requirements for the markers? Where do you even get something like that?

    If you want to edit the motions manually you can do that the choreography or add a second action and blend them.

    The marker stickers I have used can be any normal sticker with a basic color. The brands I used are Herma and Avery. They should be available in any office supply store.

  6. Luuk, first off I enjoy your accent, and your english is just fine, I would like to ask for a ball park figure on what you think you may charge for the app? I like to put aside moneys for these purposes, like I do with my A:M subscriptions, its a must have app for me. Also a possible release date will help in that matter too. I already have started a list of other possible uses for such an app.

    Mike

     

    I'm really not sure about the price yet, so don't hold me to it (it might also get lower) but I'm currently thinking about something like 129 euro. Keeping in mind the the value of the US dollar is very low at the moment and I don't want it to get too expensive for the American customers. Minus VAT and multiplied by the current rate that would be 150 USD. I think that's a reasonable price considering what you get and the amount of work involved, especially if you compare it to the price of other motion tracking solutions.

    I hope to be able to release the first version at the end of November.

  7. Hey that looks promising. What will it take to track the eyelids and eye direction?

     

    I was waiting for someone to ask that :) I won't include it in the first release, but when I done getting all things to work for version 1 I'll start experimenting on this. I do think it would be possible to get seemingly good results but that will greatly depend on the quality of the video file.

     

     

    I have a HDV cam (hi-def). Would that actually improve things or is it more of a matter of good lighting and contrast?

     

    Yes. That will absolutely improve the results because it can track the movement more accurate and be because of the higher resolution the calculated angles will also be more accurate. And you'll need less or no smoothing to prevent jitter.

  8. is there any hope for a mac version?? this looks very cool!!

    If I make enough money with this I will definitely make a MAC version but you'll have to be patient.

     

     

    On a side note... I always wondered about certain complaints regarding mo-capped facial movements, specifically "exaggeration". Every time I've seen this sort of thing done in big productions the main complain I and others have is how the "regular" motion of a "real" face is lacking in emotion and subdued and wondered if there wouldn't be a way to just "add" in exaggeration to the rig... like using a slider of some kind to increase the motion on extremes? Maybe you are already planning for this.

     

    As for the jittering... couldn't that be fixed in AM? Reducing channels by baking the action or something?

     

    -vern

     

    Exaggeration would be a very useful feature. For some characters the movement made by the face won't be enough and A:M can only reduce the enforcement. It won't go over 100% so that's why I wan't to make it. It can always be reduced later in A:M if it appears to be too much.

     

    The jittering is just because the export isn't 100% finished yet. I'm using a smoothing algorithm that smooths the feature movements but if the resolution of the video isn't high enough there will still be jittering in the movements of the rig. That's why I will make an after smooth function that smooths the rig after the 3D motion is calculated. That's one of the first things I will add to the program. I hope it works out like I think it will do...

     

    Btw, I changed a few things and made another test video. Yes, this one still jitters...

  9. I just did another MoCap test: see this video

    Sorry for my crappy english...

     

    With this test I find some more things I have to adjust, like reducing the head rotation and link the 'sneer' movement to the rotation.

    This was also a good test to see how the lip sync is working out. It looks like I have smoothed the mouth bones too much. That's easy to fix :)

  10. Does it create a keyframe in all channels on every frame? Can it be added to separate body animation done in AM?

     

    When you import an BVH file in A:M you do that in an action. So it creates keyframes on every frame in that action only. You still can manipulate the movements in the chor / combine it with other actions.

    You can apply the action to every model that has the same face rig.

  11. Just wanted to add ,you have my interest also, I have a Digital 8 camcorder, guessing that will do? very interested in the process that's involved too. this program runs as a separate app?, then you load the BVH, setup a rig and mesh?

     

    Thanks, so cool..

     

    Any camera with decent quality will do. Even a web cam can be used if it has a good resolution and FPS.

    There is no minimum. For satisfying results you should at least have VGA / NTSC / PAL resolution with 25-30 FPS.

    If you're using a High Definition camera the result will be smoother, if your camera has a higher FPS you can track faster movements.

     

    And yes, it's a seperate app. Because it exports BVH files it can be used with any 3D animation program which supports BVH. I will of course promote A:M on my website to get users of other apps to do it with A:M ;)

  12. This looks very cool. How complex is the facial rig needed? Will the facial rig be supplied with it?

     

    The facial rig is pretty simple, I will supply one but it is also possible to constrain the BVH to a squetch rig.

  13. Thanks for all the compliments :)

     

    I don't know what your plans are.... But I would Buy that in a friggin heartbeat.

    Oh really? What would you pay for it? ^_^

     

    That depends on how good/easy it is to set up.

    Only you know how difficult/time consuming this project has been for you.

    Seems like most 3rd party things in the A:M field sell between $30 and $100

     

    I certainly have no problems in that area, but again, only you can figure how much it costs...

     

    Mike Fitz

    www.3dartz.com

     

    I'm not sure yet what the price will be but I'll make sure it wouldn't be too much. Actually the most valuable thing about my program is that it's really easy to set up. Just manually specify the marker positions on the first frame and click the track button. The program export a BVH file that can be imported into A:M. Of course you'll need to rig the face and smart skinning is also possible. Once the face is rigged and you need a diffrent animation it's just a few mouse clicks and it's done.

    When I release it I'll make a trail version so you can try before you buy.

     

     

    Wow. REALLY cool! I'd love to learn more about the 'how it works'... you're really on to something there. Say- would it be able to overexaggerate?+

    Yes, it will be "overexaggeratable" the current test version isn't but someone else already asked me to do that, and because I think that's really a good idea I'm getting on it this week.

  14. Hello all,

     

    As some of you know I have been working on a facial motion tracking program. At this moment most of it works and I'm doing some tests.

    So, here is my first satisfying experiment: test 01

    (Don't be scared, I'm just making faces for test purposes only :rolleyes: )

     

    It still needs some more tweaking, but I'm very happy with the results for now. Btw, the MoCap is done with just one regular DV camcorder.

    Some others from the A:M community will also do some tests with it soon, they can post their results here too if they like.

    I'll keep you guys posted about the progress. I think I'm getting close to a version that's ready for release.

×
×
  • Create New...