sprockets The Snowman is coming! Realistic head model by Dan Skelton Vintage character and mo-cap animation by Joe Williamsen Character animation exercise by Steve Shelton an Animated Puppet Parody by Mark R. Largent Sprite Explosion Effect with PRJ included from johnL3D New Radiosity render of 2004 animation with PRJ. Will Sutton's TAR knocks some heads!
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

Facial MoCap experiments


Luuk Steitner

Recommended Posts

Hello all,

 

As some of you know I have been working on a facial motion tracking program. At this moment most of it works and I'm doing some tests.

So, here is my first satisfying experiment: test 01

(Don't be scared, I'm just making faces for test purposes only :rolleyes: )

 

It still needs some more tweaking, but I'm very happy with the results for now. Btw, the MoCap is done with just one regular DV camcorder.

Some others from the A:M community will also do some tests with it soon, they can post their results here too if they like.

I'll keep you guys posted about the progress. I think I'm getting close to a version that's ready for release.

Link to comment
Share on other sites

  • Replies 273
  • Created
  • Last Reply

Top Posters In This Topic

Thanks for all the compliments :)

 

I don't know what your plans are.... But I would Buy that in a friggin heartbeat.

Oh really? What would you pay for it? ^_^

 

That depends on how good/easy it is to set up.

Only you know how difficult/time consuming this project has been for you.

Seems like most 3rd party things in the A:M field sell between $30 and $100

 

I certainly have no problems in that area, but again, only you can figure how much it costs...

 

Mike Fitz

www.3dartz.com

Link to comment
Share on other sites

Thanks for all the compliments :)

 

I don't know what your plans are.... But I would Buy that in a friggin heartbeat.

Oh really? What would you pay for it? ^_^

 

That depends on how good/easy it is to set up.

Only you know how difficult/time consuming this project has been for you.

Seems like most 3rd party things in the A:M field sell between $30 and $100

 

I certainly have no problems in that area, but again, only you can figure how much it costs...

 

Mike Fitz

www.3dartz.com

 

I'm not sure yet what the price will be but I'll make sure it wouldn't be too much. Actually the most valuable thing about my program is that it's really easy to set up. Just manually specify the marker positions on the first frame and click the track button. The program export a BVH file that can be imported into A:M. Of course you'll need to rig the face and smart skinning is also possible. Once the face is rigged and you need a diffrent animation it's just a few mouse clicks and it's done.

When I release it I'll make a trail version so you can try before you buy.

 

 

Wow. REALLY cool! I'd love to learn more about the 'how it works'... you're really on to something there. Say- would it be able to overexaggerate?+

Yes, it will be "overexaggeratable" the current test version isn't but someone else already asked me to do that, and because I think that's really a good idea I'm getting on it this week.

Link to comment
Share on other sites

Just wanted to add ,you have my interest also, I have a Digital 8 camcorder, guessing that will do? very interested in the process that's involved too. this program runs as a separate app?, then you load the BVH, setup a rig and mesh?

 

Thanks, so cool..

Link to comment
Share on other sites

Just wanted to add ,you have my interest also, I have a Digital 8 camcorder, guessing that will do? very interested in the process that's involved too. this program runs as a separate app?, then you load the BVH, setup a rig and mesh?

 

Thanks, so cool..

 

Any camera with decent quality will do. Even a web cam can be used if it has a good resolution and FPS.

There is no minimum. For satisfying results you should at least have VGA / NTSC / PAL resolution with 25-30 FPS.

If you're using a High Definition camera the result will be smoother, if your camera has a higher FPS you can track faster movements.

 

And yes, it's a seperate app. Because it exports BVH files it can be used with any 3D animation program which supports BVH. I will of course promote A:M on my website to get users of other apps to do it with A:M ;)

Link to comment
Share on other sites

Does it create a keyframe in all channels on every frame? Can it be added to separate body animation done in AM?

 

When you import an BVH file in A:M you do that in an action. So it creates keyframes on every frame in that action only. You still can manipulate the movements in the chor / combine it with other actions.

You can apply the action to every model that has the same face rig.

Link to comment
Share on other sites

I just did another MoCap test: see this video

Sorry for my crappy english...

 

With this test I find some more things I have to adjust, like reducing the head rotation and link the 'sneer' movement to the rotation.

This was also a good test to see how the lip sync is working out. It looks like I have smoothed the mouth bones too much. That's easy to fix :)

Link to comment
Share on other sites

That is really quite amazing! Noticed the head seems to have a bit of the jitters in this demo though.

 

If you need an alpha or beta tester, feel free to contact me. I have been a software developer for over 20 years now with a strong ability to document problems and how to duplicate them.

 

Thanks...

Al

Link to comment
Share on other sites

This is pretty cool.

 

I want to get it just so I can forget to take the dots off my face when I go to the local WAWA for coffee. I know I would forget. I could see this turn into some new kind of fashion statement. People wearing white dots in public, in different colors.

 

Seriously though if this turns out as kick arse as the potential demonstrated so far I can see low budget productions snapping up AM just to have access to it. I know that I bought AM originally for the features available at such a low price. Keep the price low and people will be all over it like flies on... uh... a glazed donut in the parking lot of the WAWA.

 

On a side note... I always wondered about certain complaints regarding mo-capped facial movements, specifically "exaggeration". Every time I've seen this sort of thing done in big productions the main complain I and others have is how the "regular" motion of a "real" face is lacking in emotion and subdued and wondered if there wouldn't be a way to just "add" in exaggeration to the rig... like using a slider of some kind to increase the motion on extremes? Maybe you are already planning for this.

 

As for the jittering... couldn't that be fixed in AM? Reducing channels by baking the action or something?

 

-vern

Link to comment
Share on other sites

Every time I've seen this sort of thing done in big productions the main complain I and others have is how the "regular" motion of a "real" face is lacking in emotion and subdued

 

I personally am not fond of facial mocap specifically for that reason. But for major body parts (weight shifts, follow throughs) I wouldn't mind using it.

Link to comment
Share on other sites

Luuk, Have you considered using a kid in their mid teens? The cost benefit would be great - after all, you can't guarantee to get the dots in the same place every time, so you might as well use acne... It's free. :P

Link to comment
Share on other sites

is there any hope for a mac version?? this looks very cool!!

If I make enough money with this I will definitely make a MAC version but you'll have to be patient.

 

 

On a side note... I always wondered about certain complaints regarding mo-capped facial movements, specifically "exaggeration". Every time I've seen this sort of thing done in big productions the main complain I and others have is how the "regular" motion of a "real" face is lacking in emotion and subdued and wondered if there wouldn't be a way to just "add" in exaggeration to the rig... like using a slider of some kind to increase the motion on extremes? Maybe you are already planning for this.

 

As for the jittering... couldn't that be fixed in AM? Reducing channels by baking the action or something?

 

-vern

 

Exaggeration would be a very useful feature. For some characters the movement made by the face won't be enough and A:M can only reduce the enforcement. It won't go over 100% so that's why I wan't to make it. It can always be reduced later in A:M if it appears to be too much.

 

The jittering is just because the export isn't 100% finished yet. I'm using a smoothing algorithm that smooths the feature movements but if the resolution of the video isn't high enough there will still be jittering in the movements of the rig. That's why I will make an after smooth function that smooths the rig after the 3D motion is calculated. That's one of the first things I will add to the program. I hope it works out like I think it will do...

 

Btw, I changed a few things and made another test video. Yes, this one still jitters...

Link to comment
Share on other sites

When using BVH for body motion that was a pain for me to set up. I forget who it was but someone did a really excellent tutorial on that a while back.

 

I would think with this system setting up the "BVH rig" should be a bit easier. We won't have to deal with IK and differences in limb length etc.

 

-vern

Link to comment
Share on other sites

Luuk, first off I enjoy your accent, and your english is just fine, I would like to ask for a ball park figure on what you think you may charge for the app? I like to put aside moneys for these purposes, like I do with my A:M subscriptions, its a must have app for me. Also a possible release date will help in that matter too. I already have started a list of other possible uses for such an app.

Mike

Link to comment
Share on other sites

As BVH files produce keyframes on every frame, and because any slight deviations in the marker detection would result in jittery motion, I think it will always be better to add eye motion and blinks after importing the motion data into A:M. Having said that I would be interested in seeing what results Zign Track produced.

 

I am fortunate enough to have been chosen as one of the testers for Luuk's motion tracker and, although I have had a few initial problems with beta 1, I am confident that I can achieve good results with better video and better facial markers. I look forward to giving beta 2 a thourough workout later today and will post my results when I have something to show.

 

Zign Track looks like it will be an excellent tool for speeding up facial animation and great for other tracking tasks too. Luuk seems to have a good grip on the subject and the application already has a professional and intuitive feel to it.

Link to comment
Share on other sites

Hey that looks promising. What will it take to track the eyelids and eye direction?

I can't speak for Zign Track, but my face capture experiments included both eye and blink tracking. Eye tracking can be done by trackin the iris as a marker. The results are mediocre at best because the marker is very large in relation to the movement, but you do get some of the natural eye jitter. I tracked blinks by putting a small piece of colored tape on my eyelash and measuring its relationship to a marker under my eye. Again, the track is kind of rough, but better than nothing.

Link to comment
Share on other sites

Hey that looks promising. What will it take to track the eyelids and eye direction?

 

I was waiting for someone to ask that :) I won't include it in the first release, but when I done getting all things to work for version 1 I'll start experimenting on this. I do think it would be possible to get seemingly good results but that will greatly depend on the quality of the video file.

 

 

I have a HDV cam (hi-def). Would that actually improve things or is it more of a matter of good lighting and contrast?

 

Yes. That will absolutely improve the results because it can track the movement more accurate and be because of the higher resolution the calculated angles will also be more accurate. And you'll need less or no smoothing to prevent jitter.

Link to comment
Share on other sites

Luuk, first off I enjoy your accent, and your english is just fine, I would like to ask for a ball park figure on what you think you may charge for the app? I like to put aside moneys for these purposes, like I do with my A:M subscriptions, its a must have app for me. Also a possible release date will help in that matter too. I already have started a list of other possible uses for such an app.

Mike

 

I'm really not sure about the price yet, so don't hold me to it (it might also get lower) but I'm currently thinking about something like 129 euro. Keeping in mind the the value of the US dollar is very low at the moment and I don't want it to get too expensive for the American customers. Minus VAT and multiplied by the current rate that would be 150 USD. I think that's a reasonable price considering what you get and the amount of work involved, especially if you compare it to the price of other motion tracking solutions.

I hope to be able to release the first version at the end of November.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...