sprockets The Snowman is coming! Realistic head model by Dan Skelton Vintage character and mo-cap animation by Joe Williamsen Character animation exercise by Steve Shelton an Animated Puppet Parody by Mark R. Largent Sprite Explosion Effect with PRJ included from johnL3D New Radiosity render of 2004 animation with PRJ. Will Sutton's TAR knocks some heads!
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

Bug speaks


Recommended Posts

  • Replies 21
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

So tell us more about how this is done! It has a more keyframed look than motion capture.

It is fundamentally motion capture, and I'm driving his face with mine, but it works more like a puppet because the mocap data is driving a pose based animation set up rather than directly controlling muscle animation.

 

Here is a split screen with my face.splitscreen.mov

Link to comment
Share on other sites

Sounds cool!

I am not able to view the splitscreen.mov file though. I get an error message: " error 2048: the file is not a movie file"

I'll try the first one that you posted. Maybe I need another codec or something.

---------------------------------------

 

Nope. I get the same error for both movies.

What compression codec did you use?

Link to comment
Share on other sites

Lots of potential (TWO 2 anyone). I notice the eye lids aren't working. I don't know how the eyes are tracked.....maybe the pupils are markers. But put some teeth in there and it would probably look better.

Link to comment
Share on other sites

...fundamentally motion capture, and I'm driving his face with mine, but it works more like a puppet because the mocap data is driving a pose based animation set up rather than directly controlling muscle animation.

Here is a split screen with my face.splitscreen.mov

 

Wow. Very nice. A bit "jittery"; I assume that is due to inconsistencies in tracking those oh-so-stylish dots on your face. :) I watched the split screen several times in succession, very promising, very weird to see the bug mimic your face.

Can't wait for more details on "how it was done".

Link to comment
Share on other sites

Nice work. Before we get too excited, is this likely to end up being a product (software only I imagine) that us amateurs can afford or are you aiming at the commercial/pro market?

 

I must say that it certainly has the potential to speed up facial animation and lip synch work for the average animator. How dense are the keyframes?

 

Cheers

Link to comment
Share on other sites

Lots of potential (TWO 2 anyone). I notice the eye lids aren't working. I don't know how the eyes are tracked.....maybe the pupils are markers. But put some teeth in there and it would probably look better.

I think I've figured out how to do blink tracking and eyelid tracking in the next rev.

 

as far as teeth go, the Bug and I are having ongoing negotiations.

 

Very nice. Are you planning on expanding it to the whole body?

 

Paul - It looks like he's using Sorenson V3 for video compression (uncompressed PCM audio - no sound recorded as far as I can hear though).

 

Glenn

I might try some body capture, especially upper body, but that is a lot more complicated and really requires more cameras than I have at the moment.

 

And I did use Sorenson3.

 

More questions: How much do those dots on your face cost? I assume it's a digital camera? What MP?

Dots were made with a MAC liquid eyeliner (which makes them surprisingly expensive), coulda used a sharpie but I had a lunch date.

The cameras involved are a Canon GL2, and a little Sony handycam, both of which are DV video.

 

...fundamentally motion capture, and I'm driving his face with mine, but it works more like a puppet because the mocap data is driving a pose based animation set up rather than directly controlling muscle animation.

Here is a split screen with my face.splitscreen.mov

 

Wow. Very nice. A bit "jittery"; I assume that is due to inconsistencies in tracking those oh-so-stylish dots on your face. :) I watched the split screen several times in succession, very promising, very weird to see the bug mimic your face.

Can't wait for more details on "how it was done".

Yeah Jitter is video interlacing, basically. If I had me a couple o' them HD cams things 'uld be different, oh yes they would.

 

Nice work. Before we get too excited, is this likely to end up being a product (software only I imagine) that us amateurs can afford or are you aiming at the commercial/pro market?

 

I must say that it certainly has the potential to speed up facial animation and lip synch work for the average animator. How dense are the keyframes?

 

Cheers

I am at the moment unsure of where this will go outside of my productions. I don't see it being an easy to use tool any time soon. The technologies are all off the shelf, but the process takes a lot of know how in a bunch of areas. I used to work for a company that tried to build "easy to use" facial tracking software. It was a disaster because you just can't make things simple enough for the casual user. However, a dedicated amateur could do this basically the same way I did with a total cost not more than $1500 not including computer.

 

Keyframe density is thirty per second, basically one per video frame (A:M didn't like the data at 29.97). But the beauty of using A:M is that you just drop extra layers ontop of the data to adjust or correct, you don't have to worry about the keyframe density.

 

I hope it speeds up facial animation for me, right now. Than we'll see about this mythical average animator.

Link to comment
Share on other sites

Hooray! Downloaded today without a hitch. For some reason the files were only halfway downloaded though they looked like complete QT.mov files. I think it was my security software. Anyway, downloads are working again today.

 

This is such an interesting and exciting experiment!

 

I'd love to know what the tracking software is.

 

Looking forward to seeing your first short with kitchen table face tracking. :)

Link to comment
Share on other sites

VERY interesting!

 

This spurs the imagination with possibliites. Animated spokespersons...whatabout this: Feed 'live' facial data and audio thru HAMR to control a character that can be seen by millions instantly...

 

Anything you could divulge on your process? Do you HAVE to look like a Ninja to use it?

Link to comment
Share on other sites

VERY interesting!

 

This spurs the imagination with possibliites. Animated spokespersons...whatabout this: Feed 'live' facial data and audio thru HAMR to control a character that can be seen by millions instantly...

 

Anything you could divulge on your process? Do you HAVE to look like a Ninja to use it?

Yeah, so as mentioned in a previous post, I used to work for a company that tried to do a live face capture product for the masses. The company was called Eyematic and if you were at Siggraph 2001 you might have seen us performing our celebrity theater. The practicality of making such a realtime system work well consistently was a bear and a half. The technology certainly exists now to make a practical realtime system, and If you got 20 or 30k to drop you can probably get one- hey, they made Polar Distress.

But I don't see a personal studio option coming too soon. All the people who were working on such things now are working on how to identify your face at the superbowl and other "government" work. And the market is still too small for a low cost option to make money, I suspect. But I'm with you all the way, Matt. I've always wanted to just plug-in my avatar and go.

 

As far as process goes, I used Syntheyes for the initial motion capture, and A:M for puppet building- everything else is highly classified voodoo. And no, you don't HAVE to look like a ninja, but it makes it easier to track the top of your head.

Link to comment
Share on other sites

The practicality of making such a realtime system work well consistently was a bear and a half.

 

You can't go cutting up bears. They're on the endangered species list!

 

I recently saw another company working on capturing facial animation and it was the best I've seen it. Unfortunately I've lost the link. Maybe I got it from the forum.

Link to comment
Share on other sites

niice test thanks for sharing! You're saying the jittering is caused by interlacing? Do you need some 24P or 30P footage to test it out with? I've always wanted to get syntheyes... this would be yet one more reason to add to the pile...

 

Grr... if only money grew on trees...

 

-Ethan

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...