Bendytoons Posted February 23, 2007 Share Posted February 23, 2007 Here is the first camera test of my new puppeting system. Sorry for the lame delivery but this was just a proof of concept. I know there are a heap of problems with the performance, but please let me know what you think. talk6_.mov Quote Link to comment Share on other sites More sharing options...
KenH Posted February 23, 2007 Share Posted February 23, 2007 By puppeting system, I'm assuming it's some sort of external control of the character. It that's so, then there's certainly potential there. Quote Link to comment Share on other sites More sharing options...
Hash Fellow robcat2075 Posted February 23, 2007 Hash Fellow Share Posted February 23, 2007 So tell us more about how this is done! It has a more keyframed look than motion capture. Quote Link to comment Share on other sites More sharing options...
Bendytoons Posted February 23, 2007 Author Share Posted February 23, 2007 So tell us more about how this is done! It has a more keyframed look than motion capture. It is fundamentally motion capture, and I'm driving his face with mine, but it works more like a puppet because the mocap data is driving a pose based animation set up rather than directly controlling muscle animation. Here is a split screen with my face.splitscreen.mov Quote Link to comment Share on other sites More sharing options...
Paul Forwood Posted February 23, 2007 Share Posted February 23, 2007 Sounds cool! I am not able to view the splitscreen.mov file though. I get an error message: " error 2048: the file is not a movie file" I'll try the first one that you posted. Maybe I need another codec or something. --------------------------------------- Nope. I get the same error for both movies. What compression codec did you use? Quote Link to comment Share on other sites More sharing options...
nimblepix Posted February 23, 2007 Share Posted February 23, 2007 Wow, looking forward to hearing more about this system. Quote Link to comment Share on other sites More sharing options...
KenH Posted February 23, 2007 Share Posted February 23, 2007 Lots of potential (TWO 2 anyone). I notice the eye lids aren't working. I don't know how the eyes are tracked.....maybe the pupils are markers. But put some teeth in there and it would probably look better. Quote Link to comment Share on other sites More sharing options...
Ganthofer Posted February 23, 2007 Share Posted February 23, 2007 Very nice. Are you planning on expanding it to the whole body? Paul - It looks like he's using Sorenson V3 for video compression (uncompressed PCM audio - no sound recorded as far as I can hear though). Glenn Quote Link to comment Share on other sites More sharing options...
Paul Forwood Posted February 24, 2007 Share Posted February 24, 2007 Thanks, Glenn. I've tried viewing on another PC but I get the same error message so it looks like the file is getting damaged in the download. Quote Link to comment Share on other sites More sharing options...
KenH Posted February 24, 2007 Share Posted February 24, 2007 More questions: How much do those dots on your face cost? I assume it's a digital camera? What MP? Quote Link to comment Share on other sites More sharing options...
frosteternal Posted February 24, 2007 Share Posted February 24, 2007 ...fundamentally motion capture, and I'm driving his face with mine, but it works more like a puppet because the mocap data is driving a pose based animation set up rather than directly controlling muscle animation. Here is a split screen with my face.splitscreen.mov Wow. Very nice. A bit "jittery"; I assume that is due to inconsistencies in tracking those oh-so-stylish dots on your face. I watched the split screen several times in succession, very promising, very weird to see the bug mimic your face. Can't wait for more details on "how it was done". Quote Link to comment Share on other sites More sharing options...
higginsdj Posted February 24, 2007 Share Posted February 24, 2007 Nice work. Before we get too excited, is this likely to end up being a product (software only I imagine) that us amateurs can afford or are you aiming at the commercial/pro market? I must say that it certainly has the potential to speed up facial animation and lip synch work for the average animator. How dense are the keyframes? Cheers Quote Link to comment Share on other sites More sharing options...
Admin Rodney Posted February 24, 2007 Admin Share Posted February 24, 2007 Impressive! Quote Link to comment Share on other sites More sharing options...
strohbehn Posted February 24, 2007 Share Posted February 24, 2007 That's cool! I'm very interested to see how you did this. Thanks for posting your tests! Quote Link to comment Share on other sites More sharing options...
Bendytoons Posted February 24, 2007 Author Share Posted February 24, 2007 Lots of potential (TWO 2 anyone). I notice the eye lids aren't working. I don't know how the eyes are tracked.....maybe the pupils are markers. But put some teeth in there and it would probably look better. I think I've figured out how to do blink tracking and eyelid tracking in the next rev. as far as teeth go, the Bug and I are having ongoing negotiations. Very nice. Are you planning on expanding it to the whole body? Paul - It looks like he's using Sorenson V3 for video compression (uncompressed PCM audio - no sound recorded as far as I can hear though). Glenn I might try some body capture, especially upper body, but that is a lot more complicated and really requires more cameras than I have at the moment. And I did use Sorenson3. More questions: How much do those dots on your face cost? I assume it's a digital camera? What MP? Dots were made with a MAC liquid eyeliner (which makes them surprisingly expensive), coulda used a sharpie but I had a lunch date. The cameras involved are a Canon GL2, and a little Sony handycam, both of which are DV video. ...fundamentally motion capture, and I'm driving his face with mine, but it works more like a puppet because the mocap data is driving a pose based animation set up rather than directly controlling muscle animation. Here is a split screen with my face.splitscreen.mov Wow. Very nice. A bit "jittery"; I assume that is due to inconsistencies in tracking those oh-so-stylish dots on your face. I watched the split screen several times in succession, very promising, very weird to see the bug mimic your face. Can't wait for more details on "how it was done". Yeah Jitter is video interlacing, basically. If I had me a couple o' them HD cams things 'uld be different, oh yes they would. Nice work. Before we get too excited, is this likely to end up being a product (software only I imagine) that us amateurs can afford or are you aiming at the commercial/pro market? I must say that it certainly has the potential to speed up facial animation and lip synch work for the average animator. How dense are the keyframes? Cheers I am at the moment unsure of where this will go outside of my productions. I don't see it being an easy to use tool any time soon. The technologies are all off the shelf, but the process takes a lot of know how in a bunch of areas. I used to work for a company that tried to build "easy to use" facial tracking software. It was a disaster because you just can't make things simple enough for the casual user. However, a dedicated amateur could do this basically the same way I did with a total cost not more than $1500 not including computer. Keyframe density is thirty per second, basically one per video frame (A:M didn't like the data at 29.97). But the beauty of using A:M is that you just drop extra layers ontop of the data to adjust or correct, you don't have to worry about the keyframe density. I hope it speeds up facial animation for me, right now. Than we'll see about this mythical average animator. Quote Link to comment Share on other sites More sharing options...
cfree68f Posted February 24, 2007 Share Posted February 24, 2007 very, very cool! Quote Link to comment Share on other sites More sharing options...
Paul Forwood Posted February 24, 2007 Share Posted February 24, 2007 Hooray! Downloaded today without a hitch. For some reason the files were only halfway downloaded though they looked like complete QT.mov files. I think it was my security software. Anyway, downloads are working again today. This is such an interesting and exciting experiment! I'd love to know what the tracking software is. Looking forward to seeing your first short with kitchen table face tracking. Quote Link to comment Share on other sites More sharing options...
John Bigboote Posted February 24, 2007 Share Posted February 24, 2007 VERY interesting! This spurs the imagination with possibliites. Animated spokespersons...whatabout this: Feed 'live' facial data and audio thru HAMR to control a character that can be seen by millions instantly... Anything you could divulge on your process? Do you HAVE to look like a Ninja to use it? Quote Link to comment Share on other sites More sharing options...
goodguy20k Posted February 24, 2007 Share Posted February 24, 2007 Ok... That's just... Awesome! I didn't see the video until today, but I'd seen all the comments about the lovely dots. Amazingly, those are a lot better then I thought it'd look! Well done! Keep us posted! Quote Link to comment Share on other sites More sharing options...
Bendytoons Posted February 24, 2007 Author Share Posted February 24, 2007 VERY interesting! This spurs the imagination with possibliites. Animated spokespersons...whatabout this: Feed 'live' facial data and audio thru HAMR to control a character that can be seen by millions instantly... Anything you could divulge on your process? Do you HAVE to look like a Ninja to use it? Yeah, so as mentioned in a previous post, I used to work for a company that tried to do a live face capture product for the masses. The company was called Eyematic and if you were at Siggraph 2001 you might have seen us performing our celebrity theater. The practicality of making such a realtime system work well consistently was a bear and a half. The technology certainly exists now to make a practical realtime system, and If you got 20 or 30k to drop you can probably get one- hey, they made Polar Distress. But I don't see a personal studio option coming too soon. All the people who were working on such things now are working on how to identify your face at the superbowl and other "government" work. And the market is still too small for a low cost option to make money, I suspect. But I'm with you all the way, Matt. I've always wanted to just plug-in my avatar and go. As far as process goes, I used Syntheyes for the initial motion capture, and A:M for puppet building- everything else is highly classified voodoo. And no, you don't HAVE to look like a ninja, but it makes it easier to track the top of your head. Quote Link to comment Share on other sites More sharing options...
KenH Posted February 24, 2007 Share Posted February 24, 2007 The practicality of making such a realtime system work well consistently was a bear and a half. You can't go cutting up bears. They're on the endangered species list! I recently saw another company working on capturing facial animation and it was the best I've seen it. Unfortunately I've lost the link. Maybe I got it from the forum. Quote Link to comment Share on other sites More sharing options...
dre4mer Posted February 25, 2007 Share Posted February 25, 2007 niice test thanks for sharing! You're saying the jittering is caused by interlacing? Do you need some 24P or 30P footage to test it out with? I've always wanted to get syntheyes... this would be yet one more reason to add to the pile... Grr... if only money grew on trees... -Ethan Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.