sprockets Behind The Scenes: A:M and Animatronics Jeff Cantin's Classic Splining Tutorial Strange Effect, video demo and PRJ included John blows up a planet, PRJs included VWs by Stian, Rodger and Marcos Myron's band gets its own wine! Learn to do radiosity renders
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

DrPhibes

*A:M User*
  • Posts

    105
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by DrPhibes

  1. Everything discussed above was created in late 2019 through the summer of 2020. Which is to say that it is out of date. The way we do this on current characters is far more streamlined. We have been working with an automation software design group that has created a system that uses a fully simulated version of the character. All the animation data that is sent to the control system is fully visualized and simulates the full actuator capabilities of the animatronic in real time. It means we can program in all those speed limits I keep talking about, but it also allows for collision control and actuator behavior based on inertia and more. Of course, I can still use Animation:Master to create the performance.
  2. The final step in the process is to export this animation data to the character. I first bake the animation in the choreography. I bake one keyframe per frame to maintain accuracy at this step. Then export the choreograph action file. Next, I needed to come up with a way to convert this data to a CSV file containing just the position over time data for the one axis of the bones I want to pass on to the animatronics. Fortunately, at the time we did this project, I had someone on staff who knew a little C++ and I had him code me an app. This app imports A:M .act files and allows me to select the bones I want to translate, define the axis and ranges, and even remap the values. Finally saving as a CSV that I can bring into the animatronic controller animation software. On this show, we were using a program called Conductor. It is the same software we used to use in the early days, and as I mentioned, it does not visualize any animation. Just stores position data on a timeline. I can import the CSV here and this tool does a great job at cleaning up the extra frames reducing to a manageable number of points without altering the shape of the curve and affecting the acceleration. This application becomes my second verification that I have stayed under my speed and acceleration limits because it has built-in a similar meter and warning function I created in A:M In the end, the file from this app is what is saved to the character on an SD card. There is a master show control system for the venue that simply sends a trigger signal to the character to run the animation.
  3. As I mentioned previously, speed and acceleration/deceleration are critical data streams to monitor. So that we don’t break the robot there are many safety measures built through the system. For example, on many of the actuators we program those speed limits into the hardware so that no matter what the animation data tells it to do, it will not go past a certain limit. So that I can know with confidence that I am animating to those limits without exceeding them and that what I see on screen is what I am going to get, I designed a Speed/Acceleration meter that I can have open in A:M that monitors the bones I am using as my data gathering proxies. These used a series of expressions to do the math of the Speed/Acceleration of the bone it was constrained to. After spending some time to figure out how to make these meters, I find it amusing how now I don’t really use them because I know the characters so well, I can see when I am over the limits by eye.
  4. The other place I used IK type control was with the head. This head in real life is driven by two linear actuators that move the head in all directions. For the rig, it was a relatively easy thing to have a single bone that is located at the u-joint that the head pivots off constrained to aim at a null. By moving the null you can make the head look up, down, left, and right. However, this all feeds back to a series of bones that represent the actuators. It is from these two “actuator” bones I gather position data to pass onto the real actuators.
  5. The frog features the most of all the characters. This is the only character also that used any IK type rigging. All the other characters use FK through pose sliders. For the eyes I wanted the environment to do some of the animating for me. I built a representation of the entire location in the project at full scale, and in my rig for the eyes, I created a null that is the focal point of the eyes. With this null I could place it at the various locations the audience is present around the lake using poses. Then fading between pose sliders, I can direct his gaze around the lake regardless of where the position of the head is at. This adds a complexity to the animation that nobody is even aware of consciously. With the head moving, rotating, and bobbing to the music, the eyes can stay fixed on your position while you are sitting in the Steakhouse at the edge of the lake. Good restaurant by the way.
  6. The first character we looked at was a giant toucan. We knew we wanted this to be an extremely dynamic figure, so to determine what kind of speeds and acceleration limits we would need to design within I made a quick animation using a model of mostly 2d cut-outs. Even though it was not necessary for the info we were gathering at this step, I still modeled at full scale. All the axes of motion were placed at the approximate real location we had determined in our napkin sketches during the initial conversations. Then this same project is what everything else was built on top of. All of it was created in a few hours. The great thing about the workflow in A:M is how easy it is to build a model that is almost made of nothing but the bones, animate it, come back and refine the sculpted model and even move the axis around later while utilizing the same animation data. I can almost work backward if needed. There is very little time spent on the front end to get to where I can work on movement. Lady Birds_test-01-sml.mp4 Tucan-Demo04_sml.mp4
  7. Hello, As some may already know, I have been a user of A:M for more than 30 years. I walked right into the front office of Hash HQ to buy my first copy in 1993, and I still have the disks. I am not usually able to show much of my work outside of the studio, however, at the request of Robert Holmén I have assembled a brief look at how I use A:M in my design work. Even though in my job I need to know a large library of applications, I still like using A:M when I can because of how quickly I can get from an idea to something moving on the screen. Often, I am using A:M to create previz animatics just to show conceptually how something will likely move and to create some parameters for fabricating. Things like locating the axis of movement and how many degrees it may move, or how fast it might move for example. It is to develop a starting point for engineering data and specifying actuators, or sometimes to simply give the client and my team clarity on our direction. There is rarely any kind of finished detailed rendering of these kinds of animations. Everything is relatively primitive. There are occasions though where I can take the animation a little further. Over the past few years, we have used the previz animation to drive the programming of a finished animatronic. You can see examples of this in the current shows running at the Wynn in Las Vegas. https://press.wynnlasvegas.com/press-releases/wynn-las-vegas-brings-entertainment-back-to-the-strip-with-debut-of-the-new-lake-of-dreams/s/62fa1fa1-64dc-4009-a73b-4fc7099eb279?cultureSeoName=GLOBAL When we were tasked with creating all these new shows and updating our frog to a more modern control system, the frog had been in show since 2005 and a lot has changed since then, I was able to update the way we animate the shows finally. To animate the original version of the frog, someone would need to sit at the edge of the lake in Las Vegas in the middle of the night with a mixer board adjusting one axis at a time. It would take several nights to get a song dialed in. Even though the animation was being recorded to an animation tool it was designed for automation control, so the only animation interface is a timeline and spline features. You have no visualized double of your character to monitor. With our new process, now I can just animate a new song in A:M in an afternoon from the comfort of my desk and upload it directly to the frog from almost 1000 miles away. The following posts are a brief look at the A:M projects I have created and the process for this set of shows.
  8. Nice work! I was searching for something completely unrelated and came across this post. Excellent results. I was thinking "I wonder if I can emulate the slit scan process in 3d" as I wandered out of the theater from seeing 2001 in 70mm on Monday. Now I know the answer.
  9. Robert, This solution works in the choreography also! So after I create an animation through some poses I baked the animation. Then I followed your steps above and was able to change the driver. worked perfectly. Thanks so much for the help! Charles
  10. Thanks Robert for this solution! I will try this out. Yes, you are correct, I meant euler. I am not sure this will fix the problem if I am baking out an animation in the choreography. Won't that just resort to the quaternion? I have an animation for a character done entirely through pose sliders. This character has a few constraints that use a null target to direct the z-roll of a couple bones. The animation then gets baked out in a chor and I export that action to a file that I then use a tool to convert to a CSV to extract the specific z-roll data and apply to a real world actuator. This was all working fine two years ago when I set this up, and now that I need to create a new animation this forcing of the quaternion driver issue has popped up. This of course means that the z rotate data is not accurate anymore. I was going to look back to try and figure out what version of the software I would have been using summer of 2020. Maybe it would work ok there.
  11. I would like to use the Vector Interpolation driver on bones in an action file. I used to be able to select it in the option menu and set it for the project be default, but now everything is defaulting to quaternion. Even an older project is defaulting to quat. I change it in the options to vector and it just keeps reverting to quaternion. I don't want the rotate.w channel for this project, just xyz absolute values. How do I fix this? Thanks all, Charles
  12. Robert, This is really helpful! Thank you very much for the deep dive! I will have more questions as I go here. I am hoping to create a more direct translation of a spline from A:M to Rhino and back. the polygon formats that I can use to get models to and from are ok, but I think there has to be a better way.
  13. I have been working with A:M to create motion profiles for animatronics. Successfully I programed the animatronics for some pieces in Las Vegas last year, writing our own app to help translate to our controllers. You can see some of our process here. I am still working on refining this process and I was hoping someone could point me to some documentation of what all the switches are in an .ACT file or a .MDL file so I can better create some of the tools I need. I have looked on the FTP and at the SDK materials, but I could not find some plain language materials describing all of the syntax of the files. I am not a programmer by nature and there is a lot I don't know, so perhaps it is there, but I need a little more guidance to sort it out. everything I have done so far is simply by trial and error. For example, I can open an action file in text editor and understand what is the axis, time, and position of a bone and this made it easy to convert to what I needed. However, there are other tags in a model file I don't quite know what they all are and I would like to understand them more clearly to leverage for other conversions. For example the following; <SPLINE> 262145 0 1 -18.4294033 -2.04771137 0 . . 262145 0 9 24.2136593 -1.03441095 0 . . </SPLINE> creates a two point spline in a model file. what does the "262145" indicate? What does the "0" following it indicate? what does the ". ." at the end indicate? Thanks for any help in parsing this out. Charles
  14. I have designed many 3d printable models in A:M, but as Robert mentioned it is not really intended as a CAD program. The kind of Boolean functions you want are not really practical in A:M, but there are best practices and techniques within A:M that can get you there. This is a functional "steam" engine I did all in A:M https://www.thingiverse.com/thing:25624 Charles
  15. Solved the problem! It has everything to do with the files names! I shortened all the files names and it worked fine! I don't know why the longer files names work on other PCs. All of my machines are Windows 10. Thanks for all your help in trying to sort this out Robert! CB
  16. Recorded a new WAV - Works! Split original and save - Did not work Merged to mono and save - did not work
  17. That worked! Did you simply merge the tracks in Audacity and export to the 44.1kHz? I tried to do that just now and my file did not work, but yours worked fine.
  18. Yeah, and it works fine for me on other machines, but not on this one. All the files you sent me DO work fine here, so this is a mystery. I can't tell what the difference is in the files. I thought maybe about saving out as a entirely different format and then back to a wav. I said Audacity above when I meant Adobe Audition. I should try Audacity.
  19. Both work equally well. Import and playback without error.
  20. I can't post publicly, but I sent you a link via a PM. I thought it was because of the different sample rates. Yours was 32kHz and my original was a 48kHz, but after saving out several from Audacity changing to various sample rates, none of them worked. So there is something else different about your wav file than mine that I am missing. What software are you using when you edit or save wav files? CB
  21. That worked! What is the difference there?
  22. Thanks for at least something to try. It did not work, but it gave me hope for at least a minute CB
  23. [Solution: shorten filename] Has anyone else ever had an error dropping in a wave file into a project? I have several PC's I work on and everything is great when importing an audio file, but on one machine I get "trouble opening mci device error: 304" on this Windows 10 machine. The wave file imports, but there is no audio playback. I so I assume it's that machine and look to make sure all drivers are current and can't replicate the problem in any other app other than A:M. I can import an MP3 and that has playback no problem. Unfortunately I am noticing sync problems and it does not display a waveform in the timeline. Probably because MP3 is not fully supported yet. These kind of problems are maddening. I can't find any info on a MCI error 304 to help troubleshoot this and I don't know what A:M is specifically doing when it loads the audio file and how is accesses libraries and such. Probably on my own here but I thought I would ask. Charles
×
×
  • Create New...