sprockets Behind The Scenes: A:M and Animatronics Jeff Cantin's Classic Splining Tutorial Strange Effect, video demo and PRJ included John blows up a planet, PRJs included VWs by Stian, Rodger and Marcos Myron's band gets its own wine! Learn to do radiosity renders
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

Animation:Master in Las Vegas


DrPhibes

Recommended Posts

Hello,

As some may already know, I have been a user of A:M for more than 30 years. I walked right into the front office of Hash HQ to buy my first copy in 1993, and I still have the disks.

I am not usually able to show much of my work outside of the studio, however, at the request of Robert Holmén I have assembled a brief look at how I use A:M in my design work.

Even though in my job I need to know a large library of applications, I still like using A:M when I can because of how quickly I can get from an idea to something moving on the screen.  Often, I am using A:M to create previz animatics just to show conceptually how something will likely move and to create some parameters for fabricating. Things like locating the axis of movement and how many degrees it may move, or how fast it might move for example. It is to develop a starting point for engineering data and specifying actuators, or sometimes to simply give the client and my team clarity on our direction. There is rarely any kind of finished detailed rendering of these kinds of animations. Everything is relatively primitive.

There are occasions though where I can take the animation a little further. Over the past few years, we have used the previz animation to drive the programming of a finished animatronic.

You can see examples of this in the current shows running at the Wynn in Las Vegas.

https://press.wynnlasvegas.com/press-releases/wynn-las-vegas-brings-entertainment-back-to-the-strip-with-debut-of-the-new-lake-of-dreams/s/62fa1fa1-64dc-4009-a73b-4fc7099eb279?cultureSeoName=GLOBAL

When we were tasked with creating all these new shows and updating our frog to a more modern control system, the frog had been in show since 2005 and a lot has changed since then, I was able to update the way we animate the shows finally.

To animate the original version of the frog, someone would need to sit at the edge of the lake in Las Vegas in the middle of the night with a mixer board adjusting one axis at a time. It would take several nights to get a song dialed in. Even though the animation was being recorded to an animation tool it was designed for automation control, so the only animation interface is a timeline and spline features.  You have no visualized double of your character to monitor.

With our new process, now I can just animate a new song in A:M in an afternoon from the comfort of my desk and upload it directly to the frog from almost 1000 miles away.

The following posts are a brief look at the A:M projects I have created and the process for this set of shows.

  • Like 2
Link to comment
Share on other sites

The first character we looked at was a giant toucan. We knew we wanted this to be an extremely dynamic figure, so to determine what kind of speeds and acceleration limits we would need to design within I made a quick animation using a model of mostly 2d cut-outs. Even though it was not necessary for the info we were gathering at this step, I still modeled at full scale. All the axes of motion were placed at the approximate real location we had determined in our napkin sketches during the initial conversations. Then this same project is what everything else was built on top of. All of it was created in a few hours.

The great thing about the workflow in A:M is how easy it is to build a model that is almost made of nothing but the bones, animate it, come back and refine the sculpted model and even move the axis around later while utilizing the same animation data. I can almost work backward if needed. There is very little time spent on the front end to get to where I can work on movement.

  • clap 1
Link to comment
Share on other sites

The frog features the most of all the characters. This is the only character also that used any IK type rigging. All the other characters use FK through pose sliders.

For the eyes I wanted the environment to do some of the animating for me. I built a representation of the entire location in the project at full scale, and in my rig for the eyes, I created a null that is the focal point of the eyes. With this null I could place it at the various locations the audience is present around the lake using poses. Then fading between pose sliders, I can direct his gaze around the lake regardless of where the position of the head is at. This adds a complexity to the animation that nobody is even aware of consciously. With the head moving, rotating, and bobbing to the music, the eyes can stay fixed on your position while you are sitting in the Steakhouse at the edge of the lake. Good restaurant by the way.

Frog_AM.png

Frog_AM-lake.png

Link to comment
Share on other sites

The other place I used IK type control was with the head. This head in real life is driven by two linear actuators that move the head in all directions. For the rig, it was a relatively easy thing to have a single bone that is located at the u-joint that the head pivots off constrained to aim at a null. By moving the null you can make the head look up, down, left, and right. However, this all feeds back to a series of bones that represent the actuators. It is from these two “actuator” bones I gather position data to pass onto the real actuators.

Frog_AM-neck.png

Link to comment
Share on other sites

As I mentioned previously, speed and acceleration/deceleration are critical data streams to monitor. So that we don’t break the robot there are many safety measures built through the system. For example, on many of the actuators we program those speed limits into the hardware so that no matter what the animation data tells it to do, it will not go past a certain limit.

So that I can know with confidence that I am animating to those limits without exceeding them and that what I see on screen is what I am going to get, I designed a Speed/Acceleration meter that I can have open in A:M that monitors the bones I am using as my data gathering proxies.

These used a series of expressions to do the math of the Speed/Acceleration of the bone it was constrained to.

After spending some time to figure out how to make these meters, I find it amusing how now I don’t really use them because I know the characters so well, I can see when I am over the limits by eye.

Frog_AM-Meter.png

Link to comment
Share on other sites

The final step in the process is to export this animation data to the character.

I first bake the animation in the choreography. I bake one keyframe per frame to maintain accuracy at this step. Then export the choreograph action file.

Next, I needed to come up with a way to convert this data to a CSV file containing just the position over time data for the one axis of the bones I want to pass on to the animatronics.

Fortunately, at the time we did this project, I had someone on staff who knew a little C++ and I had him code me an app. This app imports A:M .act files and allows me to select the bones I want to translate, define the axis and ranges, and even remap the values. Finally saving as a CSV that I can bring into the animatronic controller animation software.

On this show, we were using a program called Conductor. It is the same software we used to use in the early days, and as I mentioned, it does not visualize any animation. Just stores position data on a timeline. I can import the CSV here and this tool does a great job at cleaning up the extra frames reducing to a manageable number of points without altering the shape of the curve and affecting the acceleration.

This application becomes my second verification that I have stayed under my speed and acceleration limits because it has built-in a similar meter and warning function I created in A:M

In the end, the file from this app is what is saved to the character on an SD card. There is a master show control system for the venue that simply sends a trigger signal to the character to run the animation.

Baked.png

ACT-CSV.png

Conductor.png

Link to comment
Share on other sites

Everything discussed above was created in late 2019 through the summer of 2020. Which is to say that it is out of date. The way we do this on current characters is far more streamlined. We have been working with an automation software design group that has created a system that uses a fully simulated version of the character. All the animation data that is sent to the control system is fully visualized and simulates the full actuator capabilities of the animatronic in real time. It means we can program in all those speed limits I keep talking about, but it also allows for collision control and actuator behavior based on inertia and more.

Of course, I can still use Animation:Master to create the performance.

Link to comment
Share on other sites

  • Hash Fellow

Thank you, for posting this very detailed behind-the-scenes peek, Charles!

All most of us know about animatronics is that last chapter in "The Illusion of Life".

It is very cool to see what's been happening in modern times and that A:M has been part of it!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...