Jump to content
Hash, Inc. Forums

DrPhibes

*A:M User*
  • Posts

    103
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by DrPhibes

  1. Robert, This solution works in the choreography also! So after I create an animation through some poses I baked the animation. Then I followed your steps above and was able to change the driver. worked perfectly. Thanks so much for the help! Charles
  2. Thanks Robert for this solution! I will try this out. Yes, you are correct, I meant euler. I am not sure this will fix the problem if I am baking out an animation in the choreography. Won't that just resort to the quaternion? I have an animation for a character done entirely through pose sliders. This character has a few constraints that use a null target to direct the z-roll of a couple bones. The animation then gets baked out in a chor and I export that action to a file that I then use a tool to convert to a CSV to extract the specific z-roll data and apply to a real world actuator. This was all working fine two years ago when I set this up, and now that I need to create a new animation this forcing of the quaternion driver issue has popped up. This of course means that the z rotate data is not accurate anymore. I was going to look back to try and figure out what version of the software I would have been using summer of 2020. Maybe it would work ok there.
  3. I would like to use the Vector Interpolation driver on bones in an action file. I used to be able to select it in the option menu and set it for the project be default, but now everything is defaulting to quaternion. Even an older project is defaulting to quat. I change it in the options to vector and it just keeps reverting to quaternion. I don't want the rotate.w channel for this project, just xyz absolute values. How do I fix this? Thanks all, Charles
  4. Robert, This is really helpful! Thank you very much for the deep dive! I will have more questions as I go here. I am hoping to create a more direct translation of a spline from A:M to Rhino and back. the polygon formats that I can use to get models to and from are ok, but I think there has to be a better way.
  5. I have been working with A:M to create motion profiles for animatronics. Successfully I programed the animatronics for some pieces in Las Vegas last year, writing our own app to help translate to our controllers. You can see some of our process here. I am still working on refining this process and I was hoping someone could point me to some documentation of what all the switches are in an .ACT file or a .MDL file so I can better create some of the tools I need. I have looked on the FTP and at the SDK materials, but I could not find some plain language materials describing all of the syntax of the files. I am not a programmer by nature and there is a lot I don't know, so perhaps it is there, but I need a little more guidance to sort it out. everything I have done so far is simply by trial and error. For example, I can open an action file in text editor and understand what is the axis, time, and position of a bone and this made it easy to convert to what I needed. However, there are other tags in a model file I don't quite know what they all are and I would like to understand them more clearly to leverage for other conversions. For example the following; <SPLINE> 262145 0 1 -18.4294033 -2.04771137 0 . . 262145 0 9 24.2136593 -1.03441095 0 . . </SPLINE> creates a two point spline in a model file. what does the "262145" indicate? What does the "0" following it indicate? what does the ". ." at the end indicate? Thanks for any help in parsing this out. Charles
  6. I have designed many 3d printable models in A:M, but as Robert mentioned it is not really intended as a CAD program. The kind of Boolean functions you want are not really practical in A:M, but there are best practices and techniques within A:M that can get you there. This is a functional "steam" engine I did all in A:M https://www.thingiverse.com/thing:25624 Charles
  7. Solved the problem! It has everything to do with the files names! I shortened all the files names and it worked fine! I don't know why the longer files names work on other PCs. All of my machines are Windows 10. Thanks for all your help in trying to sort this out Robert! CB
  8. Recorded a new WAV - Works! Split original and save - Did not work Merged to mono and save - did not work
  9. That worked! Did you simply merge the tracks in Audacity and export to the 44.1kHz? I tried to do that just now and my file did not work, but yours worked fine.
  10. Yeah, and it works fine for me on other machines, but not on this one. All the files you sent me DO work fine here, so this is a mystery. I can't tell what the difference is in the files. I thought maybe about saving out as a entirely different format and then back to a wav. I said Audacity above when I meant Adobe Audition. I should try Audacity.
  11. Both work equally well. Import and playback without error.
  12. I can't post publicly, but I sent you a link via a PM. I thought it was because of the different sample rates. Yours was 32kHz and my original was a 48kHz, but after saving out several from Audacity changing to various sample rates, none of them worked. So there is something else different about your wav file than mine that I am missing. What software are you using when you edit or save wav files? CB
  13. That worked! What is the difference there?
  14. Thanks for at least something to try. It did not work, but it gave me hope for at least a minute CB
  15. [Solution: shorten filename] Has anyone else ever had an error dropping in a wave file into a project? I have several PC's I work on and everything is great when importing an audio file, but on one machine I get "trouble opening mci device error: 304" on this Windows 10 machine. The wave file imports, but there is no audio playback. I so I assume it's that machine and look to make sure all drivers are current and can't replicate the problem in any other app other than A:M. I can import an MP3 and that has playback no problem. Unfortunately I am noticing sync problems and it does not display a waveform in the timeline. Probably because MP3 is not fully supported yet. These kind of problems are maddening. I can't find any info on a MCI error 304 to help troubleshoot this and I don't know what A:M is specifically doing when it loads the audio file and how is accesses libraries and such. Probably on my own here but I thought I would ask. Charles
  16. I have played with this a bit, but I think I am missing something. I am getting a syntax error. Have you tried it?
  17. Hmm, I will need to research this more. This sounds promising. So then what you are saying is that on a x scale channel for example, an expression is linked to it to force the spline in the timeline to display the acceleration curve?
  18. Thank you for all your efforts Robert! To address your comment in the video about using the translate channel (or any other channel) as the source of our motion data in the conversion to the controller, this is exactly what I planned to do if there was not an easy automated way to get there with pose sliders. The 0 to 100 values, and the small size of an action file with the ability to parse out the channels from logically named pose sliders was nice. I can get to the 0 to 100 values in other ways with constraints, but when parsing the data it requires more effort due to the duplicate naming. Search a project file for "matchname =Z" and see how many hits you get I had played with some of the text editing solutions to transfer the channels as well. I think you have definitively shown that I need to stick with our first version of the tool that pulled from a position channel.
  19. It is a tool we wrote internally. It is a separate app that we wrote that is a basic converter to translate to the controller hardware we use, and we added some features to help with the real world issues of acceleration and deceleration limits of the actuators. I hope to at some point develop this stuff into an A:M plugin(s) that allows me to see in realtime on a timeline an overlay of an acceleration curve with the current position over time spline. This would allow me to animate to the limits preset into the plugin and make sure that nothing is moving in a way it can't in reality without having to save out and bring back in. When we did this during our first tests we just created an Excel project that did all of this conversion.
  20. Robert, This is getting so close to where I need to be. But yes, getting the pose sliders to record key frames would be ideal. I came across another issue I overlooked as I was working through this with you. In my other projects because of their simplicity, I can just use key framed pose sliders in an action, export that action, run it through the filtering software to refine any acceleration or deceleration issues, then import that edited action file back in to A:M, drop it on the model and watch the play back to visualize the changes. This ability goes away to some degree with the expressions. What I think I need to do is build an animation rig and a separate playback rig if I continue down this path.
  21. So to answer the first question, yes, expressions could probably be made to do what I want in this case. So the example you provided in the video was based on the "demo02" action I set up where I hand animated the actuator to match the gimbal motion. In this case if I hand animate the poses then I get what I need in the pose sliders and there is no need for a baking step. I placed this action in there to demonstrate the end result I was looking for. Look at the "demo01" action. You will see that the actuators are not moving. This is the where I would like to find a rig setup to automatically move the actuators to match the gimbal motion. Also, yes, it is true I can pull the translate value out of file and run that to the controller. It does create an extra step because I will need to process that separately from the other motions pulled from pose sliders. In fact our first version of the translation tool worked specifically on bone rotation values and not pose slider positions.
  22. I am going to go into more explanation here of my goal and issue so that it may help spark some ideas. Normally the animatronics we build have a single actuator tied to a single point of motion, usually a rotational axis, stacked on another point of motion. In an animation model and rig I build accurately to the real world items as possible. This usually results in a single bone that has a certain range of rotation applied through a a pose slider. Then in an action file I would animate to the prerecorded audio simply adjusting the pose sliders per axis of motion. This FK animation process work very well for the majority of the characters we have built. I would animate a neck rotation, then the head tilt for example, building each layer individually. All of these key framed pose sliders are exported in an action file to some custom software we created to run a filter through the channels to adjust the acceleration/deceleration profiles to match what the real world actuators can do based on load specs and so on. Then this is exported to the controller hardware. I can send this data both ways as well. Once the adjustments are made to the channels, I can bring it back into A:M to see how it may have changed and then adjust if necessary. One day i hope to just have this function in A:M as a plugin instead having to export and re import. So now this is where my current problem lies. We have a mechanical design that uses two actuators to control a gimbal. This means that there are some tricky rigs to emulate accurately. In most cases I would like to use a pose slider to control each actuator independently so I have recorded key frames on the pose channel. I can make this work now and is acceptable. However, this is a good case to look at using a more IK workflow where I am moving a single null and the pair of actuator follow and automatically record key frames to their respective channels. I don't spend quite enough time building complex rigs in A:M to instantly know the cleanest approach. I am attaching a stripped down model of what I am doing so you can look at it. This is to scale as well. There is a single null target that is essentially the nose of the character. This null is followed by a single bone using an aim at constraint. The base of this bone represents the gimbal. There is a triangular frame (the gimbal plate) that has two nulls used as targets to aim the actuators at. Each actuator has a pose slider that moves it 0 to 100% through its range of extension and retraction. In the Demo02 action I animated the head aim target, then by hand animated the pose sliders to line up with their respective targets on the gimbal plate. This gives me the ability to very simply perform the head, and then also generate the key framed poses specific to the actuators needed for export. So, this last step of hand positioning the pose sliders and generating those pose key frames is what I was hoping to find a way to automate. Am I dreaming? I have another element that would operate in a similar fashion, but has 4 actuators. A second pass by hand to set those key frames could be time consuming. Charles Gimbal Demo.prj
  23. Hello Robert, This works with translate, do you think it can be made to work with the longitude or latitude motion of a bone? Charles
×
×
  • Create New...