-
Posts
21,575 -
Joined
-
Last visited
-
Days Won
110
Content Type
Profiles
Forums
Events
Everything posted by Rodney
-
It can be a good place to quickly share information and resources. It's basically an updated chatroom but definitely a considerable update from the iRC back in the day. There are also some people that might not drop in to a forum that might stop by there. When they arrive they've find links to popular topics here in the forum. In addition to voice it also has (limited) video and screen sharing capability for those that desire such. There is a Help Desk so that'd be a good place to start that kind of session.
-
For what it may be worth, I would not want a USB key solution. Having said that, I suppose that would depend on what came with the license.
-
For those interested, there is now an Animation:Master Discord Server. Visit the Animation:Master Discord - Collaborate - Chat and share files - Share News and information - Search for topics of interest - Edit documentation in real time. - Initiate group or private chats (voice or video) - Develop your personal projects and explore your ideas These channels are meant to supplement the A:M forum with capabilities we don't have.
-
Here's a list of all the Object types we can filter for: Name=Filter for All SearchString=Object names containing this text SearchString1=And LinkString1=0 SearchString2=Or but not LinkString2=1 Choreography Model Segment Action Group Decal Rotoscope Camera Light Null Object Project Muscle Channel Container Objects Actions Materials Aim At Constraint Aim Roll At Constraint Translate To Constraint Orient Like Constraint Kinematic Constraint Aim Like Two Constraint Aim Roll Like Two Constraint Material Attribute Gradient Spherical Checker Distort Turbulence Blobby Emitter Path Constraint Spherical Limits Euler Limits Pose Pose Container Dope Sheet Dope Frame Stamp Path Model Motion Capture Device Sensor Container Action Object Container Roll Like Constraint Action Object Action Force Particle Material Effector Volumetric Effect Hair Emitter Sounds Sound Images Clip Still Animation Decal Image Decals Decal Images Stamps Sprite Emitter Sprite Relation Container Relation Surface Constraint Scale Like Constraint Groups Translate Limits Flock Effect Light List Node Light List Choreography Action User Container Bones Spring Spring Container Spring System Mass Mass Container Prop Phoneme Text Dialog CP To Mass Constraint Action DopeSheet Spline Container Channel Relation Storage Driver Action Container Object Shortcut Channel Driver Euler Rotate Driver Vector Rotate Driver Quaternion Rotate Driver Pose Channel Container Empty Relation Mass To CP Constraint Mass To Bone Constraint Layer CP Shortcut Spline Shortcut Flock Surface Constraint Translate Driver No SubProperties Shag Emitter Animatable Driver Animatable Driver Keyframe Constant Driver Mass Shortcut Action Object Placeholder Rigid Body Empty Driver Model Bone Folder Expression BoneToSpringConstraint Post Container Post Effect Post Effects Dynamic Constraint Spline Container Shortcut Hair Guide Hair Guide Shortcut Hair Guide CP Shortcut Dynamic Results Driver Bullet Rigid Body Bullet Soft Body Buffer Shortcut Composite Light Buffer Lights Translate Driver Time Channel Choreographies Guide CP Hair System Light Buffers Rotoscope Container Streak Emitter Bias Driver Guide Hair Info Tree Object List Action Object Light List Action Object Light Lists Dummy Buffer Shortcut Camera Dummy Buffer Shortcut Sound Info Node Fluid Emitter Group Constraint Off the top of my head I can't say I know what a 'sensor' is although that likely has something to do with Motion Capture. Come to think of it... there are more than a few things in that list I should investigate.
-
I'm adding this topic because I'm sure to forget exploring it further... Filters are very useful for quickly finding resources in the Project Workspace (PWS). Also very useful is that we can save those Filters in the Project File. Inside the file itself (the text of the file that is) filters appear as in the following: Name=Filter for Models Model At least initially the order of the list is by order of how the resource is opened in the file so it can be handy to have some filters open as assets are added into a Project. In this way they can be more easily managed, renamed (by saving), etc. A downside of using Filter (besides the fact that they are hard to find and therefore easily forgotten is that crashes can ensue while using them. We'll need to investigate that.
-
I need to delve more deeply into the versioning aspect of Dat because I can sense some usefulness and power in it's ability to move through time restoring to any published point. This makes me wonder how applications process dat files because if Animation:Master could directly access those revisions... good grief that'd be awesome. I'd have a mini test for that but I know for a fact that A:M's internal browser is not Dat compliant .
-
I'm not entirely sure what the difference between a fake webserver and a real one is outside of the fact that a real webserver will likely be consider one that is dedicated to that job. Purist will likely see that going a step further where the server needs to reside exclusively on a computer set aside for that purpose. As far as very limited ability... if the local (fake) webserver has everything required for two or more people to share data directly anywhere in the world at no additional cost that limited capability may suffice. Humorous aside: I recall many years ago going into a Best Buy computer store (or similar franchise) and asking a store assistant if they had any computers set up as servers as I wanted to buy one. The guy looked at me like I was from another planet. His take was that all the computers in the store could be set up as servers. I didn't want a computer that *could* be set up as a server. I just wanted to buy a server. Needless to say, I left that day without buying one. (In case it isn't obvious I'm laughing at myself and my naivety and not that poor computer guy)
-
Third party servers can certainly be useful but in P2P systems a primary strength is not in storage but in the transfer of bits and bytes between the peers/clients. And in distributed systems permanency may not be required nor desired. We may want to simply share a file once especially if it is a work in progress. Permanency can easily equate to outdated data. It is interesting to note that in distributed systems permanency is a byproduct. This relates to an observation made years ago when people were losing their A:M models and someone would step out of the shadows and say... "Here it is. A couple years ago you gave me a copy." That's one example of (unintentional) permanency achieved through peer to peer distribution although in that particular scenario not something to heavily rely on. That might be a good test case however, to try to set up a P2P system of P2P backup. This might be a source of income for the local user who would be incrementally reimbursed for maintaining distributed data. In turn they would hold a copy of the encrypted data (which not even they could access). But I digress... (My point there is that permanency through distriubuted backups is entirely viable) I would have to look to see in what capacity XAMPP might be viable. The downside I must assume is that (as originally designed for local testing) it intentionally turns off security features. That might not be a problem but needs to be considered. At a quick glance I'd say that wherever FTP-style file sharing/serving were required XAMPP might be ideal because... as far as I can tell... that is how peers would access each others local servers; via an FTP client. IF serving up html and apps I would think that better suited to a browser and I don't see where XAMPP can be leveraged by a browser. I really don't know enough about XAMPP to speculate further.
-
Fuchur, I will have to assume you are not interested in assisting with this particular test. I do thank you for your interest and for the related information regarding WebRTS. I very much enjoyed looking into WebRTS and especially how Dat.js interfaces with it. If you are in fact interested in testing the Dat protocol I will be most appreciative.
-
Here's a useful video that covers some of the capabilities that other approaches don't have: xhttps://www.youtube.com/watch?v=Bem9nRpyPEs Added: I hadn't confirmed it before but I have it from the source now that the Beaker Browser is a Chrome fork so that is where it gets the majority of its browser code..
-
Continuing on... relative to security and Dat: (Disclaimer: This is very likely going too far into the weeks for most folks but is interesting never-the-less) A response addressing that very question suggests the following: . This addresses several areas of security as 'secure' means different things to different people.
-
Additionally, WebRTC still requires a server and even the browsers that might lean toward being that server don't provide that out of the box. Chrome appears to be a good example of this where an extension must be added in order to get that functionality and operate via WebRTC. As far as I can tell the Beaker browser is the only browser built from the core with this functionality in mind. It is built from the core of Chromium so it's next of kin is Chrome and similarly coded browsers. Security-wise, the code of the Beaker Browser can be readily examined whereas the extensions required for WebRTC more often than not... cannot. From an interview with the autor of the Beaker browser:
-
Here's a near equivalent that uses WebRTC. Interestingly, after being awarded some recognition hasn't seen development in appox. 4 years. That might be largely because it does exactly what it advertises to do. http://www.peer-server.com/ A relevant part of the description applies to the Dat approach as well: *emphasis added
-
It wouldn't make much sense not to be online. I can't think of any cases where P2P would not be online. BUT This approach does account for being offline as well. It just awaits the next time it can be compared to the other seeds and then updates to get back in sync (as allowed).. The immediate benefit of this is something of a built in versioning system. This is a requirement where technically ever offline server might have a different version of the same data. The resync then gets everything back up to date. Again, all this assuming the user actually wants to be up to date. I fear your hopes are dashed with this technology but you also likely would assess technologies such as Bittorrent as risky. (and you would be right) The promise (which must be tested/proven) is that security must be built into the system. A risk here is not unlike risks with any data that traverses the internet. P2P is highly secure in the transfer of data as that is encrypted. The area of concern is then while browsing the data because anyone with the address can access the information associated with that Dat and even those that cannot can root out other information.. The security is found in that the address provides a key to encrypt and decrypt the data and that address is not necessarily public. This is why I can access/read the data if I have the address but neither I nor anyone else can where we do not. This is not unlike email in a sense in that I might launch an email to one or more persons and they might in turn launch it to others. There is always some risk that the ISPs used might try to read/interpret that data but their primary incentive to do so would be based on what they believe they can do with that data. In theory, at the small scale, no data need be sent beyond that of the address itself which is then used to encode and decipher all subsequent traffic. That data steam could then be limited to a one byte data stream. There... intercept and decipher that! The smaller that footprint the more secure the data. Additionally, the speed of that transfer is largely controlled by bandwidth (although other factors weigh in such as compression alogorithm that take advantage of previous and anticipated patterns. So, in theory again, if all that encryption were not enough bad data could be mixed in with good data and those with the key would simply access the good data. The middleman would just have that mess of undecipherable data. This is the useful element related to blockchains without the global overhead. But all of that neither here nor there... Regarding WebRTC, the developers of Dat respond thusly: So, Dat-js does use WebRTC but it is deemed to limited at present for the purpose under consideration of having the browser be the server of local P2P distributed data. Regarding areas of security... any and all P2P activity can be deemed high risk but risks should always be calculated and mitigated Here's what the folks behind Dat are staring at with regard to security: https://docs.datproject.org/security Additional information can be found here: https://docs.datproject.org/ecosystem Added: Should we need to standardize on a more highly supported browser such as Chrome WebRTC would be a likely alternative. The downside: this would appear to be primarily used at the local network level which defeats the broader goal of distributed P2P
-
One reason that many won't be jumping on the Peer to Peer bandwagon too soon is that it is hard to make money when people cut out the middle man. If I can trade goods and services directly with you I don't need to contract with someone in the middle. On the local level, a different set of technologies is on approach and you may have seen or even used it; projecting from one devices to another. Some older computers aren't equipped with sufficient hardware although the software (like Win 10) running on it likely is. Specifically what I'm taking about here is how any device within range can be given access to my computer and data can be shared back and forth as needed. This is especially useful with smart phones and tablets where all I might need to do to sync some data on each of my devices is to be within close proximity. The Peer to Peer model expands this basic idea across the internet where the browser is the server. I can serve up (publish) any content I desire and others can access it as if they were getting it directly from me (because they are). I get a sense that the ISPs are very much in the know on where this may be heading and that is why they seek more control in the realm of 'net neutrality' because in a Peer to Peer world they are no longer dealing with the middle men either and their ability to control the flow of data from every computer in the world to every other computer in the world is constrained by a very large dose of freedom. In the interest of self preservation then they have been moving to where they can legally meter and gauge the flow of information where costs are higher. Some of this makes sense while some of it does not but that is largely because they have a clearer view on what is coming.
-
I'll play your game. Sure, of course it'd be worth it. Actually, the only way we will know if the powers that be estimate it to be worthy it is if/when they implement it. For what it is worth, in my estimation moving to the next tier in licensing activation has better than a 50/50 chance. Someone behind the scenes with an eye on the bottom line would have to crunch the numbers to make that determination.
-
For my part, and in my testing, I'm trying to work in a way that supports the creation of storyboards, comics and film and so am trying to adapt Script Lining a little to facilitate that. This is not entirely unlike the method Miyazaki used for his storyboards... and keep in mind that those storyboards WERE his script. Unlike most American storyboards Japanese storyboards tend to be vertical in orientation. As such they follow more of a traditional film and script approach rather than the timeline popular in most digital software. This equates more than a little to why I think we like the Project Workspace listing as much as we do in A:M. In its vertical orientation we can easily see what resources are in our 'working script'. In filmmaking, the process of 'breaking down the script' is different than Script Lining in that the lining of the script focuses on Camera shots and what will be in front of the camera. Breaking down the script is where the resources needed in each of those shots is categorized so that production can proceed with the confidence that everything that is needed will be there. In collaborative efforts that breakdown is essential in passing the torch to the subject matter experts that will work their magic on costuming, props, stunt work, fx, sound fx, etc. The person/persons breaking everything down might use a different color to underline each element in the script. With Script Lining the standard color is said to be red....although I could see where multi-color setups could be useful especially in the digital realm where we might want to specific different cameras. In the real world we might only have access to one camera... because renting or purchasing those isn't cheap... but with virtual cameras that's not a problem. An added benefit there might also be that by setting up different cameras for color coded 'linings' we might be able to optimize use of Netrender by sending it Choreographies that use those dedicated cameras. That's all a bit too far afield for the subject matter at hand but the possibilities are intriguing.
-
Sometimes you outsmart yourself... I was having a persistent error rendering out to a shared folder. Tried a bunch of different things. Most attempts would get 15 frames into a 30 frame sequence and then fail with an error in A:M. PNG, TGA, etc. same error. Drop the numbering... Tried EXR... the first frame failed. Gah! Low and behold... that external drive was full. Changed to a different partition that has lots of space.... no issues.
-
Of interest although perhaps a bit too obscure for most people will be the following... The P2P browsing doesn't seem to fully support Windows shortcuts which is very unfortunate. However the better path... and something that works great... is to use Symbolic Links (and or Junctions) rather than shortcuts as those do work within Beaker. What I'm working on at present is a test of a production pipeline that can be easily navigated through and Symlinks are just the ticket for such a thing. And... even if not using any of this it's good to know how Symlinks work in Windows so they can be used for other things. With Symlinks we can create substitutions so that Windows treats a multiple directories can all point to the same location (or similar) location or we can use a short name for a very very VERY long name. Added: I just did a test and this appears to work great for use with my RenderFolder concept where A:M always points to the same location. I can then simply change the SymLink to point to the flavor of the day (i.e. proxy, lowrez, highrez). Update: More testing. Works great! SymLinks can't be directly copied/pasted but they can be renamed.
-
Here's some related information from the Director's perspective: http://timidmonster.com/what-every-director-needs-to-know-before-shooting/ Translating this into our world (the world of digital animation where one person may wear all of the hats) the requirements don't really change although the shortcuts very often do. The bottom line here might be one of efficiency and breaking down our ideas into workable plans that get us from the idea to the finished presentation.
-
and here's a video walking through the process. You may want to watch this with the speed turned up: xhttps://www.youtube.com/watch?v=8n59e287Ix0
-
All you guys and gals knew this stuff already but I'm slow and only recently read the memo... Script Lining is something I have long knew needed to be present in the filmmaking pipeline but I had never really seen it in the wild. Having now seen it I gain that sense of satisfaction that I knew it was there but renewed frustration in that digital solutions still fall short of capturing it's potential. For those that share my lack of knowledge concerning Script Lining here's a good overview of the process and how it helps us break down a script to the point where we know where our cameras will be and what should be seen in the lens/renders: It can be a very useful method to move into the stage of Storyboarding where boards will be used to explore the script visually to find the best possible way to tell the story. . https://www.amyclarkefilms.com/blog/how-to-line-a-film-script From the article: Notes You draw your lines left to right in shooting order. Red is the standard colour used for a lined script Sometimes multiply colours are used to indicate different shots i.e. –blue ink for single shots, green ink for cutaways, wild tracks taken by sound etc If a shot continues to another page an arrow is placed below the line and continues onto the next page. Lined scripts are also called Marked Up Scripts or MUS.
-
Perhaps for $99 that would include a second seat. The gain then would be in that activation and management of licenses would be cheaper because the users would do that. This does beg the question however, let's say I have one seat... Hash Inc goes to the proposed scheme and for $99 offers a second seat and the ability to move the license to any two machines. All well and good thus far. But now I think... I sure would like to put A:M on that other laptop so my kid would stop interrupting me and they could just use that laptop to create their own A:M things. Do I purchase a second $99 set of seats? I don't think there is currently an option (via Hash Inc store) for quickly adding seats in this way. As 'access is in the license' I think the solution is fairly straightforward. Having Steve at Graphic Powers communicate how they get it done to Jason will help immensely. And I think they are both geographically close to each other in the Portland area. So what we have in the wishing well at this moment in time might be: $79 one seat locked to a single computer $99 two seats with license managed via login Additional seats to be negotiated. This would get us back very close to how we operated back in the CD days but without being reliant on the CD for authentication of licensing. I'm I keeping up thus far?
-
As I was looking at it I began to think.... maybe... But then... No. Very close in many respects. The primary difference is that with Beaker the browser is the server. Additionally there is no cloud... everyone is (or can be a server). It is more like BitTorrent in this respect but without some of its drawbacks. With cloud approaches (like owncloud) a server needs to be setup (and this appears to be limited to Linux at present for owncloud although the clients can be Linux, Mac, Windows). I can see where something like owncloud would be advantageous. This makes me want to list some of the obstacles to using Dat systems... things it's supporters are working to address. One is that it can be very hard to find things in a distributed system because there is no aggregation in the traditional sense. Perhaps an example of some early forays into that area would be DatProject and the curated list of DatProjects at AwesomeDat. But curation (as the name implies) takes time and much effort. I suspect a mechanism needs to be promoted that indexes a resource when it is first published. There is a basic mechanism there but I'm not familiar enough with it to understand how it does or does not measure up.
-
There is another relatively newcomer on the scene similar to Beaker browser called Brave. (It's easy to find on the internet though) Brave is very similar to Beaker but more polished and it is one of the first to incorporate monetization features in the form of a wallet. They are promoting that aspect through a Creators Referral Program that reportedly will be paying out 1 million tokens (for whatever that is worth) to get others to switch to the Brave browser. No, I am not signed up for the referral program. The main difference between Brave and Beaker is that Brave is blockchain enabled whereas Beaker is not (and does not plan to use it). And this may be where Beaker and Brave depart company. The folks behind Beaker think that Blockchain is fine but Proof of Work carries with it too much unnecessary baggage.