-
Posts
21,597 -
Joined
-
Last visited
-
Days Won
110
Content Type
Profiles
Forums
Events
Everything posted by Rodney
-
I'm not entirely sure what the difference between a fake webserver and a real one is outside of the fact that a real webserver will likely be consider one that is dedicated to that job. Purist will likely see that going a step further where the server needs to reside exclusively on a computer set aside for that purpose. As far as very limited ability... if the local (fake) webserver has everything required for two or more people to share data directly anywhere in the world at no additional cost that limited capability may suffice. Humorous aside: I recall many years ago going into a Best Buy computer store (or similar franchise) and asking a store assistant if they had any computers set up as servers as I wanted to buy one. The guy looked at me like I was from another planet. His take was that all the computers in the store could be set up as servers. I didn't want a computer that *could* be set up as a server. I just wanted to buy a server. Needless to say, I left that day without buying one. (In case it isn't obvious I'm laughing at myself and my naivety and not that poor computer guy)
-
Third party servers can certainly be useful but in P2P systems a primary strength is not in storage but in the transfer of bits and bytes between the peers/clients. And in distributed systems permanency may not be required nor desired. We may want to simply share a file once especially if it is a work in progress. Permanency can easily equate to outdated data. It is interesting to note that in distributed systems permanency is a byproduct. This relates to an observation made years ago when people were losing their A:M models and someone would step out of the shadows and say... "Here it is. A couple years ago you gave me a copy." That's one example of (unintentional) permanency achieved through peer to peer distribution although in that particular scenario not something to heavily rely on. That might be a good test case however, to try to set up a P2P system of P2P backup. This might be a source of income for the local user who would be incrementally reimbursed for maintaining distributed data. In turn they would hold a copy of the encrypted data (which not even they could access). But I digress... (My point there is that permanency through distriubuted backups is entirely viable) I would have to look to see in what capacity XAMPP might be viable. The downside I must assume is that (as originally designed for local testing) it intentionally turns off security features. That might not be a problem but needs to be considered. At a quick glance I'd say that wherever FTP-style file sharing/serving were required XAMPP might be ideal because... as far as I can tell... that is how peers would access each others local servers; via an FTP client. IF serving up html and apps I would think that better suited to a browser and I don't see where XAMPP can be leveraged by a browser. I really don't know enough about XAMPP to speculate further.
-
Fuchur, I will have to assume you are not interested in assisting with this particular test. I do thank you for your interest and for the related information regarding WebRTS. I very much enjoyed looking into WebRTS and especially how Dat.js interfaces with it. If you are in fact interested in testing the Dat protocol I will be most appreciative.
-
Here's a useful video that covers some of the capabilities that other approaches don't have: xhttps://www.youtube.com/watch?v=Bem9nRpyPEs Added: I hadn't confirmed it before but I have it from the source now that the Beaker Browser is a Chrome fork so that is where it gets the majority of its browser code..
-
Continuing on... relative to security and Dat: (Disclaimer: This is very likely going too far into the weeks for most folks but is interesting never-the-less) A response addressing that very question suggests the following: . This addresses several areas of security as 'secure' means different things to different people.
-
Additionally, WebRTC still requires a server and even the browsers that might lean toward being that server don't provide that out of the box. Chrome appears to be a good example of this where an extension must be added in order to get that functionality and operate via WebRTC. As far as I can tell the Beaker browser is the only browser built from the core with this functionality in mind. It is built from the core of Chromium so it's next of kin is Chrome and similarly coded browsers. Security-wise, the code of the Beaker Browser can be readily examined whereas the extensions required for WebRTC more often than not... cannot. From an interview with the autor of the Beaker browser:
-
Here's a near equivalent that uses WebRTC. Interestingly, after being awarded some recognition hasn't seen development in appox. 4 years. That might be largely because it does exactly what it advertises to do. http://www.peer-server.com/ A relevant part of the description applies to the Dat approach as well: *emphasis added
-
It wouldn't make much sense not to be online. I can't think of any cases where P2P would not be online. BUT This approach does account for being offline as well. It just awaits the next time it can be compared to the other seeds and then updates to get back in sync (as allowed).. The immediate benefit of this is something of a built in versioning system. This is a requirement where technically ever offline server might have a different version of the same data. The resync then gets everything back up to date. Again, all this assuming the user actually wants to be up to date. I fear your hopes are dashed with this technology but you also likely would assess technologies such as Bittorrent as risky. (and you would be right) The promise (which must be tested/proven) is that security must be built into the system. A risk here is not unlike risks with any data that traverses the internet. P2P is highly secure in the transfer of data as that is encrypted. The area of concern is then while browsing the data because anyone with the address can access the information associated with that Dat and even those that cannot can root out other information.. The security is found in that the address provides a key to encrypt and decrypt the data and that address is not necessarily public. This is why I can access/read the data if I have the address but neither I nor anyone else can where we do not. This is not unlike email in a sense in that I might launch an email to one or more persons and they might in turn launch it to others. There is always some risk that the ISPs used might try to read/interpret that data but their primary incentive to do so would be based on what they believe they can do with that data. In theory, at the small scale, no data need be sent beyond that of the address itself which is then used to encode and decipher all subsequent traffic. That data steam could then be limited to a one byte data stream. There... intercept and decipher that! The smaller that footprint the more secure the data. Additionally, the speed of that transfer is largely controlled by bandwidth (although other factors weigh in such as compression alogorithm that take advantage of previous and anticipated patterns. So, in theory again, if all that encryption were not enough bad data could be mixed in with good data and those with the key would simply access the good data. The middleman would just have that mess of undecipherable data. This is the useful element related to blockchains without the global overhead. But all of that neither here nor there... Regarding WebRTC, the developers of Dat respond thusly: So, Dat-js does use WebRTC but it is deemed to limited at present for the purpose under consideration of having the browser be the server of local P2P distributed data. Regarding areas of security... any and all P2P activity can be deemed high risk but risks should always be calculated and mitigated Here's what the folks behind Dat are staring at with regard to security: https://docs.datproject.org/security Additional information can be found here: https://docs.datproject.org/ecosystem Added: Should we need to standardize on a more highly supported browser such as Chrome WebRTC would be a likely alternative. The downside: this would appear to be primarily used at the local network level which defeats the broader goal of distributed P2P
-
One reason that many won't be jumping on the Peer to Peer bandwagon too soon is that it is hard to make money when people cut out the middle man. If I can trade goods and services directly with you I don't need to contract with someone in the middle. On the local level, a different set of technologies is on approach and you may have seen or even used it; projecting from one devices to another. Some older computers aren't equipped with sufficient hardware although the software (like Win 10) running on it likely is. Specifically what I'm taking about here is how any device within range can be given access to my computer and data can be shared back and forth as needed. This is especially useful with smart phones and tablets where all I might need to do to sync some data on each of my devices is to be within close proximity. The Peer to Peer model expands this basic idea across the internet where the browser is the server. I can serve up (publish) any content I desire and others can access it as if they were getting it directly from me (because they are). I get a sense that the ISPs are very much in the know on where this may be heading and that is why they seek more control in the realm of 'net neutrality' because in a Peer to Peer world they are no longer dealing with the middle men either and their ability to control the flow of data from every computer in the world to every other computer in the world is constrained by a very large dose of freedom. In the interest of self preservation then they have been moving to where they can legally meter and gauge the flow of information where costs are higher. Some of this makes sense while some of it does not but that is largely because they have a clearer view on what is coming.
-
I'll play your game. Sure, of course it'd be worth it. Actually, the only way we will know if the powers that be estimate it to be worthy it is if/when they implement it. For what it is worth, in my estimation moving to the next tier in licensing activation has better than a 50/50 chance. Someone behind the scenes with an eye on the bottom line would have to crunch the numbers to make that determination.
-
For my part, and in my testing, I'm trying to work in a way that supports the creation of storyboards, comics and film and so am trying to adapt Script Lining a little to facilitate that. This is not entirely unlike the method Miyazaki used for his storyboards... and keep in mind that those storyboards WERE his script. Unlike most American storyboards Japanese storyboards tend to be vertical in orientation. As such they follow more of a traditional film and script approach rather than the timeline popular in most digital software. This equates more than a little to why I think we like the Project Workspace listing as much as we do in A:M. In its vertical orientation we can easily see what resources are in our 'working script'. In filmmaking, the process of 'breaking down the script' is different than Script Lining in that the lining of the script focuses on Camera shots and what will be in front of the camera. Breaking down the script is where the resources needed in each of those shots is categorized so that production can proceed with the confidence that everything that is needed will be there. In collaborative efforts that breakdown is essential in passing the torch to the subject matter experts that will work their magic on costuming, props, stunt work, fx, sound fx, etc. The person/persons breaking everything down might use a different color to underline each element in the script. With Script Lining the standard color is said to be red....although I could see where multi-color setups could be useful especially in the digital realm where we might want to specific different cameras. In the real world we might only have access to one camera... because renting or purchasing those isn't cheap... but with virtual cameras that's not a problem. An added benefit there might also be that by setting up different cameras for color coded 'linings' we might be able to optimize use of Netrender by sending it Choreographies that use those dedicated cameras. That's all a bit too far afield for the subject matter at hand but the possibilities are intriguing.
-
Sometimes you outsmart yourself... I was having a persistent error rendering out to a shared folder. Tried a bunch of different things. Most attempts would get 15 frames into a 30 frame sequence and then fail with an error in A:M. PNG, TGA, etc. same error. Drop the numbering... Tried EXR... the first frame failed. Gah! Low and behold... that external drive was full. Changed to a different partition that has lots of space.... no issues.
-
Of interest although perhaps a bit too obscure for most people will be the following... The P2P browsing doesn't seem to fully support Windows shortcuts which is very unfortunate. However the better path... and something that works great... is to use Symbolic Links (and or Junctions) rather than shortcuts as those do work within Beaker. What I'm working on at present is a test of a production pipeline that can be easily navigated through and Symlinks are just the ticket for such a thing. And... even if not using any of this it's good to know how Symlinks work in Windows so they can be used for other things. With Symlinks we can create substitutions so that Windows treats a multiple directories can all point to the same location (or similar) location or we can use a short name for a very very VERY long name. Added: I just did a test and this appears to work great for use with my RenderFolder concept where A:M always points to the same location. I can then simply change the SymLink to point to the flavor of the day (i.e. proxy, lowrez, highrez). Update: More testing. Works great! SymLinks can't be directly copied/pasted but they can be renamed.
-
Here's some related information from the Director's perspective: http://timidmonster.com/what-every-director-needs-to-know-before-shooting/ Translating this into our world (the world of digital animation where one person may wear all of the hats) the requirements don't really change although the shortcuts very often do. The bottom line here might be one of efficiency and breaking down our ideas into workable plans that get us from the idea to the finished presentation.
-
and here's a video walking through the process. You may want to watch this with the speed turned up: xhttps://www.youtube.com/watch?v=8n59e287Ix0
-
All you guys and gals knew this stuff already but I'm slow and only recently read the memo... Script Lining is something I have long knew needed to be present in the filmmaking pipeline but I had never really seen it in the wild. Having now seen it I gain that sense of satisfaction that I knew it was there but renewed frustration in that digital solutions still fall short of capturing it's potential. For those that share my lack of knowledge concerning Script Lining here's a good overview of the process and how it helps us break down a script to the point where we know where our cameras will be and what should be seen in the lens/renders: It can be a very useful method to move into the stage of Storyboarding where boards will be used to explore the script visually to find the best possible way to tell the story. . https://www.amyclarkefilms.com/blog/how-to-line-a-film-script From the article: Notes You draw your lines left to right in shooting order. Red is the standard colour used for a lined script Sometimes multiply colours are used to indicate different shots i.e. –blue ink for single shots, green ink for cutaways, wild tracks taken by sound etc If a shot continues to another page an arrow is placed below the line and continues onto the next page. Lined scripts are also called Marked Up Scripts or MUS.
-
Perhaps for $99 that would include a second seat. The gain then would be in that activation and management of licenses would be cheaper because the users would do that. This does beg the question however, let's say I have one seat... Hash Inc goes to the proposed scheme and for $99 offers a second seat and the ability to move the license to any two machines. All well and good thus far. But now I think... I sure would like to put A:M on that other laptop so my kid would stop interrupting me and they could just use that laptop to create their own A:M things. Do I purchase a second $99 set of seats? I don't think there is currently an option (via Hash Inc store) for quickly adding seats in this way. As 'access is in the license' I think the solution is fairly straightforward. Having Steve at Graphic Powers communicate how they get it done to Jason will help immensely. And I think they are both geographically close to each other in the Portland area. So what we have in the wishing well at this moment in time might be: $79 one seat locked to a single computer $99 two seats with license managed via login Additional seats to be negotiated. This would get us back very close to how we operated back in the CD days but without being reliant on the CD for authentication of licensing. I'm I keeping up thus far?
-
As I was looking at it I began to think.... maybe... But then... No. Very close in many respects. The primary difference is that with Beaker the browser is the server. Additionally there is no cloud... everyone is (or can be a server). It is more like BitTorrent in this respect but without some of its drawbacks. With cloud approaches (like owncloud) a server needs to be setup (and this appears to be limited to Linux at present for owncloud although the clients can be Linux, Mac, Windows). I can see where something like owncloud would be advantageous. This makes me want to list some of the obstacles to using Dat systems... things it's supporters are working to address. One is that it can be very hard to find things in a distributed system because there is no aggregation in the traditional sense. Perhaps an example of some early forays into that area would be DatProject and the curated list of DatProjects at AwesomeDat. But curation (as the name implies) takes time and much effort. I suspect a mechanism needs to be promoted that indexes a resource when it is first published. There is a basic mechanism there but I'm not familiar enough with it to understand how it does or does not measure up.
-
There is another relatively newcomer on the scene similar to Beaker browser called Brave. (It's easy to find on the internet though) Brave is very similar to Beaker but more polished and it is one of the first to incorporate monetization features in the form of a wallet. They are promoting that aspect through a Creators Referral Program that reportedly will be paying out 1 million tokens (for whatever that is worth) to get others to switch to the Brave browser. No, I am not signed up for the referral program. The main difference between Brave and Beaker is that Brave is blockchain enabled whereas Beaker is not (and does not plan to use it). And this may be where Beaker and Brave depart company. The folks behind Beaker think that Blockchain is fine but Proof of Work carries with it too much unnecessary baggage.
-
Two (or more) A:M Users can collaborate without concern for their files being compromised. This is because the URL (DAT Address) is said to be unguessable and the data itself is encrypted during transfer. So, only those who have access to the address can access the data stream. https://beakerbrowser.com/docs/tutorials/share-files-secretly.html
-
Files can be kept in sync by 'listening' for changes to directories or files: This notifies everyone that they need to refresh their content. After the file is updated success https://beakerbrowser.com/docs/tutorials/listen-for-file-changes.html
-
I'm currently researching how detrimental it is to serve DAT from external drives. I know network drives take a hit because the data has to be written twice (presumably once to the main computer and then once to the remote client). It seems to me that external drives might take this hit also.
-
Wondering out loud.. How many folks would pay $99 a year for the ability to move Animation:Master around from computer to computer (i.e. activate and deactivate A:M as the need arises to move the license just by logging in to a website and adjusting the license)? I'm not suggesting this would be feasible (or even particularly possible) and very likely it would not but I still cannot help but wonder.... would that be worth an additional $20 per year?
-
That would depend on what software provider they use. If they are using what I think they are that might not be the case. Here are some general guidelines: While there are certainly no guarantees in this guessing game that second bullet point would likely have us covered. There are a lot of assumptions here... such as sticking with the same company. Moving to a different license manager could be expected to break stuff. But that's where you maintain the old, add the new and wait until all the users under the old scheme expire.
-
Rusty, By all means join in the fun. If you download the Beaker Browser (link above) and enter the following address you should be able to see two models and one action: dat://8ddfac3988fd0bd8c654af3785951f174d15c9baadb60e241a67590c698ed642 As soon as someone launches the Beaker browser they can also set up distributed secure servers of their own (and remove them when no longer needed). What I would very likely do as a Proof of Concept would be to publish the data from the AM DVD for others to Fork to their own DAT servers and somewhere in that mix there would be a place where we would mix all of our files. For instance, I might render something out of A:M to a DAT Render Folder that other folks could immediately use in A:M. Likewise, we could have a Master Project that everyone would add resources into.... a kind of 'one Project to rule them all' approach. For testing purposes of course. I was also thinking of running a challenge of sorts called '48 frames' where a group of A:M users all contribute in DAT time to fill those 2 seconds worth of frames with (theoretically) high quality content. The underlying premise being that if 48 seconds can be filled then productions of larger size can be completed in similar ways. I've experimented with a number of other things but right now I'm mostly interested in just making the first connection. If there are problems with that then it doesn't matter much what we can dream up... it won't work anyway. DAT servers can serve up secure and secret transfers so don't release any DAT address out in public that you want to keep private! Who knows... perhaps this could be the resurgence of that ages old plan to create an Animation:Master digital magazine. That'd be sweet. Those involved in creating the issues would create in private and then when ready to publish the proper DAT address would be released.