Jump to content
Hash, Inc. - Animation:Master

Fuchur

*A:M User*
  • Posts

    5,369
  • Joined

  • Last visited

  • Days Won

    84

Everything posted by Fuchur

  1. Third party servers can certainly be useful but in P2P systems a primary strength is not in storage but in the transfer of bits and bytes between the peers/clients. And in distributed systems permanency may not be required nor desired. We may want to simply share a file once especially if it is a work in progress. Permanency can easily equate to outdated data. It is interesting to note that in distributed systems permanency is a byproduct. This relates to an observation made years ago when people were losing their A:M models and someone would step out of the shadows and say... "Here it is. A couple years ago you gave me a copy." That's one example of (unintentional) permanency achieved through peer to peer distribution although in that particular scenario not something to heavily rely on. That might be a good test case however, to try to set up a P2P system of P2P backup. This might be a source of income for the local user who would be incrementally reimbursed for maintaining distributed data. In turn they would hold a copy of the encrypted data (which not even they could access). But I digress... (My point there is that permanency through distriubuted backups is entirely viable) I would have to look to see in what capacity XAMPP might be viable. The downside I must assume is that (as originally designed for local testing) it intentionally turns off security features. That might not be a problem but needs to be considered. At a quick glance I'd say that wherever FTP-style file sharing/serving were required XAMPP might be ideal because... as far as I can tell... that is how peers would access each others local servers; via an FTP client. IF serving up html and apps I would think that better suited to a browser and I don't see where XAMPP can be leveraged by a browser. I really don't know enough about XAMPP to speculate further. People often mistake a "Webserver" to be a whole different machine than anything else. Actually a webserver is nothing else than a "normal" computer (a RasperyPi even if you wish), which runs a certain combination of software on it. You can and should use hardware, that is very durable especially for permanent 24/7 usage, but in the end, it can be anything. XAMPP is a suit of software which includes very powerful stuff like Apache Webserver, MySQL-Databases, FTP-Software, PHP-Interpreter, PERL-Support and more which can be activated and deactivated as you wish. XAMPP itself should not be used (as provided) for permanent hosting. That changes however if you change the default passwords for the software/services you want to use. XAMPP is mainly unsecure, because they use default passwords so everybody can easily get started without setting it up. Change those and you should be fine (more or less). So what you get with XAMPP is a full blown webserver, which of cause is by default created to share data with people all over the world. A browser, which can be used as a server too is like trying to build a digger starting with a spoon and attaching all the other stuff needed to make it a digger to the spoon. Everything is possible and of cause you can do that, but I still do not see why someone should. Best regards *Fuchur*
  2. I can understand how it works, but I still don't really get what you want to archive with it... why not use a server to do all that for you and make it permament like that? In short beaker at least in this video is doing not much more than creating a fake webserver on your machine with very limited abillities, as far as I can see... Would that be the same as for instance XAMPP, but by installing/using beaker instead of something from Apache? Best regards *Fuchur*
  3. Did you copy that from somewhere? I can not write in there in a reasonable way...? But lets try it without that: I am talking about 2 persons (or more) communicating with eachother. Of cause at least one needs to be online* (the one who wants to download something), but do you want to be online all the time because someone in lets say Germany (a couple of hours time difference in general something like 9 hours) wants to download something/sync something, etc.? That is very unhandy don't you think? And the more interesting part is: To gain what exactly? Waiting two days till someone is online again with the computer hosting the files needed? What are you trying to archieve with the technology which can not be done easier with a FTP server? * Not sure if this is a language thingy... with "needs to be online" I am talking about being connected to the internet at the very given moment, not that there is an internet connection available at all to that person. No, this is something very different. A browser needs to let a website (quite uncontrolled input till it is done and run) communicate with the user with very different things including scripting and every day something newly created with the technology of "tomorrow", etc. That is the reason why EVERY browser and app (!= software/program) out there is running in a sandbox to secure the computer (which deveice ever) it is run on. Standard software on Windows is not using a sandbox and like that allows access to quite everywhere on the computer, etc. This is all about security. I am not talking about a user saying "hey, use this folder." but about security structures build into the operation system which is the most important one... we are not talking about sniffing data but about controlling the computer of the other person as a whole (of cause including to encrypt the data on your computer and blackmailing you to decrypt it again, just putting a lot of trojaners on there, using your computer as a crypto-machine, using a bug to get your computer to burn and so on... there is a lot of maleware out there you really do not want to get catch yourself if you are just visiting a website. Doing only data transfer like a bit torrent is something different because there is software that does it all by itself and in general is not running any script of unknown sources. To make it short: You don't want that... this is really not about encrypting or using SSL or some other encrypting technology to transfer data to someone else... this is extremly cruicial to everything you do on the internet or in your whole digital life. ////////////////////////////////////////////////////////////////////////////// Saying WebRTC is not P2P is really just wrong, as long as you know who you want to talk to. (some kind of possibility to connect to the other party. => IP adresses or whatever. But you will always need that in some way or the other) There is the possibility to use it as P2P or as P2S2P, depending on what you want. I am not saying, WebRTC is already fully developed perfectly, but FF and Chrome have been used to do a skype like technology to communicate with eachother quite a long time ago. It is a open standard and at least FF does not need any kind of plugin or extension to do WebRTC. Would be new for me, that Chrome needs it, but who knows... maybe Google did something on their own again... ////////////////////////////////////////////////////////////////////////////// I have to admit, that I can not read much more from what you wrote (you have a tendency to write very extensive my friend ) and I need to work tomorrow, but I surely will go on reading it in the next days. Best regards *Fuchur*
  4. The problem with p2p is most often, that people need to be online for that. Like with any of the old fileshare systems (eMule, bittorrent, etc) this is a problem which can not be overcome easily. That is the thing that makes server based systems "better". They are really always available. Nobody needs to charge them, restart them, etc. They are just there to run, at least until some kind of maintenance is necessary and even that they often do not need to be restarted. What you are trying to do with the browser (which is already possible using WebRTC) is possible, but if you want to sync folders for instance: That will not be possible for security reasons and I do not hope that that will be change anytime soon. It would be a mayor problem if the browser is broken open than, because they could do close to anything then and list files on your computer (=> very bad thing) Best regards *Fuchur*
  5. Not close to being equal... Dropbox is a cloud based (which in this case does not mean much more than uploading something to a server and making a download link available somewhere else) service in which you really do not know where the data is going to be and who has access to it. P2P would mean a more direct approach transfering data directly to the recipient. (that is technically not exactly true, since a transfer via internet always means, that there a other servers/routers involved in the process). Before you use dropbox, use at least OneDrive (they at least have to loose something if they give out your data, because they have a lot of companies as customers who would not be amused at all, if the data would be leaked) or better a FTP server at your favourit hoster. (godaddy, blue or whatever is good in your country... for gemany it would be 1&1, strato or netcup for instance) Best regards *Fuchur*
  6. There are many things involved like soft shadows, which are distributed to the passes and so on. If you do not use those neighter and really get rid of all the more advanced things, it might be the same, but I am really not sure about that. Is it just out of interest, or is it for a specific reason you are asking about that? Best regards *Fuchur*
  7. Sounds like you are blocked by the hash-spam-protection or something. You could try to send information by the contact form at hash.com. Best regards *Fuchur*
  8. As you are changing your hardware it will (very likely) no longer work and you need to get the lic transfered by Jason. Did you change your laptop too? (more RAM, a SSD or something like that?) Best regards *Fuchur*
  9. Are you using 3d models with a thickness or is it only a flat patch which is opening there? Best regards *Fuchur*
  10. Have fun and party well . Best regards *Fuchur*
  11. Cool start . You may want to have a look at my tutorials and printings, just in case you need some info somewhere: https://www.patchwork3d.de/3d-print-177-en Best regards *Fuchur*
  12. Just great . Best regards *Fuchur*
  13. Works for me too... Have a look into your HTX folder in the A:M installation folder. Are there some? Do they have "_64.hxt" behind them (if you are using the 64bit version)? Best regards *Fuchur*
  14. As written above, he does not own AE. Best regards *Fuchur*
  15. Looks really great . Best regards *Fuchur*
  16. There we go... it really is that easy and it is quite easy to do. Rodney did really nicely explain it. Best regards *Fuchur*
  17. You can attach your master0.lic file to your request if you still have it. If not, you can use the download from the trial requst page to create your host-id of your new computer, which may make it easier for Jason. https://www.hash.com/try-it-16-en Best regards *Fuchur*
  18. Great stuff Kevin – very well done : It looks very nice and has a great production quality. Best regards *Fuchur*
  19. Could you have a look if you are using SSE3 or SS4-Version? It should be visible at Help > About A:M. Best regards *Fuchur*
  20. How much RAM is used there? Is maybe one of your RAM chips failing?
  21. You will always need an internet connection for the activation. That is just how it works. You are getting your licence by the server in the end and the activation code is just like a password to access it. (technically not totally correct, but close enough to understand it) Best regards *Fuchur*
  22. Very well done Robert. Congratulation everybody and very nice entries . Best regards *Fuchur*
  23. Very cool ! Congratulations! Best regards *Fuchur*
×
×
  • Create New...