Jump to content
Hash, Inc. Forums

Peer to Peer Resource Sharing? I need some testers.


Rodney

Recommended Posts

  • Admin

Are there any brave souls out there willing to help me test some peer to peer file sharing?

 

I'm looking for one or two people to help me test the basic process.

 

The underlying idea is that of distributed computing where files are shared from my computer directly to anyone with the address without need for a middleman server. If they so desire they can also share files as if they have their own server.

 

I'm leaning toward this approach for the next round of A:M Extras and other productions but it really needs to shown to work well first. ;)

 

I think I can only use WIndows users only at this time.

Later testing should expand to Mac and Linux.

 

For those too shy to post their interest here please feel free to email me at: rodney.baker@gmail.com

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

  • Admin
Wouldn't a Dropbox location be simpler?

 

No Sir, not unless Dropbox has changed significantly of late.

II don't have a point by point comparison to provide but pehaps the root difference is that Dropbox is maintained on a third party server.

This would provide a means to securely host and share with only two computers and I suppose technically at least one ISP (because the computers need to connect via internet.

There are some other technologies coming online that will handle local networks but I'm wanting something that I can host on my computer and others can access directly.

Other parties don't have to do the same thing although they can.

 

 

A key element is Distributed Dataset Syncronation and Versioning (DAT)

Here's a whitepaper on it from May 2017: https://github.com/datproject/docs/blob/master/papers/dat-paper.pdf

 

Having said this there are also ways to publish this data in similar ways as http protocols and files can always be mirrored on sites like Dropbox.

Third party servers would be the fall back but not the environment for the creative side of things.

 

For those that want to investigate the easiest way to get up to speed would be to download the Beaker Browser and follow instructions on how to set up a DAT server.

 

https://beakerbrowser.com/

 

For more info see: https://beakerbrowser.com/docs/

Link to comment
Share on other sites

Any info on your p2p resource sharing you care to share in this thread would be of interest to me.

 

I planned on sharing projects and resources in this way or something similar. I have a Tb of every A:M (or related) resource I could find, download, purchase, create or somehow obtain between 2002 and 2015 (i.e. models, materials, images, tools, plugins, tutorials, mocap files, reference material/videos/images, etc.)...all organized and sorted in folders. This is the 'resource pool' for my…well, kindof want-to-be company Virtualmedia Studios (www.virtualmediastudios.com). This effort went 'on hold' several years ago as I slowly and painfully concluded that I am not really an ‘entrepreneur’...no, not even having to think about stuff like 'accounting' appeals much to me. I’m just restarting the effort now.

 

The animation assembly line I built (all based around A:M) was, IMHO, pretty cool. For instance, all the resources mentioned above are reflected in A:M's library (a program runs each night to update A:M's library files with any changes made to the resource pool). Hair dos are separate from ‘hair color’ and it is all interchangeable and ‘plug and play’. Face maps (old, young, tan, pale, etc.) are interchangeable amoung character models. Everything below the neck is ‘wardrobe’ (photo-realistic* human characters are only heads and are interchangeable with wardrobe outfits in a standardized, ‘plug and play’ (action invoked) manner.

 

Whoops, sorry…I got myself started on a subject that I can talk about for days!

 

Cheers,
Rusty

Link to comment
Share on other sites

  • Admin

Rusty,

By all means join in the fun.

If you download the Beaker Browser (link above) and enter the following address you should be able to see two models and one action:

 

dat://8ddfac3988fd0bd8c654af3785951f174d15c9baadb60e241a67590c698ed642

 

As soon as someone launches the Beaker browser they can also set up distributed secure servers of their own (and remove them when no longer needed).

 

What I would very likely do as a Proof of Concept would be to publish the data from the AM DVD for others to Fork to their own DAT servers and somewhere in that mix there would be a place where we would mix all of our files.

For instance, I might render something out of A:M to a DAT Render Folder that other folks could immediately use in A:M.

Likewise, we could have a Master Project that everyone would add resources into.... a kind of 'one Project to rule them all' approach.

For testing purposes of course.

 

I was also thinking of running a challenge of sorts called '48 frames' where a group of A:M users all contribute in DAT time to fill those 2 seconds worth of frames with (theoretically) high quality content.

The underlying premise being that if 48 seconds can be filled then productions of larger size can be completed in similar ways.

 

I've experimented with a number of other things but right now I'm mostly interested in just making the first connection.

If there are problems with that then it doesn't matter much what we can dream up... it won't work anyway. ;)

 

DAT servers can serve up secure and secret transfers so don't release any DAT address out in public that you want to keep private!

 

Who knows... perhaps this could be the resurgence of that ages old plan to create an Animation:Master digital magazine.

That'd be sweet.

Those involved in creating the issues would create in private and then when ready to publish the proper DAT address would be released.

Link to comment
Share on other sites

  • Admin

I'm currently researching how detrimental it is to serve DAT from external drives.

I know network drives take a hit because the data has to be written twice (presumably once to the main computer and then once to the remote client).

It seems to me that external drives might take this hit also.

Link to comment
Share on other sites

  • Admin

Two (or more) A:M Users can collaborate without concern for their files being compromised.

This is because the URL (DAT Address) is said to be unguessable and the data itself is encrypted during transfer.

So, only those who have access to the address can access the data stream.

 

https://beakerbrowser.com/docs/tutorials/share-files-secretly.html

Link to comment
Share on other sites

  • Admin

There is another relatively newcomer on the scene similar to Beaker browser called Brave.


(It's easy to find on the internet though)

Brave is very similar to Beaker but more polished and it is one of the first to incorporate monetization features in the form of a wallet.
They are promoting that aspect through a Creators Referral Program that reportedly will be paying out 1 million tokens (for whatever that is worth) to get others to switch to the Brave browser.

No, I am not signed up for the referral program.

The main difference between Brave and Beaker is that Brave is blockchain enabled whereas Beaker is not (and does not plan to use it).
And this may be where Beaker and Brave depart company. The folks behind Beaker think that Blockchain is fine but Proof of Work carries with it too much unnecessary baggage.

Link to comment
Share on other sites

  • Admin
Maybe something like ownCloud? https://owncloud.org/

 

 

As I was looking at it I began to think.... maybe...

 

But then...

 

No.

 

Very close in many respects.

The primary difference is that with Beaker the browser is the server. Additionally there is no cloud... everyone is (or can be a server).

It is more like BitTorrent in this respect but without some of its drawbacks.

 

With cloud approaches (like owncloud) a server needs to be setup (and this appears to be limited to Linux at present for owncloud although the clients can be Linux, Mac, Windows).

 

I can see where something like owncloud would be advantageous.

 

This makes me want to list some of the obstacles to using Dat systems... things it's supporters are working to address.

One is that it can be very hard to find things in a distributed system because there is no aggregation in the traditional sense.

Perhaps an example of some early forays into that area would be DatProject and the curated list of DatProjects at AwesomeDat.

But curation (as the name implies) takes time and much effort.

I suspect a mechanism needs to be promoted that indexes a resource when it is first published.

There is a basic mechanism there but I'm not familiar enough with it to understand how it does or does not measure up.

Link to comment
Share on other sites

  • Admin

Of interest although perhaps a bit too obscure for most people will be the following...

 

The P2P browsing doesn't seem to fully support Windows shortcuts which is very unfortunate.

However the better path... and something that works great... is to use Symbolic Links (and or Junctions) rather than shortcuts as those do work within Beaker.

 

What I'm working on at present is a test of a production pipeline that can be easily navigated through and Symlinks are just the ticket for such a thing.

 

And... even if not using any of this it's good to know how Symlinks work in Windows so they can be used for other things. :)

 

With Symlinks we can create substitutions so that Windows treats a multiple directories can all point to the same location (or similar) location or we can use a short name for a very very VERY long name.

 

 

Added: I just did a test and this appears to work great for use with my RenderFolder concept where A:M always points to the same location. I can then simply change the SymLink to point to the flavor of the day (i.e. proxy, lowrez, highrez). Update: More testing. Works great! SymLinks can't be directly copied/pasted but they can be renamed.

Link to comment
Share on other sites

  • Admin

Sometimes you outsmart yourself...

 

I was having a persistent error rendering out to a shared folder.

Tried a bunch of different things.

Most attempts would get 15 frames into a 30 frame sequence and then fail with an error in A:M.

PNG, TGA, etc. same error.

Drop the numbering...

Tried EXR... the first frame failed.

Gah!

 

Low and behold... that external drive was full.

Changed to a different partition that has lots of space.... no issues.

 

Link to comment
Share on other sites

Wouldn't a Dropbox location be simpler?

 

Not close to being equal... Dropbox is a cloud based (which in this case does not mean much more than uploading something to a server and making a download link available somewhere else) service in which you really do not know where the data is going to be and who has access to it. P2P would mean a more direct approach transfering data directly to the recipient. (that is technically not exactly true, since a transfer via internet always means, that there a other servers/routers involved in the process).

 

Before you use dropbox, use at least OneDrive (they at least have to loose something if they give out your data, because they have a lot of companies as customers who would not be amused at all, if the data would be leaked) or better a FTP server at your favourit hoster. (godaddy, blue or whatever is good in your country... for gemany it would be 1&1, strato or netcup for instance)

 

Best regards

*Fuchur*

Link to comment
Share on other sites

  • Admin

One reason that many won't be jumping on the Peer to Peer bandwagon too soon is that it is hard to make money when people cut out the middle man.

If I can trade goods and services directly with you I don't need to contract with someone in the middle.

 

On the local level, a different set of technologies is on approach and you may have seen or even used it; projecting from one devices to another.

Some older computers aren't equipped with sufficient hardware although the software (like Win 10) running on it likely is.

Specifically what I'm taking about here is how any device within range can be given access to my computer and data can be shared back and forth as needed.

This is especially useful with smart phones and tablets where all I might need to do to sync some data on each of my devices is to be within close proximity.

 

The Peer to Peer model expands this basic idea across the internet where the browser is the server.

I can serve up (publish) any content I desire and others can access it as if they were getting it directly from me (because they are).

 

I get a sense that the ISPs are very much in the know on where this may be heading and that is why they seek more control in the realm of 'net neutrality' because in a Peer to Peer world they are no longer dealing with the middle men either and their ability to control the flow of data from every computer in the world to every other computer in the world is constrained by a very large dose of freedom. In the interest of self preservation then they have been moving to where they can legally meter and gauge the flow of information where costs are higher. Some of this makes sense while some of it does not but that is largely because they have a clearer view on what is coming.

Link to comment
Share on other sites

The problem with p2p is most often, that people need to be online for that. Like with any of the old fileshare systems (eMule, bittorrent, etc) this is a problem which can not be overcome easily.

That is the thing that makes server based systems "better". They are really always available. Nobody needs to charge them, restart them, etc.

They are just there to run, at least until some kind of maintenance is necessary and even that they often do not need to be restarted.

 

What you are trying to do with the browser (which is already possible using WebRTC) is possible, but if you want to sync folders for instance: That will not be possible for security reasons and I do not hope that that will be change anytime soon.

It would be a mayor problem if the browser is broken open than, because they could do close to anything then and list files on your computer (=> very bad thing)

 

Best regards

*Fuchur*

Link to comment
Share on other sites

  • Admin
people need to be online for that

 

It wouldn't make much sense not to be online.

I can't think of any cases where P2P would not be online.

 

BUT

 

This approach does account for being offline as well.

It just awaits the next time it can be compared to the other seeds and then updates to get back in sync (as allowed)..

The immediate benefit of this is something of a built in versioning system.

This is a requirement where technically ever offline server might have a different version of the same data.

The resync then gets everything back up to date.

Again, all this assuming the user actually wants to be up to date.

 

 

 

if you want to sync folders for instance: That will not be possible for security reasons and I do not hope that that will be change anytime soon.

 

 

I fear your hopes are dashed with this technology but you also likely would assess technologies such as Bittorrent as risky. (and you would be right)

The promise (which must be tested/proven) is that security must be built into the system.

A risk here is not unlike risks with any data that traverses the internet.

P2P is highly secure in the transfer of data as that is encrypted.

The area of concern is then while browsing the data because anyone with the address can access the information associated with that Dat and even those that cannot can root out other information..

The security is found in that the address provides a key to encrypt and decrypt the data and that address is not necessarily public.

This is why I can access/read the data if I have the address but neither I nor anyone else can where we do not.

 

This is not unlike email in a sense in that I might launch an email to one or more persons and they might in turn launch it to others.

There is always some risk that the ISPs used might try to read/interpret that data but their primary incentive to do so would be based on what they believe they can do with that data.

In theory, at the small scale, no data need be sent beyond that of the address itself which is then used to encode and decipher all subsequent traffic.

That data steam could then be limited to a one byte data stream. There... intercept and decipher that!

The smaller that footprint the more secure the data.

Additionally, the speed of that transfer is largely controlled by bandwidth (although other factors weigh in such as compression alogorithm that take advantage of previous and anticipated patterns.

So, in theory again, if all that encryption were not enough bad data could be mixed in with good data and those with the key would simply access the good data.

The middleman would just have that mess of undecipherable data.

This is the useful element related to blockchains without the global overhead.

 

But all of that neither here nor there...

 

 

Regarding WebRTC, the developers of Dat respond thusly:

 

WebRTC Usage Notes

Important: dat-js uses WebRTC, so it can only connect to other WebRTC clients. It is not possible for the dat-js library to connect directly clients using other protocols. All other Dat applications use non-WebRTC protocols (see this FAQ for more info). Non-browser clients can connect dats peer-to-peer via webrtc modules, such as electron-webrtc, or use proxies via websockets, http, or other client-server protocols.

Due to WebRTC's less than stellar performance - Dat has focused on creating solid networking using other protocols. We may integrate WebRTC if performance improves and it becomes easier to run in non-browser interfaces (though we'd prefer using more performant options in the browser, if they develop).

 

 

So, Dat-js does use WebRTC but it is deemed to limited at present for the purpose under consideration of having the browser be the server of local P2P distributed data.

 

Regarding areas of security... any and all P2P activity can be deemed high risk but risks should always be calculated and mitigated

Here's what the folks behind Dat are staring at with regard to security:

 

https://docs.datproject.org/security

 

Additional information can be found here:

 

https://docs.datproject.org/ecosystem

 

 

Added: Should we need to standardize on a more highly supported browser such as Chrome WebRTC would be a likely alternative.

The downside: this would appear to be primarily used at the local network level which defeats the broader goal of distributed P2P

Link to comment
Share on other sites

  • Admin

Here's a near equivalent that uses WebRTC.
Interestingly, after being awarded some recognition hasn't seen development in appox. 4 years.
That might be largely because it does exactly what it advertises to do.

http://www.peer-server.com/

A relevant part of the description applies to the Dat approach as well:

We built PeerServer in 8 weeks for our Stanford senior project in Spring 2013. PeerServer is a peer-to-peer client server using WebRTC, where your browser acts as a server for other browsers across WebRTC peer-to-peer data channels. You can create a client-server within your browser tab, upload content, and generate dynamic content using a mock-database, templating system, and sessions. Any client browser that connects to your client server will behave as if it is talking to a traditional server while in fact exclusively hitting your server.
This system allows you to quickly create a decentralized, short-lived web application where all the content lives within your browser. The traditional server only performs the initial handshake between the client-browsers and the client-server;
your browser serves all other content peer-to-peer.


*emphasis added

Link to comment
Share on other sites

  • Admin

Additionally, WebRTC still requires a server and even the browsers that might lean toward being that server don't provide that out of the box.

Chrome appears to be a good example of this where an extension must be added in order to get that functionality and operate via WebRTC.

 

As far as I can tell the Beaker browser is the only browser built from the core with this functionality in mind.

It is built from the core of Chromium so it's next of kin is Chrome and similarly coded browsers.

 

Security-wise, the code of the Beaker Browser can be readily examined whereas the extensions required for WebRTC more often than not... cannot.

 

From an interview with the autor of the Beaker browser:

 

Beaker is a participatory browser. It's a browser for indie hackers.

The Web is closed source. If you want to influence how social media works, you have to work at Facebook or Twitter. For search, Google. Control is in the hands of companies, rather than the users themselves.

With Beaker, we have a new Web protocol: the Decentralized Archive Transport. "Dat." It creates sites on demand, for free, and then shares them from the device. No servers required. That's our innovation.

 

Link to comment
Share on other sites

  • Admin

Continuing on... relative to security and Dat:

 

(Disclaimer: This is very likely going too far into the weeks for most folks but is interesting never-the-less)

 

A response addressing that very question suggests the following:

 

-content integrity. The core data structure in dat is an append-only ledger. The items in the ledger are stored in a merkle tree using strong cryptographic hashes. This guarantees that data is not tampered with and you can easily verify the data you're loading is exactly as intended by the author. The root hashes are signed by the private key of the author so that you can get future updates without having to manually update your root hashes through a side channel. Many technologies use similar content addressable techniques (bittorrent, ipfs, git version control, blockchain ledgers, etc).
- semi-private datasets. The dat url *is* the public key. In dat's network protocols, the public key is never sent across the wire (even in the DHT, only a hash of the public is used for discovery) and all communication is strongly encrypted using a key derived from the public key. Therefore a client needs to get the public key via some other means (either dns-over-https like beaker does or through someone messaging you a dat url). In a way it's what some services call a private resource. I call it semi-private because once the public key is shared, anyone with a copy *can* share to other people and there is no way to control or stop such sharing.

- pseudo anonymity. Publishing a dataset only needs to expose your IP address if you seed the dataset directly. It's quite possible to seed the dat to other servers and then have them keep the dataset available where your original machine goes off-line completely. The only identifying information for the dataset is the public-key which has no way to track back to the person or machine that originally created the dataset. Typically this isn't a big concern and most people share directly from their own machines as well as use services like hashbase.io to provide highly available datasets and https-to-dat bridging

 

.

This addresses several areas of security as 'secure' means different things to different people.

Link to comment
Share on other sites

Did you copy that from somewhere? I can not write in there in a reasonable way...?

 

But lets try it without that:

I am talking about 2 persons (or more) communicating with eachother. Of cause at least one needs to be online* (the one who wants to download something),
but do you want to be online all the time because someone in lets say Germany (a couple of hours time difference in general something like 9 hours) wants
to download something/sync something, etc.? That is very unhandy don't you think?

 

And the more interesting part is: To gain what exactly? Waiting two days till someone is online again with the computer hosting the files needed? What are you trying to archieve with the technology which can not be done easier with a FTP server?

* Not sure if this is a language thingy... with "needs to be online" I am talking about being connected to the internet at the very given moment, not that there is an internet connection available at all to that person.

 

I fear your hopes are dashed with this technology but you also likely would assess technologies such as Bittorrent as risky. (and you would be right)

No, this is something very different. A browser needs to let a website (quite uncontrolled input till it is done and run) communicate with the user with very different things including scripting and every day something newly created with the technology of "tomorrow", etc.
That is the reason why EVERY browser and app (!= software/program) out there is running in a sandbox to secure the computer (which deveice ever) it is run on.

 

Standard software on Windows is not using a sandbox and like that allows access to quite everywhere on the computer, etc.

This is all about security. I am not talking about a user saying "hey, use this folder." but about security structures build into the operation system which is the most important one... we are not talking about sniffing data but about controlling the computer of the other person as a whole (of cause including to encrypt the data on your computer and blackmailing you to decrypt it again, just putting a lot of trojaners on there, using your computer as a crypto-machine, using a bug to get your computer to burn and so on... there is a lot of maleware out there you really do not want to get catch yourself if you are just visiting a website.

Doing only data transfer like a bit torrent is something different because there is software that does it all by itself and in general is not running any script of unknown sources.

 

To make it short: You don't want that... this is really not about encrypting or using SSL or some other encrypting technology to transfer data to someone else... this is extremly cruicial to everything you do on the internet or in your whole digital life.

 

//////////////////////////////////////////////////////////////////////////////

 

Saying WebRTC is not P2P is really just wrong, as long as you know who you want to talk to. (some kind of possibility to connect to the other party. => IP adresses or whatever. But you will always need that in some way or the other)

There is the possibility to use it as P2P or as P2S2P, depending on what you want.

 

I am not saying, WebRTC is already fully developed perfectly, but FF and Chrome have been used to do a skype like technology to communicate with eachother quite a long time ago. It is a open standard and at least FF does not need any kind of plugin or extension to do WebRTC. Would be new for me, that Chrome needs it, but who knows... maybe Google did something on their own again...

 

//////////////////////////////////////////////////////////////////////////////

 

I have to admit, that I can not read much more from what you wrote (you have a tendency to write very extensive my friend ;) ) and I need to work tomorrow, but I surely will go on reading it in the next days.

 

Best regards

*Fuchur*

Link to comment
Share on other sites

  • Admin

Here's a useful video that covers some of the capabilities that other approaches don't have:

 

xhttps://www.youtube.com/watch?v=Bem9nRpyPEs

 

 

Added: I hadn't confirmed it before but I have it from the source now that the Beaker Browser is a Chrome fork so that is where it gets the majority of its browser code..

Link to comment
Share on other sites

  • Admin

Fuchur,

I will have to assume you are not interested in assisting with this particular test.

I do thank you for your interest and for the related information regarding WebRTS.

I very much enjoyed looking into WebRTS and especially how Dat.js interfaces with it.

 

If you are in fact interested in testing the Dat protocol I will be most appreciative.

Link to comment
Share on other sites

I can understand how it works, but I still don't really get what you want to archive with it... why not use a server to do all that for you and make it permament like that?

In short beaker at least in this video is doing not much more than creating a fake webserver on your machine with very limited abillities, as far as I can see...

 

Would that be the same as for instance XAMPP, but by installing/using beaker instead of something from Apache?

 

Best regards

*Fuchur*

Link to comment
Share on other sites

  • Admin
why not use a server to do all that for you and make it permament like that?

 

Third party servers can certainly be useful but in P2P systems a primary strength is not in storage but in the transfer of bits and bytes between the peers/clients.

And in distributed systems permanency may not be required nor desired.

We may want to simply share a file once especially if it is a work in progress.

Permanency can easily equate to outdated data.

 

It is interesting to note that in distributed systems permanency is a byproduct.

This relates to an observation made years ago when people were losing their A:M models and someone would step out of the shadows and say... "Here it is. A couple years ago you gave me a copy."

That's one example of (unintentional) permanency achieved through peer to peer distribution although in that particular scenario not something to heavily rely on.

That might be a good test case however, to try to set up a P2P system of P2P backup.

This might be a source of income for the local user who would be incrementally reimbursed for maintaining distributed data.

In turn they would hold a copy of the encrypted data (which not even they could access).

But I digress... (My point there is that permanency through distriubuted backups is entirely viable)

 

Would that be the same as for instance XAMPP, but by installing/using beaker instead of something from Apache?

 

 

I would have to look to see in what capacity XAMPP might be viable.

The downside I must assume is that (as originally designed for local testing) it intentionally turns off security features.

That might not be a problem but needs to be considered.

 

At a quick glance I'd say that wherever FTP-style file sharing/serving were required XAMPP might be ideal because... as far as I can tell... that is how peers would access each others local servers; via an FTP client. IF serving up html and apps I would think that better suited to a browser and I don't see where XAMPP can be leveraged by a browser.

 

I really don't know enough about XAMPP to speculate further.

Link to comment
Share on other sites

  • Admin
beaker... is doing not much more than creating a fake webserver on your machine with very limited abillities

 

I'm not entirely sure what the difference between a fake webserver and a real one is outside of the fact that a real webserver will likely be consider one that is dedicated to that job. Purist will likely see that going a step further where the server needs to reside exclusively on a computer set aside for that purpose.

 

As far as very limited ability... if the local (fake) webserver has everything required for two or more people to share data directly anywhere in the world at no additional cost that limited capability may suffice.

 

Humorous aside: I recall many years ago going into a Best Buy computer store (or similar franchise) and asking a store assistant if they had any computers set up as servers as I wanted to buy one. The guy looked at me like I was from another planet. His take was that all the computers in the store could be set up as servers. I didn't want a computer that *could* be set up as a server. I just wanted to buy a server. Needless to say, I left that day without buying one. (In case it isn't obvious I'm laughing at myself and my naivety and not that poor computer guy)

Link to comment
Share on other sites

  • Admin

I need to delve more deeply into the versioning aspect of Dat because I can sense some usefulness and power in it's ability to move through time restoring to any published point.

 

This makes me wonder how applications process dat files because if Animation:Master could directly access those revisions... good grief that'd be awesome.

I'd have a mini test for that but I know for a fact that A:M's internal browser is not Dat compliant ;).

Link to comment
Share on other sites

 

why not use a server to do all that for you and make it permament like that?

 

Third party servers can certainly be useful but in P2P systems a primary strength is not in storage but in the transfer of bits and bytes between the peers/clients.

And in distributed systems permanency may not be required nor desired.

We may want to simply share a file once especially if it is a work in progress.

Permanency can easily equate to outdated data.

 

It is interesting to note that in distributed systems permanency is a byproduct.

This relates to an observation made years ago when people were losing their A:M models and someone would step out of the shadows and say... "Here it is. A couple years ago you gave me a copy."

That's one example of (unintentional) permanency achieved through peer to peer distribution although in that particular scenario not something to heavily rely on.

That might be a good test case however, to try to set up a P2P system of P2P backup.

This might be a source of income for the local user who would be incrementally reimbursed for maintaining distributed data.

In turn they would hold a copy of the encrypted data (which not even they could access).

But I digress... (My point there is that permanency through distriubuted backups is entirely viable)

 

Would that be the same as for instance XAMPP, but by installing/using beaker instead of something from Apache?

 

 

I would have to look to see in what capacity XAMPP might be viable.

The downside I must assume is that (as originally designed for local testing) it intentionally turns off security features.

That might not be a problem but needs to be considered.

 

At a quick glance I'd say that wherever FTP-style file sharing/serving were required XAMPP might be ideal because... as far as I can tell... that is how peers would access each others local servers; via an FTP client. IF serving up html and apps I would think that better suited to a browser and I don't see where XAMPP can be leveraged by a browser.

 

I really don't know enough about XAMPP to speculate further.

 

 

People often mistake a "Webserver" to be a whole different machine than anything else.

Actually a webserver is nothing else than a "normal" computer (a RasperyPi even if you wish), which runs a certain combination of software on it. You can and should use hardware, that is very durable especially for permanent 24/7 usage, but in the end, it can be anything.

 

XAMPP is a suit of software which includes very powerful stuff like Apache Webserver, MySQL-Databases, FTP-Software, PHP-Interpreter, PERL-Support and more which can be activated and deactivated as you wish.

XAMPP itself should not be used (as provided) for permanent hosting. That changes however if you change the default passwords for the software/services you want to use. XAMPP is mainly unsecure, because they use default passwords so everybody can easily get started without setting it up. Change those and you should be fine (more or less).

 

So what you get with XAMPP is a full blown webserver, which of cause is by default created to share data with people all over the world. A browser, which can be used as a server too is like trying to build a digger starting with a spoon and attaching all the other stuff needed to make it a digger to the spoon. Everything is possible and of cause you can do that, but I still do not see why someone should.

 

Best regards

*Fuchur*

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...