-
Posts
5,395 -
Joined
-
Last visited
-
Days Won
87
Content Type
Profiles
Forums
Events
Everything posted by Fuchur
-
I have to admit that I never used Win10 Home at all, but as far as I can see you really do not need Pro at the moment as long as you do not need to join a domain-group (in general those are functions only needed in business to easily maintain larger amounts of Desktop computers or you want to use heavily encrypted file systems, etc. In short: I think you will be fine with Windows 10 Home edition. Best regards *Fuchur*
-
Hey Silvien, if you have any questions or need advise, let us know . And of cause: Keep us updated on your progress . Best regards *Fuchur*
-
AntiVir doesnt have a problem with that neither... it is Norton. They are often creating trouble like that with there suit, since they use a whitelist for a lot of things and if they do not know something, it is automatically bad. Best regards *Fuchur*
-
I am just guessing here but this is what I would try: First redownload the file and try again and if that does not work, shut down Norton, install and restart your computer (to restart Norton again). Best regards *Fuchur*
-
Using Terrain plug-in as a pose?
Fuchur replied to R Reynolds's topic in Work In Progress / Sweatbox
You can give any material a Displacement-Map or Bump-Map value. If you use it (located at the material properties itself) the whole material will be used as bump/displacement map-value, no longer for color. You can then just copy that material (save as > "name", embed it again and import the saved one and embed again with a different name) and get rid of the displacement value there again. See the screen cap I did about it below. That should be what you are after, right? Best regards *Fuchur* projection_map_displacement.mp4 -
I just tried it the other way around (first putting on the path-constraint, then the AimAt) and it worked. But what might come into your way is the "Compansate Mode"? You have to turn that off before applying the AimAt-Constraint if you do not want the current rotation offset to be kept. You can find it at the top toolbar next to the magnet mode button. So apply the aimat constraint and before picking the aim, have a look at that button. It needs to be off. Best regards *Fuchur*
-
You might want to have a look at this: - https://www.patchwork3d.de/am-to-directx-63-en - https://www.patchwork3d.de/am-to-unity3d-64-en And of cause nemyax very nice blender plugin is a great way too. Best regards *Fuchur*
-
As far as I know, OBJ does not support bones. if you want to export the most things you can from A:M, use Direct X (*.x) files. Obj is more commonly used, but will not hold bone animatons, etc. Best regards *Fuchur*
-
On the fence, and a couple questions about A:M
Fuchur replied to TwoCatsYelling's topic in New Users
Welcome to the show Twocats . Best regards *Fuchur* -
Don't know exactly but can't you open the mode lin v18? Anyway: It might have to do with missing AM Stuff. You can download it here: ftp://ftp.hash.com/pub/updates/windows/Am2006/ Best regards *Fuchur*
- 12 replies
-
Here is a very fun and quite astonishing project by Microsoft: https://www.microsoft.com/en-us/research/blog/project-zanzibar-blurring-distinction-digital-physical-worlds-via-tangible-interaction-portable-implementation/
- 1 reply
-
- 2
-
A character I started working on yesterday
Fuchur replied to jirard's topic in Work In Progress / Sweatbox
I just like the style of your models... can't help myself . Best regards *Fuchur* -
I'd say it is not a bug but intentional. It is using the most likely thing by default which you want to do in a certain situation. In modelling mode rotoscope or decal is very likely. In chor a layer and rotoscope is likely. And in an action it is only rotoscope. This is especially true since the decal is always saved with the model, not the action or pose itself. Very likely the abillity to add a stamp in a action (which is very useful of cause) has been added later on. I'd say a feature request would be good, but it is likely not a bug. And I definitivly do not see a reason to remove one of the methodes. I am using both and I never stumbled about that since it I do it intuitivly in each situation like it is programmed already, which tells me someone has thought about it very well. Best regards *Fuchur*
-
I stumbled over that a few times already... I think it has to do with the mode you are in. Modelling mode seems to ignore the setting all along, in chors it is taken in acount sometimes, etc. I'd say we should post it as a feature request to really take it into account as the button would suggest. (maybe with default mode set to off so it is as most people are used too already in A:M)? Best regards *Fuchur*
-
Especially for displaysmentmaps I like EXR (with 32bit per channel) very much, because there are just "more" color steps inbetween which makes it less "stepy". I am not sure about the glitch... maybe it would be worth a try to use a transparency gradient at the start and end of the image? Best regards *Fuchur*
-
Are you using an EXR there? Best regards *Fuchur*
-
Great stuff, really great stuff . Best regards *Fuchur*
-
I talked about that with Steffen about 3 years ago. I am not sure if something changed, but I doubt it. Long story short: It is quite hard to change that especially to a vector based system, just because the UI-system used (to keep it compatible with Mac and Windows version, which is a big hassel anyway) does not support anything but bitmaps. Of cause Steffen would have to say something about it himself... Best regards *Fuchur*
-
Actually, a GPU has a lot of cores (up to a few thousands) but they are very different and one GPU core is a lot less powerful than a CPU core itself. For istancen in general a GPU core is not able to use the same instruction sets as CPU cores and like that it is not possible to do it "just like that". If GPU rendering would be included, it would need a own rendering procedure which takes such things in account. Best regards *Fuchur*
-
Third party servers can certainly be useful but in P2P systems a primary strength is not in storage but in the transfer of bits and bytes between the peers/clients. And in distributed systems permanency may not be required nor desired. We may want to simply share a file once especially if it is a work in progress. Permanency can easily equate to outdated data. It is interesting to note that in distributed systems permanency is a byproduct. This relates to an observation made years ago when people were losing their A:M models and someone would step out of the shadows and say... "Here it is. A couple years ago you gave me a copy." That's one example of (unintentional) permanency achieved through peer to peer distribution although in that particular scenario not something to heavily rely on. That might be a good test case however, to try to set up a P2P system of P2P backup. This might be a source of income for the local user who would be incrementally reimbursed for maintaining distributed data. In turn they would hold a copy of the encrypted data (which not even they could access). But I digress... (My point there is that permanency through distriubuted backups is entirely viable) I would have to look to see in what capacity XAMPP might be viable. The downside I must assume is that (as originally designed for local testing) it intentionally turns off security features. That might not be a problem but needs to be considered. At a quick glance I'd say that wherever FTP-style file sharing/serving were required XAMPP might be ideal because... as far as I can tell... that is how peers would access each others local servers; via an FTP client. IF serving up html and apps I would think that better suited to a browser and I don't see where XAMPP can be leveraged by a browser. I really don't know enough about XAMPP to speculate further. People often mistake a "Webserver" to be a whole different machine than anything else. Actually a webserver is nothing else than a "normal" computer (a RasperyPi even if you wish), which runs a certain combination of software on it. You can and should use hardware, that is very durable especially for permanent 24/7 usage, but in the end, it can be anything. XAMPP is a suit of software which includes very powerful stuff like Apache Webserver, MySQL-Databases, FTP-Software, PHP-Interpreter, PERL-Support and more which can be activated and deactivated as you wish. XAMPP itself should not be used (as provided) for permanent hosting. That changes however if you change the default passwords for the software/services you want to use. XAMPP is mainly unsecure, because they use default passwords so everybody can easily get started without setting it up. Change those and you should be fine (more or less). So what you get with XAMPP is a full blown webserver, which of cause is by default created to share data with people all over the world. A browser, which can be used as a server too is like trying to build a digger starting with a spoon and attaching all the other stuff needed to make it a digger to the spoon. Everything is possible and of cause you can do that, but I still do not see why someone should. Best regards *Fuchur*
-
I can understand how it works, but I still don't really get what you want to archive with it... why not use a server to do all that for you and make it permament like that? In short beaker at least in this video is doing not much more than creating a fake webserver on your machine with very limited abillities, as far as I can see... Would that be the same as for instance XAMPP, but by installing/using beaker instead of something from Apache? Best regards *Fuchur*
-
Did you copy that from somewhere? I can not write in there in a reasonable way...? But lets try it without that: I am talking about 2 persons (or more) communicating with eachother. Of cause at least one needs to be online* (the one who wants to download something), but do you want to be online all the time because someone in lets say Germany (a couple of hours time difference in general something like 9 hours) wants to download something/sync something, etc.? That is very unhandy don't you think? And the more interesting part is: To gain what exactly? Waiting two days till someone is online again with the computer hosting the files needed? What are you trying to archieve with the technology which can not be done easier with a FTP server? * Not sure if this is a language thingy... with "needs to be online" I am talking about being connected to the internet at the very given moment, not that there is an internet connection available at all to that person. No, this is something very different. A browser needs to let a website (quite uncontrolled input till it is done and run) communicate with the user with very different things including scripting and every day something newly created with the technology of "tomorrow", etc. That is the reason why EVERY browser and app (!= software/program) out there is running in a sandbox to secure the computer (which deveice ever) it is run on. Standard software on Windows is not using a sandbox and like that allows access to quite everywhere on the computer, etc. This is all about security. I am not talking about a user saying "hey, use this folder." but about security structures build into the operation system which is the most important one... we are not talking about sniffing data but about controlling the computer of the other person as a whole (of cause including to encrypt the data on your computer and blackmailing you to decrypt it again, just putting a lot of trojaners on there, using your computer as a crypto-machine, using a bug to get your computer to burn and so on... there is a lot of maleware out there you really do not want to get catch yourself if you are just visiting a website. Doing only data transfer like a bit torrent is something different because there is software that does it all by itself and in general is not running any script of unknown sources. To make it short: You don't want that... this is really not about encrypting or using SSL or some other encrypting technology to transfer data to someone else... this is extremly cruicial to everything you do on the internet or in your whole digital life. ////////////////////////////////////////////////////////////////////////////// Saying WebRTC is not P2P is really just wrong, as long as you know who you want to talk to. (some kind of possibility to connect to the other party. => IP adresses or whatever. But you will always need that in some way or the other) There is the possibility to use it as P2P or as P2S2P, depending on what you want. I am not saying, WebRTC is already fully developed perfectly, but FF and Chrome have been used to do a skype like technology to communicate with eachother quite a long time ago. It is a open standard and at least FF does not need any kind of plugin or extension to do WebRTC. Would be new for me, that Chrome needs it, but who knows... maybe Google did something on their own again... ////////////////////////////////////////////////////////////////////////////// I have to admit, that I can not read much more from what you wrote (you have a tendency to write very extensive my friend ) and I need to work tomorrow, but I surely will go on reading it in the next days. Best regards *Fuchur*
-
The problem with p2p is most often, that people need to be online for that. Like with any of the old fileshare systems (eMule, bittorrent, etc) this is a problem which can not be overcome easily. That is the thing that makes server based systems "better". They are really always available. Nobody needs to charge them, restart them, etc. They are just there to run, at least until some kind of maintenance is necessary and even that they often do not need to be restarted. What you are trying to do with the browser (which is already possible using WebRTC) is possible, but if you want to sync folders for instance: That will not be possible for security reasons and I do not hope that that will be change anytime soon. It would be a mayor problem if the browser is broken open than, because they could do close to anything then and list files on your computer (=> very bad thing) Best regards *Fuchur*
-
Not close to being equal... Dropbox is a cloud based (which in this case does not mean much more than uploading something to a server and making a download link available somewhere else) service in which you really do not know where the data is going to be and who has access to it. P2P would mean a more direct approach transfering data directly to the recipient. (that is technically not exactly true, since a transfer via internet always means, that there a other servers/routers involved in the process). Before you use dropbox, use at least OneDrive (they at least have to loose something if they give out your data, because they have a lot of companies as customers who would not be amused at all, if the data would be leaked) or better a FTP server at your favourit hoster. (godaddy, blue or whatever is good in your country... for gemany it would be 1&1, strato or netcup for instance) Best regards *Fuchur*