sprockets The Snowman is coming! Realistic head model by Dan Skelton Vintage character and mo-cap animation by Joe Williamsen Character animation exercise by Steve Shelton an Animated Puppet Parody by Mark R. Largent Sprite Explosion Effect with PRJ included from johnL3D New Radiosity render of 2004 animation with PRJ. Will Sutton's TAR knocks some heads!
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

This looked interesting: massively parallel computer


Roger

Recommended Posts

  • Replies 17
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

  • *A:M User*

It probably won't be able to run windows natively, but they're talking about it running Ubuntu 12.04 so I imagine you could run windows in a vm. It currently only has 1GB of RAM onboard, not sure if that is expandable. That might be the only stumbling block to running Windows.

 

I may plunk down for one, you could probably do some interesting stuff with it even if it wouldn't make a good desktop system.

I imagine this might be more useful for coprocessing in a system, if you could work that out somehow. Kinda like GPU computing. Maybe you could accelerate radiosity rendering with this? I imagine anyone that does any kind of DSP work would also be able to use this.

Link to comment
Share on other sites

  • *A:M User*

As far as the costs, I'm really not sure how that works, I don't have any experience in the electronics or semi-conductor industry.

From watching their Kickstarter videos, it sounds like the $750k is going towards design and fabrication of the actual board the chip sits on, apparently the processor is a done deal already? The prototype board is a $10k board, though, because they can only order in such limited quantity. So I imagine most of this is probably going to go to a board redesign and then maybe an initial production run of maybe 5,000 boards?

Link to comment
Share on other sites

  • *A:M User*
the SDK is in c/c++. Isn't that what A:M is written in? If so, the best use of this is a solid, super-fast, small footprint render farm.

 

I'm not a programmer, but yeah I think so. I don't want to speculate on the ease of something I know relatively little about. But, even if you couldn't compile a native version of AM for it, you should still be able to pass AM data onto a different renderer if you wanted to use it as a render farm. I'm not sure how much faster this would be than current systems, I think most mid-range to high-end GPUs are currently in the teraflop range for single-precision floating point. Not sure about CPUs, but those must surely be in the 100 gigflop range by now.

 

Worst case, you could probably run AM in a virtual machine on it, but I don't know if you could access all the hardware that way.

 

Someone else posted a link to Corel subscription thing. I see interesting things happening with this. Given that Steam runs on Linux now, and is also now open to non-gaming apps, you could get a huge new user-base for AM along with an easy way to handle DRM and subscription management. Don't know how trivial it would be to get it to run on Linux, but it's interesting to think about.

Link to comment
Share on other sites

  • Hash Fellow

From past discussions with Steffen, there's a lot of Windows© that is needed for A:M to work... to even show up on the screen. (Mac works because Hash wrote a lot of code to replace the Windows stuff that isn't in MacOS)

 

To run NetRender, maybe not so much of Windows is needed, but it's all written for Intel CPUs so a lot of rewriting would be needed to port it to Linux and and the ARM CPU it has.

 

Those 64 cores aren't full CPUs that you can send regular code to, they sound more like the things in a graphics card.

 

And then you're still talking about NetRender which requires more RAM to run each instance. The board only has 1GB.

 

I'm guessing it would be a monumental hassle to get A:M to something like this.

 

Liquid Helium-cooled overclocking of a conventional CPU might be more practical. :o

Link to comment
Share on other sites

Is it possible to get AM to Netrender over the net? I remember way way back Electric Image had that ability. You needed to copy all the assets to each machine.

Seeing that Netrender 1 frame per core, might be possible to have users gather together and do some sort of virtual share of even 1 or 2 cores ea. Imagine 100 users collaborating?

Link to comment
Share on other sites

  • Admin

Just got an email... looks like they made their quota.

 

That was a lot of pennies to raise in the last few hours.

Some folks must be really interested in parallel computing. ;)

 

 

...and still 17 hours to go to reach their 'whatchamacallit'... stretch goal... and if they do, I get the 64core bonus board. Go team!

 

Kickstarter... what a concept...

Link to comment
Share on other sites

  • *A:M User*
Just got an email... looks like they made their quota.

 

That was a lot of pennies to raise in the last few hours.

Some folks must be really interested in parallel computing. ;)

 

 

...and still 17 hours to go to reach their 'whatchamacallit'... stretch goal... and if they do, I get the 64core bonus board. Go team!

 

Kickstarter... what a concept...

 

I honestly wasn't sure they were going to make it. They were only at 570k or so last night. I haven't pulled the trigger yet, I'm trying to decide how badly I want an experimental computer. I'd just wait, but there is the possibility that I may not get one at all if they can't turn it into a retail product.

 

What do you plan on using yours for, Rodney?

Link to comment
Share on other sites

  • Admin
What do you plan on using yours for, Rodney?

 

Hopefully more than a paperweight. ;)

 

Right now I'd be happy to have an extra computer to work on (all the standard stuff... browse, edit movies, draw, etc.) while this one renders stuff from A:M. I have a TV sitting here doing nothing that will make a great/huge monitor.

 

This is assuming I can get mine to run.

 

It'd be cool to set it up as a server.

Link to comment
Share on other sites

  • *A:M User*
What do you plan on using yours for, Rodney?

 

Hopefully more than a paperweight. ;)

 

Right now I'd be happy to have an extra computer to work on (all the standard stuff... browse, edit movies, draw, etc.) while this one renders stuff from A:M. I have a TV sitting here doing nothing that will make a great/huge monitor.

 

This is assuming I can get mine to run.

 

It'd be cool to set it up as a server.

 

I don't think you'll have too much trouble, the prototype looked a bit rough but it did look fully functional. Assuming they weren't doing the "hook up the prototype but the thing running the demo is in another box" deal :(

I don't think that's the case, though. Shame it doesn't support more than 1GB of RAM, though. Funny how that seems like such a small amount...

Link to comment
Share on other sites

  • *A:M User*

Wish I was having an easier time deciding if I want to pull the trigger on this. The 16 core version is about as fast as the i7 I have now, assuming I'd be able to use it in the same way (unlikely). It might be fun to play with, but I'm not a programmer or EE so not sure I'd make use of it other than as a really low power server or desktop. And there is the risk (however small) that I might not get a finished board.

Link to comment
Share on other sites

  • *A:M User*

Well, I ended up getting one of these. Hopefully it will be fun to fool with. I am concerned that it seems to lack any kind of GPU, but hopefully the 16 core chip is functioning as some kind of GPU? Not sure how else they could be doing the video otherwise.

Guess I'll see what happens.

Link to comment
Share on other sites

  • Admin
I am concerned that it seems to lack any kind of GPU

 

There was a short write up that seemed to suggest that is related to how they are able to get the parallel computing and speed they do from their chip; they bypass the GPU and instead work everything directly in parallel on their chip. My naive understanding is that their chips simply take on the role of the GPU instead of farming out instructions to one of them. If that is indeed the case then this would make sense to me because GPUs have increasingly just been specialized CPUs that focus solely on processing graphics. The downside being that they can't/won't process anything else. Again, this is pure speculation on my part but it's the only angle that makes sense to me given this computers capability of processing graphics/video at such a high rate directly to a TV screen.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...