sprockets Nidaros Cathedral Tongue Sandwich A:M Composite Kaleidoscope Swamp Demon Caboose Break Room
sprockets
Recent Posts | Unread Content | Previous Banner Topics
Jump to content
Hash, Inc. - Animation:Master

This looked interesting: massively parallel computer


Recommended Posts

  • Replies 17
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

  • Hash Fellow
Posted

If it ran Windows it would be ready to go. But looks like it doesn't.

 

$750,000 sounds like not enough to get such a hardware project going but i have no idea really?

  • *A:M User*
Posted

It probably won't be able to run windows natively, but they're talking about it running Ubuntu 12.04 so I imagine you could run windows in a vm. It currently only has 1GB of RAM onboard, not sure if that is expandable. That might be the only stumbling block to running Windows.

 

I may plunk down for one, you could probably do some interesting stuff with it even if it wouldn't make a good desktop system.

I imagine this might be more useful for coprocessing in a system, if you could work that out somehow. Kinda like GPU computing. Maybe you could accelerate radiosity rendering with this? I imagine anyone that does any kind of DSP work would also be able to use this.

  • *A:M User*
Posted

As far as the costs, I'm really not sure how that works, I don't have any experience in the electronics or semi-conductor industry.

From watching their Kickstarter videos, it sounds like the $750k is going towards design and fabrication of the actual board the chip sits on, apparently the processor is a done deal already? The prototype board is a $10k board, though, because they can only order in such limited quantity. So I imagine most of this is probably going to go to a board redesign and then maybe an initial production run of maybe 5,000 boards?

Posted

the SDK is in c/c++. Isn't that what A:M is written in? If so, the best use of this is a solid, super-fast, small footprint render farm.

  • *A:M User*
Posted
the SDK is in c/c++. Isn't that what A:M is written in? If so, the best use of this is a solid, super-fast, small footprint render farm.

 

I'm not a programmer, but yeah I think so. I don't want to speculate on the ease of something I know relatively little about. But, even if you couldn't compile a native version of AM for it, you should still be able to pass AM data onto a different renderer if you wanted to use it as a render farm. I'm not sure how much faster this would be than current systems, I think most mid-range to high-end GPUs are currently in the teraflop range for single-precision floating point. Not sure about CPUs, but those must surely be in the 100 gigflop range by now.

 

Worst case, you could probably run AM in a virtual machine on it, but I don't know if you could access all the hardware that way.

 

Someone else posted a link to Corel subscription thing. I see interesting things happening with this. Given that Steam runs on Linux now, and is also now open to non-gaming apps, you could get a huge new user-base for AM along with an easy way to handle DRM and subscription management. Don't know how trivial it would be to get it to run on Linux, but it's interesting to think about.

  • Admin
Posted

Hmmm.... interesting. Okay, I'm in. :)

 

With 24 hours to go and a lot of funds to collect though...

 

Thanks for the heads up Roger.

 

 

(Since I've been wanting a good solid Linux machine this would do it)

  • Hash Fellow
Posted

From past discussions with Steffen, there's a lot of Windows© that is needed for A:M to work... to even show up on the screen. (Mac works because Hash wrote a lot of code to replace the Windows stuff that isn't in MacOS)

 

To run NetRender, maybe not so much of Windows is needed, but it's all written for Intel CPUs so a lot of rewriting would be needed to port it to Linux and and the ARM CPU it has.

 

Those 64 cores aren't full CPUs that you can send regular code to, they sound more like the things in a graphics card.

 

And then you're still talking about NetRender which requires more RAM to run each instance. The board only has 1GB.

 

I'm guessing it would be a monumental hassle to get A:M to something like this.

 

Liquid Helium-cooled overclocking of a conventional CPU might be more practical. :o

Posted

Is it possible to get AM to Netrender over the net? I remember way way back Electric Image had that ability. You needed to copy all the assets to each machine.

Seeing that Netrender 1 frame per core, might be possible to have users gather together and do some sort of virtual share of even 1 or 2 cores ea. Imagine 100 users collaborating?

  • Hash Fellow
Posted
Is it possible to get AM to Netrender over the net?

 

During TWO we had something like that but very few people got on board.

  • Admin
Posted

Just got an email... looks like they made their quota.

 

That was a lot of pennies to raise in the last few hours.

Some folks must be really interested in parallel computing. ;)

 

 

...and still 17 hours to go to reach their 'whatchamacallit'... stretch goal... and if they do, I get the 64core bonus board. Go team!

 

Kickstarter... what a concept...

  • *A:M User*
Posted
Just got an email... looks like they made their quota.

 

That was a lot of pennies to raise in the last few hours.

Some folks must be really interested in parallel computing. ;)

 

 

...and still 17 hours to go to reach their 'whatchamacallit'... stretch goal... and if they do, I get the 64core bonus board. Go team!

 

Kickstarter... what a concept...

 

I honestly wasn't sure they were going to make it. They were only at 570k or so last night. I haven't pulled the trigger yet, I'm trying to decide how badly I want an experimental computer. I'd just wait, but there is the possibility that I may not get one at all if they can't turn it into a retail product.

 

What do you plan on using yours for, Rodney?

  • Admin
Posted
What do you plan on using yours for, Rodney?

 

Hopefully more than a paperweight. ;)

 

Right now I'd be happy to have an extra computer to work on (all the standard stuff... browse, edit movies, draw, etc.) while this one renders stuff from A:M. I have a TV sitting here doing nothing that will make a great/huge monitor.

 

This is assuming I can get mine to run.

 

It'd be cool to set it up as a server.

  • *A:M User*
Posted
What do you plan on using yours for, Rodney?

 

Hopefully more than a paperweight. ;)

 

Right now I'd be happy to have an extra computer to work on (all the standard stuff... browse, edit movies, draw, etc.) while this one renders stuff from A:M. I have a TV sitting here doing nothing that will make a great/huge monitor.

 

This is assuming I can get mine to run.

 

It'd be cool to set it up as a server.

 

I don't think you'll have too much trouble, the prototype looked a bit rough but it did look fully functional. Assuming they weren't doing the "hook up the prototype but the thing running the demo is in another box" deal :(

I don't think that's the case, though. Shame it doesn't support more than 1GB of RAM, though. Funny how that seems like such a small amount...

  • *A:M User*
Posted

Wish I was having an easier time deciding if I want to pull the trigger on this. The 16 core version is about as fast as the i7 I have now, assuming I'd be able to use it in the same way (unlikely). It might be fun to play with, but I'm not a programmer or EE so not sure I'd make use of it other than as a really low power server or desktop. And there is the risk (however small) that I might not get a finished board.

  • *A:M User*
Posted

Well, I ended up getting one of these. Hopefully it will be fun to fool with. I am concerned that it seems to lack any kind of GPU, but hopefully the 16 core chip is functioning as some kind of GPU? Not sure how else they could be doing the video otherwise.

Guess I'll see what happens.

  • Admin
Posted
I am concerned that it seems to lack any kind of GPU

 

There was a short write up that seemed to suggest that is related to how they are able to get the parallel computing and speed they do from their chip; they bypass the GPU and instead work everything directly in parallel on their chip. My naive understanding is that their chips simply take on the role of the GPU instead of farming out instructions to one of them. If that is indeed the case then this would make sense to me because GPUs have increasingly just been specialized CPUs that focus solely on processing graphics. The downside being that they can't/won't process anything else. Again, this is pure speculation on my part but it's the only angle that makes sense to me given this computers capability of processing graphics/video at such a high rate directly to a TV screen.

  • *A:M User*
Posted

Yeah, that's the only thing that makes sense. It's a shame they didn't make the stretch goal, the 64 core one would have been pretty nice.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...