sprockets The Snowman is coming! Realistic head model by Dan Skelton Vintage character and mo-cap animation by Joe Williamsen Character animation exercise by Steve Shelton an Animated Puppet Parody by Mark R. Largent Sprite Explosion Effect with PRJ included from johnL3D New Radiosity render of 2004 animation with PRJ. Will Sutton's TAR knocks some heads!
sprockets
Recent Posts | Unread Content
Jump to content
Hash, Inc. - Animation:Master

Shuttle Discover cockpit vr


pixelplucker

Recommended Posts

  • Replies 13
  • Created
  • Last Reply

Top Posters In This Topic

Popular Days

Top Posters In This Topic

Very neat to see. Brings back memories, as I was the software Project Lead for real-time image generation on the very first Space Shuttle Simulators.

 

The simulators were located at NASA Houston. But we did most of our work from Sunnyvale, Ca (Link Flight Simulators). We did the digital generation of the real-time imagery for the forward windows, as well as the overhead and aft windows, and the CCTV systems. The forward windows were used for navigation, takeoff/landing training; the overhead windows were used for navigation by stars training (don't ask me how); and the aft windows were used for training using the RMS (remote Manipulator System) - ie grappling objects (payloads) in the cargo bay, using the giant articulated arms, which also had cameras at the end.

 

At that time, digital image generation was in it's infancy and the imagery for any one window was very low detail (4-8000 faces/window) as well as displayed in low resolution (1024? 2048? scanlines - maximum 512 edge intersections/scanline), 256 colors, 256 intensities, all smooth shaded, antialisased, no textures, with the beginnings of atmospheric attenuation (fog), updated at 30Hz.

 

There was no z-buffer then, and the occulting algorhythms required the real-time software to generate a priority list of all the fixed and moving objects in the scene (along with other things) to drive the special purpose image generation hardware which did the rest of the frame computation

 

I had been to NASA Houston and had seen the simulators (when we were installing them), but this was especially neat to see the real thing.

Link to comment
Share on other sites

  • Hash Fellow
Very neat to see. Brings back memories, as I was the software Project Lead for real-time image generation on the very first Space Shuttle Simulators.

 

The simulators were located at NASA Houston. But we did most of our work from Sunnyvale, Ca (Link Flight Simulators). We did the digital generation of the real-time imagery for the forward windows, as well as the overhead and aft windows, and the CCTV systems. The forward windows were used for navigation, takeoff/landing training; the overhead windows were used for navigation by stars training (don't ask me how); and the aft windows were used for training using the RMS (remote Manipulator System) - ie grappling objects (payloads) in the cargo bay, using the giant arms, which also had cameras at the end.

 

At that time, digital image generation was in it's infancy and the imagery for any one window was very low detail (4-8000 faces/window) as well as displayed in low resolution (1024? 2048? scanlines - maximum 512 edge intersections/scanline), 256 colors, 256 intensities, all smooth shaded, antialisased, no textures, with the beginnings of atmospheric attenuation (fog), updated at 30Hz.

 

There was no z-buffer then, and the occulting algorhythms required the real-time software to generate a priority list of all the fixed and moving objects in the scene (along with other things) to drive the special purpose image generation hardware which did the rest of the frame computation

 

I had been to NASA Houston and had seen the simulators (when we were installing them), but this was especially neat to see the real thing.

 

Fascinating to hear that. Any recollection as to what the "specs" on the computer running it was?

Link to comment
Share on other sites

  • Hash Fellow
Looks like they bought one of everything from the Radio Shack catalogue... does not have a look of quality or confidence. NASA is not big on design, are they...? Does what it needs to tho...

 

Maybe they could get Apple to design the next one.

Link to comment
Share on other sites

Fascinating to hear that. Any recollection as to what the "specs" on the computer running it was?

 

The computers at that time, used for the real-time (RT) software, were no big whoop, just plain vanilla Interdata 8/32's (formally Perkin Elmer), mini-computers. We programmed the real-time in assembler (at that time), because speed was essential, and we were concerned with shaving off millisecs for certain operations that had to be performed (30 hz). That's only 33 ms to compute the frame.

 

The big honking special purpose hardware - called the DIG (Digital Image Generator) was designed by LINK, and was driven by the puny off-the-shelf Interdata computers (running the real-time).

 

The special purpose DIG hardware however occupied a floor space probably about 8-10 feet in length, 3 feet wide, approx 8 feet tall (going on memory). The number of distinctly designed circuit boards numbered about 100-200, and each giant circuit board was about 1 x 2 feet. Maybe there were a total of 300 circuit boards/system? My eyes/brain would glaze over whenever maintenance logistics was mentioned on these systems.

 

The primary function of the real-time was to interface with the flight control computer (receive position, orientation data), and to drive the DIG Hardware. The real-time would update the DIG (synchronously at 30 hz) with position, orientation and the object occulting list. Other non-critical processes were to retrieve and update the DIG active data base with the "model" database, based on position and visibility (usually 20 miles maximum visibility when landing). This would be done asynchronously.

 

The most complicated part for the real time was doing overload correction - as there were many processing limitations in the DIG - number of objects allowed, number of occulting levels, number of faces, number of scanline intersections, etc. The real-time would have to make decisions as to how to compensate for overload (eliminate level of detail, reduce visibility, etc), with minimum impact on training.

 

This shuttle project was the one in which I received 3 Employee (aka Chump) of the Month awards.

Link to comment
Share on other sites

Looks like they bought one of everything from the Radio Shack catalogue... does not have a look of quality or confidence. NASA is not big on design, are they...? Does what it needs to tho...

 

NASA is big on redundancy, fail safe systems - especially when human safety is involved.

Link to comment
Share on other sites

The "stone knives and bear skins" age of computer science.

 

AND we were running UNIX as the OS on the mini's. Amazes me that UNIX is still around in slightly more advanced form ? At least I would hope so, not having direct experience with LINUX, and MAC command line OS thingies. MSDOS was also similar to UNIX.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...