Report of the IPS Technical Committee
Full-Dome Video Systems
IPS Technical Committee Chair
The Renaissance Center
Dickson, Tennessee, USA
Chapel Hill, North Carolina, USA
[reprinted from the Planetarian, March 1999]
One of the primary charters for the IPS Technical Committee is to review the range of competing full-dome video systems that have recently become available, develop some sort of evaluation metric, and attempt to define a set of standards that would help manufacturers address compatibility issues for content production and presentation. To that end, we are beginning the process of looking at the major systems, each in detail. In this quarter's column we'll give a comprehensive overview of the major technologies involved, discuss two prevailing architectures, and review a few of the major systems.
Let us begin by saying that one could easily write much more on this subject than we have space for here, and that this overview is simply a brief introduction to some of the technologies and terms that may appear in a discussion of full-dome video systems. Also please keep in mind that we, the members of the IPS Technical Committee, are not equipment vendors. We are looking at these systems as potential customers in light of our collective expertise. Furthermore, the IPS Technical Committee will not attempt to recommend one system or another. Our main focus is to disseminate information and to encourage the major vendors to create interoperable systems.
There are several companies that provide full-dome video systems in a variety of formats. Some of the major vendors include:
ElectricSky™ Spitz, Inc.
Virtuarium GOTO Optical Mfg., Co.
V-Dome Trimension, Inc.
VisionDome™ Alternate Realities Co.
StarRider™ Evans & Sutherland
SkyVision Sky-Skan, Inc.
The actual physical setup varies from system to system, although there are some similarities. Most of the options use multiple video projectors and some form of edge-blending technology to create a seamless video image over the entire surface of a planetarium dome. Only Alternate Realities offers a single-lens system for smaller theaters. To generate images, some systems use a graphics super computer and others use off-the-shelf hardware and software solutions. Finally, there is a wide range of control and automation mechanisms, audience response sub-systems, and production philosophies.
There are two primary architectures in all-dome video systems: real-time and offline (also known as "pre-rendered"). Real-time systems use massive amounts of processing power to generate every image "on-the-fly." Offline systems render video out to a storage medium (hard disk, tape, laserdisc) and then play back as needed. Each architecture has its own merits, but the larger question of which type of system a theater might choose is probably more philosophical or financial in nature, rather than technical.
Real-time architectures have their roots in high-end flight simulator displays. Historically, these Image Generators (IGs) were specifically designed to recreate out-of-cockpit views for pilots, ground warfare, and other military training scenarios. Modern IGs provide a more general-purpose approach to graphics and can now reproduce wider range of content.
Production with real-time systems involves creating 3D graphics models for every object in your "show." These models are then given texture and color, and are placed in a three-dimensional space the "world." Over time, objects can move from one place to another, change in size and shape, and fly in and out of the audience's view. In the spirit of flight simulators, the audience's viewpoint can also change over time, allowing for tremendous production freedom and graphic realism.
Real-time systems compute images as fast as they can hopefully producing images at more than 30 frames-per-second (fps). Depending on the complexity of the show sequence, real-time frame rates may vary, resulting in motion that can be very smooth in some places and jerky in others. With careful production, though, these systems can produce consistent, smooth motion.
Since real-time architectures generate images on-the-fly, they work very well in interactive environments. For example, with StarRider™ from Evans & Sutherland, it is possible to "fly" the theater with a single joystick, much like one would fly a flight-simulator. Another important feature of real-time image generation is the ability to manipulate program content on the dome without having to refer to some sort of "preview" or having to wait for animation sequences to render on a separate computer. On the other hand, real-time systems are somewhat limited in the complexity of the scenes that they can produce, and they require very technically skilled modelers to create objects that will be shown in a program. Further discussion of the merits and challenges of various systems will be addressed within individual product evaluations.
Offline (pre-rendered) architectures stem from recent advancements in digital video production and non-linear editing systems. Desktop video production and animation has become very popular in the last several years. Today's systems can provide full professional level capability at a fraction of the cost of yesterday's studio gear. Witness television programs like Star Trek and Babylon 5, along with blockbuster movies like Armageddon and Independence Day; each of these productions used PC-based animation and video editing systems to create visual effects.
Production with offline systems is similar to real-time. Objects are modeled and textures are applied. One tremendous difference, however, is the complexity of the models that can be used, at the expense of time. An offline model can be as detailed as you like, but you have to wait for the computer to render each image. One advantage is that slow-time animation systems tend to be more advanced (both in terms of interface and features) than the current crop of real-time production software.
After selecting models and designing the animated sequences that will make up your program, an additional step must be employed to generate images for the dome. In a multi-projector situation, frames of animation must be divided up, directed to the appropriate projector, and synchronized with all of the other content. This process is handled differently in each of the primarily offline solutions evaluated here.
Near-real-time is another term that may be applied to some pre-rendered systems. Given that all of your show content is prepared and placed in random access storage (e.g. hard disk or laserdisc), it is then considered to be online content. From there, individual frames can be displayed at will, or in sequence, at almost any frame rate. In this sense, pre-rendered content can mimic some of the functionality of a real-time system.
Finally, there is a need to address standards between systems. Currently, each vendor has a unique projector configuration, development platform and imaging hardware. Some vendors support industry standard tools like 3D Studio Max for modeling and animation, and After Effects for compositing, although the final media format is different for each system. That is, content created for one system can't easily be used by another. This is especially true when moving from real-time to offline or vice versa.
Perhaps a first step is to encourage vendors to agree on a common projector configuration. Then we can concentrate on common media formats and production standards. One example of vendors working together was demonstrated at the most recent IPS conference in London where Sky-Skan and Evans & Sutherland used the same projectors to showcase SkyVision and StarRider. While each vendor's content was very different, at least they were somewhat compatible at the projector level. This kind of cooperation is beneficial to both vendors and planetariums by expanding the library of available content that can be presented in a theater.
Once there is a potential for projection compatibility it is necessary to address the source material and production differences between real-time and pre-rendered systems. As an example, we'll work through a prototypical visual sequence and highlight a few of the production considerations for both architectures. Our storyboard snippet begins with the planet Saturn appearing on the limb of our dome and zooming up to rest at front and center. After pausing for a moment, we move towards the planet, dip through its rings and fly on to Titan.
In a real-time environment, one would start by creating a sphere to represent the planet Saturn and a disk for its rings. This combination would probably be modeled several times, each with a different level of detail (LOD). Because real-time systems are limited in the amount of detail that can be displayed in any one channel, (one channel = one projector) it is often necessary to create simplified models to represent the object when viewed from a distance. While zooming in towards the planet, we'd start with the simplest model and transition between the others as it got closer. The key is to develop models with the minimum number of polygons necessary to achieve the desired effect.
Texturing the planet's surface isn't much of a challenge one can find very accurate texture maps that will work nicely. The rings are a bit more difficult. Designing a 2D texture for the rings as viewed at a distance is not trivial, but designing a series of textures to make the rings three-dimensional when we move through them is downright hard. Unfortunately, a real-time system could not possibly handle a model of every individual clump of material in the rings and most real-time systems don't support particle animation (an algorithmic method for generating lots of tiny objects without having to explicitly model every single one). In this case you'll most likely use a collection of flat polygons with custom texture and transparency maps.
Once all the models are complete and textured, they must be translated into the desired image generator format and downloaded to the IG and to a show-control workstation. Once everything is "installed", then comes the task of positioning models and preparing flight paths for both the objects and/or the view camera. Those details are very system-specific and are beyond the scope of this article. In any case, once everything is roughly positioned and timed, you're ready to finalize the sequence and move on.
To replicate this same scene in an offline environment, you again start with basic models and textures. This time you don't have to worry about level-of-detail models and polygon counts (though these techniques can save you some rendering time). There are also a number of "special effects" that you can add in the offline system not currently available with real-time. For example, you might develop a complex particle system to represent the rings where all of the individual particles are moving independently and realistically with the proper gravitational effects. You might also add layering effects to the planetary surface to simulate cloud layers. All these effects add rendering time, but they also add a stunning amount of realism to the finished sequence. Finally, the tools for creating object paths and camera paths are far superior to most real-time show production software. Furthermore, in most cases you can do all of your modeling, animation, and rendering in one software package, on one computer. Real-time often requires you to work with separate modeling, animation, and control packages, and several different computer systems.
As you read through the following product reviews, keep in mind that the technological issues of all-dome video are just one small part of the equation.
Regardless of which system you may prefer from a technical standpoint, there may be larger, more difficult questions to pose:
· Can I afford it?
· Will it help further the goals of my planetarium?
· Will it help me reach my audience more effectively?
You should also consider the time and money you'll spend on maintenance and production. Another important consideration is whether you have the creative and technical talent on your staff to effectively use the system, or if you prefer to use external production houses and private consultants. Any full-dome video system will more than likely require at least two, perhaps three full-time employees with very specific skill sets. The potential for these systems is great, but they require a significant resource commitment.
SkyVision Product Review
This review does not constitute a recommendation nor endorsement for any product or company.
51 Lake Street
Nashua, New Hampshire 03060-4513 USA
Contact: Steve Savage
+1 800 880 8500
Cabletron, a large hardware vendor in the networking business, hired Sky-Skan, through a series of subcontractors, to produce and operate a demonstration program for Cabletron at the recent Networld & Interop show in Atlanta, Georgia USA. The program was given in a small 27-ft (8.2m) vacuum dome produced by ProDome (Antti Jannes & Co. in Finland). In addition, there was a Digistar II instrument, several moving-mirror incandescent fixtures, and an infra-red sound system. As an aside, the seats were from an automobile manufacturer and were very comfortable! Sky-Skan also demonstrates SkyVision in their 30-ft (9.1m) dome in Nashua.
SkyVision currently consists of six Barco video projectors (with outboard line quadruplers) for imaging on the dome. Five projectors form a continuous horizon image, and the sixth projector forms a "cap" that fills in the zenith. Each projector was mounted underneath the springline of the dome. SkyVision, as assembled in Atlanta, used Barco 801s projectors running at approximately 80% brightness. Sky-Skan also offers a high definition system called SkyVision HR. Using the same configuration, this system makes use of six Barco 1209s projectors. The digital video workstations that feed the projectors have an initial on-line video playback capacity of approximately 25 minutes and the native resolution for each frame of video is 1726 x 1296 pixels. This yields an image resolution approaching that of an IMAX frame across the dome.
All of the projectors are accessed through the SPICE automation system and each can function as a stand-alone unit along with providing SkyVision output. This makes the system extremely flexible when it comes time to incorporate more traditional video sources (e.g. laserdisc, DVD, and tape formats) into a program.
Producing the images were six PC compatible computers, each with a compressed video output card. In addition, the "zenith" computer also had a SMPTE timecode generator, and an additional card that delivered eight channels of digital audio for the show's soundtrack. Each machine also had a removable storage unit with a 9GB hard disk.
SPICE automation controls the SkyVision system, allowing programmers to search for and play from individual frames of video, and to play segments by name. The six-computer configuration is fairly flexible, and may change in subsequent revisions of the system. One advantage of keeping all six computers is that when it comes time to render new footage, you can have six processors working on the job.
In its current configuration, SkyVision supports two hours of online storage. That's two hours of full-dome imagery without changing disk drives. There are other storage options that can provide up to eight hours of online video if needed. Since the system uses removable technology, it is just as easy to swap in a new set of drives for additional content. In any case, SkyVision content storage is very flexible.
SkyVision supports interactivity through Sky-Skan's proprietary hardware/software combination, along with some creative pre-production. Since all of the content is pre-rendered and stored to hard disk, multi-path programs (where the audience chooses various topical segments during the course of the show) are quite simple to execute. In fact, when compared to laserdisc, hard disk based video can offer faster search times and more flexible control over playback. More advanced forms of interactivity are also possible, though it may require some extra effort during pre-production to assemble all the content in a meaningful way.
The production process for SkyVision is relatively straightforward; the magic is in the software. Sky-Skan has produced a clever production tool that can take a computer image file and dice it up such that it can be displayed as a whole by the SkyVision projectors. You can use almost any image, though to achieve full-dome video you will probably use a fish-eye lens (either real or virtual) to generate an appropriate hemispherical representation of the subject.
Perhaps the two most common ways of producing SkyVision images are via animation and compositing. When creating animated sequences, the renderer is set up with a virtual camera that mimics a fish-eye lens. This compensates for the distortion that occurs when projecting onto a hemispherical screen. The detail of an animated sequence is limited only by time and the sophistication of your rendering software. It is also possible to use video and film footage shot in more familiar rectangular formats. Using a compositing tool (e.g. Adobe After Effects) one can stretch and position footage for use with SkyVision. Note that since the date of this review, Sky-Skan has made progress on streamlining the SkyVision production process and has integrated additional content development tools.
SkyVision Strengths & Criticisms
Probably the most important technical considerations to address are image quality, content production, and maintenance. The full-dome image generated by SkyVision is surprisingly good, but varies with content. With proper alignment, the seams between projectors are nearly invisible. Depending on the image being projected, sometimes the seams are not detectable at all. Projector alignment will drift with time, though, and will likely require regular adjustments to maintain the best image. High-detail, natural footage such as Earth-bound panoramas seem to be more forgiving than some animated sequences when it comes to detecting misalignments. Edge blending between the five horizon projectors is excellent. Edge blending inconsistencies between the zenith projector and the others is much more noticeable. As with any multi-projector system, the "soccer ball" effect is unavoidable when viewing large, bright, low-detail areas such as a daytime summer sky. (Keep in mind that this review was conducted in an inflatable dome, and it is nearly impossible to accurately align multiple projectors in such an environment - actual installations provide much better results.) Given a bit more time, the engineers at Sky-Skan say they can tune the image blending algorithms to minimize the visual impact. Content is perhaps the largest factor in evaluating image quality. Some material looks absolutely wonderful on SkyVision, while other sources highlight its weak points. Our guess is that this effect has as much to do with psychology and the human visual system as it does with the technical aspects of multi-image projection.
Some planetarians who have used large format video projectors may feel a bit underwhelmed by the brightness offered by CRT based systems, especially in larger domes. Thankfully, this isn't so much of a problem when using an all-dome video system by itself. That is, when the eye can't compare between a smaller, brighter projector and a larger, more dim image, the perceived contrast ratio is very high and the image appears to be quite acceptable. For mission critical applications, the image brightness issue can be overcome by doubling the number of projectors, effectively having two projectors per frame and having an instant backup for the theater.
It may occur that while producing an animated sequence for SkyVision, you spend all night rendering only to put the result up on the dome and find that it's not acceptable, for whatever reason. Careful planing and pre-production can minimize these troubles, but it's still a fact of life. To help alleviate this problem, one might do some production work in the dome itself, using one of the SkyVision projectors as a preview monitor. Then it is possible to adjust colors, intensity, detail, alignment, etc. such that it looks best when viewed on the dome, rather than on a computer screen.
Perhaps the greatest strength of SkyVision is the ability to produce detailed, Hollywoodstyle imagery with well known tools. Granted, the time required to render complex scenes is substantial, but it's the realism that modern audiences demand. Another strength of SkyVision is that it takes a software approach to solving projection geometry and overlap issues. This lowers the cost to the end user because software is easy to reproduce and upgrade and does not rely on more expensive proprietary "black box" hardware.
SkyVision is offered in full and partial dome configurations.
The first SkyVision installation was unveiled at the Houston Museum of Natural History's Burke Baker Planetarium on December 11, 1998.
StarRider Product Review
This review does not constitute a recommendation nor endorsement for any product or company.
Evans & Sutherland
600 Komas Dr.
Salt Lake City, Utah 84108 USA
Contact: Jeri Panek
+1 801 588 1000
The Evans & Sutherland Digital Theater division has constructed a demonstration and development theater at their headquarters in Salt Lake City, Utah. This theater features a 36-ft (11m) variable tilt Astro-Tec dome, Sky-Skan automation and sound reinforcement, a Digistar II digital planetarium, and a full dome StarRider projection system.
StarRider is currently based on the ESIG (Evans & Sutherland Image Generator) and the PRODAS display system from SEOS. PRODAS consists of six specially modified Barco CRT video projectors and a proprietary edge blending system to create a seamless dome image. The projectors are arrayed in a five-segment panorama with a sixth projector filling in at the zenith. (This configuration is very similar to SkyVision, with some differences in projector placement.) StarRider projectors normally reside in a cove space or projection gallery such that the front lenses sit just beneath the dome springline. PRODAS comes with a rather elaborate remote control panel that is used to administer all aspects of projector setup and operation. Unfortunately, the unit is not immediately compatible with any automation system; that functionality may arrive shortly.
With the addition of a video source switcher, StarRider can accommodate other input sources (e.g. laserdisc, DVD, SkyVision, and tape formats). Using these alternate sources and some creative animation techniques, it is theoretically possible to create non-real-time content for playback on StarRider. It is also possible to turn off the edge-blending hardware. In effect, this makes each projector behave as a "normal" Barco and provides six discreet channels of video.
The graphics muscle behind StarRider is the Evans & Sutherland line of image generators. As previously mentioned, the current version of StarRider ships with the ESIG - a proven technology that is used extensively in other E&S simulator product lines. StarRider is also available with the new Harmony and Ensemble image generators, both from E&S. Ensemble will come in at the lowest price point, using custom PC-based graphics technology. Harmony will offer the highest performance and image quality. Harmony uses several proprietary graphics engines to generate the six simultaneous video streams that drive StarRider. The IG is based on a number of custom chip designs and runs under a specially designed real-time operating system which results in unmatched performance. Harmony supports a number of breakthrough graphics technologies such as texture sharpening, real-time Phong shading, and a multisample depth buffer. Suffice it to say that Harmony is a very complex piece of engineering that is still in its infancy. I strongly recommend that you explore the E&S website if you're interested in these and other technical details of Harmony.
FuseBox is the software product that controls the Harmony IG and integrates the entire StarRider system. FuseBox is a show production and show control tool that brings together models, textures, and other assets, into a visual scripting environment. Show elements respond to system and user definable events (e.g. time cues), and "paths" help define object motion. In addition, FuseBox is the hub of StarRider's interactive capabilities. StarRider uses flight sticks and an armrest keypad for audience participation. FuseBox is a rapidly evolving tool that is being tuned to the program development needs of StarRider. Its learning curve is steep, but therein lies its power.
StarRider audio is handled by SawPro which is a commercial multitrack audio editor and playback system. SawPro can support up to 32 tracks of simultaneous audio playback provided that you have enough sound cards in your host computer (a Pentium class system with at least 128Mb RAM). SawPro is the SMPTE show source for StarRider and is triggered by FuseBox via MIDI. In case you're wondering, a 20 minute show with six audio channels requires about 650Mb of disk space if the material is stored at CD quality (44.1KHz sample rate, 16 bit resolution). As with most hard disk based audio systems, there is no wait for tape rewinding and the system can instantaneously jump to any place in the soundtrack quite a boon during production.
The production process for StarRider is somewhat complex. Each task, in and of itself, is not overly difficult, but each has its own separate challenges. To begin, all of the visuals in a show must be modeled and textured. The modeling process can be done with tools like 3dStudio Max and MultiGen. The challenge is to create models that will work well in a real-time architecture. Perhaps the most important point is that models should have low polygon counts. In the process of creating textures, one must consider how colors will change when viewed on a large screen display, the effects of transparency, along with the physical size and image complexity of the texture. There are substantial differences when modeling for offline or real-time systems.
Once all the assets are generated, the next step is to begin organizing and developing the show in FuseBox. Models are positioned and oriented in the virtual world, given flight paths and other attributes, and events are timed to match appropriate script and score cues. During the course of production, one must keep in mind the capabilities of the real-time IG. In order to maintain image frame rates, a limited amount of detail may be present in each of StarRider's six video channels.
StarRider Strengths & Criticisms
StarRider shares most of the basic technical challenges found in multi-projector all-dome video systems; image quality, content production, and maintenance. StarRider's image quality is a function of content, production technique, and projector tuning. Visuals must be well modeled, strategically placed, and motion must be scripted with care. Harmony and the other E&S IGs are relatively forgiving technologies, but they do have limits when it comes to the complexity and placement of StarRider visuals. Specifically, there is a limit on the amount of detail that can be displayed in any one channel of the IG (recall that StarRider is a six-channel system - one channel per projector.) Furthermore, aligning and color matching the StarRider projectors is a challenging process. In order to maintain the best image, the projectors will most likely require bi-weekly adjustments. It is important to note that when the system is properly tuned, the resulting image is seamless and very pleasing to the eye.
Developing StarRider content requires a staff of creative and skilled professionals. The terminology and technical challenges of real-time are daunting. There's also a steep learning curve when it comes to the highly specialized software used to create and control models. Thankfully, one can use popular software like 3D Studio Max to create models, but one must use FuseBox to manipulate them within the context of a show. Still, the very best StarRider shows will be produced by those who have a firm grasp of real-time modeling concepts.
Perhaps StarRider's greatest technical achievement is truly interactive production and presentation. Interactively placing and moving visual elements on the dome is a relatively new production model and one that offers a tremendous amount of creative freedom. Furthermore, the real-time processing of StarRider allows one to develop audience interfaces that are unique and robust.
StarRider is normally sold as a complete package with dome, projectors, Digistar II digital planetarium, sound system, interactive hardware, effects, automation, and software. StarRider and Digistar II work well together on the dome, but they are wholly separate development environments. E&S currently offers full and partial dome StarRider systems.
The first StarRider installation was unveiled at Chicago's Adler Planetarium on December 4, 1998.
ElectricSky Product Review
This review does not constitute a recommendation nor endorsement for any product or company.
PO Box 198, Route 1
Chadds Ford, PA 19317 USA
Contact: Jon Shaw
+1 610 459 5200
Spitz has several demonstration domes at its headquarters in Chadds Ford, Pennsylvania. ElectricSky™ is currently housed in their 40-ft (12.2m) dome (10 degree tilt). The theater also showcases a Spitz planetarium instrument, Spitz's new ATM-4 automation system, and a full complement of all-sky and special effects projectors.
ElectricSky is offered in several configurations. To be more correct, ElectricSky is a member of a family of products called ImmersaVision™, an immersive multimedia theater system developed by Spitz. All-dome (immersive) video is just one aspect of ImmersaVision. Currently, the most extensively supported suite of products include ElectricHorizon and ElectricSky. For this review, we are taking a look at Electric Sky as configured with three projectors providing a 200 x 60 degree field-of-view. Spitz also offers a four-projector system (panorama with top-cap) and a seven projector full-dome array. ElectricSky uses newly-developed Electrohome dome projectors with advanced geometry correction and edge blending technologies from Panoram.
The entire system is integrated within the ATM-4 automation software, allowing random video source selection, routing, and output format. ElectricSky provides support for CRV, laserdisc, DVD, tape, digital disk recorders, and workstation source material. ATM-4 also automates the edge blending hardware such that blended and non-blended source can be displayed within the same program.
Spitz developed a 10 minute demonstration program to showcase the ImmersaVision format. The first performance was delivered from a trio of CRV discs, and the second from DVDs. Without a side-by-side comparison, it's difficult to see any differences between the two source formats; both were of excellent quality. Spitz is also exploring hard disk based storage options with an eye toward an integrated media server. Recording source material to a CRV disk is relatively simple, but the disks hold less than a half hour of video per side. On the other hand, DVDs hold much more content but are currently somewhat expensive to create. Keep in mind that almost any video playback format can be used and the folks at Spitz seem to be generally flexible in supporting customer-preferred equipment.
Audio can originate directly from the playback devices or from a separate digital tape or disk recorder. ElectricSky uses the 5.1 surround sound standard from either encoded source or discreet channels. The ElectricSky specification outlines a complete theater treatment for sound reproduction, including speaker types, placement, and reinforcement hardware.
ATM-4 automation controls all aspects of ElectricSky through a new Windows interface. ImmersaVision content is treated as a single playback system with a standard set of control options. In addition, ATM-4 supports interactivity via proprietary hardware.
ATM-4 automation controls all aspects of ElectricSky through a new Windows interface. ImmersaVision content is treated as a single playback system with a standard set of control options. In addition, ATM-4 supports interactivity via proprietary hardware (audience responders) and integrated software control. Like any other pre-rendered architecture, interactive and multi-path programs require a bit of pre-production effort. Any time the audience is given a choice, two or more separate bits of content must be generated and stored for real-time retrieval during the program.
The production process for the ImmersaVision format is greatly simplified through the use of a number of custom utilities and plug-ins that work with off-the-shelf production tools like AfterEffects, and Photoshop. In addition, Spitz has developed a special plug-in for the popular program 3D Studio Max, called ImmersaMax, used to generate CG content for ImmersaVision.
ImmersaVision content can originate from a number of different source material formats including film, video (HD and NTSC), panoramic and hemispherical video and film, computer graphics, and still images. In each case, a producer can chose the form of spherical correction, if any, that needs to be applied to the source material to ensure that it is displayed correctly on the dome. Spitz is the only manufacturer that offers the ability to set an eyepoint when correcting materials for display on a dome. That is, every other system assumes that the viewer is seated in the very center of the theater, which is usually the location of the planetarium instrument. With Spitz's utilities, you can create a view that is better suited to your particular theater layout.
Because ElectricSky uses hardware edge blending there are a number of other image sources that can be considered. For example, you can connect a desktop PC/Macintosh to the system, displaying the computer desktop across three full projectors. ElectricSky can also be driven by multi-channel visualization systems (from Silicon Graphics, Intergraph, HP, etc.), and other real-time image sources. This is a tremendous advantage during production because you can test source material without having to split it up into three separate frames and then apply soft edges for display. In the case of ElectricSky, just open a window containing an image with the correct aspect ratio and you're done! One might also imagine playing video games on this enormous display, or perhaps seeing every cell in a large spreadsheet. The possibilities are quite exciting.
ElectricSky Strengths & Criticisms
Spitz's video panorama and ImmersaVision projection format are more than a collection of software and hardware. In developing these technologies, Spitz spent a great deal of time researching large-format immersive displays. What they've come up with is an extremely flexible system that can accommodate a diverse range of source material and a production and presentation philosophy that is based on the science of visualization. Of all the systems reviewed thus far, Spitz has demonstrated the greatest amount of technical flexibility and product forethought.
Spitz blends their video projectors with a 25% overlap, which is a bit more than the other manufacturers use. This larger overlap seems to have a positive effect on the resulting image, giving the very best color blending, and absolutely seamless geometry blending. Spitz also uses a circular top-cap, reducing the edge-blend artifacts that can be quite harsh in a pentagonal cap (a la SkyVision and StarRider).
Like SkyVision, ElectricSky production uses mostly off-the-shelf tools and popular software packages for the manipulation and generation of content. Spitz, however, has developed additional custom utilities that allow an illustrator or animator to use virtually any software package for content creation, even if that software doesn't support spherical rendering or custom image warping! The other tremendous advantage to ElectricSky is the ability to preview content on the dome without having to split images and pre-blend. In fact, you can use ElectricSky as a working desktop and produce images right on the dome.
Spitz is currently focused on the three-projector ImmersaVision format, with a sound philosophy and research to back up their development efforts. They are working to build more support for their full-dome video product, though Spitz did not demonstrate full-dome capability during the review. There's no doubt that a tremendous amount of content can be effectively displayed within the ImmersaVision format. A planetarium, though, implies a complete hemisphere and sometimes it's necessary to exploit the full dome for maximum effect. Bear in mind that full-dome configurations can be much more expensive, and they require more complex production techniques. There is a clear trade-off and a planetarium's choice may depend on cost, support, production and maintenance issues. Formats like ImmersaVision provide a cleaner, more uniform image than full-dome, are easier to maintain and operate, and provide a very dramatic effect when used well. It's not an easy decision.
The first ElectricSky theater was unveiled at the Northern Lights Centre in April of 1997. The Northern Lights Centre is located in Watson Lake, Yukon Territory, Canada.
VisionDome Product Overview
This overview does not constitute a recommendation nor endorsement for any product or company.
Alternate Realities Corp.
Durham, NC USA
Contact: Kenneth Galluppi
+1 919 217 1497
VisionDome is a system for projecting full-color, full-motion graphics, created and manipulated in a 3-D computer environment. The technology is most similar to that of StarRider, but instead of using several video projectors to cover the dome, VisionDome uses a single projector and fish-eye lens to achieve a full-dome image. VisionDome is a real-time architecture that shares many of StarRider's strengths and content development challenges.
Alternate Realities Corporation is located in North Carolina's Research Triangle Park, nestled between the cities of Raleigh, Durham, and Chapel Hill. Morehead Planetarium proved to be a convenient, yet challenging test for their system in an actual, working planetarium theater. (Until that time they had limited their activities to using the technology in a small, demonstration dome as a stand-alone system.) Besides the challenges of positioning the projector off-center, such a demonstration would test the ability of the system to project images over a much greater distance. Both the VisionDome and Morehead staffs were initially skeptical about how well the system's images would hold up projecting onto a 20.7-meter (68-foot) dome, but felt, nonetheless, that the challenge would be informative in evaluating VisionDome's capabilities and limitations.
After a couple of preliminary visits to evaluate the Morehead theater environment, and to arrange for an adequate electrical power feed, the VisionDome team arrived to conduct their test. Their equipment included a 3-D graphics workstation and image processor; a high-intensity, high-resolution video/graphics projector; a specially-designed optical assembly for the 180-degree projection; and a large, makeshift wooden stand for the projector.
On "test day", equipment setup was completed within only a couple of hours of arrival. The large projector had been placed on its stand, the long optical pipe - complete with integral fisheye lens - was mated and aligned to the projector, and the graphics workstation and processor was up and running. A few moments later, the first VisionDome images were being drawn. A variety of different images were displayed during the test, including fractal-style images, a graphical Space Shuttle launch, and a DNA double helix, among others.
The initial results were encouraging with a number of images that showed a surprising degree of sharpness and clarity. Motion of the manipulated "objects" was relatively smooth, with very little jerkiness evident. Objects were projected with a variety of background colors, but the best results were obtained when objects were placed against a black background.
Of course, it was assumed that there would be difficulties associated with the Morehead test. Some of the images displayed during the test were quite "soft" in appearance. The VisionDome people said this was because they were testing image-sequences of a variety of resolutions. It was obvious that only the higher-resolution images would be applicable for all-dome use. There was some distortion visible in the images, taking the form of the image appearing to rest atop a curved void of black extending about 75 degrees in azimuth and about 10-15 degrees in altitude at the void's apex. The VisionDome folks attributed this effect to an incorrect mechanical adjustment between the projector and the lens pipe. They explained that this would be easily correctable by re-machining the shim-plates between the two components. However, the good news was that the distortion that would normally be encountered by projecting images off-center is easily corrected by loading a computer algorithm into the graphics processor.
The main limitation seen during the test was Morehead's large dome-size, which lowered the brightness, contrast, and overall color-saturation of the images. In addition, Morehead's white, high-reflectance dome further reduced the overall contrast of many images - particularly those incorporating non-black backgrounds - because of "cross-bounce". (This is a phenomenon familiar to all-dome film people, and is why such theaters have gray domes to reduce the overall reflectance, and thus, the cross-bounce effect.) Both the VisionDome and Morehead personnel suspected that lowered brightness, contrast, and color would be negative factors in the test, but were, nonetheless, pleasantly surprised that the images "held up" as well as they did. However, because of these limitations, VisionDome, as currently configured, is not optimized for large-dome applications. And given the need for lowered dome reflectance, the system is probably best suited for domes roughly 12-meters (40-feet), and smaller.
VisionDome Strengths & Criticisms
As with most of the systems under review, content for VisionDome is a primary concern. VisionDome's graphics workstation and application software appeared to be quite functional. However, since there is currently little in the way of appropriate ready-to-go graphical sequences for the system -particularly those which are astronomical in nature - the burden appears to rest primarily on the shoulders of the end-user. Facilities considering VisionDome or any other similar graphical system for the planetarium must consider the issue of content availability. With a trend toward smaller staffs in planetariums these days, many facilities may be hard-pressed to create original images for use in programs, given the staff-time and expertise needed to generate even the simplest 3-D objects and manipulated sequences. To that end, Mr. Galluppi and the engineers at ARC are interested in approaching the planetarium community as a potential market and looking for artists/designers to develop visual content.
The Morehead demonstration should be looked at as a worst-case scenario. Not only was the dome extremely large, but the system was tested with an older generation of video projector. Newer projectors from the same manufacturer can produce brighter, sharper images at higher resolutions. The greatest potential for VisionDome is in smaller theaters where the image can be most effectively used.
Alignment and color balancing with VisionDome is greatly simplified since there is only one projector and one lens. There are no bright spots, overlap areas, or other alignment headaches to deal with. While the system is not maintenance-free, it is much less expensive to own and operate.
An ideal partnership would probably be to install a VisionDome system into a college or university planetarium where staff and students could make use of it as a visualization platform and showcase for student graphics work. VisionDome is available in a number of configurations and price-points.
Questions and comments regarding these reviews should be made directly to the respective vendor or to the IPS Technical Committee:
IPS Technical Committee Chair
The Renaissance Center
719 East College Street
Dickson, TN 37055
+1 615 446 1985
Evans & Sutherland: www.es.com/Products/Edutain/starrider.html
Alternate Realities: www.virtual-reality.com/products.html
3D Studio Max: www.ktx.com
Electric Image Animation System: www.electricimage.com
MultiGen real-time modeling tool: www.multigen.com
Barco Projection Systems: www.barco.com/projecti/index.htm
PRODAS multi-projector displays: www.seos.co.uk
Panoram Technologies: www.panoramtech.com
Reprinted from the Planetarian, Vol 28, #1, March 1999. Copyright 1999 International Planetarium Society. For permission to reproduce please contact Executive Editor, Sharon Shanks.