DETAILS


COLUMNS


CONTRIBUTIONS

a a a

VISFILES

Vol.32 No.4 November 1998
ACM SIGGRAPH



A Perspective from Industry



T.Todd Elvins
San Diego Supercomputer Center


November 98 Columns
Gaming & Graphics Images and Reversals


T.Todd Elvins
Previous article by T.Todd Elvins Next article by T.Todd Elvins

Mike Krogh contributes this issue’s VisFiles. This — our first perspective from industry — examines a problem well known to many of you. How does a virtual reality (VR) software developer support the wide assortment of VR peripherals for which there are no standard APIs? As always you are welcome to contact our guest author or myself if you have an opinion on this subject. If we receive some spicy viewpoints, perhaps we can convince Gordon to publish them in future issues. (Editor/Gordon’s note: the more the merrier - bring them on!)

— T. Todd Elvins

Interfacing Commercial Applications to Virtual Reality Environments



Michael Krogh and Anders Grimsrud
Computational Engineering Intl.

Abstract

Virtual reality is in the process of transitioning from a research discipline to an application area for both engineers and scientists. For VR to be truly successful for these types of users, compelling applications must be made available that support the full range of VR user interface devices. The variety of such devices, and lack of standards to interface with them, creates a challenge: how to provide an application in executable form, while allowing user customizable software interfaces to the user’s VR devices? Dynamic shared libraries provide an answer.

Introduction

VR had its origins in the late 1960s with Ivan Sutherland’s[1] work; but it wasn’t until the late ‘80s that it came into fashion with the advent of goggles, datagloves and datasuits. At that time virtual reality was a novel research area with researchers busy experimenting and fantasizing about VR’s potential while hardware, software, simulations and environment prototypes were being built. Additionally, physicists, chemists, astronomers and other scientists started to apply this technology to their own research. Since then, virtual reality has transformed from being primarily a research area into a set of technologies for applied research, design and development that aid analysis, communication and marketing.

VR is finding applicability in fields as diverse as art, tele-collaboration, gaming, sales and finance. Current scientific and engineering VR applications include brain surgery planning, the NASA Mars Pathfinder mission analysis [2], virtual windtunnel exploration of computation fluid dynamics data [3], ocean modeling [4] and collaborative vehicle design [5].

While VR has unquestionably matured, one could nonetheless argue that it is still in an adolescent stage. It still predominantly exists in the research labs of the government, universities and some large corporations — however, this too is changing. High performance graphics hardware has become ubiquitous and relatively inexpensive. Furthermore, virtual reality “gear,” the high tech displays and input devices, is now readily available with a wide range of capabilities, and cost is no longer such a barrier to implementation. Missing, however, are the applications for end users, in particular engineers and scientists. Most available software is still written for VR researchers and developers. Very little exists for virtual reality gear that works “out-of-the-box.” One explanation of this involves invoking the classic “chicken/egg” dilemma: applications tend not to be available until there is a significant user base, while the user base doesn’t grow until there are compelling applications. An additional problem lies in interfacing applications to the numerous VR peripherals available today. Standard application programming interfaces (APIs) do not exist for these peripherals.

The challenge for application developers becomes one of how to support the multitude of VR devices to which the developer does not have access. One answer is through dynamic shared libraries.

Previous Work

Early virtual reality simulations and environments were written as custom applications for a variety of reasons: applicable software didn’t previously exist, performance requirements 1, and difficulties of interfacing applications to the variety of novel I/O devices, among others.

As various I/O devices became more prevalent, de facto software standards, such as the Cave library [6], emerged. Libraries such as these eased new simulation development by providing the core of the rendering “loop” for supported VR devices. Developers needed only write subroutines for drawing the geometry for display on the VR devices. While this simplified some of the software development, developers were still left with the task of writing the remainder of the simulation and visualization algorithms.

To further ease visualization development, researchers explored integrating VR support into visualization development environments such as AVS [7,8]. Again, end users were mostly freed from the details of interfacing to supported VR devices. They needed only write their VR simulation within the commercial visualization environment. This usually consisted of “wiring-up” networks of interconnected subroutine-level modules. Additionally, new modules can be written in common programming languages when existing modules do not provide the needed functionality.

De facto VR libraries and development environments are wonderful tools for building prototypes and doing VR research. However, many end users (engineers, physicists, chemists, etc.) do not possess the skills or the interest to write and/or assemble their own visualization tool; nor do they necessarily have access to a visualization person to do it for them. A complete tool that works “out-of-the-box” is much more desirable for this user base.

However, such a complete tool is not that simple to develop and support. Standards for VR devices are still sparse, making it difficult for application developers to provide adequate support for the variety of VR devices and the numerous ways that these may be configured on the end-user’s computer.

Compelling Applications

To be compelling to scientific and engineering end users, a VR application must possess several characteristics.

  • The application must provide the scientific or analysis capabilities that the user desires (e.g. isosurfaces, clip planes, data calculation, statistics).
  • The application needs to deliver sufficient rendering performance to preserve the illusion of reality2 as well as sufficient processor performance for feature calculations (e.g. isosurface calculation).
  • The application must also support an interface mechanism to the particular VR devices for which the user has access.
  • The application must be easy to use and must not require significant learning time (both in terms of learning how to navigate within the VR environment and how to operate the application).

The last item is important because many users initially seem skeptical of VR’s ability. In other words, to be a viable approach, the user must see a benefit in a short period with low effort. This item can be achieved with a well-designed user interface that functions similarly both on a desktop workstation and within a virtual environment.

A common challenge for software vendors is deciding how to provide and support high-quality applications that interface to a wide variety of VR devices. Highly desirable is an extensible mechanism that supports current, as well as future, user configurable VR devices without requiring source or object code access or modification. The approach implemented by Computational Engineering Intl. in their EnSight product is entitled User Defined Functions (UDFs).

Dynamic Shared Libraries

UDFs are based on dynamic shared libraries. While they may go by different names, such as DSOs and DLLs, these libraries exist on most flavors of Unix and on Windows 95 and NT. Three interacting components are required: an application, a separate library of subroutines that are called by the application and operating system support (typically provided by the run-time loader on Unix systems). The application calls subroutines in the library as it would call any other subroutine. The subroutines are written no differently than had they been linked directly with the application. The run time loader provides a mechanism by which unresolved references within the application are matched to the subroutines within the library.

Nearly all applications these days use dynamic shared libraries for linking applications to system libraries. For example, most applications linked with X11 or MOTIF are linked to a shared object version of the library. There are at least three advantages to using DSOs in an application.

  • The application disk size for DSO linked applications can be significantly smaller than one linked with static libraries.
  • The DSO can be optimized for a particular hardware configuration transparent to the application.
  • The library can be supplied by the end user affecting the functionality of the application without necessitating a relink and/or recompile.

A minor disadvantage to DSO linked applications when compared to statically linked applications is that they tend to start up slightly slower since the operating system must search the known libraries to resolve the unreferenced symbols. The good news is that following start-up, a DSO linked application performs comparably to one that has been statically linked.

Figure 1


Michael F. Krogh is a Senior Developer at Computational Engineering Intl. Prior to CEI he worked at Los Alamos National Laboratory as a Visualization Researcher. He has a M.S. in computer science from the University of Illinois Urbana Champaign. His interests include visualization, parallel and distributed computing and software engineering.

Anders Grimsrud is Vice President for Research and Development at Computational Engineering Intl. Prior to CEI he worked for Cray Research, Inc. as the Group Leader for Engineering Graphics. He has a Ph.D. in engineering from Brigham Young University.

Michael Krogh and Anders Grimsrud
Computational Engineering Intl.


T.Todd Elvins is a Staff Scientist at San Diego Supercomputer Center. He recently finished his Computer Engineering Ph.D. at the University of California, San Diego, and his research interests include perceptually based user interface design, data and information visualization, Web-based imaging and computer graphics. He can be contacted at:

T.Todd Elvins
San Diego Supercomputer Center
University of California
San Diego
MC 0505
La Jolla, CA
92093-0505, USA

Web site

The copyright of articles and images printed remains with the author unless otherwise indicated.

User Defined Functions

User defined functions, popularized by applications such as Netscape Navigator and Adobe PhotoShop with their plug-in architectures, were first included in EnSight a year ago for defining custom data readers. EnSight, a commercial engineering and scientific visualization application, reads data supplied in many of the popular commercial computational fluid dynamics and finite element analysis output formats. While many customers utilize these common data formats, many have their own proprietary or non-standard formats. Custom data readers are UDFs whereby users may write a few subroutines that allow EnSight to read their files, thus avoiding translation or waiting on the vendor to provide the reader. EnSight prescribes the API that a UDF must implement. The user writes the needed functions, in either C or FORTRAN, using a template outlining required subroutines along with their input and output parameters. Following testing and debugging, the custom data reader can easily be invoked by EnSight at run-time to read the new data type.

Following the same model, EnSight has been modified to allow users to write their own interfaces to input devices. This is extremely useful since the VR input devices are varied and often unique. Further, it is extremely difficult to use the standard mouse as an input device since it is often several feet removed from the person using the VR environment (see Figure 1). EnSight’s requirements for the input UDFs are simple — the routine returns a normalized position, orientation and button state which, when combined, are used to perform transformations, picking and menu operations in the VR environment. This allows a user to, for example, interactively position particle traces or move objects within the virtual environment.

Insert 1 shows a very simple example of an application linked with a DSO, using actual C code together with a description of the necessary steps to build and execute the application on an SGI workstation. Other hardware platforms support DSO’s in a similar manner. Insert 2 shows the UDF prototypes for the virtual reality input device support in EnSight.

Future Directions

To fully support virtual reality, two additional features are needed: stereographic output and three-dimensional camera tracking. Stereographic output, already supported by EnSight, does not warrant UDFs at this time since most VR systems utilize “plug compatible” stereo glasses from either StereoGraphics Corporation or NuVision Technologies, Inc. Three-dimensional tracking makes use of input devices such as those from Polhemus Inc. or Ascension Technology Corporation and is already supported by EnSight through UDFs. Three-dimensional camera tracking, typically used by Cave systems and various head-mounted displays, differs from other tracking only in that the input is used to control the graphics viewing transformation. Future work will look at coupling such input UDFs to the camera model used within EnSight.

As shown with EnSight and its support for virtual reality devices, user defined functions based on dynamic shared libraries provide an important mechanism. Users benefit by being able to supply or modify such routines at execution time and still obtain performance comparable to that of directly compiled-in subroutines. Applications providers benefit from extensibility while not having to provide either source or object code to the entire application. Finally, everyone benefits by having compelling, supported applications available for up-and-coming but nascent technologies such as virtual reality.

Further information on EnSight can be found at http://www.ceintl.com/.

Acknowledgments

CEI would like to thank the Army Research Laboratory for their interest and funding of EnSight functionality that supports VR environments.

References

  1.  I. Sutherland, “SketchPad: A Man-Machine Graphical Communication System,” AFIPS Spring Joint Computer Conference, 1963, pp 329-346.
  2.  R. Baldwin, “VR: Friend or Foe?,” IEEE Computer Graphics and Applications, Vol. 17, No. 6, November-December 1997, p. 102.
  3.  S. Bryson, “The Virtual Windtunnel on the Virtual Workbench,” IEEE Computer Graphics and Applications, Vol. 17, No. 4, July-August 1997, p. 15.
  4.  K Gaither, R. Moorhead, S. Nations, D. Fox, “Visualizing Ocean Circulation Models Through Virtual Environments,” IEEE Computer Graphics and Applications, Vol. 17, No. 1, January-February 1997, pp. 16-19.
  5.  V. Lehner, T. DeFanti, “Distributed Virtual Reality: Supporting Remote Collaboration in Vehicle Design,” IEEE Computer Graphics and Applications, Vol. 17, No. 2, March-April 1997, pp. 13-17.
  6.  C. Cruz-Neira, D. Sandin, T. DeFanti, “Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE,” Siggraph 1993 Conference Proceedings, pp. 135-142.
  7.  W. Sherman, “Integrating Virtual Environments into the Dataflow Paradigm,” Fourth Eurographics Workshop in ViSC, Workshop Papers, April 1993.
  8.  W. Bethel, “Chemical Flooding in a Virtual Environment – A Survivor’s Guide to VR Development,” 1994 International AVS Users Group Conference Proceedings, May 1994, pp. 299-309.