PMEL's Virtual Reality Lab began experimenting with visualization as a science tool for oceanography and atmospheric science with the help of NOAA's Office of High Performance Computing and Communications. The tools we use include basic analysis software like Matlab and IDL, visualization software and libraries like Iris Explorer, VTK, and CAVE, and web-based visualization languages like Java 3D and VRML. This poster shows our latest efforts using VRML (left) and the Immersadesk (right). We've been invited to present this work at: NOAATech2000 (Outstanding Technology Achievement Award) SuperComputing '99 Canadian Statistical Society Meeting Other presentations include: AMS 1999 - Dallas GOIN 1999 - Honolulu AMS 2000 - Long Beach Supercomputing '99, Nov 13-19, 1999 AGU 2000, Jan 24-28, 2000 PMEL Open House, March 16-17, 2000 Tsunami Hazard Mitigation Meeting, May 9-11, 2000 The Evolution of Visualization Our binocular vision and the tools we've created to visualize and analyze environmental data sets have evolved along parallel lines. Early graphing tools allowed the user to simply plot variables in two dimensions. Oceanographic and atmospheric data, being inherently three-dimensional, lends itself to the next generation of visualization applications: those which have the ability to easily render plots as three-dimensional perspective plots on either a computer screen or on paper. More of these applications have included three-dimensional visualization, and the evolution of these tools follows the evolution of our own binocular vision. Depth perception is the key to viewing our world, and we have developed many ways of perceiving depth and distinguishing objects from their background and estimating their size. Our binocular vision has evolved to be a highly refined instrument of determining distance and size. Perhaps the first "depth cue" at our disposal is object recognition, in which we perceive an object as separate from its background simply due to the fact we recognize it in its entirety. The next step in the evolution of full stereo vision is motion parallax, where one may perceive an object's distance and size by the use of relative motion. If the object is moving, we more easily distinguish it from the background. Or, if it is not, the observer may move slightly and determine the distance to an object by how much the object appears to move relative to its background. Related to motion parallax is the concept of perspective: how an object's appearance changes when viewed from different angles. An object appears smaller when it is farther away. The parallel edges of a box, when viewed from an angle, seem to converge in the distance. All of these depth cues are uniocular - they rely only on one eye to view three-dimensional objects. Binocular vision utilizes both convergence/divergence of the eyes, and stereopsis (the difference in images that the right and left eye views) to give the observer clues as to object distance and size. Research has shown that our binocular vision controls the visual-motor system by which we estimate distance and size. The brain takes recognition cues, relative motion cues, and parallax cues, but it depends on stereopsis for accurate interactions with three-dimensional objects. This pattern of evolution is reflected in the way we view oceanographic data: our tools have evolved from simple 2-D plots, to surface plots utilizing perspective, to adding user interaction and object motion (motion parallax) as with VRML, to full stereoscopic 3-D animation as with stereo VMRL and the Immersadesk.