Today's Date:
Become a fan on Facebook Follow us on Twitter Connect with us on LinkedIn Bookmark and Share
Site Navigation:

ARL-Devised Data Assessment System Receives U.S. Patent

Simulations based on high performance computing (HPC) physics are used by Army programs to evaluate the feasibility of various designs and to focus the physical experiments that need to be performed. The time and cost savings of using HPC simulations has been widely recognized by the Army. The results of these simulations are typically interpreted via scientific visualization, the process of producing images and animations that reflect the quantities being calculated.

Members of the U.S. Army Research Laboratory's (ARL) Computational and Information Sciences Directorate have devised totally new ways to explore these complex datasets and have implemented an Augmented Virtuality Scientific Data Assessment System. The focus of this work is to add aural and tactile dimensions to the data and to use both physical and virtual objects together to provide additional detail and context to the simulation. This system, called the Multi-Sensory Environment for Scientific Data Assessment (MESDA), recently received a U.S. patent and is currently in use at ARL at Aberdeen Proving Ground, Md.

MESDA allows for additional computed values to be represented by using positional sound that varies in pitch and volume with the data. Low frequency vibrations, which are not audible, are also used to represent additional dimensions of data. To give context to the simulation and to show detail that may not be present in the virtual representations, physical objects are added and "mixed" with the computer-generated images. MESDA allows for a greater number of calculated values to be analyzed simultaneously.

For example, effective stress on a projectile can be represented as time varying color and overlaid on the actual projectile. Pressure can be represented as a sound that varies in pitch, and the amplitude of a tactile vibration can vary with the energy. An observer can intuitively understand how these quantities correlate to each other.

These three-dimensional images are projected onto a standard rear projection screen as well as a unique transparent screening material. This material, known as "transfilm," preserves the polarization of the projected images. Physical objects are placed between the rear projection screen and the transfilm to produce a "reality sandwich" of virtual-physical-virtual objects. This technique allows virtual objects to be placed in front of physical objects, a feat impossible to achieve with standard projection.