The Need for Verifiable Visualization

R.M. Kirby, C.T. Silva
2008 IEEE Computer Graphics and Applications  
T he use of simulation science as a means of scientifi c inquiry is increasing at a tremendous rate. The process of mathematically modeling physical phenomena, experimentally estimating important key modeling parameters, numerically approximating the solution of the mathematical model, and computationally solving the resulting algorithm has inundated the scientifi c and engineering worlds. As increasingly more science and engineering practitioners advocate the use of computer simulation for
more » ... yzing and predicting physical and biological phenomena, the computational science and engineering community has started asking introspective questions, such as 1 Can computer-based predictions be used as a reliable basis for making crucial decisions? How can you assess a computer-based prediction's accuracy or validity? What confi dence (or error measures) can be assigned to a computer-based prediction of a complex event? Those researchers outside traditional computational engineering and science areas (traditional areas such as computational fl uid dynamics [CFD] and computational solid mechanics [CSM]) are sometimes shocked to hear these questions being asked, as they often have assumed that these types of issues had been settled long ago-at the inception of computing and computational modeling. A study of the computational science and engineering (CS&E) literature from the past 40 years clearly shows that these questions have not been ignored. Scientists who employ computing for solving problems have always been concerned with accuracy, reliability, and robustness. It was not until the past 10 years, however, that the CS&E community has joined together in an attempt to generate a unifi ed perspective from which to evaluate these questions. The consequence of these efforts has led to what some are calling a new CS&E discipline-validation and verifi cation, or V&V, which seeks to articulate processes by which we ■ ■ ■ can obtain answers to these questions. Let us take a closer look. Figure 1 shows the common simulation science pipeline consisting of the physical phenomena of interest, mathematical modeling of the phenomena, simulation, and evaluation (often through a combination of postprocessing and visualization). It also identifi es where validation and verifi cation fi t into this process (we will further explain these terms later in the article). Validation and verifi cation Scientists frequently use visualization techniques to help them assess their simulation results. Visualization is the lens through which scientists often view modeling and discretization interactions-hence, visualization itself must be explicitly considered as part of the V&V process. Simulation researchers often create their own visualization tools, claiming that they "don't trust" visualization techniques that they themselves have not implemented. CFD researchers creating visualizations of their own data joke that they are experts in the presentation of their own brand of CFD: colorful faulty dynamics. Such a statement can only be truly understood in the light of the V&V process; it is the means by which simulation scientists gain confi dence in their algorithms and implementations as well as those by others within their community. To gain the simulation community's confi dence, the visualization process must come under this process's umbrella. Visualization techniques have lagged behind in error and uncertainty analysis of the methodology as a component of a larger scientifi c pipeline. Little systematic research efforts have addressed quantifying and minimizing the visualization error budget (a concept we will discuss later in the article). Furthermore, there is a real need to look at this visualization error budget in the context of the error that the rest of the computational pipeline generated and how it impacts visualization algorithms (note that this is distinct from the area of "error and uncertainty visualization," which is concerned with visualizing errors and uncertainties).
doi:10.1109/mcg.2008.103 pmid:18753037 fatcat:l6chydxtzfcybk25uhc4saumv4