A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is
This skeleton forms the basis for visualizing data in a very generic and reusable way (Benger, 2008) . ... Interdisciplinary approaches face various difficulties, starting with incompatibilities of the data format and data model (Benger, 2009) . ...doi:10.5772/32203 fatcat:w66hisrlord7lerfn4oh4lw32a
Science and Art Symposium 2000
We present visualizations of recent supercomputer simulations from numerical relativity, exploiting the progress in visualization techniques and numerical methods also from an artistic point of view. The sequences have been compiled into a video tape, showing colliding black holes, orbiting and merging neutron stars as well as collapsing gravitational waves. In this paper we give some background information and provide a glance at the presented sequences.doi:10.1007/978-94-011-4177-2_30 fatcat:yszc7epi5ndwhnq7ivjevjqyqy
The Tenth Marcel Grossmann Meeting
A new visualization technique for visualizing three-dimensional symmetric positive definite tensor fields of rank two is described. It supports studying the spatial projection of a spacetime metric. The rendering technique is demonstrated upon the Schwarzschild metric of a static black hole, the Kerr metric of a rotating black hole in two different coordinate systems, and a numerically computed dataset describing the metric of two colliding black holes.doi:10.1142/9789812704030_0164 fatcat:sdbjmkyainh3hnx7hjav3scgvm
Journal of WSCG
Airborne light detection and ranging (LIDAR)-based bathymetry is a highly specialized field within the widely known and used geoscientific surveying technology based on green spectrum lasers. Green light can penetrate shallow water bodies such that river and lake beds can be surveyed. The result of such observations are point clouds, from which geometries are extracted, such as digital elevation maps for lake, river or sea floors. The quality of those maps is crucially dependent on the amountdoi:10.24132/jwscg.2020.28.22 fatcat:32ml77llsfcrlnznm3nryiarne
more »... reliable information that can be extracted from the noisy LIDAR signals. The primary LIDAR data consists of amplitude response curves for each emitted laser signal. A direct visualization of these "raw", pristine data is crucial to verify, assess and optimize subsequent data processing and reduction methods. In this article we present a method for scientific visualization of these amplitude response curves en mass, i.e. for millions and tens of millions thereof simultaneously. As part of this direct visualization also preliminary analysis and data reduction operations can be performed interactively. This primary and direct inspection allows studying and evaluating the full potential of acquired data sets such that data processing methods can be fine-tuned to squeeze out all needed information of interest. Ideally such improved data processing avoids subsequent surveying when output from already measured data sets is improved, resulting in reduced economical and environmental costs.
Big data in observational and computational sciences impose increasing challenges on data analysis. In particular, data from light detection and ranging (LIDAR) measurements are questioning conventional methods of CPU-based algorithms due to their sheer size and complexity as needed for decent accuracy. These data describing terrains are natively given as big point clouds consisting of millions of independent coordinate locations from which meaningful geometrical information content needs to bedoi:10.1016/j.procs.2015.05.217 fatcat:b4abuw7cgrf6bby7k3hiqkq73e
more »... extracted. The method of computing the point distribution tensor is a very promising approach, yielding good results to classify domains in a point cloud according to local neighborhood information. However, an existing KD-Tree parallel approach, provided by the VISH visualization framework, may very well take several days to deliver meaningful results on a real-world dataset. Here we present an optimized version based on uniform grids implemented in OpenCL that is able to deliver results of equal accuracy up to 24 times faster on the same hardware. The OpenCL version is also able to benefit from a heterogeneous environment and we analyzed and compared the performance on various CPU, GPU and accelerator hardware platforms. Finally, aware of the heterogeneous computing trend, we propose two low-complexity dynamic heuristics for the scheduling of independent dataset fragments in multi-device heterogenous systems.
We used the Vish Visualization Shell (Benger et al. 2007 ) to numerically sample (20) and (21) on a uniform grid for analyzing scalar fields (Fig. 2, Fig. 3 ) and a radial sampling distribution ...arXiv:1304.6029v2 fatcat:tkkrceki7jd7dez2l3nzzx7tgq
A new general-purpose technique for the visualization of time-dependent symmetric positive definite tensor fields of rank two is described. It is based on a splatting technique that is built from tiny transparent glyph primitives which are capable of incorporating the full orientational information content of a tensor. The result is an information-rich image that allows to read off the preferred orientations in a tensor field. It is useful for analyzing slices or volumes of a three-dimensionaldoi:10.1117/12.549300 dblp:conf/vda/BengerH04 fatcat:fzhxwjdhzvacdbrlk2nqk7q37a
more »... ensor field and can be overlaid with standard volume rendering or color mapping. The application of the rendering technique is demonstrated on numerically simulated general relativistic data and a measured diffusion tensor field of a human brain. A problem common to all visualization techniques using icons is visual clutter, an experience which is tackled in vector field visualization by displaying low dimensional field characteristics like critical points and integral lines, e.g. stream lines, streak lines and path lines. Hyperstream lines 4 are stream lines of the maximum (or minimum) eigenvector "field" and is made available as an electronic reprint with permission of SPIE and IS&T. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited. ] J J J J J J J J Ĵ
Journal of WSCG
., Werner, R., Schmidt INTRODUCTION Virtual surgery planning and pre-operative design and fabrication of plastic surgical guides and titanium plates for bone fixation have proven valuable for improving ...dblp:journals/jwscg/StamatakisBS17 fatcat:jkaafwsvgnh25pb2a3cwsot2by
is visualization research scientist at Louisiana State University, specializing in astrophysics and computational fluid dynamics CHRISTOPH BEST (email@example.com) is project leader at the European Bioinformatics Institute, specializing in electron microscopy image informatics THE BIOLOGICAL SCIENCES need a generic image format suitable for long-term storage and capable of handling very large images. Images convey profound ideas in biology, bridging across disciplines. Digital imagery began 50doi:10.1145/1562764.1562781 pmid:21218176 pmcid:PMC3016045 fatcat:pdbdjngje5at5npb264ce7ltvm
more »... s ago as an obscure technical phenomenon. Now it is an indispensable computational tool. It has produced a variety of incompatible image file formats, most of which are already obsolete. Several factors are forcing the obsolescence: rapid increases in the number of pixels per image; acceleration in the rate at which images are produced; changes in image designs to cope with new scientific instrumentation and concepts; collaborative requirements for interoperability of images collected in different labs on different instruments; and research
Lecture Notes in Computational Science and Engineering
Large scale simulations running in metacomputing environments face the problem of efficient file I/O. For efficiency it is desirable to write data locally, distributed across the computing environment, and then to minimize data transfer, i.e. reduce remote file access. Both aspects require I/O approaches which differ from existing paradigms. For the data output of distributed simulations, one wants to use fast local parallel I/O for all participating nodes, producing a single distributeddoi:10.1007/978-3-642-57313-2_1 fatcat:n65uobcsmjehtfywztjaiflq6e
more »... file, while keeping changes to the simulation code as small as possible. For reading the data file as in postprocessing and file based visualization, one wants to have efficient partial access to remote and distributed files, using a global naming scheme and efficient data caching, and again keeping the changes to the postprocessing code small. However, all available software solutions require the entire data to be staged locally (involving possible data recombination and conversion), or suffer from the performance problems of remote or distributed file systems. In this paper we show how to interface the HDF5 I/O library via its flexible Virtual File Driver layer to the Globus Data Grid. We show, that combining these two toolkits in a suitable way provides us with a new I/O framework, which allows efficient, secure, distributed and parallel file I/O in a metacomputing environment.
One such specific technique was presented in (Benger and Hege 2004 ) and more thoroughly described in (Benger et al 2006) by means of medical datasets. ... Visualization of tensor fields is mostly investigated in medical imaging based on magnetoresonance data acquisition methods (Benger et al 2006) . ...doi:10.1088/1367-2630/10/12/125004 fatcat:hoo3t54af5attotoly4tb47xkq
We used the Vish Visualization Shell (Benger et al. ) to numerically sample () and () on a uniform grid for analyzing scalar fields (Figures and ) and a radial sampling distribution ( Figure ...doi:10.1186/s40668-014-0002-6 fatcat:64auatz6ifgg3gwa3zkejmheam
We describe a new graduate course in scientific computing that was taught during Fall 2010 at Louisiana State University. The course was designed to provide students with a broad and practical introduction to scientific computing which would provide them with the basic skills and experience to very quickly get involved in research projects involving modern cyberinfrastructure and complex real world scientific problems. The course, which was taken by thirteen graduate students, covered basicdoi:10.1016/j.procs.2011.04.210 fatcat:p4yyjrs3wbdold34el5ihwgaoq
more »... ls, networks and data, simulations and application frameworks, scientific visualization, and distributed scientific computing. Notable features of the course include a modularized team-teaching approach, and the integration of national cyberinfrastucture with teaching.
The details of the classification scheme and implementation through grouping into five major hierarchy levels are described elsewhere [Benger 2004 ]. ... Future investigation will go into applying techniques suitable for derived quantities, such as the Green-Cauchy tensor of the wind vector field, drawing upon experience with visualizing tensor fields [Benger ...doi:10.1145/1174429.1174465 dblp:conf/graphite/VenkataramanBLJR06 fatcat:y5l4mvpfz5aozkp3oo7kxnoufy
doi:10.1145/3095140.3097284 dblp:conf/cgi/BreuilsNFHBS17 fatcat:rhy5paehtbfvtauekb74zhjhzm
« Previous Showing results 1 — 15 out of 165 results