A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2005; you can also visit <a rel="external noopener" href="http://grids.ucs.indiana.edu/ptliupages/publications/gempapermarch00.pdf">the original URL</a>. The file type is <code>application/pdf</code>.
<i title="American Geophysical Union">
Geocomplexity and the Physics of Earthquakes
Computer simulations will be key to substantial gains in understanding the earthquake process. Emerging information technologies make possible a major change in the way computers are used and data is accessed. An outline of a realizable computational infrastructure includes standardization of data accessibility, harnessing high-performance computing algorithms, and packaging simulation elements as distributed objects across wide networks. These advances promise to reduce dramatically the<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1029/gm120p0219">doi:10.1029/gm120p0219</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ezuf2iy7ezb4roaaauplcefnze">fatcat:ezuf2iy7ezb4roaaauplcefnze</a> </span>
more »... tion and cost of doing earthquake science as they transform the fragmentary nature of the field into one of integration and community. 1.2: Computational Overview There is substantial international interest in the use of large-scale computation in the earthquake field including an activity in Japan where major computational resources are being deployed and an effort among several Asia-Pacific nations including USA (the so-called APEC initiative). Here we will focus on an American activity known as GEM for its goal to produce a "General Earthquake Model" (http://www.npac.syr.edu/projects/gem, http://milhouse.jp-l.nasa.gov/gem). There are currently no approaches to earthquake forecasting that are uniformly reliable. The field uses phenomenological approaches, which attempt to forecast individual events or more reliably statistical analyses giving probabilistic predictions. The development of these methods has been complicated by the fact that large events responsible for the greatest damage repeat at irregular intervals of hundreds to thousands of years, and so the limited historical record has frustrated phenomenological studies. Direct numerical simulation has not been extensively pursued due to the complexity of the problem and the (presumed) sensitivity of the occurrence of large events to detailed understanding of earth constituent make up, myriad initial conditions, and the relevant micro-scale physics which determines the underlying friction laws. This field is different from many other physical sciences such as climate and weather, as it so far has made little use of parallel computing and only now is starting its own "Grand Challenges". It is thus not known how important large-scale simulations will be in Earthquake science. Nevertheless it is essentially certain that they can provide a numerical laboratory of semi-
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20051013043125/http://grids.ucs.indiana.edu/ptliupages/publications/gempapermarch00.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/0c/27/0c2728b9a26dba2993592e2e6bb340f88f27a78a.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1029/gm120p0219"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>