146 Hits in 0.79 sec

Datenkompetenz – Data Literacy

Thomas Ludwig, Hannes Thiemann
2020 Informatik-Spektrum  
Zusammenfassung Unsere Zeit der sprunghaft anwachsenden Datenmengen in allen Bereichen erfordert die Ausbildung von Datenkompetenz als Schlüsselkompetenz für das 21. Jahrhundert. Kenntnisse zur Datensammlung, zum Datenmanagement, zur Datenevaluation und zur Datenanwendung bilden die Grundlage für einen kompetenten Umgang mit Daten in Wissenschaft, Wirtschaft und Gesellschaft. Umfangreiche Datenmengen sind heute in allen Lebensbereichen in Wertschöpfungsketten eingebunden, die es zu gestalten
more » ... zu bewerten gilt. Insbesondere im Bereich der Wissenschaftsdaten wird dies auch institutionell unterstützt, um aus Daten neues Wissen und neue Einsichten generieren zu können.
doi:10.1007/s00287-020-01320-0 fatcat:n7gaybm3svfkdbfigdmpicqusi

FAIR Practices in Europe

Peter Wittenburg, Michael Lautenschlager, Hannes Thiemann, Carsten Baldauf, Paul Trilsbeek
2019 Data Intelligence  
Institutions driving fundamental research at the cutting edge such as for example from the Max Planck Society (MPS) took steps to optimize data management and stewardship to be able to address new scientific questions. In this paper we selected three institutes from the MPS from the areas of humanities, environmental sciences and natural sciences as examples to indicate the efforts to integrate large amounts of data from collaborators worldwide to create a data space that is ready to be
more » ... d to get new insights based on data intensive science methods. For this integration the typical challenges of fragmentation, bad quality and also social differences had to be overcome. In all three cases, well-managed repositories that are driven by the scientific needs and harmonization principles that have been agreed upon in the community were the core pillars. It is not surprising that these principles are very much aligned with what have now become the FAIR principles. The FAIR principles confirm the correctness of earlier decisions and their clear formulation identified the gaps which the projects need to address.
doi:10.1162/dint_a_00048 fatcat:gykh3mqvuvhqpfveb3nvjrii74

Canonical Workflows in Simulation-based Climate Sciences

Ivonne Anders, Karsten Peters-von Gehlen, Hannes Thiemann
2022 Data Intelligence  
Thiemann ( contributed significantly to the sharpening of the content and thus the improvement.  ... 
doi:10.1162/dint_a_00127 fatcat:xx7jtn5nyjdjrotllen3qevqbm

Long-term archival and global dissemination of climate data at DKRZ

Hannes Thiemann, Karsten Peters, Stephan Kindermann
2019 Zenodo  
Advances in the field of climate science rely on the global and findability and low-barrier retrievability of very large volume datasets containing either observational or numerical model output data. Both aspects have to be assured for long-term preserved and newly created datasets alike. DKRZ provides services tailored towards the needs of the climate science community addressing these issues. On the one hand, DKRZ hosts the World Data Center for Climate (WDCC) – an externally certified
more » ... erm archive for climate data. Data preservation in the WDCC supports the FAIR data principles, data depositors obtain dedicated support during the archiving process and DOI-assignment to preserved datasets is also possible. Data access is open for the majority of datasets preserved in the WDCC. On the other hand, as one of the core members of the ESGF (Earth System Grid Federation), DKRZ provides user-oriented services facilitating the efficient publication, global dissemination and analysis of heavily-used, large-volume climate data. Publication in the ESGF thus lends itself for datasets which are expected to spark great interest in the global climate science community. DKRZ therefore provides data and repository services essential for the day-to-day work of the global climate science community.
doi:10.5281/zenodo.3554138 fatcat:bhwdecutxzh75py67izbk3ir2m

NFDI4Earth – addressing the digital needs of Earth System Sciences - A + B

Peter Braesicke, Hannes Thiemann, Jörg Seegert, Lars Bernard
2022 Zenodo  
Thiemann (DKRZ) in Part B NFDI4Earth -addressing the digital needs of Earth System Sciences -B GeoKarlsruhe -September 2021 Hannes Thiemann (KIT) on behalf of Lars Bernard and Jörg Seegert (TU DD) individuals  ...  Thiemann (DKRZ) in Part B Earth System Sciences (ESS) 9 Geosphere, Atmosphere, Biosphere, Hydrosphere, Cryosphere, Anthroposphere and all their interactions ▪ From Local Processes to Global Challenges  ... 
doi:10.5281/zenodo.6325314 fatcat:aa5qx3qmobbftk524pxqfolqzi

Standing out in the data crowd: EASYDAB (Earth System Data Branding)

Anette Ganske, Amandine Kaiser, Angelika Heil, Angelina Kraft, Andrea Lammert, Hannes Thiemann
2022 Zenodo  
, Hannes 2 :  ... Heil, Angelika 2 : raft, Angelina 1 : Lammert, Andrea 2 : Thiemann  ... 
doi:10.5281/zenodo.6255772 fatcat:3j333s73jree3d2lwlg2xjvpvu

NFDI4Earth: current state and goals for the future

Roland Bertelmann, Hildegard Gödde, Gunnar Lischeid, Sören Lorenz, Adrian Krolczyk, Karsten Peters, Martin Schultz, Hannes Thiemann, Gauvin Wiemer
2019 Zenodo  
The national research data infrastructure (NFDI) in Germany [1] aims to systematically manage scientific and research data, provide long-term data storage, backup and accessibility, and network the data both nationally and internationally. Within this framework, NFDI4Earth [2], an emerging consortium with many participating institutions, strives to serve researchers within Earth System Research. The strategy of the NFDI4Earth is to further identify the demands for digital changes in German
more » ... System Science community in a structured community consultation process, to establish a set of common principles, rules and standards, to establish experimental prototype platforms, operating on distributed resources, and to provide tools and mechanisms for data integration and analysis. NFDI4Earth is aware that the objectives of NFDI and in particular NFDI4Earth can only be achieved on a consortia basis anchored in the broader scientific community. For this reason, NFDI4Earth aims to create awareness right from the beginning and initiate collaborations also beyond Germany's borders. This presentation introduces the current state of consolidation of the NFDI4Earth and its goals to the scientific community. [1] [2]
doi:10.5281/zenodo.3549613 fatcat:abr7djwxcjdf7bkvv7yzbdpezm

Kindly bent to free us

Gabriel Radanne, Hannes Saffrich, Peter Thiemann
2020 Proceedings of the ACM on Programming Languages (PACMPL)  
doi:10.1145/3408985 fatcat:fb32mo42fzc5thggvtxd4sjkoi

Kindly Bent to Free Us [article]

Gabriel Radanne, Hannes Saffrich, Peter Thiemann
2020 arXiv   pre-print
Systems programming often requires the manipulation of resources like file handles, network connections, or dynamically allocated memory. Programmers need to follow certain protocols to handle these resources correctly. Violating these protocols causes bugs ranging from type mismatches over data races to use-after-free errors and memory leaks. These bugs often lead to security vulnerabilities. While statically typed programming languages guarantee type soundness and memory safety by design,
more » ... of them do not address issues arising from improper handling of resources. An important step towards handling resources is the adoption of linear and affine types that enforce single-threaded resource usage. However, the few languages supporting such types require heavy type annotations. We present Affe, an extension of ML that manages linearity and affinity properties using kinds and constrained types. In addition Affe supports the exclusive and shared borrowing of affine resources, inspired by features of Rust. Moreover, Affe retains the defining features of the ML family: it is an impure, strict, functional expression language with complete principal type inference and type abstraction. Affe does not require any linearity annotations in expressions and supports common functional programming idioms.
arXiv:1908.09681v4 fatcat:vhnbmc6vmbhtlaoafyrvghr3ja

Towards increasing the reusability of atmospheric model data: adapting metadata standards and introducing quality criteria

Karsten Peters, Daniel Neumann, Hannes Thiemann
2020 Zenodo  
We present current and future work at the German Climate Computing Center (DKRZ) aimed at increasing the long-term reusability of atmospheric and climate model data. Specifically the recently funded project AtMoDat (project start: June 2019, duration of 3 years) will be in the focus of the presentation. DKRZ plays a central role for the German and international Earth System Science (ESS) community by hosting the IPCC (Intergovernmental Panel on Climate Change) Data Distribution Center (IPCC
more » ... Reference Data Archive for global climate model output (World Data Center for Climate, WDCC) as well as a Tier 1 data node of the ESGF (Earth System Grid Federation, an infrastructure enabling the global dissemination of large-volume climate model data). Further, DKRZ offers a variety of data management services covering all aspects of the research data life cycle. In particular, DKRZ services play a vital role for the long-term curation and reusability of numerical model output gathered in the context of large and internationally coordinated coupled model intercomparison projects (CMIPs). In order for CMIP to work, data and metadata standards as well as long-term data curation mechanisms have been thoroughly defined and are continuously adapted to allow for effective data sharing, reuse and preservation of the scientific record. Once a new CMIP-standard is established by the international community in an iterative process, the post-processing of numerical model output is often very cumbersome because i) the process behind the definition of the standard appears intransparent, ii) the standard is elaborate and requires meticulous attention to details, e.g. in the creation of metadata and iii) CMIP-compliance of model output is often not taken into account from the beginning of data production. Once the model output data fullfill the standard, their further processing and usage is simplified. This high degree of standardisation and long-term reusability of climate model data is desirable, but the application of the CMIP-sta [...]
doi:10.5281/zenodo.3667634 fatcat:cwp74awypbdbbaq5i5ag5c2cie

Recommendations for discipline-specific FAIRness evaluation derived from applying an ensemble of evaluation tools

Karsten Peters-von Gehlen, Heinke Höck, Andrej Fast, Daniel Heydebreck, Andrea Lammert, Hannes Thiemann
2022 Zenodo  
In earlier work, a self-assessment of the WDCC along the FAIR principles (Peters, Höck & Thiemann, 2020) 4 indicated a high level of FAIRness (0.9 of 1).  ...  Finally, we build on earlier in-house work to evaluate WDCC's FAIRness (Peters, Höck & Thiemann, 2020) and by performing a self-assessment using the metric collection presented in Bahim et al. (2020  ... 
doi:10.5281/zenodo.5879457 fatcat:5m45x7udcnbltk553kmlb7wrya

Fitness for Use of Data Objects Described with Quality Maturity Matrix at Different Phases of Data Production

Heinke Höck, Frank Toussaint, Hannes Thiemann
2020 Data Science Journal  
Fitness for use information should be stored to enable easy identification of data objects that are suitable for re-use -a feature which can only be assessed by the data user. With the described Quality Maturity Matrix (QMM), we want to provide a metric for a discrete measurement of the fitness for use of data objects. We use the data maturity to describe the degree of formalization and standardization of the data with respect to the quality of data and metadata. The data objects mature as they
more » ... pass through the different post-production steps where they undergo different curation measures. The higher the maturity and the level in the QMM, the easier is it for the user to judge the appropriateness of the data for a possible re-use. For our development of the Quality Maturity Matrix we link the maturity levels to the five phases concept, production/processing, project collaboration/intended use, long-term archiving, and impact re-use. Each of the five levels is measured with regard to the four criteria consistency, completeness, accessibility, and accuracy. For the description we use the terms of the Open Archival Information System (OAIS). We relate our data focused QMM to some existing maturity matrices which put the focus on the maturity of the curation process rather than of the data objects themselves. In addition, we make an attempt to establish a connection between the QMM criteria of data assessment and the FAIR Data principles.
doi:10.5334/dsj-2020-045 fatcat:cm5cbfvxnfbo3d6uxsszb74tgi

FAIR@HPC – Improving HPC usage in ESS by FAIR data and compute services

Stephan Frickenhaus, Bernadette Fritzsch, Stephan Hachinger, Angelika Humbert, Nikolay V. Koldunov, Ralph Müller-Pfefferkorn, Johannes Munke, Noah Trumpik, Hannes Thiemann
2022 Zenodo  
There is a continuously increasing demand for using High-Performance-Computing (HPC) infrastructure for solving geoscientific questions in different domains. Fostering the "FAIRness" (referring to the FAIR principles of Wilkinson et al. 2016 - "Findable, Accessible, Interoperable, Reusable") of the data generated in HPC calculations is therefore a necessity; however, the sheer size of such data poses a big challenge. In particular, approaches to FAIR data and computing which do not require the
more » ... omplete movevement of extremely big data sets have to be developed. Here, the Interest Group (IG) "High-performance Computing in Earth System Sciences" within NFDI4Earth (the German National Research Data Infrastructure consortium for Earth System Sciences) presents a concept/working paper, which will receive versioned updates as needed. This is intended as an encouragement to both researchers and infrastructure providers to engage in joint discussions on the further design of work processes, future service offerings and their interaction on the background of the FAIR principles for HPC data. The NFDI4Earth IG High-performance Computing in Earth System Sciences will further promote these discussions in NFDI4Earth and bring them into the framework of the entire NFDI.
doi:10.5281/zenodo.6565404 fatcat:r7rytjzbhbe5fmbfuhhnioenqa

Recommendations for Discipline-Specific FAIRness Evaluation Derived from Applying an Ensemble of Evaluation Tools

Karsten Peters-von Gehlen, Heinke Höck, Andrej Fast, Daniel Heydebreck, Andrea Lammert, Hannes Thiemann
2022 Data Science Journal  
Peters, Höck & Thiemann, 2020, and references therein).  ...  In earlier work, a self-assessment of the WDCC along the FAIR principles (Peters, Höck & Thiemann, 2020) 4 indicated a high level of FAIRness (0.9 of 1).  ... 
doi:10.5334/dsj-2022-007 fatcat:5xutdzyy5fg7bp4azwa4mk37z4

Semantic oriented data access and storage at MPIM/DKRZ

Michael Lautenschlager, Hannes Thiemann
2001 Mass Storage Systems and Technologies(MSS), Proceedings of the NASA Goddard Conference on  
doi:10.1109/mss.2001.10013 fatcat:b3goyae3ebfn3fhgecpnq5tclu
« Previous Showing results 1 — 15 out of 146 results