Filters








388 Hits in 3.8 sec

Representing Interoperable Provenance Descriptions for ETL Workflows [chapter]

André Freitas, Benedikt Kämpgen, João Gabriel Oliveira, Seán O'Riain, Edward Curry
2015 Lecture Notes in Computer Science  
(OPM) standard and focusing on the provision of an interoperable provenance model for Web-based ETL environments.  ...  This paper proposes the convergence of this two Web data management concerns, introducing a principled provenance model for ETL processes in the form of a vocabulary based on the Open Provenance Model  ...  OPM provides a basic description of provenance which allows interoperability on the level of workflow structure.  ... 
doi:10.1007/978-3-662-46641-4_4 fatcat:o7d5txmqa5ai7iiohmxwezr7qm

Implementation-independent Knowledge Graph Construction Workflows using FnO Composition

Gertjan De Mulder, Ben De Meester
2022 Extended Semantic Web Conference  
The semantic descriptions allow for interoperable workflows, the alignment with P-PLAN and PROV-O allows for reproducibility, and the mapping to concrete implementations allows for automatic execution.  ...  We describe how a data flow workflow can be described interoperable (i.e., independent from the underlying technology stack) and reproducible (i.e., with detailed provenance) by composing semantic abstract  ...  Language (CWL), and Workflow Description Language (WDL); and descriptive specifications such as P-PLAN and the Open Provenance Model for Workflows (OPMW).  ... 
dblp:conf/esws/MulderM22 fatcat:psmnyeegpnh5tb5z45pwji5rem

Capturing Interactive Data Transformation Operations Using Provenance Workflows [chapter]

Tope Omitola, André Freitas, Edward Curry, Séan O'Riain, Nicholas Gibbins, Nigel Shadbolt
2015 Lecture Notes in Computer Science  
The proposed model showed a high level of coverage against a set of requirements used for evaluating systems that provide provenance management solutions.  ...  The ready availability of data is leading to the increased opportunity of their re-use for new applications and for analyses.  ...  This work targets interoperable provenance representations of IDT and ETL workflows using a three-layered approach to represent provenance.  ... 
doi:10.1007/978-3-662-46641-4_3 fatcat:f5zzbssxzbbjrhwugelc7rxqju

Implementing interoperable provenance in biomedical research

V. Curcin, S. Miles, R. Danger, Y. Chen, R. Bache, A. Taweel
2014 Future generations computer systems  
Provenance traces of these software need to be integrated in a consistent and meaningful manner, but since these software systems rarely share a common platform, the provenance interoperability between  ...  projects: Translational Research and Patient Safety in Europe (TRANSFoRm) and Electronic Health Records for Clinical Research (EHR4CR).  ...  The provenance data shown was created during their user tests. We would also like to thank the reviewers of an earlier version of this paper for helpful comments and constructive suggestions.  ... 
doi:10.1016/j.future.2013.12.001 fatcat:tnl3l5s7tnhdlfcfejgvec5gfe

A semantic-based workflow for biomedical literature annotation

Pedro Sernadela, José Luís Oliveira
2017 Database: The Journal of Biological Databases and Curation  
A semantic-based workflow for biomedical literature annotation.  ...  Taking up this challenge, we present a modular architecture for textual information integration using semantic web features and services.  ...  To represent the processed data, our architecture model is based on Annotation Ontology (AO) (30) , an open representation model for representing interoperable annotations in RDF (Resource Description  ... 
doi:10.1093/database/bax088 pmid:29220478 pmcid:PMC5691355 fatcat:tbt65j3vwfbefhhhpdx6lvapu4

EHDEN - D4.5 - Roadmap for interoperability solutions

Kees Van Bochove, Emma Vos, Maxim Moinat, Sebastiaan Van Sandijk, Tess Korthout, Peyman Mohtashani
2020 Zenodo  
The EHDEN project has explicitly chosen to standardize healthcare data for analysis using the OMOP Common Data Model, but other interoperability solutions, such as i2b2, CDISC, FHIR, openEHR, GA4GH etc  ...  This document addresses the relevance of those standards for the data (re)use goals of EHDEN, and points out how OMOP and OHDSI relate to them.  ...  Would it be possible to improve data provenance tracking and documentation during an ETL making use of FHIR Provenance Resource and operations?  ... 
doi:10.5281/zenodo.4474373 fatcat:3zw4vbrqyvhitc4pgqc2n2wiou

Provenance Solutions for Medical Research in Heterogeneous IT-Infrastructure: An Implementation Roadmap

Marcel Parciak, Christian Bauer, Theresa Bender, Robert Lodahl, Björn Schreiweis, Erik Tute, Ulrich Sax
2019 Studies in Health Technology and Informatics  
We propose an implementation roadmap to extract, store, and utilize provenance records in order to make provenance available to data analysts, research subjects, privacy officers, and machines (machine  ...  Each aspect is tackled separately, resulting in the implementation of a provenance toolbox.  ...  Workflow and information system provenance refer to more specific types of machine-usable data, for example, a definition and execution of a workflow defined in a workflow management system.  ... 
doi:10.3233/shti190231 pmid:31437933 fatcat:mhuuxuqk4zefrco5miboqyc7cq

Piveau: A Large-scale Open Data Management Platform based on Semantic Web Technologies [article]

Fabian Kirstein, Kyriakos Stefanidis, Benjamin Dittwald, Simon Dutkowski, Sebastian Urbanek, Manfred Hauswirth
2020 arXiv   pre-print
However, no existing solution for managing Open Data takes full advantage of these possibilities and benfits.  ...  We give a detailed description of the underlying, highly scalable, service-oriented architecture, how we integrated the aforementioned standards, and used a triplestore as our primary database.  ...  The PPL has been proven to be a fitting middle ground between ETL approaches and workflow engines.  ... 
arXiv:2005.02614v1 fatcat:sx4bz6pyu5grxifsvnphnvbafi

Towards FAIR protocols and workflows: the OpenPREDICT use case

Remzi Celebi, Joao Rebelo Moreira, Ahmed A. Hassan, Sandeep Ayyar, Lars Ridder, Tobias Kuhn, Michel Dumontier
2020 PeerJ Computer Science  
It is essential for the advancement of science that researchers share, reuse and reproduce each other's workflows and protocols.  ...  This includes FAIRification of the involved datasets, as well as applying semantic technologies to represent and store data about the detailed versions of the general protocol, of the concrete workflow  ...  the Common Workflow Language (CWL) (https://www.commonwl.org/) and the Workflow Description Language (WDL) (https://openwdl.org/), which have become the de facto standard for syntactic interoperability  ... 
doi:10.7717/peerj-cs.281 pmid:33816932 pmcid:PMC7924452 doaj:e3339c7f439d4843822785f5308945c9 fatcat:pxn7sw7mfreevex4thf3pnus4a

Personalized Biomedical Data Integration [chapter]

Xiaoming Wang, Olufunmilayo Olopade, Ian Foster
2011 Biomedical Engineering, Trends in Electronics, Communications and Software  
The majority of clinical data sources are neither programmatically accessible (syntactic interoperability) nor have metadata available for the source data (semantic interoperability).  ...  The fourth challenge is due to low data source interoperability.  ...  Some are inherently brittle, often without a clear measure of success for an ETL workflow in the real world.  ... 
doi:10.5772/13017 fatcat:2f5aywboufcfveou6qbf3y2oam

Realising Data-Centric Scientific Workflows with Provenance-Capturing on Data Lakes

Hendrik Nolte, Philipp Wieder
2022 Data Intelligence  
Although the necessity for a logical and a physical organisation of data lakes in order to meet those requirements is widely recognized, no concrete guidelines are yet provided.  ...  Building on this idea, several additional requirements were discussed in literature to improve the general usability of the concept, like a central metadata catalog including all provenance information  ...  Aggregating them into a single wrapper entity represents the exact state of a workflow.  ... 
doi:10.1162/dint_a_00141 fatcat:t2w5qaoobfh6rnnofzr65lng4a

Approaches and Criteria for Provenance in Biomedical Data Sets and Workflows: a Scoping Review Protocol (Preprint)

Kerstin Gierend, Frank Krüger, Dagmar Waltemath, Maximilian Fünfgeld, Atinkut Alamirrew Zeleke, Thomas Ganslandt
2021 JMIR Research Protocols  
This encompasses defining comprehensive concepts and standards for transparency and traceability, reproducibility, validity, and quality assurance during clinical and scientific data workflows and research  ...  This protocol outlines plans for a scoping review that will provide information about current approaches, challenges, or knowledge gaps with provenance tracking in biomedical sciences.  ...  Informatics in Research and Care in University Medicine), by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) SFB 1270/1-99150580, and by the National Research Data Infrastructure for  ... 
doi:10.2196/31750 pmid:34813494 pmcid:PMC8663663 fatcat:ntt4bv6szrfdzclp6wfktp2p6e

Leveraging Knowledge Representation to Maintain Immunization Clinical Decision Support

Janos L Mathe, Scott D Nelson, Stuart T Weinberg, Christoph U Lehmann, Andras Nadas, Asli O Weitkamp
2018 AMIA Annual Symposium Proceedings  
We use the Centers for Disease Control and Prevention (CDC) as a source for immunization knowledge for our Clinical Information Systems (CIS).  ...  Immunizations are one of the most cost-effective interventions for preventing morbidity and mortality.  ...  Figure 1 represents the initial partial order for the immunization domain via the arrows across the ETL jobs of the "(ETL.2) D2D Converter".  ... 
pmid:30815121 pmcid:PMC6371352 fatcat:6bzgltbl7rcypjguruszja4f5u

Piveau: A Large-Scale Open Data Management Platform Based on Semantic Web Technologies [chapter]

Fabian Kirstein, Kyriakos Stefanidis, Benjamin Dittwald, Simon Dutkowski, Sebastian Urbanek, Manfred Hauswirth
2020 Lecture Notes in Computer Science  
However, no existing solution for managing Open Data takes full advantage of these possibilities and benefits.  ...  We give a detailed description of the underlying, highly scalable, service-oriented architecture, how we integrated the aforementioned standards, and used a triplestore as our primary database.  ...  The PPL has been proven to be a fitting middle ground between ETL approaches and workflow engines.  ... 
doi:10.1007/978-3-030-49461-2_38 fatcat:pvi5zp7yenfx5js7soy4ynagom

D5.1 Interoperability Implementation Regulations

Carol Goble, Chris Evelo, Helen Parkinson, ELIXIR Interoperability Community
2018 Zenodo  
The main aim of this deliverable is to present our progress for the implementation of our interoperability framework for EXCELERATE Use Cases WP6-9 and the ELIXIR Data Platform (WP3).  ...  During the execution of EXCELERATE we have significantly refined and adapted this vision to offer a flexible and practical open framework for developing and deploying interoperability across the WPs, working  ...  Workflows are increasingly used to describe and link reusable tools, capture reproducible multi-step processes and represent the step-by-step know-how and provenance of processes.  ... 
doi:10.5281/zenodo.1452414 fatcat:j6bod5nlinadhl623uez7x46m4
« Previous Showing results 1 — 15 out of 388 results