8 Hits in 5.8 sec

Grids and Clouds: Making Workflow Applications Work in Heterogeneous Distributed Environments

Ewa Deelman
2009 The international journal of high performance computing applications  
Scientific workflows are frequently being used to model complex phenomena, to analyze instrumental data, to tie together information from distributed sources, and to pursue other scientific endeavors.  ...  Out of necessity, these complex applications need to execute in distributed environments and make use of a number of heterogeneous resources.  ...  At the ISI, she is leading the Pegasus project, which designs and implements workflow mapping techniques for large-scale workflows running in distributed environments.  ... 
doi:10.1177/1094342009356432 fatcat:dielha3lpnftzcrp6y7ykwjqaq

Asterism: Pegasus and Dispel4py Hybrid Workflows for Data-Intensive Science

Rosa Filgueira, Rafael Ferreira da Silva, Amrey Krause, Ewa Deelman, Malcolm Atkinson
2016 2016 Seventh International Workshop on Data-Intensive Computing in the Clouds (DataCloud)  
Keywords Data-Intensive science, scientific workflows, stream-based system, deployment and reusability of execution environments  ...  Experimental results shows how Asterism successfully and efficiently exploits combinations of diverse computational platforms, whereas DIaaS delivers specialized software to execute data-intensive applications  ...  We thank the NSF Chameleon Cloud for providing time grants to access their resources.  ... 
doi:10.1109/datacloud.2016.004 dblp:conf/sc/FilgueiraSKDA16 fatcat:efpt66w6tnho3p7a3j6eoqejva

Pegasus, a workflow management system for science automation

Ewa Deelman, Karan Vahi, Gideon Juve, Mats Rynge, Scott Callaghan, Philip J. Maechling, Rajiv Mayani, Weiwei Chen, Rafael Ferreira da Silva, Miron Livny, Kent Wenger
2015 Future generations computer systems  
This paper describes the design, development and evolution of the Pegasus Workflow Management System, which maps abstract workflow descriptions onto distributed computing infrastructures.  ...  This paper provides an integrated view of the Pegasus system, showing its capabilities that have been developed over time in response to application needs and to the evolution of the scientific computing  ...  Pegasus has been in development since 2001 and has benefited greatly from the expertise and efforts of people who worked on it over the years.  ... 
doi:10.1016/j.future.2014.10.008 fatcat:u5lbouuekvduhfdvdi7phfyn7i

Optimization Patterns for the Decentralized Orchestration of Parameter-Sweep Workflows

Selim Kalayci, S. Masoud Sadjadi
2014 2014 International Conference on Cloud and Autonomic Computing  
In our previous studies, we have designed and developed techniques to orchestrate the execution of large-scale workflows in a decentralized and adaptive manner.  ...  Due to computation and data intensive nature, resources that span across multiple domains may be needed for timely and efficient execution of this type of workflows.  ...  During the mapping of workflow tasks onto physical resources, the main goal is to achieve the minimum makespan possible for the execution of the whole workflow.  ... 
doi:10.1109/iccac.2014.28 dblp:conf/iccac/KalayciS14 fatcat:drbxzl2gzbgsvk4e3l7bh4j6a4

A characterization of workflow management systems for extreme-scale applications

Rafael Ferreira da Silva, Rosa Filgueira, Ilia Pietri, Ming Jiang, Rizos Sakellariou, Ewa Deelman
2017 Future generations computer systems  
By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution  ...  The paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.  ...  Pegasus maps an application onto available resources, optimizing the execution in terms of performance and data management.  ... 
doi:10.1016/j.future.2017.02.026 fatcat:3ri627jyy5a47hgh5r343f5ife

Scientific Workflows: Business as Usual? [chapter]

Bertram Ludäscher, Mathias Weske, Timothy McPhillips, Shawn Bowers
2009 Lecture Notes in Computer Science  
Business workflow management and business process modeling are mature research areas, whose roots go far back to the early days of office automation systems.  ...  Scientific workflow management, on the other hand, is a much more recent phenomenon, triggered by (i) a shift towards data-intensive and computational methods in the natural sciences, and (ii) the resulting  ...  Work supported through NSF grants IIS-0630033, OCI-0722079, IIS-0612326, DBI-0533368, ATM-0619139, and DOE grant DE-FC02-01ER25486.  ... 
doi:10.1007/978-3-642-03848-8_4 fatcat:ryk5brwqwrhidl73uy5624p4ju

Auspice: Automatic Service Planning in Cloud/Grid Environments [chapter]

David Chiu, Gagan Agrawal
2011 Grids, Clouds and Virtualization  
The development of such a system that can abstract the complex task of scientific workflow planning and execution from the user is reported herein.  ...  Understanding how to design, manage, and execute such data-intensive workflows has become increasingly esoteric, confined to a few scientific experts in the field.  ...  Pegasus [52] creates workflow plans in the form of Condor DAGMan files, which then uses the DAGMan scheduler for execution.  ... 
doi:10.1007/978-0-85729-049-6_5 fatcat:prikikmcpfdx3ps7554efsh4ti

The Fourth Paradigm – Data-Intensive Scientific Discovery [chapter]

Tony Hey
2012 Communications in Computer and Information Science  
very helpful discussion on a draft of this material. to all the contributors to this book for sharing their visions within the Fourth Paradigm.  ...  And, of course, we thank Jim Gray, for inspiring us. ACKN OWLE DG M E NTS  ...  Sedna is one of the few to use the industry-standard Business Process Execution Language (BPEL) for scientific workflows [10] .  ... 
doi:10.1007/978-3-642-33299-9_1 fatcat:etue636ubrandga5corvkx3d7u