Filters








9,262 Hits in 6.9 sec

Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

Zhenlong Li, Chaowei Yang, Baoxuan Jin, Manzhu Yu, Kai Liu, Min Sun, Matthew Zhan, Moncho Gomez-Gesteira
2015 PLoS ONE  
And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework.  ...  In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA).  ...  Cloud-based Workflow Execution Environment Scientific workflows normally require a collection of software components such as tools, libraries, programs, models, and applications, and these components are  ... 
doi:10.1371/journal.pone.0116781 pmid:25742012 pmcid:PMC4351198 fatcat:qwlkh75jabcplpjs24emnfixyi

COMP Superscalar, an interoperable programming framework

Rosa M. Badia, Javier Conejero, Carlos Diaz, Jorge Ejarque, Daniele Lezzi, Francesc Lordan, Cristian Ramon-Cortes, Raul Sirvent
2015 SoftwareX  
COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts.  ...  For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i) identifying the functions to be executed as asynchronous parallel  ...  Related work Related work includes (a) programming models for computing-intensive workloads, (b) workflow managers and (c) programming models for big data.  ... 
doi:10.1016/j.softx.2015.10.004 fatcat:wdihysbvvzgythxojl4uu7fbg4

Review on the Cloud Computing Programming Model

Chao Shen, Weiqin Tong
2014 International Journal of Advanced Science and Technology  
A cloud computing programming model is urgent to design to help user to use cloud resources without concerning the details of implementation.  ...  This paper first analyzes the problems which cloud computing programming model need to solve, and then analyzes the characteristics of the cloud computing programming model.  ...  Acknowledgements This work is supported by Innovation Action Plan supported by Science and Technology Commission of Shanghai Municipality (No.11511500200).  ... 
doi:10.14257/ijast.2014.70.02 fatcat:etcuk75oczgabewmzo62sokii4

Special Issue on Science-Driven Cloud Computing

Ivona Brandic, Ioan Raicu
2011 Scientific Programming  
The first trend is optimization of Cloud resources based on the available programming models for HPC (like workflows) where the control mechanism for the application execution are external to the Cloud  ...  The second trend are research issues in application of currently available public Clouds for HPC resulting in adaptation of data structures and programming models, e.g., data format adaptation, performance  ... 
doi:10.1155/2011/362493 fatcat:lknlfqvmevc3xovxzjd5jep6kq

Cloud Computing : Goals, Issues, SOA, Integrated Technologies and Future-scope

V. Madhu Viswanatham
2016 AMBIENT SCIENCE  
Cloud Computing comprises of a Service-oriented Architecture than an application oriented Architecture. The distributed Computing has laid the foundation for this advancement in the cloud [11] .  ...  The Hardware-aware WFS deals with the multiple core awareness models of WFS whereas the System intensive models concentrate on the object, data, communication and the ability for multiple workflows .The  ... 
doi:10.21276/ambi.2016.03.2.rv01 fatcat:2rzkvzztyffyrg5rs3cfjktse4

Asterism: Pegasus and Dispel4py Hybrid Workflows for Data-Intensive Science

Rosa Filgueira, Rafael Ferreira da Silva, Amrey Krause, Ewa Deelman, Malcolm Atkinson
2016 2016 Seventh International Workshop on Data-Intensive Computing in the Clouds (DataCloud)  
We also present the Data-Intensive workflows as a Service (DIaaS) model, which enables easy dataintensive workflow composition and deployment on clouds using containers.  ...  The feasibility of Asterism and DIaaS model have been evaluated using a real domain application on the NSF-Chameleon cloud.  ...  We thank the NSF Chameleon Cloud for providing time grants to access their resources.  ... 
doi:10.1109/datacloud.2016.004 dblp:conf/sc/FilgueiraSKDA16 fatcat:efpt66w6tnho3p7a3j6eoqejva

Scientific workflows management and scheduling in cloud computing: Taxonomy, prospects, and challenges

Zulfiqar Ahmad, Ali Imran Jehangiri, Mohammed Alaa Ala'anzy, Mohamed Othman, Rohaya Latip, Sardar Khaliq uz Zaman, Arif Iqbal Umar
2021 IEEE Access  
Whereas, scientific applications are data oriented and compute intensive applications and structured into scientific workflows.  ...  The services provided by cloud computing are broadly used for business and scientific applications. Business applications are task oriented applications and structured into business workflows.  ...  ACKNOWLEDGMENT The authors would like to thank the financial support and facilities provided by Universiti Putra Malaysia and the Ministry of Education Malaysia for the execution, completion and publication  ... 
doi:10.1109/access.2021.3070785 fatcat:nbrkndnhzra57lpbhzpyeht3wq

pipsCloud: High performance cloud computing for remote sensing big data management and processing

Lizhe Wang, Yan Ma, Jining Yan, Victor Chang, Albert Y. Zomaya
2018 Future generations computer systems  
In general the RS data processing for regional environmental and disaster monitoring are recognized as typical both compute-intensive and data-intensive applications.  ...  Benefiting from the ubiquity, elasticity and high-level of transparency of Cloud computing model, the massive RS data managing and data processing for dynamic environmental monitoring are all encapsulate  ...  Efficient and productive programming for these systems will be a challenge, especially in the context of data-intensive RS data processing applications.  ... 
doi:10.1016/j.future.2016.06.009 fatcat:gnkwaeggozcenkbuw7vdbmhe7i

A view of programming scalable data analysis: from clouds to exascale

Domenico Talia
2019 Journal of Cloud Computing: Advances, Systems and Applications  
Scalability is a key feature for big data analysis and machine learning frameworks and for applications that need to analyze very large and real-time data available from data repositories, social media  ...  Scalable big data analysis today can be achieved by parallel implementations that are able to exploit the computing and storage facilities of high performance computing (HPC) systems and clouds, whereas  ...  Availability of data and materials Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.  ... 
doi:10.1186/s13677-019-0127-x fatcat:l5mimqzwibh7fn4fedlsz4jkji

Towards Reliable, Performant Workflows for Streaming-Applications on Cloud Platforms

Daniel Zinn, Quinn Hart, Timothy McPhillips, Bertram Ludäscher, Yogesh Simmhan, Michail Giakkoupis, Viktor K. Prasanna
2011 2011 11th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing  
Yet, the lack of integrated support for data models, including streaming data, structured collections and files, is limiting the ability of workflows to support emerging applications in energy informatics  ...  Scientific workflows are commonplace in eScience applications.  ...  The authors would like to thank the Los Angeles Department of Water and Power (LDWP) for discussions on the Smart Grid domain challenges.  ... 
doi:10.1109/ccgrid.2011.74 dblp:conf/ccgrid/ZinnHMLSGP11 fatcat:bfeylzpl4zb6xf7dry37rfsvwy

Decentralized orchestration of data-centric workflows in Cloud environments

Bahman Javadi, Martin Tomko, Richard O. Sinnott
2013 Future generations computer systems  
In this paper, we propose a flexible and lightweight workflow framework based on the Object Modeling Systems (OMS).  ...  Although there are a plethora of orchestration frameworks to implement workflows, most of them are unsuitable for executing (enacting) data-centric workflows since they are based on a centralized orchestration  ...  Acknowledgments We would like to thank the AURIN architecture group for their support. The AURIN project is funded through the Australian Education Investment Fund SuperScience initiative.  ... 
doi:10.1016/j.future.2013.01.008 fatcat:tpmgbwshajc6dfbhl7h73qlc5i

Distributed Software Development Tools for Distributed Scientific Applications [chapter]

Vaidas Giedrimas, Leonidas Sakalauskas, Anatoly Petrenko
2017 Recent Progress in Parallel and Distributed Computing  
This chapter provides a new methodology and two tools for user-driven Wikinomicsoriented scientific applications' development.  ...  Service-oriented architecture for such applications is used, where the entire research supporting computing or simulating process is broken down into a set of loosely coupled stages in the form of interoperating  ...  The power of the Wiki technologies, software services, and clouds will ensure the ability of the interactive collaboration on software developing using the terms of particular domain.  ... 
doi:10.5772/intechopen.68334 fatcat:37iacnfqyngaxlcrz3cxzlx4fy

A Survey of Data-Intensive Scientific Workflow Management

Ji Liu, Esther Pacitti, Patrick Valduriez, Marta Mattoso
2015 Journal of Grid Computing  
A data-intensive scientific workflow is useful for modeling such process.  ...  Finally, we identify research issues for improving the execution of data-intensive scientific workflows in a multisite cloud.  ...  In the astronomy domain, Montage 1 is a computing and data-intensive application that can be modeled as a scientific workflow initially defined for the Pegasus SWfMS.  ... 
doi:10.1007/s10723-015-9329-8 fatcat:5urst5aphjftbli3pukmnbutri

Challenges of Translating HPC Codes to Workflows for Heterogeneous and Dynamic Environments

Fayssal Benkhaldoun, Christophe Cerin, Imad Kissami, Walid Saad
2017 2017 International Conference on High Performance Computing & Simulation (HPCS)  
In this paper we would like to share our experience for transforming a parallel code for a Computational Fluid Dynamics (CFD) problem into a parallel version for the RedisDG workflow engine.  ...  This paper states that new models for High Performance Computing are possible, under the condition we revisit our mind in the direction of the potential of new paradigms such as cloud, edge computing.  ...  These two papers are more related to cloud computing and data intensive scientific workflows in putting an emphasis on data management.  ... 
doi:10.1109/hpcs.2017.130 dblp:conf/ieeehpcs/BenkhaldounCKS17 fatcat:cxr7q7s46jg4zbf4etyky6llue

A Framework for Distributed Data-Parallel Execution in the Kepler Scientific Workflow System

Jianwu Wang, Daniel Crawl, Ilkay Altintas
2012 Procedia Computer Science  
Distributed Data-Parallel (DDP) patterns such as MapReduce have become increasingly popular as solutions to facilitate data-intensive applications, resulting in a number of systems supporting DDP workflows  ...  We describe a set of DDP actors based on DDP patterns and directors for DDP workflow executions within the presented framework.  ...  facilitate data-intensive applications in scientific workflow systems.  ... 
doi:10.1016/j.procs.2012.04.178 fatcat:3cwhymxw2zbjlgdmub5t6w7vya
« Previous Showing results 1 — 15 out of 9,262 results