Filters








54,999 Hits in 5.5 sec

Big Data Mining Using Public Distributed Computing

Albertas Jurgelevičius, Leonidas Sakalauskas
2018 Information Technology and Control  
challenges as big data mining.  ...  Public distributed computing is a type of distributed computing in which so-called volunteers provide computing resources to projects.  ...  The focus of our further research will be to tackle public distributed computing reliability and data security issues.  ... 
doi:10.5755/j01.itc.47.2.19738 fatcat:3t42ava6izat3bcialy6rfhbre

The Optimization Strategies on Clarification of the Misconceptions of Big Data Processing in Dynamic and Opportunistic Environments

Wei Li, Maolin Tang
2021 Big Data and Cognitive Computing  
This paper identifies four common misconceptions about the scalability of volunteer computing on big data problems.  ...  Finding optimal use of volunteers are possible for the given big data problems even on the dynamics and opportunism of volunteers.  ...  Data Availability Statement: Not applicable. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/bdcc5030038 fatcat:mqej6pasrvcrheiipnirvzzmba

Big Data Processing by Volunteer Computing Supported by Intelligent Agents [chapter]

Jerzy Balicki, Waldemar Korłub, Jacek Paluszak
2015 Lecture Notes in Computer Science  
In this paper, volunteer computing systems have been proposed for big data processing. Moreover, intelligent agents have been developed to efficiency improvement of a grid middleware layer.  ...  Finally, two agents based on genetic programming as well as harmony search have been applied to optimize big data processing.  ...  This research is supported by Department of Computer Architecture, Faculty of Electronics, Telecommunications and Informatics, Gdańsk University of Technology under statutory activity grant.  ... 
doi:10.1007/978-3-319-19941-2_26 fatcat:ui2yx3ualfb4hbwlehy7skolr4

The Optimization Potential of Volunteer Computing for Compute or Data Intensive Applications

Wei Li, School of Engineering & Technology, Central Queensland University, Australia, William W. Guo
2019 Journal of Communications  
Second, virtual tasks are composed to apply certain compute-or data-intensity on the running MapReduce.  ...  Abstract-The poor scalability of Volunteer Computing (VC) hinders the application of it because a tremendous number of volunteers are needed in order to achieve the same performance as that of a traditional  ...  Big data is so coined because it cannot be processed by a single commodity computer in a reasonable amount of time. Consequently a HPC is necessary for big data processing.  ... 
doi:10.12720/jcm.14.10.971-979 fatcat:hcisqwu7kjaotklbu7lbb7xybq

Implementing a Volunteer Notification System into a Scalable, Analytical Realtime Data Processing Environment [chapter]

Jesko Elsner, Tomas Sivicki, Philipp Meisen, Tobias Meisen, Sabina Jeschke
2016 Automation, Communication and Cybernetics in Science and Engineering 2015/2016  
Within the given context, this work will furthermore give an insight on state-of-the-art proprietary solutions for Big Data processing that are currently available.  ...  This work concentrates on leveraging open source Big Data technologies with the aim to deliver a robust, secure and highly available enterprise-class Big Data platform.  ...  ACKNOWLEDGMENT This paper is based on work done in the INTERREG IVa project EMuRgency (www.emurgency.eu).  ... 
doi:10.1007/978-3-319-42620-4_64 fatcat:pd2nu6qe5ffb3m76hoctaimfse

A note on new trends in data-aware scheduling and resource provisioning in modern HPC systems

Jie Tao, Joanna Kolodziej, Rajiv Ranjan, Prem Prakash Jayaraman, Rajkumar Buyya
2015 Future generations computer systems  
(RS) big data, challenges, current techniques, and existing works for processing big data.  ...  The paper ''Cloud-aware data intensive workflow scheduling on volunteer computing systems'' [6] proposes a partitioning and data-centric approach to schedule workflows on both volunteer and Cloud resources  ... 
doi:10.1016/j.future.2015.04.016 fatcat:z2254f74bjdhtgoxus44ogqkyq

Crisis Analytics: Big Data Driven Crisis Response [article]

Junaid Qadir, Anwaar Ali, Raihan ur Rasool, Andrej Zwitter, Arjuna Sathiaseelan, Jon Crowcroft
2016 arXiv   pre-print
With the advances in technology (in terms of computing, communications, and the ability to process and analyze big data), our ability to respond to disasters is at an inflection point.  ...  This article introduces the history and the future of big crisis data analytics, along with a discussion on its promise, challenges, and pitfalls.  ...  This highlights the dangers of siloed big crisis data analytics on the fragmented data.  ... 
arXiv:1602.07813v1 fatcat:zkhoipkhyzfbzphjdrho4m3eq4

Data Processing Model to Perform Big Data Analytics in Hybrid Infrastructures

Julio C. S. Anjos, Kassiano J. Matteussi, Paulo R. R. De Souza, Gabriel J. A. Grabher, Guilherme A. Borges, Jorge L. V. Barbosa, Gabriel V. Gonzalez, Valderi R. Q. Leithardt, Claudio F. R. Geyer
2020 IEEE Access  
Finally, the data processed in each Big Data engine need to be integrated as in a single computation.  ...  Following, sensors send data, which will be preprocessed on Big Data processing in these environments.  ... 
doi:10.1109/access.2020.3023344 fatcat:dmqifexpivhnld75k3uwbmyaui

Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response

Ferda Ofli, Patrick Meier, Muhammad Imran, Carlos Castillo, Devis Tuia, Nicolas Rey, Julien Briant, Pauline Millet, Friedrich Reinhard, Matthew Parkan, Stéphane Joost
2016 Big Data  
Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center ( JRC) have noted that aerial imagery will inevitably present a big data challenge.  ...  The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster  ...  (For a detailed survey on different techniques used to process social media data, please see Imran et al. 18 ) Crowdsourcing alone is insufficient to make sense of this user-generated source of big data  ... 
doi:10.1089/big.2014.0064 pmid:27441584 fatcat:mrk6ixmpyfhz3d4x7isvq7os6e

Enabling Strategies for Big Data Analytics in Hybrid Infrastructures

Julio C. S. Anjos, Kassiano J. Matteussi, Paulo R. R. De Souza, Alexandre da Silva Veith, Gilles Fedak, Jorge Luis Victoria Barbosa, Claudio R. Geyer
2018 2018 International Conference on High Performance Computing & Simulation (HPCS)  
This work proposes the use of hybrid infrastructures such as Cloud and Volunteer Computing for Big Data processing and analysis.  ...  Although the Cloud computing scenario has grown rapidly in recent years, it still suffers from a lack of the kind of standardization that involves the resource management for Big Data applications, such  ...  This work proposes the use of hybrid infrastructures such as Cloud and Volunteer Computing for Big Data processing and analysis.  ... 
doi:10.1109/hpcs.2018.00140 dblp:conf/ieeehpcs/AnjosMSVFBG18 fatcat:dycbqxcqdbfgtcl7jllj4wzeyi

ATLAS@Home: Harnessing Volunteer Computing for HEP

C Adam-Bourdarios, D Cameron, A Filipčič, E Lancon, W Wu
2015 Journal of Physics, Conference Series  
So far many thousands of members of the public have signed up to contribute their spare CPU cycles for ATLAS, and there is potential for volunteer computing to provide a significant fraction of ATLAS computing  ...  Volunteer computing has been used over the last few years in many other scientific fields and by CERN itself to run simulations of the LHC beams.  ...  The first big volunteer computing project was SETI@Home [1] , where a program installed on volunteers' computers searched for evidence of extra-terrestrial life in radio signals from telescopes.  ... 
doi:10.1088/1742-6596/664/2/022009 fatcat:5vly2j55hzhdhjzu7v4wecjeqa

A Dynamic Task Allocation Algorithm Based on Weighted Velocity

2017 Computer Engineering and Intelligent Systems  
Volunteer computing is a way of supporting people around the world who provide free computer resources, to participate in scientific calculation or data analysis on the Internet.  ...  To make full use of idle computer resources, a dynamic task allocation algorithm (TAA) based on weighted velocity was proposed in this work.  ...  Experimental data explanation HCEP is an experimental project run on a volunteer computing platform. It crack the password by using a dictionary (Zhang & Meng, 2009) .  ... 
doi:10.7176/ceis/8-8-1 fatcat:q2lzuhu445dbzgkzcsk32bof4m

Message from the Program Chairs

2019 2019 IEEE 43rd Annual Computer Software and Applications Conference (COMPSAC)  
Extracting meaning and knowledge from big data is crucial for governments and businesses to support their decision making processes.  ...  Artificial Intelligence (AI) makes it possible for machines to learn and accomplish tasks by processing large amount of data. Advances in big data technologies triggers the evolution of AI.  ...  Sheikh Iqbal Ahamed has worked tirelessly and diligently for many months to support our efforts and keeping all our activities on track and on schedule as Standing Committee Vice Chair.  ... 
doi:10.1109/compsac.2019.10172 fatcat:x35mmpnd7nekzgayheylbcnogi

Message from the Program Chairs

2019 2019 IEEE 43rd Annual Computer Software and Applications Conference (COMPSAC)  
Extracting meaning and knowledge from big data is crucial for governments and businesses to support their decision making processes.  ...  Artificial Intelligence (AI) makes it possible for machines to learn and accomplish tasks by processing large amount of data. Advances in big data technologies triggers the evolution of AI.  ...  Sheikh Iqbal Ahamed has worked tirelessly and diligently for many months to support our efforts and keeping all our activities on track and on schedule as Standing Committee Vice Chair.  ... 
doi:10.1109/compsac.2019.00008 fatcat:537co2fg2vdwdceuebuvmm7x2u

Citizen science, computing, and conservation: How can "Crowd AI" change the way we tackle large-scale ecological challenges?

Meredith S. Palmer, Sarah E. Huebner, Marco Willi, Lucy Fortson, Craig Packer
2021 Human Computation  
Systematic camera trap surveys generate 'Big Data' across broad spatial and temporal scales, providing valuable information on environmental and anthropogenic factors affecting vulnerable wildlife populations  ...  Using Crowd AI to quickly and accurately 'unlock' ecological Big Data for use in science and conservation is revolutionizing the way we take on critical environmental issues in the Anthropocene era.  ...  We acknowledge the Minnesota Supercomputing Institute (www.msi.umn.org) for providing resources contributing to data storage and processing.  ... 
doi:10.15346/hc.v8i2.123 fatcat:feeakprednhepnhrvq46j33h4y
« Previous Showing results 1 — 15 out of 54,999 results