On the core affinity and file upload performance of Hadoop

Joong-Yeon Cho, Hyun-Wook Jin, Min Lee, Karsten Schwan
2013 Proceedings of the 2013 International Workshop on Data-Intensive Scalable Computing Systems - DISCS-2013  
The MapReduce programming model is introduced for bigdata processing, where the data nodes perform both data storing and computation. Thus, we need to understand different resource requirements of data storing and computation tasks and schedule these efficiently over multi-core processors. The core affinity defines mapping between a set of cores and a given task. The core affinity can be decided based on resource requirements of a task because this largely affects the efficiency of computation,
more » ... memory, and I/O resource utilization. In this paper, we analyze the impact of core affinity on the file upload performance of Hadoop Distributed File System (HDFS). Our study can provide the insight into the process scheduling issues on big-data processing systems. We also suggest a framework for dynamic core affinity based on our observations and show that a preliminary implementation can improve the throughput more than 40% compared with default Linux system.
doi:10.1145/2534645.2534651 dblp:conf/sc/ChoJLS13 fatcat:vozfjbkh6zbnvme3rykqjeyemy