Orchestration from the Cloud to the Edge [chapter]

Sergej Svorobej, Malika Bendechache, Frank Griesinger, Jörg Domaschka
2020 The Cloud-to-Thing Continuum  
The effective management of complex and heterogeneous computing environments is one of the biggest challenges that service and infrastructure providers are facing in the cloud-to-thing continuum era. Advanced orchestration systems are required to support the resource management of large-scale cloud data centres integrated with the big data generation of IoT devices. The orchestration system should be aware about all available resources and their current status in order to perform dynamic
more » ... ions and enable short time deployment of applications. This chapter will review the state of the art with regards to orchestration along the cloudto-thing continuum with a specific emphasis on container-based orchestration (e.g. Docker Swarm and Kubernetes) and fog-specific orchestration architectures (e.g. SORTS, SOAFI, ETSI IGS MEC, and CONCERT). The inarguable success of cloud computing combined with rapid growth in adoption of Internet services is resulting in an unprecedented demand for computing resources. However, cloud computing performance for many applications depends closely on the network latency. In particular, the strength of network connectivity is crucial for large data sets. As more and more data is generated by enterprises and consumers, particularly with the adoption of the Internet of Things (IoT), traditional cloud connectivity may not be sufficient (Carnevale et al. 2018) . To make up for the lack of speed and connectivity with conventional cloud computing, processing for mission-critical applications will need to occur closer to the data source. Processing the data close to where it originated is referred to as edge computing and fog computing. Edge computing is pushing computing applications, data, and services away from centralised cloud data centre architectures to the edges of the underlying network (Barika et al. 2019). It is defined by NIST (Iorga et al. 2018) "as a local computing at the network layer encompassing the smart end-devices and their users. It runs specific applications in a fixed logic location and provides a direct transmission service." It promises to reduce the amount of data pushed to centralised cloud data centres avoiding load on the network and therefore is beneficial for analytics and knowledgebased services. Edge computing also leads to lower latencies, hence increasing communication velocity, reducing wider network footprints and avoiding congestion. As it reduces the distance the data must travel, it boosts the performance and reliability of latency-critical applications and services. Service orchestration is an arrangement of auxiliary system components that cloud providers can use for coordination and management of computing resources to ensure service provision to cloud consumers (Bohn et al. 2011) . Orchestration can also be defined as the use of programming technology to manage the interconnections and interactions among
doi:10.1007/978-3-030-41110-7_4 fatcat:dcblko7gvbe4vhs2t6k2p7ik6i