AMBLE: A Context-Aware Mobile Learning Framework

Brita Curum, Kavi Khedo
2018 EAI Endorsed Transactions on Context-aware Systems and Applications  
INTRODUCTION: Mobile Learning is a new pivotal learning trend nowadays. With the increasing use of sophisticated smartphones equipped with augmented reality supporting tools and sensors, mobile learning platforms are expected to deliver tailor-made and customized learning elements to learners. Context-awareness is regarded as the fundamental approach or workaround to lift this learning style to distribute adaptive and personalized learning elements in mobile devices. OBJECTIVES: The main
more » ... y in mobile learning is to make learning elements as flexible as possible using different forms of context data to extend the natural adaptation capabilities in mobile devices in order to engage learners in extremely rich environments. METHODS: In this paper, A context-aware MoBile LEarning framework is proposed, namely the AMBLE framework. It processes contextual data at four distinct levels namely: Sensing Layer, Adaptation Layer, Context Processing Layer, and Application Layer to perform adaptation of learning contents based on the actual environment and conditions of the learner. RESULTS: Partial implementation of the proposed framework has the potential to capture and represent the physical context information that may be used to perform a dynamic adaptation of learning contents and thus significantly improve the mobile learning experiences. Extra work is expected regarding the implementation of the other layers and components of the framework including the user model, context manager, and the adaptation engine. CONCLUSION: The AMBLE framework proposes some relevant content adaptations with some positive results. As future works, new forms of user-context adaptation synthesized with other extracted data sets of contextual information will be used to establish and align relevant dynamic adaptation and personalization of learning contents.
doi:10.4108/eai.13-7-2018.162824 fatcat:b73q4jhwwndmveqj5vwegdyrpm