562 Hits in 3.5 sec

Unobtrusive and Multimodal Approach for Behavioral Engagement Detection of Students [article]

Nese Alyuz, Eda Okur, Utku Genc, Sinem Aslan, Cagri Tanriover, Asli Arslan Esme
2019 arXiv   pre-print
We propose a multimodal approach for detection of students' behavioral engagement states (i.e., On-Task vs.  ...  Off-Task), based on three unobtrusive modalities: Appearance, Context-Performance, and Mouse.  ...  Towards this end, we propose a multimodal approach for detection of students' behavioral engagement states (i.e., On-Task vs.  ... 
arXiv:1901.05835v1 fatcat:73njb4q24vd4zlbhesyjxsbziq

Detecting Drowsy Learners at the Wheel of e-Learning Platforms with Multimodal Learning Analytics

Ryosuke Kawamura, Shizuka Shirai, Noriko Takemura, Mehrasa Alizadeh, Mutlu Cukurova, Haruo Takemura, Hajime Nagahara
2021 IEEE Access  
Although wakefulness of learners strongly relates to educational outcomes, detecting drowsy learning behaviors only from log data is not an easy task.  ...  We collected multimodal data from learners in a blended course of informatics and conducted two types of analysis on them.  ...  Those studies show the benefits of hybrid over single-mode measures and provide evidence for the potential of multimodal data to gain insights into detecting drowsy behaviors automatically.  ... 
doi:10.1109/access.2021.3104805 fatcat:nkmz62lle5de5ph3mlnnn7dmne

Multimodal Affect Detection in the Wild

Nigel Bosch
2015 Proceedings of the 2015 ACM on International Conference on Multimodal Interaction - ICMI '15  
Affect detection is an important component of computerized learning environments that adapt the interface and materials to students' affect.  ...  Results are presented for completed work evaluating the accuracy of individual modality faceand interaction-based detectors, accuracy and availability of a multimodal combination of these modalities, and  ...  Any opinions, findings and conclusions, or recommendations expressed in this paper do not necessarily reflect the views of the NSF or the Bill & Melinda Gates Foundation.  ... 
doi:10.1145/2818346.2823316 dblp:conf/icmi/Bosch15 fatcat:szsiuif5hzc4feoi74jdf6hstq

Relevance of learning analytics to measure and support students' learning in adaptive educational technologies

Maria Bannert, Inge Molenar, Roger Azevedo, Sanna Järvelä, Dragan Gašević
2017 Proceedings of the Seventh International Learning Analytics & Knowledge Conference on - LAK '17  
The aim is to develop our understanding of multimodal data that unobtrusively capture cognitive, meta-cognitive, affective and motivational states of learners over time.  ...  In this poster, we describe the aim and current activities of the EARLI-Centre for Innovative Research (E-CIR) "Measuring and Supporting Student's Self-Regulated Learning in Adaptive Educational Technologies  ...  More specifically, combining multimodal data can reveal both cognitive and affective states of the learner and can detect arousal levels and the valence of emotional reactions.  ... 
doi:10.1145/3027385.3029463 fatcat:zkzkgb6lxbbetlbebrigbv2umi

Accuracy vs. Availability Heuristic in Multimodal Affect Detection in the Wild

Nigel Bosch, Huili Chen, Sidney D'Mello, Ryan Baker, Valerie Shute
2015 Proceedings of the 2015 ACM on International Conference on Multimodal Interaction - ICMI '15  
This paper discusses multimodal affect detection from a fusion of facial expressions and interaction features derived from students' interactions with an educational game in the noisy real-world context  ...  Balancing the accuracy vs. applicability tradeoff appears to be an important feature of multimodal affect detection.  ...  Separate strategies will be used for each affective state and off-task behavior.  ... 
doi:10.1145/2818346.2820739 dblp:conf/icmi/BoschCDBS15 fatcat:6g6ms4hk2rarxdwy5gfm6wzg2i

Multimodal data capabilities for learning: What can multimodal data tell us about learning?

Kshitij Sharma, Michail Giannakos
2020 British Journal of Educational Technology  
In particular, we identified the six main objectives (ie, behavioral trajectories, learning outcome, learning-task performance, teacher support, engagement and student feedback) that the MMLA research  ...  This study presents the outcomes of a systematic literature review of empirical evidence on the capabilities of multimodal data (MMD) for human learning.  ...  We would like to thank Manolis Mavrikis, the editor-in-chief of BJET, for his comments and suggestions.  ... 
doi:10.1111/bjet.12993 fatcat:klpx2g4clbdxviv2jmwypii2eq

Human-centered collaborative interaction

Paulo Barthelmess, Edward Kaiser, Rebecca Lunsford, David McGee, Philip Cohen, Sharon Oviatt
2006 Proceedings of the 1st ACM international workshop on Human-centered multimedia - HCM '06  
In this paper we examine the problem of providing effective support for collaboration, focusing on the role of human-centered approaches that take advantage of multimodality and multimedia.  ...  Human-centered multimedia and multimodal approaches hold a promise of providing substantially enhanced user experiences by focusing attention on human perceptual and motor capabilities, and on actual user  ...  Natural multimodal language reveals a variety of semantic cues that can be exploited as mechanisms for automatic detection of in-tentions and meaning without requiring direct user intervention.  ... 
doi:10.1145/1178745.1178747 fatcat:r2lhqi25xzh5rif7l7yjdgznyy

ICMI 2013 grand challenge workshop on multimodal learning analytics

Louis-Philippe Morency, Sharon Oviatt, Stefan Scherer, Nadir Weibel, Marcelo Worsley
2013 Proceedings of the 15th ACM on International conference on multimodal interaction - ICMI '13  
It also contributes to improving pedagogical support for students' learning through assessment of new digital tools, teaching strategies, and curricula.  ...  Advances in learning analytics are contributing new empirical findings, theories, methods, and metrics for understanding how students learn.  ...  We anticipate that embarking on multimodal learning analytics research will create new technical challenges for machine learning and multimodal researchers, and also provide advances for an important application  ... 
doi:10.1145/2522848.2534669 dblp:conf/icmi/MorencyOSWW13 fatcat:aclgiabsandu7hvg3zenl55nom

Realizing a Mobile Multimodal Platform for Serious Games Analytics

Laila Shoukry, Stefan Göbel
2019 International Journal of Serious Games  
This paper presents the design and development of "StoryPlay Multimodal", a mobile multimodal analytics platform for the evaluation of Serious Games.  ...  This is done by capturing, pre-processing, synchronizing and visualizing multimodal serious games analytics and mobile sensor data from playtesting sessions.  ...  The feature of recording and replaying face expressions of game players is presented as having many advantages for evaluation: It is an unobtrusive way of observing players' engagement and involuntary  ... 
doi:10.17083/ijsg.v6i4.323 fatcat:c6yr6hfnlvegtldq7yscxuqoya

Holistic Analysis of the Classroom

Mirko Raca, Pierre Dillenbourg
2014 Proceedings of the 2014 ACM workshop on Multimodal Learning Analytics Workshop and Grand Challenge - MLA '14  
For this purpose, we developed a multi-camera system for observing teacher actions and students reactions throughout the class.  ...  Up until recently, the later approach was all too inaccessible due to complexity and time needed to evaluate every class.  ...  for face detection.  ... 
doi:10.1145/2666633.2666636 dblp:conf/icmi/RacaD14 fatcat:ksyg5bmqdndy5ojkwz7nrxeujq

A Survey on Human Emotion Recognition Approaches, Databases and Applications

C. Vinola, K. Vimaladevi
2015 ELCVIA Electronic Letters on Computer Vision and Image Analysis  
The modalities and approaches used for affect detection vary and contribute to accuracy and efficacy in detecting emotions of human beings.  ...  Thus an integrated discussion of methods, databases used and applications pertaining to the emerging field of Affective Computing (AC) is done and surveyed.  ...  In [74] , an attempt for the development of real-time automated recognition of engagement from students' facial expressions is again explored.  ... 
doi:10.5565/rev/elcvia.795 fatcat:wlaxn2xgczhm7dhkekk2atfcuq

A Data-Driven Approach to Quantify and Measure Students' Engagement in Synchronous Virtual Learning Environments

Xavier Solé-Beteta, Joan Navarro, Brigita Gajšek, Alessandro Guadagni, Agustín Zaballos
2022 Sensors  
To validate the feasibility of this approach, a software prototype has been implemented to measure student engagement in two different learning activities in a synchronous learning session: a masterclass  ...  The purpose of this paper is to propose a methodology and its associated model to measure student engagement in VLEs that can be obtained from the systematic analysis of more than 30 types of digital interactions  ...  Acknowledgments: Authors would like to acknowledge Edmon Bosch and Artur Alcoverro for their commitment and support on this research.  ... 
doi:10.3390/s22093294 fatcat:dyqidokcn5ep7azcd4jlv34poq

Exploring Emergent Features of Student Interaction within an Embodied Science Learning Simulation

Jina Kang, Robb Lindgren, James Planey
2018 Multimodal Technologies and Interaction  
Implications for the design of multimodal interaction technologies and the metrics that were used to investigate different types of students' interactions while learning are discussed.  ...  Recent research suggests that these theories can be successfully applied to the design of learning environments, and new technologies enable multimodal platforms that respond to students' natural physical  ...  Acknowledgments: We thank the entire ELASTIC 3 S team for their help with design, development, and data collections.  ... 
doi:10.3390/mti2030039 fatcat:7rwqw5ybuzdb5jffrq3ijpvaqy

Teaching analytics

Luis P. Prieto, Kshitij Sharma, Pierre Dillenbourg, María Jesús
2016 Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK '16  
and models for teaching activity extraction, as well as the collection of a larger multimodal dataset to improve the accuracy and generalizability of these methods.  ...  The reliable detection of concrete teaching activity (e.g., explanation vs. questioning) accurately still remains challenging (67%, κ=0.56), a fact that will prompt further research on multimodal features  ...  of university lectures, Raca and Dillenbourg [35] take an unobtrusive computer vision approach to assess student attention from their posture and other behavioral cues.  ... 
doi:10.1145/2883851.2883927 dblp:conf/lak/PrietoSDJ16 fatcat:iwq6b7hcr5e4hisemgorf3fyuy

A Taxonomy in Robot-Assisted Training: Current Trends, Needs and Challenges

Konstantinos Tsiakas, Maria Kyrarini, Vangelis Karkaletsis, Fillia Makedon, Oliver Korn
2018 Technologies  
The proposed taxonomy suggests a set of categories and parameters that can be used to characterize such systems, considering the current research trends and needs for the design, development and evaluation  ...  The goal is to identify and discuss open challenges, highlighting the different aspects of a Robot-Assisted Training system, considering both robot perception and behavior control.  ...  for multimodal data collection and analysis for human behavior modeling.  ... 
doi:10.3390/technologies6040119 fatcat:o43ockt4inbstojrq5wxb62pia
« Previous Showing results 1 — 15 out of 562 results