Filters








558 Hits in 3.8 sec

Unobtrusive and Multimodal Approach for Behavioral Engagement Detection of Students [article]

Nese Alyuz, Eda Okur, Utku Genc, Sinem Aslan, Cagri Tanriover, Asli Arslan Esme
2019 arXiv   pre-print
We propose a multimodal approach for detection of students' behavioral engagement states (i.e., On-Task vs.  ...  Off-Task), based on three unobtrusive modalities: Appearance, Context-Performance, and Mouse.  ...  Towards this end, we propose a multimodal approach for detection of students' behavioral engagement states (i.e., On-Task vs.  ... 
arXiv:1901.05835v1 fatcat:73njb4q24vd4zlbhesyjxsbziq

Detecting Drowsy Learners at the Wheel of e-Learning Platforms with Multimodal Learning Analytics

Ryosuke Kawamura, Shizuka Shirai, Noriko Takemura, Mehrasa Alizadeh, Mutlu Cukurova, Haruo Takemura, Hajime Nagahara
2021 IEEE Access  
Although wakefulness of learners strongly relates to educational outcomes, detecting drowsy learning behaviors only from log data is not an easy task.  ...  We collected multimodal data from learners in a blended course of informatics and conducted two types of analysis on them.  ...  There was no robust evaluation of the results; however, an average accuracy of 75.2% was reported for student engagement measure.  ... 
doi:10.1109/access.2021.3104805 fatcat:nkmz62lle5de5ph3mlnnn7dmne

Multimodal Affect Detection in the Wild

Nigel Bosch
2015 Proceedings of the 2015 ACM on International Conference on Multimodal Interaction - ICMI '15  
Affect detection is an important component of computerized learning environments that adapt the interface and materials to students' affect.  ...  Results are presented for completed work evaluating the accuracy of individual modality faceand interaction-based detectors, accuracy and availability of a multimodal combination of these modalities, and  ...  Any opinions, findings and conclusions, or recommendations expressed in this paper do not necessarily reflect the views of the NSF or the Bill & Melinda Gates Foundation.  ... 
doi:10.1145/2818346.2823316 dblp:conf/icmi/Bosch15 fatcat:szsiuif5hzc4feoi74jdf6hstq

Accuracy vs. Availability Heuristic in Multimodal Affect Detection in the Wild

Nigel Bosch, Huili Chen, Sidney D'Mello, Ryan Baker, Valerie Shute
2015 Proceedings of the 2015 ACM on International Conference on Multimodal Interaction - ICMI '15  
This paper discusses multimodal affect detection from a fusion of facial expressions and interaction features derived from students' interactions with an educational game in the noisy real-world context  ...  Balancing the accuracy vs. applicability tradeoff appears to be an important feature of multimodal affect detection.  ...  For example, a bored student might not interact very frequently, or might aimlessly repeat an action. An engaged student might be more likely to try various strategies in the game and succeed more.  ... 
doi:10.1145/2818346.2820739 dblp:conf/icmi/BoschCDBS15 fatcat:6g6ms4hk2rarxdwy5gfm6wzg2i

Multimodal data capabilities for learning: What can multimodal data tell us about learning?

Kshitij Sharma, Michail Giannakos
2020 British Journal of Educational Technology  
In particular, we identified the six main objectives (ie, behavioral trajectories, learning outcome, learning-task performance, teacher support, engagement and student feedback) that the MMLA research  ...  This study presents the outcomes of a systematic literature review of empirical evidence on the capabilities of multimodal data (MMD) for human learning.  ...  We would like to thank Manolis Mavrikis, the editor-in-chief of BJET, for his comments and suggestions.  ... 
doi:10.1111/bjet.12993 fatcat:klpx2g4clbdxviv2jmwypii2eq

Human-centered collaborative interaction

Paulo Barthelmess, Edward Kaiser, Rebecca Lunsford, David McGee, Philip Cohen, Sharon Oviatt
2006 Proceedings of the 1st ACM international workshop on Human-centered multimedia - HCM '06  
In this paper we examine the problem of providing effective support for collaboration, focusing on the role of human-centered approaches that take advantage of multimodality and multimedia.  ...  Recent years have witnessed an increasing shift in interest from single user multimedia/multimodal interfaces towards support for interaction among groups of people working closely together, e.g. during  ...  Natural multimodal language reveals a variety of semantic cues that can be exploited as mechanisms for automatic detection of in-tentions and meaning without requiring direct user intervention.  ... 
doi:10.1145/1178745.1178747 fatcat:r2lhqi25xzh5rif7l7yjdgznyy

ICMI 2013 grand challenge workshop on multimodal learning analytics

Louis-Philippe Morency, Sharon Oviatt, Stefan Scherer, Nadir Weibel, Marcelo Worsley
2013 Proceedings of the 15th ACM on International conference on multimodal interaction - ICMI '13  
Multimodal learning analytics (MMLA) [1] is an extension of learning analytics and emphasizes the analysis of natural rich modalities of communication across a variety of learning contexts.  ...  It also contributes to improving pedagogical support for students' learning through assessment of new digital tools, teaching strategies, and curricula.  ...  We anticipate that embarking on multimodal learning analytics research will create new technical challenges for machine learning and multimodal researchers, and also provide advances for an important application  ... 
doi:10.1145/2522848.2534669 dblp:conf/icmi/MorencyOSWW13 fatcat:aclgiabsandu7hvg3zenl55nom

Realizing a Mobile Multimodal Platform for Serious Games Analytics

Laila Shoukry, Stefan Göbel
2019 International Journal of Serious Games  
This paper presents the design and development of "StoryPlay Multimodal", a mobile multimodal analytics platform for the evaluation of Serious Games.  ...  This is done by capturing, pre-processing, synchronizing and visualizing multimodal serious games analytics and mobile sensor data from playtesting sessions.  ...  The feature of recording and replaying face expressions of game players is presented as having many advantages for evaluation: It is an unobtrusive way of observing players' engagement and involuntary  ... 
doi:10.17083/ijsg.v6i4.323 fatcat:c6yr6hfnlvegtldq7yscxuqoya

A Survey on Human Emotion Recognition Approaches, Databases and Applications

C. Vinola, K. Vimaladevi
2015 ELCVIA Electronic Letters on Computer Vision and Image Analysis  
The modalities and approaches used for affect detection vary and contribute to accuracy and efficacy in detecting emotions of human beings.  ...  Thus an integrated discussion of methods, databases used and applications pertaining to the emerging field of Affective Computing (AC) is done and surveyed.  ...  In [74] , an attempt for the development of real-time automated recognition of engagement from students' facial expressions is again explored.  ... 
doi:10.5565/rev/elcvia.795 fatcat:wlaxn2xgczhm7dhkekk2atfcuq

Holistic Analysis of the Classroom

Mirko Raca, Pierre Dillenbourg
2014 Proceedings of the 2014 ACM workshop on Multimodal Learning Analytics Workshop and Grand Challenge - MLA '14  
For this purpose, we developed a multi-camera system for observing teacher actions and students reactions throughout the class.  ...  Up until recently, the later approach was all too inaccessible due to complexity and time needed to evaluate every class.  ...  for face detection.  ... 
doi:10.1145/2666633.2666636 dblp:conf/icmi/RacaD14 fatcat:ksyg5bmqdndy5ojkwz7nrxeujq

A Data-Driven Approach to Quantify and Measure Students' Engagement in Synchronous Virtual Learning Environments

Xavier Solé-Beteta, Joan Navarro, Brigita Gajšek, Alessandro Guadagni, Agustín Zaballos
2022 Sensors  
To validate the feasibility of this approach, a software prototype has been implemented to measure student engagement in two different learning activities in a synchronous learning session: a masterclass  ...  The purpose of this paper is to propose a methodology and its associated model to measure student engagement in VLEs that can be obtained from the systematic analysis of more than 30 types of digital interactions  ...  Acknowledgments: Authors would like to acknowledge Edmon Bosch and Artur Alcoverro for their commitment and support on this research.  ... 
doi:10.3390/s22093294 fatcat:dyqidokcn5ep7azcd4jlv34poq

Exploring Emergent Features of Student Interaction within an Embodied Science Learning Simulation

Jina Kang, Robb Lindgren, James Planey
2018 Multimodal Technologies and Interaction  
Implications for the design of multimodal interaction technologies and the metrics that were used to investigate different types of students' interactions while learning are discussed.  ...  This study examines how students engaged with an embodied mixed-reality science learning simulation using advanced gesture recognition techniques to support full-body interaction.  ...  Acknowledgments: We thank the entire ELASTIC 3 S team for their help with design, development, and data collections.  ... 
doi:10.3390/mti2030039 fatcat:7rwqw5ybuzdb5jffrq3ijpvaqy

A Taxonomy in Robot-Assisted Training: Current Trends, Needs and Challenges

Konstantinos Tsiakas, Maria Kyrarini, Vangelis Karkaletsis, Fillia Makedon, Oliver Korn
2018 Technologies  
The proposed taxonomy suggests a set of categories and parameters that can be used to characterize such systems, considering the current research trends and needs for the design, development and evaluation  ...  The goal is to identify and discuss open challenges, highlighting the different aspects of a Robot-Assisted Training system, considering both robot perception and behavior control.  ...  for multimodal data collection and analysis for human behavior modeling.  ... 
doi:10.3390/technologies6040119 fatcat:o43ockt4inbstojrq5wxb62pia

Teaching analytics

Luis P. Prieto, Kshitij Sharma, Pierre Dillenbourg, María Jesús
2016 Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK '16  
and models for teaching activity extraction, as well as the collection of a larger multimodal dataset to improve the accuracy and generalizability of these methods.  ...  However, in the case of (often, half-improvised) teaching in face-to-face classrooms, such interventions would require first an understanding of what the teacher actually did, as the starting point for  ...  of university lectures, Raca and Dillenbourg [35] take an unobtrusive computer vision approach to assess student attention from their posture and other behavioral cues.  ... 
doi:10.1145/2883851.2883927 dblp:conf/lak/PrietoSDJ16 fatcat:iwq6b7hcr5e4hisemgorf3fyuy

LECTOR: Towards Reengaging Students in the Educational Process Inside Smart Classrooms [chapter]

Maria Korozi, Asterios Leonidis, Margherita Antona, Constantine Stephanidis
2017 Lecture Notes in Computer Science  
activity recognition techniques) in order to identify inattentive behaviors and (ii) recommending interventions for motivating distracted students when deemed necessary.  ...  This paper presents the rationale behind the design of LECTOR and outlines its key features and facilities.  ...  to assist educator in monitoring students' behavior in the classroom and maximizing students' engagement at the task at hand.  ... 
doi:10.1007/978-3-319-72038-8_11 fatcat:qg46qtxp4rct7cjnhoa5wcfkhu
« Previous Showing results 1 — 15 out of 558 results