A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Multimodal Interaction Recognition Mechanism by Using Midas Featured By Data-Level and Decision-Level Fusion
2017
Lahore Garrison University research journal of computer science and information technology
Natural User Interfaces (NUI's) dealing with gestures is an alternative of traditional input devices on multi-touch panels. Rate of growth in the Sensor technology has increased the use of multiple sensors to deal with various monitoring and compatibility issues of machines. Research on data-level fusion models requires more focus on the fusion of multiple degradation-based sensor data. Midas, a novel declarative language to express multimodal interaction patterns has come up with the idea of
doi:10.54692/lgurjcsit.2017.010227
fatcat:cqvkqnfafzf6fkz2bkmjcfybwq