Multimodal Interaction Recognition Mechanism by Using Midas Featured By Data-Level and Decision-Level Fusion

Muhammad Habib, Noor ul Qamar
2017 Lahore Garrison University research journal of computer science and information technology  
Natural User Interfaces (NUI's) dealing with gestures is an alternative of traditional input devices on multi-touch panels. Rate of growth in the Sensor technology has increased the use of multiple sensors to deal with various monitoring and compatibility issues of machines. Research on data-level fusion models requires more focus on the fusion of multiple degradation-based sensor data. Midas, a novel declarative language to express multimodal interaction patterns has come up with the idea of
more » ... velopers required patterns description by employing multi-model interaction mechanism. The language as a base interface deals with minimum complexity issues like controlling inversion and intermediary states by means of data fusion, data processing and data selection provisioning high-level programming abstractions.
doi:10.54692/lgurjcsit.2017.010227 fatcat:cqvkqnfafzf6fkz2bkmjcfybwq