A quantum-inspired multimodal sentiment analysis framework

Yazhou Zhang, Dawei Song, Peng Zhang, Panpan Wang, Jingfei Li, Xiang Li, Benyou Wang
2018 Theoretical Computer Science  
Please cite this article in press as: Y. Zhang et al., A Quantum-Inspired Multimodal Sentiment Analysis Framework, Theoret. Comput. Sci. (2018), https://doi. Abstract Multimodal sentiment analysis aims to capture diversified sentiment information implied in data that are of different modalities (e.g., an image that is associated with a textual description or a set of textual labels). The key challenge is rooted on the "semantic gap" between different lowlevel content features and high-level
more » ... ntic information. Existing approaches generally utilize a combination of multimodal features in a somehow heuristic way. However, how to employ and combine multiple information from different sources effectively is still an important yet largely unsolved problem. To address the problem, in this paper, we propose a Quantum-inspired Multimodal Sentiment Analysis (QMSA) framework. The framework consists of a Quantum-inspired Multimodal Representation (QMR) model (which aims to fill the "semantic gap" and model the correlations between different modalities via density matrix), and a Multimodal decision Fusion strategy inspired by Quantum Interference (QIMF) in the double-slit experiment (in which the sentiment label is analogous to a photon, and the data modalities are analogous to slits). Extensive experiments are conducted on two large scale datasets, which are collected from the Getty Images and Flickr photo sharing platform. The experimental results show that our approach significantly outperforms a wide range of baselines and state-of-the-art methods.
doi:10.1016/j.tcs.2018.04.029 fatcat:rpnlxnvps5fklcqozncurg3tyq