Filters








4,040 Hits in 3.1 sec

Probabilistic collaborative filtering with negative cross entropy

Alejandro Bellogin, Javier Parapar, Pablo Castells
2013 Proceedings of the 7th ACM conference on Recommender systems - RecSys '13  
Then we replace the rating prediction by weighted average in CF with the negative cross entropy (from IR) to incorporate the information learnt from the RM: r(u, i) = H(p(·|R u ); p(·|i)|N k (u)) = v∈  ...  Further improvements are achieved when we use a complete probabilistic representation of the problem.  ...  . • The complete probabilistic model (RMCE) achieves even larger improvements. • Improvement in performance are consistent across different datasets We have also produced different mappings the involved  ... 
doi:10.1145/2507157.2507191 dblp:conf/recsys/BelloginPC13 fatcat:txhoxfpkwjfwngmungvfqp7hzu

Variational Autoencoders for Sparse and Overdispersed Discrete Data [article]

He Zhao, Piyush Rai, Lan Du, Wray Buntine, Mingyuan Zhou
2019 arXiv   pre-print
We conduct extensive experiments on three important problems from discrete data analysis: text analysis, collaborative filtering, and multi-label learning.  ...  Compared with several state-of-the-art baselines, the proposed models achieve significantly better performance on the above problems.  ...  With extensive experiments on both large-scale bagof-words corpora and collaborative filtering datasets, we demonstrate that the negative-binomial distribution and its binary variant are better choices  ... 
arXiv:1905.00616v2 fatcat:nemcgjoaszfzjb7ufw7xffvuiu

Enhancing VAEs for collaborative filtering

Daeryong Kim, Bongwon Suh
2019 Proceedings of the 13th ACM Conference on Recommender Systems - RecSys '19  
Neural network based models for collaborative filtering have started to gain attention recently.  ...  We also show that VampPriors coupled with gating mechanisms outperform SOTA results including the Variational Autoencoder for Collaborative Filtering by meaningful margins on 2 popular benchmark datasets  ...  Revisiting equation (3) , we can see that only the cross-entropy term is associated with the prior ( ).  ... 
doi:10.1145/3298689.3347015 dblp:conf/recsys/KimS19 fatcat:evkubhaurjhhnlcqr22psgcjcy

Neural Collaborative Ranking [article]

Bo Song, Xin Yang, Yi Cao, Congfu Xu
2018 arXiv   pre-print
We combine our classification strategy with the recently proposed neural collaborative filtering framework, and propose a general collaborative ranking framework called Neural Network based Collaborative  ...  NeuMF assumes that the non-interacted items are inherent negative and uses negative sampling to relax this assumption.  ...  filtering method with binary cross-entropy loss.  ... 
arXiv:1808.04957v1 fatcat:f7rn4jtn3rbzlmkka5ixamntbi

DeepCF: A Unified Framework of Representation Learning and Matching Function Learning in Recommender System [article]

Zhi-Hong Deng, Ling Huang, Chang-Dong Wang, Jian-Huang Lai, Philip S. Yu
2019 arXiv   pre-print
To this end, we propose a general framework named DeepCF, short for Deep Collaborative Filtering, to combine the strengths of the two types of methods and overcome such flaws.  ...  performs deep matrix factorization with normalized cross entropy loss as loss function.  ...  To perform collaborative filtering on implicit data which lacks real negative feedback is also known as the One-Class Collaborative Filtering (OCCF) problem (Pan et al. 2008 ).  ... 
arXiv:1901.04704v1 fatcat:tbwqzkjoqfgt7ciidwathwcy5m

Implicit Feedback Deep Collaborative Filtering Product Recommendation System [article]

Karthik Raja Kalaiselvi Bhaskar, Deepa Kundur, Yuri Lawryshyn
2020 arXiv   pre-print
In this paper, several Collaborative Filtering (CF) approaches with latent variable methods were studied using user-item interactions to capture important hidden variations of the sparse customer purchasing  ...  CF with Neural Collaborative Filtering(NCF) was shown to produce the highest Normalized Discounted Cumulative Gain (NDCG) performance on the real-world proprietary dataset provided by a large parts supply  ...  Matrix Factorization with Alternating Least Square (ALS) [10] , Bayesian Personalized Ranking (BPR) [12] , Neural Collaborative Filtering (NCF) [11] and Autoencoder for Collaborative Filtering (ACF  ... 
arXiv:2009.08950v2 fatcat:h6jysprlhzckjoo2kfccs72rxi

DeepCF: A Unified Framework of Representation Learning and Matching Function Learning in Recommender System

Zhi-Hong Deng, Ling Huang, Chang-Dong Wang, Jian-Huang Lai, Philip S. Yu
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
To this end, we propose a general framework named DeepCF, short for Deep Collaborative Filtering, to combine the strengths of the two types of methods and overcome such flaws.  ...  performs deep matrix factorization with normalized cross entropy loss as loss function.  ...  To perform collaborative filtering on implicit data which lacks real negative feedback is also known as the One-Class Collaborative Filtering (OCCF) problem (Pan et al. 2008 ).  ... 
doi:10.1609/aaai.v33i01.330161 fatcat:wgz74gy6unb2rfjojl4fwvgj5u

Deep Exercise Recommendation Model

Tuanji Gong, Xuanxia Yao
2019 International Journal of Modeling and Optimization  
In this paper, we propose a new hybrid recommendation model that combines deep collaborative filtering (DeepCF) component with wide linear component.  ...  We use tightly couple model to combine SDAE model and collaborative filter model.  ...  We use cross entropy loss as loss function as follows: (1 ) ( , ) (1 ) C.  ... 
doi:10.7763/ijmo.2019.v9.677 fatcat:zlghod2ugjetlghcod3onoewgq

Is Simple Better? Revisiting Non-Linear Matrix Factorization for Learning Incomplete Ratings

Vaibhav Krishna, Tian Guo, Nino Antulov-Fantulin
2018 2018 IEEE International Conference on Data Mining Workshops (ICDMW)  
Matrix factorization techniques have been widely used as a method for collaborative filtering for recommender systems.  ...  Secondly, the architecture built is compared with deep-learning algorithms like Restricted Boltzmann Machine and state-of-the-art Deep Matrix factorization techniques.  ...  The model was trained through a cross-entropy loss function designed to incorporate the explicit ratings into cross entropy, and use both implicit and explicit ratings for optimization.  ... 
doi:10.1109/icdmw.2018.00183 dblp:conf/icdm/KrishnaGA18 fatcat:fpimimjpufa33bpyywkf33moxu

Collaborative Distillation for Top-N Recommendation [article]

Jae-woong Lee, Minjin Choi, Jongwuk Lee, Hyunjung Shim
2019 arXiv   pre-print
To address the issues, we propose a new KD model for the collaborative filtering approach, namely collaborative distillation (CD).  ...  Specifically, (1) we reformulate a loss function to deal with the ambiguity of missing feedback. (2) We exploit probabilistic rank-aware sampling for the top-N recommendation. (3) To train the proposed  ...  Collaborative Filtering Loss We design an improved collaborative filtering loss function to overcome the uncertainty of implicit data representation in CF.  ... 
arXiv:1911.05276v1 fatcat:bymyvyvsnvgo3ckeqcczgdw3ji

Online learning for collaborative filtering

Guang Ling, Haiqin Yang, Irwin King, Michael R. Lyu
2012 The 2012 International Joint Conference on Neural Networks (IJCNN)  
To capture these changes, in this paper, we develop an online learning framework for collaborative filtering.  ...  Collaborative filtering (CF), aiming at predicting users' unknown preferences based on observational preferences from some users, has become one of the most successful methods to building recommender systems  ...  Probabilistic Matrix Factorization Probabilistic Matrix Factorization (PMF) adopts a probabilistic linear model with Gaussian observation noise [11] .  ... 
doi:10.1109/ijcnn.2012.6252670 dblp:conf/ijcnn/LingYKL12 fatcat:lmeqdxo47nbo5ai6mspo6wa4dm

An approach to the extraction of preference-related information from design team language

Haifeng Ji, Maria C. Yang, Tomonori Honda
2011 Research in Engineering Design  
are applied to convert these ratings into values that may be compared to the results of transcript analysis: the application of a modified Logit model and simulation based on the principle of maximum entropy  ...  The probabilistic approach proposed in the paper represents how likely a choice is to be "most preferred" by a design team over a given period of time.  ...  Collaborative filtering assumes that individuals with similar profiles gravitate toward the same choices.  ... 
doi:10.1007/s00163-011-0116-7 fatcat:fs2d3dahlrd5rcp4ieg3lji6eu

Variational Autoencoders for Collaborative Filtering

Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, Tony Jebara
2018 Proceedings of the 2018 World Wide Web Conference on World Wide Web - WWW '18  
This non-linear probabilistic model enables us to go beyond the limited modeling capacity of linear factor models which still largely dominate collaborative filtering research.  ...  We extend variational autoencoders (vaes) to collaborative filtering for implicit feedback.  ...  It is also used in the cross-entropy loss 2 for multi-class classification.  ... 
doi:10.1145/3178876.3186150 dblp:conf/www/LiangKHJ18 fatcat:baidkwo2kvaldh3mr4meqlbxaa

Variational Autoencoders for Collaborative Filtering [article]

Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, Tony Jebara
2018 arXiv   pre-print
This non-linear probabilistic model enables us to go beyond the limited modeling capacity of linear factor models which still largely dominate collaborative filtering research.We introduce a generative  ...  We extend variational autoencoders (VAEs) to collaborative filtering for implicit feedback.  ...  It is also used in the cross-entropy loss 2 for multi-class classification.  ... 
arXiv:1802.05814v1 fatcat:qtdx2jcdfvdbjmfdtprcjxwasi

Epileptic Seizure Detection using Deep Learning Approach

Sirwan Tofiq Jaafar, Mokhtar Mohammadi
2019 UHD Journal of Science and Technology  
Many methods have been developed to help the neurophysiologists to detect the seizure activities with high accuracy.  ...  The 5-fold cross-validation is selected for evaluating the performance of the proposed method. About 97.75% of the accuracy is achieved.  ...  The accuracy 92.80% with fuzzy entropy and 95.33% with distribution entropy achieved.  ... 
doi:10.21928/uhdjst.v3n2y2019.pp41-50 fatcat:rupuw2sufbehncguhyp4aacvfe
« Previous Showing results 1 — 15 out of 4,040 results