bias collaborative filtering collaborative systems contact recommendation context diversity evaluation finance group recommendation hybrid recommendation information extraction information retrieval information retrieval models information retrieval theory knn learning to rank link prediction metasearch metrics multi-armed bandits multimedia retrieval news retrieval nlp non-random missing data novelty performance prediction personalization popularity probabilistic models query aspect rank aggregation rank fusion recommender systems relation extraction semantic search semantics social networks thompson sampling web search web services
2019 |
R. CaƱamares, M. Redondo, P. Castells.
Multi-Armed Recommender System Bandit Ensembles.
13th ACM Conference on Recommender Systems (RecSys 2019). Copenhagen, Denmark, September 2019, pp. 432-436. Poster |