Une méthode KNN sans paramètre pour prédire les notes des utilisateurs
Abstract
Among the most popular collaborative filtering algorithms are methods based on nearest
neighbors. In their basic operation, KNN methods consider a fixed number of neighbors to
make recommendations. However, it is not easy to choose an appropriate number of neighbors.
Thus, it is generally fixed by calibration to avoid inappropriate values which would negatively
affect the accuracy of the recommendations.
In the literature, some authors have addressed the problem of dynamically finding an appropriate
number of neighbors. But they use additional parameters which limit their proposals
because these parameters also require calibration.
In this paper, we propose a parameter-free KNN method for rating prediction. It is able to
dynamically select an appropriate number of neighbors to use. The experiments that we did on
three publicly available datasets demonstrate the efficiency of our proposal. It rivals those of
the state of the art in their best configurations.