Abstract
K-Nearest Neighbours (k-NN) is a popular classification and regression algorithm, yet one of its main limitations is the difficulty in choosing the number of neighbours. We present a Bayesian algorithm to compute the posterior probability distribution for k given a target point within a data-set, efficiently and without the use of Markov Chain Monte Carlo (MCMC) methods or simulation—alongside an exact solution for distributions within the exponential family. The central idea is that data points around our target are generated by the same probability distribution, extending outwards over the appropriate, though unknown, number of neighbours. Once the data is projected onto a distance metric of choice, we can transform the choice of k into a change-point detection problem, for which there is an efficient solution: we recursively compute the probability of the last change-point as we move towards our target, and thus de facto compute the posterior probability distribution over k. Applying this approach to both a classification and a regression UCI data-sets, we compare favourably and, most importantly, by removing the need for simulation, we are able to compute the posterior probability of k exactly and rapidly. As an example, the computational time for the Ripley data-set is a few milliseconds compared to a few hours when using a MCMC approach.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Cucala L, Marin J-M, Robert C, Titterington M (2008) A Bayesian reassessment of nearest-neighbour classification ArXiv e-prints
Fink D (1997) A compendium of conjugate priors
Ghosh AK (2006) On optimum choice of k in nearest neighbor classification. Comput Stat Data Anal 50(11):3113–3123
Green PJ (1995) Reversible jump markov chain monte carlo computation and bayesian model determination. Biometrika 82(4):711
Guo R, Chakraborty S (2010) Bayesian adaptive nearest neighbor. Stat Anal Data Min 3(2):92–105
Holmes CC, Adams NM (2002) A probabilistic nearest neighbour method for statistical pattern recognition. J Royal Stat Soc Ser B (Stat Methodol) 64(2):295–306
Ji WY, Friel N (2013) Efficient estimation of the number of neighbours in probabilistic K nearest neighbour classification. CoRR, arXiv:1305.1002
Kaya H, Tüfekci P, Gürgen FS (2012) Local and global learning methods for predicting power of a combined gas & steam turbine. In: International conference on emerging trends in computer and electronics engineering (ICETCEE 2012), Dubai
Manocha S, Girolami MA (2007) An empirical analysis of the probabilistic k-nearest neighbour classifier. Pattern Recogn Lett 28(13):1818–1824
Prescott Adams R, MacKay DJC (2007) Bayesian Online Changepoint Detection. ArXiv e-prints
Smith AFM (1975) A bayesian approach to inference about a change-point in a sequence of random variables. Biometrika 62(2):407–416
Stephens DA (1994) Bayesian retrospective multiple-changepoint identification. J Royal Stat Soc Ser C (Appl Stat) 43(1):159–178
Tomasev N, Radovanović M, Mladenić D, Ivanović M (2011) A probabilistic approach to nearest-neighbor classification: Naive hubness bayesian knn. In: Proceedings of the 20th ACM International Conference on Information and Knowledge Management, CIKM ’11. ACM, New York, pp 2173–2176
Weinberger KQ, Saul LK (2009) Distance metric learning for large margin nearest neighbor classification. J Mach Learn Res 10:207–244
Author information
Authors and Affiliations
Corresponding author
Additional information
Primarily at UBS Securities LLC, 1285 Ave. of the Americas, New York, NY 10019
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Nuti, G. An Efficient Algorithm for Bayesian Nearest Neighbours. Methodol Comput Appl Probab 21, 1251–1258 (2019). https://doi.org/10.1007/s11009-018-9670-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11009-018-9670-z