Abstract:
The editedknearest neighbor rule (k-NNR) consists of 1) eliminating those samples from the data which are not classified correctly by thek-NNR and the remainder of the da...Show MoreMetadata
Abstract:
The editedknearest neighbor rule (k-NNR) consists of 1) eliminating those samples from the data which are not classified correctly by thek-NNR and the remainder of the data, and 2) using the NNR with the samples which remain from 1) to classify new observations. Wilson has shown that this rule has an asymptotic probability of error which is better than that of thek-NNR. A key step in his development is showing the convergence of the edited nearest neighbor. His lengthy argument is replaced here by a somewhat simpler one which uses an intuitive fact about the editing procedure.
Published in: IEEE Transactions on Information Theory ( Volume: 19, Issue: 5, September 1973)