A novel version of k nearest neighbor: Dependent nearest neighbor

dc.authorid0000-0003-0710-0867en_US
dc.authorid0000-0001-7789-6376en_US
dc.contributor.authorErtuğrul, Ömer Faruk
dc.contributor.authorTağluk, Mehmet Emin
dc.date.accessioned2019-07-04T13:12:26Z
dc.date.available2019-07-04T13:12:26Z
dc.date.issued2017-06en_US
dc.departmentBatman Üniversitesi Mühendislik - Mimarlık Fakültesi Elektrik-Elektronik Mühendisliği Bölümüen_US
dc.description.abstractk nearest neighbor (kNN) is one of the basic processes behind various machine learning methods In kNN, the relation of a query to a neighboring sample is basically measured by a similarity metric, such as Euclidean distance. This process starts with mapping the training dataset onto a one-dimensional distance space based on the calculated similarities, and then labeling the query in accordance with the most dominant or mean of the labels of the k nearest neighbors, in classification or regression issues, respectively. The number of nearest neighbors (k) is chosen according to the desired limit of success. Nonetheless, two distinct samples may have equal distances to query but, with different angles in the feature space. The similarity of the query to these two samples needs to be weighted in accordance with the angle going between the query and each of the samples to differentiate between the two distances in reference to angular information. This opinion can be analyzed in the context of dependency and can be utilized to increase the precision of classifier. With this point of view, instead of kNN, the query is labeled according to its nearest dependent neighbors that are determined by a joint function, which is built on the similarity and the dependency. This method, therefore, may be called dependent NN (d-NN). To demonstrate d-NN, it is applied to synthetic datasets, which have different statistical distributions, and 4 benchmark datasets, which are Pima Indian, Hepatitis, approximate Sinc and CASP datasets. Results showed the superiority of d-NN in terms of accuracy and computation cost as compared to other employed popular machine learning methods.en_US
dc.identifier.citationErtuğrul, Ö F., Tağluk, M. E. (2017). A novel version of k nearest neighbor: Dependent nearest neighbor. Applied Soft Computing, 55, pp. 480-490. https://doi.org/10.1016/j.asoc.2017.02.020en_US
dc.identifier.endpage490en_US
dc.identifier.issn1568-4946
dc.identifier.scopusqualityQ1en_US
dc.identifier.startpage480en_US
dc.identifier.urihttps://doi.org/10.1016/j.asoc.2017.02.020
dc.identifier.urihttps://hdl.handle.net/20.500.12402/2188
dc.identifier.volume55en_US
dc.identifier.wosqualityQ1en_US
dc.indekslendigikaynakWeb of Scienceen_US
dc.indekslendigikaynakScopusen_US
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.relation.isversionof10.1016/j.asoc.2017.02.020en_US
dc.relation.journalApplied Soft Computing Journalen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.rightsAttribution-NonCommercial-ShareAlike 3.0 United States*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/us/*
dc.subjectDependencyen_US
dc.subjectDependent Nearest Neighboren_US
dc.subjectk Nearest Neighboren_US
dc.subjectSimilarityen_US
dc.titleA novel version of k nearest neighbor: Dependent nearest neighboren_US
dc.typeArticleen_US

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
İsim:
1-s2.0-S1568494617300984-main.pdf
Boyut:
2.8 MB
Biçim:
Adobe Portable Document Format
Açıklama:
Tam Metin / Full Text
Lisans paketi
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
İsim:
license.txt
Boyut:
1.44 KB
Biçim:
Item-specific license agreed upon to submission
Açıklama: