Abbas, Muhammad and Memon, Kamran Ali and Ain, Noor ul and Ajebesone, Ekang Francis and Usaid, Muhammad and Bhutto, Zulfiqar Ali (2020) An Improved Weighted Base Classification for Optimum Weighted Nearest Neighbor Classifiers. EAI Endorsed Transactions on Scalable Information Systems, 7 (27): e1. ISSN 2032-9407
|
Text
eai.13-7-2018.163339.pdf - Published Version Available under License Creative Commons Attribution No Derivatives. Download (2MB) | Preview |
Abstract
Existing classification studies use two non-parametric classifiers- k-nearest neighbours (kNN) and decision trees, and one parametric classifier-logistic regression, generating high accuracies. Previous research work has compared the results of these classifiers with training patterns of different sizes to study alcohol tests. In this paper, the Improved Version of the kNN (IVkNN) algorithm is presented which overcomes the limitation of the conventional kNN algorithm to classify wine quality. The proposed method typically identifies the same number of nearest neighbours for each test example. Results indicate a higher Overall Accuracy (OA) that oscillates between 67% and 76%. Among the three classifiers, the least sensitive to the training sample size was the kNN and produced the unrivalled OA, followed by sequential decision trees and logistic regression. Based on the sample size, the proposed IVkNN model presented 80% accuracy and 0.375 root mean square error (RMSE).
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Classification, k-Nearest Neighbor (kNN), Logistic Regression, Decision Trees, Cross-Validation, Machine-Learning (ML), SVM, random forest, improved version of k-nearest neighbor (IVkNN), and Python |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science QA75 Electronic computers. Computer science |
Depositing User: | EAI Editor II. |
Date Deposited: | 22 Oct 2020 12:32 |
Last Modified: | 22 Oct 2020 12:32 |
URI: | https://eprints.eudl.eu/id/eprint/721 |
Actions (login required)
![]() |
View Item |