AWAIS had a classification accuracy of 75 87 using 10-fold cross-

AWAIS had a classification accuracy of 75.87 using 10-fold cross-validation method for diabetes dataset. Polat and G��ne? [15] worked on thoroughly diabetes disease using principal component analysis (PCA) and adaptive neuro-fuzzy inference system (ANFIS). The obtained test classification accuracy was 89.47% by using the 10-fold cross-validation. Polat et al. [16] proposed a new learning system which is cascade and used generalized discriminant analysis and least square support vector machine. The classification accuracy was obtained as 82.05%. Kahramanli and Allahverdi [17] presented a hybrid neural network that achieves accuracy value of 84.24% using artificial neural network (ANN) and fuzzy neural network (FNN) together. Patil et al.

[18] proposed hybrid prediction model (HPM) which uses Simple k-means clustering algorithm for verifying the chosen class labels and then using the classification algorithm on the result set. Accuracy value of HPM was 92.38%. Isa and Mamat [19] presented a modified hybrid multilayer perceptron (HMLP) network for improving the conventional one, and the average correct classification rate of the proposed system was 80.59%. Aibinu et al. [20] proposed a new biomedical signal classification method using complex-valued pseudo autoregressive (CAR) modeling approach. The presented technique obtained a classification accuracy of 81.28%. 2. Preliminaries2.1. Feature SelectionFeature selection provides a smaller but more distinguishing subset compared to the starting data, selecting the distinguishing features from a set of features and eliminating the irrelevant ones.

Reducing the dimension of the data is aimed by finding a small important features set. This results in both reduced processing time and increased classification accuracy. The algorithm developed in this study was based on the sequential forward selection (SFS) algorithm, which is popular in these algorithms. SFS is a method of feature selection offered by Whitney [21]. Sequential forward selection is the simplest greedy search algorithm which starts from the empty set and sequentially adds the feature x+ for obtaining results in the highest objective function J(Yk + x+) when combined with the features Yk that have already been selected. Pseudo code Batimastat is given Pseudocode 1 for SFS [22].Pseudocode 1Pseudo code for SFS [22].In summary, SFS begins with zero attributes and then evaluates the whole feature subsets with only one feature, and the best performing one adds this subset to the best performing feature for subsets of the next larger size. This cycle repeats until there is no improvement in the current subset [23].The objection function is critical for this algorithm.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>