Optimization Technique For Feature Selection And Classification Using Support Vector Machine

2540 Words Mar 23rd, 2015 11 Pages
Optimization Technique for Feature Selection and Classification Using Support Vector Machine

Abstract— Classification problems often have a large number of features in the data sets, but only some of them are useful for classification. Data Mining Performance gets reduced by Irrelevant and redundant features. Feature selection aims to choose a small number of relevant features to achieve similar or even better classification performance than using all features. It has two main objectives are maximizing the classification performance and minimizing the number of features. Moreover, the existing feature selection algorithms treat the task as a single objective problem. Selecting attribute is done by the combination of attribute evaluator and search method using WEKA Machine Learning Tool. We compare SVM classification algorithm to automatically classify the data using selected features with different standard dataset.
Index Terms— Data Mining, Kernel methods, Support Vector Machine, WEKA, Classification.
.
——————————  ——————————
1 INTRODUCTION
S
upport Vector Machine (SVM) was first described in 1992, introduced by Boser, Guyon, and Vapnik. Support vector machines (SVMs) are a set of related supervised learning methods used for classification and regression [1]. They belong to a family of linear classifiers. In other words, Support Vector Machine (SVM) is a tool of classification and…
Open Document