Example usage for weka.classifiers Classifier subclass-usage

List of usage examples for weka.classifiers Classifier subclass-usage

Introduction

In this page you can find the example usage for weka.classifiers Classifier subclass-usage.

Usage

From source file SMO.java

/**
 <!-- globalinfo-start -->
 * Implements John Platt's sequential minimal optimization algorithm for training a support vector classifier.<br/>
 * <br/>
 * This implementation globally replaces all missing values and transforms nominal attributes into binary ones. It also normalizes all attributes by default. (In that case the coefficients in the output are based on the normalized data, not the original data --- this is important for interpreting the classifier.)<br/>
 * <br/>

From source file ID3Chi.java

/**
 * <!-- globalinfo-start --> Class for constructing an unpruned decision tree
 * based on the ID3 algorithm. Can only deal with nominal attributes. No missing
 * values allowed. Empty leaves may result in unclassified instances. For more
 * information see: <br/>
 * <br/>

From source file ann.MyANN.java

/**
 *
 * @author YusufR
 */
public class MyANN extends Classifier {

From source file ann.SingleLayerPerceptron.java

/**
 *
 * @author gifarikautsar
 */
public class SingleLayerPerceptron extends Classifier
        implements TechnicalInformationHandler, Sourcable, Serializable {

From source file assign00.DecisionTreeClassifier.java

/**
 *
 * @author Iceman
 */
public class DecisionTreeClassifier extends Classifier {

From source file assign00.HardCodedClassifier.java

/**
 *
 * @author Iceman
 */
public class HardCodedClassifier extends Classifier {

From source file assign00.KNNClassifier.java

/**
 *
 * @author Iceman
 */
public class KNNClassifier extends Classifier {

From source file assign00.NeuralNetworkClassifier.java

/**
 *
 * @author Iceman
 */
public class NeuralNetworkClassifier extends Classifier {
    Network network;

From source file br.com.ufu.lsi.rebfnetwork.RBFNetwork.java

/**
 <!-- globalinfo-start -->
 * Class that implements a normalized Gaussian radial basisbasis function network.<br/>
 * It uses the k-means clustering algorithm to provide the basis functions and learns either a logistic regression (discrete class problems) or linear regression (numeric class problems) on top of that. Symmetric multivariate Gaussians are fit to the data from each cluster. If the class is nominal it uses the given number of clusters per class.It standardizes all numeric attributes to zero mean and unit variance.
 * <p/>
 <!-- globalinfo-end -->

From source file cerebro.Id3.java

/**
 <!-- globalinfo-start -->
 * Class for constructing an unpruned decision tree based on the ID3 algorithm. Can only deal with nominal attributes. No missing values allowed. Empty leaves may result in unclassified instances. For more information see: <br/>
 * <br/>
 * R. Quinlan (1986). Induction of decision trees. Machine Learning. 1(1):81-106.
 * <p/>