Modifier and Type | Class and Description |
---|---|
class |
RandomizableClassifier
Abstract utility class for handling settings common to randomizable
classifiers.
|
class |
RandomizableIteratedSingleClassifierEnhancer
Abstract utility class for handling settings common to randomizable
meta classifiers that build an ensemble from a single base learner.
|
class |
RandomizableMultipleClassifiersCombiner
Abstract utility class for handling settings common to randomizable
meta classifiers that build an ensemble from multiple classifiers based
on a given random number seed.
|
class |
RandomizableSingleClassifierEnhancer
Abstract utility class for handling settings common to randomizable
meta classifiers that build an ensemble from a single base learner.
|
Modifier and Type | Class and Description |
---|---|
class |
LibSVM
A wrapper class for the libsvm tools (the libsvm
classes, typically the jar file, need to be in the classpath to use this
classifier).
LibSVM runs faster than SMO since it uses LibSVM to build the SVM classifier. LibSVM allows users to experiment with One-class SVM, Regressing SVM, and nu-SVM supported by LibSVM tool. |
class |
MultilayerPerceptron
A Classifier that uses backpropagation to classify instances.
This network can be built by hand, created by an algorithm or both. |
Modifier and Type | Class and Description |
---|---|
class |
AdaBoostM1
Class for boosting a nominal class classifier using the Adaboost M1 method.
|
class |
Bagging
Class for bagging a classifier to reduce variance.
|
class |
CostSensitiveClassifier
A metaclassifier that makes its base classifier cost-sensitive.
|
class |
CVParameterSelection
Class for performing parameter selection by cross-validation for any classifier.
For more information, see: R. |
class |
Dagging
This meta classifier creates a number of disjoint, stratified folds out of the data and feeds each chunk of data to a copy of the supplied base classifier.
|
class |
Decorate
DECORATE is a meta-learner for building diverse ensembles of classifiers by using specially constructed artificial training examples.
|
class |
END
A meta classifier for handling multi-class datasets with 2-class classifiers by building an ensemble of nested dichotomies.
For more info, check Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems. |
class |
Grading
Implements Grading.
|
class |
GridSearch
Performs a grid search of parameter pairs for the a classifier (Y-axis, default is LinearRegression with the "Ridge" parameter) and the PLSFilter (X-axis, "# of Components") and chooses the best pair found for the actual predicting.
The initial grid is worked on with 2-fold CV to determine the values of the parameter pairs for the selected type of evaluation (e.g., accuracy). |
class |
LogitBoost
Class for performing additive logistic regression.
|
class |
MetaCost
This metaclassifier makes its base classifier cost-sensitive using the method specified in
Pedro Domingos: MetaCost: A general method for making classifiers cost-sensitive. |
class |
MultiBoostAB
Class for boosting a classifier using the MultiBoosting method.
MultiBoosting is an extension to the highly successful AdaBoost technique for forming decision committees. |
class |
MultiClassClassifier
A metaclassifier for handling multi-class datasets with 2-class classifiers.
|
class |
MultiScheme
Class for selecting a classifier from among several using cross validation on the training data or the performance on the training data.
|
class |
RacedIncrementalLogitBoost
Classifier for incremental learning of large datasets by way of racing logit-boosted committees.
For more information see: Eibe Frank, Geoffrey Holmes, Richard Kirkby, Mark Hall: Racing committees for large datasets. |
class |
RandomCommittee
Class for building an ensemble of randomizable base classifiers.
|
class |
RandomSubSpace
This method constructs a decision tree based classifier that maintains highest accuracy on training data and improves on generalization accuracy as it grows in complexity.
|
class |
RotationForest
Class for construction a Rotation Forest.
|
class |
Stacking
Combines several classifiers using the stacking method.
|
class |
StackingC
Implements StackingC (more efficient version of stacking).
For more information, see A.K. |
class |
ThresholdSelector
A metaclassifier that selecting a mid-point threshold on the probability output by a Classifier.
|
class |
Vote
Class for combining classifiers.
|
Modifier and Type | Class and Description |
---|---|
class |
ClassBalancedND
A meta classifier for handling multi-class datasets with 2-class classifiers by building a random class-balanced tree structure.
For more info, check Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems. |
class |
DataNearBalancedND
A meta classifier for handling multi-class datasets with 2-class classifiers by building a random data-balanced tree structure.
For more info, check Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems. |
class |
ND
A meta classifier for handling multi-class datasets with 2-class classifiers by building a random tree structure.
For more info, check Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems. |
Modifier and Type | Class and Description |
---|---|
class |
MIEMDD
EMDD model builds heavily upon Dietterich's Diverse Density (DD) algorithm.
It is a general framework for MI learning of converting the MI problem to a single-instance setting using EM. |
Modifier and Type | Class and Description |
---|---|
class |
BFTree
Class for building a best-first decision tree classifier.
|
class |
RandomForest
Class for constructing a forest of random trees.
For more information see: Leo Breiman (2001). |
class |
RandomTree
Class for constructing a tree that considers K
randomly chosen attributes at each node.
|
class |
REPTree
Fast decision tree learner.
|
class |
SimpleCart
Class implementing minimal cost-complexity pruning.
Note when dealing with missing values, use "fractional instances" method instead of surrogate split method. For more information, see: Leo Breiman, Jerome H. |
Modifier and Type | Class and Description |
---|---|
class |
Cobweb
Class implementing the Cobweb and Classit clustering algorithms.
Note: the application of node operators (merging, splitting etc.) in terms of ordering and priority differs (and is somewhat ambiguous) between the original Cobweb and Classit papers. |
class |
EM
Simple EM (expectation maximisation) class.
EM assigns a probability distribution to each instance which indicates the probability of it belonging to each of the clusters. |
class |
FarthestFirst
Cluster data using the FarthestFirst algorithm.
For more information see: Hochbaum, Shmoys (1985). |
class |
RandomizableClusterer
Abstract utility class for handling settings common to randomizable
clusterers.
|
class |
RandomizableDensityBasedClusterer
Abstract utility class for handling settings common to randomizable
clusterers.
|
class |
RandomizableSingleClustererEnhancer
Abstract utility class for handling settings common to randomizable
clusterers.
|
class |
sIB
Cluster data using the sequential information bottleneck algorithm.
Note: only hard clustering scheme is supported. |
class |
SimpleKMeans
Cluster data using the k means algorithm
Valid options are:
|
class |
XMeans
Cluster data using the X-means algorithm.
X-Means is K-Means extended by an Improve-Structure part In this part of the algorithm the centers are attempted to be split in its region. |
Modifier and Type | Class and Description |
---|---|
class |
MiddleOutConstructor
The class that builds a BallTree middle out.
For more information see also: Andrew W. |
Modifier and Type | Class and Description |
---|---|
class |
ClassificationGenerator
Abstract class for data generators for classifiers.
|
class |
ClusterGenerator
Abstract class for cluster data generators.
|
class |
DataGenerator
Abstract superclass for data generators that generate data for classifiers
and clusterers.
|
class |
RegressionGenerator
Abstract class for data generators for regression classifiers.
|
Modifier and Type | Class and Description |
---|---|
class |
Agrawal
Generates a people database and is based on the paper by Agrawal et al.:
R. |
class |
BayesNet
Generates random instances based on a Bayes network.
|
class |
LED24
This generator produces data for a display with 7 LEDs.
|
class |
RandomRBF
RandomRBF data is generated by first creating a random set of centers for each class.
|
class |
RDG1
A data generator that produces data randomly by producing a decision list.
The decision list consists of rules. Instances are generated randomly one by one. |
Modifier and Type | Class and Description |
---|---|
class |
Expression
A data generator for generating y according to a given expression out of randomly generated x.
E.g., the mexican hat can be generated like this: sin(abs(a1)) / abs(a1) In addition to this function, the amplitude can be changed and gaussian noise can be added. |
class |
MexicanHat
A data generator for the simple 'Mexian Hat' function:
y = sin|x| / |x| In addition to this simple function, the amplitude can be changed and gaussian noise can be added. |
Modifier and Type | Class and Description |
---|---|
class |
BIRCHCluster
Cluster data generator designed for the BIRCH System
Dataset is generated with instances in K clusters. Instances are 2-d data points. Each cluster is characterized by the number of data points in itits radius and its center. |
class |
SubspaceCluster
A data generator that produces data points in hyperrectangular subspace clusters.
|
Copyright © 2019 University of Waikato, Hamilton, NZ. All rights reserved.