Features

Hidden Markov Models

  • Baum-Welch reestimation algorithm, scaled forward-backward algorithm, Viterbi algorithm
  • Support for Input-Output Hidden Markov Models
  • Write your own output or transition probability distribution or use the provided distributions, including neural network based conditional probability distributions
  • Neural Networks

Feed-forward backpropagation neural networks of arbitrary topology

  • Configurable error functions with sum of squares, weighted sum of squares
  • Multiple activation functions with logistic sigmoid, linear, tanh, and soft max
  • Choose your weight update rule with standard update rule, standard update rule with momentum, Quickprop, RPROP
  • Online and batch training
  • Support Vector Machines

Fast training with the sequential minimal optimization algorithm

  • Support for linear, polynomial, tanh, radial basis function kernels
  • Decision Trees

Information gain or GINI index split criteria

  • Binary or all attribute value splitting
  • Chi-square signifigance test pruning with configurable confidence levels
  • Boosted decision stumps with AdaBoost
  • K Nearest Neighbors

Fast kd-tree implementation for instance based algorithms of all kinds

  • KNN Classifier with weighted or non-weighted classification, customizable distance function
  • Linear Algebra Algorithms

Basic matrix and vector math, a variety of matrix decompositions based on the standard algorithms

  • Solve square systems, upper triangular systems, lower triangular systems, least squares
  • Singular Value Decomposition, QR Decomposition, LU Decomposition, Schur Decomposition, Symmetric Eigenvalue Decomposition, Cholesky Factorization
  • Make your own matrix decomposition with the easy to use Householder Reflection and Givens Rotation classes
  • Optimization Algorithms

Randomized hill climbing, simulated annealing, genetic algorithms, and discrete dependency tree MIMIC

  • Make your own crossover functions, mutation functions, neighbor functions, probability distributions, or use the provided ones.
  • Optimize the weights of neural networks and solve travelling salesman problems
  • Graph Algorithms

Kruskals MST and DFS

  • Clustering Algorithms

EM with gaussian mixtures, K-means

  • Data Preprocessing

PCA, ICA, LDA, Randomized Projections

  • Convert from continuous to discrete, discrete to binary
  • Reinforcement Learning