Name        : shogun                       Relocations: (not relocatable)
Version     : 3.2.0                             Vendor: ALT Linux Team
Release     : alt1                          Build Date: Tue Jun  3 20:12:51 2014
Install date: (not installed)               Build Host: real-sisyphus.hasher.altlinux.org
Group       : Sciences/Mathematics          Source RPM: (none)
Size        : 22491035                         License: GPL v3 or later
Packager    : Eugeny A. Rostovtsev (REAL) <real at altlinux.org>
URL         : http://www.shogun-toolbox.org/
Summary     : A Large Scale Machine Learning Toolbox
Description :
The machine learning toolbox's focus is on large scale kernel methods and
especially on Support Vector Machines (SVM). It provides a generic SVM
object interfacing to several different SVM implementations, among them the
state of the art LibSVM and SVMlight.  Each of the SVMs can be
combined with a variety of kernels. The toolbox not only provides efficient
implementations of the most common kernels, like the Linear, Polynomial,
Gaussian and Sigmoid Kernel but also comes with a number of recent string
kernels as e.g. the Locality Improved, Fischer, TOP, Spectrum,
Weighted Degree Kernel (with shifts). For the latter the efficient
LINADD optimizations are implemented.  Also SHOGUN offers the freedom of
working with custom pre-computed kernels.  One of its key features is the
``combined kernel'' which can be constructed by a weighted linear combination
of a number of sub-kernels, each of which not necessarily working on the same
domain. An optimal sub-kernel weighting can be learned using Multiple Kernel
Learning.

Currently SVM 2-class classification and regression problems can be dealt
with. However SHOGUN also implements a number of linear methods like Linear
Discriminant Analysis (LDA), Linear Programming Machine (LPM), (Kernel)
Perceptrons and features algorithms to train hidden markov models.
The input feature-objects can be dense, sparse or strings and
of type int/short/double/char and can be converted into different feature types.
Chains of ``preprocessors'' (e.g. substracting the mean) can be attached to
each feature object allowing for on-the-fly pre-processing.