Git Product home page Git Product logo

justinlovinger / optimal-py-learning Goto Github PK

View Code? Open in Web Editor NEW
6.0 3.0 4.0 911 KB

Python machine learning library using powerful numerical optimization methods.

License: MIT License

Python 99.77% Nix 0.23%
python machine-learning supervised-learning multilayer-perceptron mlp neural-network rbf-network ensemble-learning self-organizing-map som optimization gradient-descent bfgs linear-regression regression l-bfgs

optimal-py-learning's Introduction

Learning (beta)

A python machine learning library, with powerful customization for advanced users, and robust default options for quick implementation.

Warning: Learning is in beta. API may change. Breaking changes may be noted in this readme, but no guarantee is given.

Currently supported models:

  • Multilayer perceptron (MLP). Commonly known as a neural network.
  • Linear and logistic regression, including Ridge and Lasso.
  • Radial basis function network (RBF)
  • Probabilistic neural network (PBNN)
  • Self organizing map (SOM)
  • Bagger ensemble

Numerical optimization strategies are also implemented to optimize models:

  • Steepest descent
  • Steepest descent with momentum
  • Broyden–Fletcher–Goldfarb-Shanno (BFGS)
  • Limited-memory Broyden–Fletcher–Goldfarb-Shanno (L-BFGS)
  • Backtracking line search
  • Wolfe line search
  • First order change initial step
  • Quadratic initial step

Installation

Copy the "learning" folder to [python-path]/lib/site-packages

Usage

from learning import datasets, validation, MLP
from learning import SoftmaxTransfer  # To further customize our MLP
from learning import CrossEntropyError  # To customize the error function of our MLP
from learning import optimize  # To customize the training of our MLP

# Grab the popular iris dataset, from our library of datasets
dataset = datasets.get_iris()

# Make a multilayer perceptron to classify the iris dataset
model = MLP(
    # The MLP will take 4 attributes, have 1 hidden layer with 2 neurons,
    # and outputs one of 3 classes
    (4, 2, 3),

    # We will use a softmax output layer for this classification problem
    # Because we are only changing the output transfer, we pass a single
    # Transfer object. We could customize all transfer layers by passing
    # a list of Transfer objects.
    transfers=SoftmaxTransfer(),

    # Cross entropy error will pair nicely with our softmax output.
    error_func=CrossEntropyError(),

    # Lets use the quasi-newton BFGS optimizer for this problem
    # BFGS requires and n^2 operation, where n is the number of weights,
    # but this isn't a problem for our relatively small MLP.
    # If we don't want to deal with optimizers, the default
    # option will select an appropriate optimizer for us.
    optimizer=optimize.BFGS(
        # We can even customize the line search method
        step_size_getter=optimize.WolfeLineSearch(
            # And the initial step size for our line search
            initial_step_getter=optimize.FOChangeInitialStep())))

# NOTE: For rapid prototyping, we could quickly implement an MLP as follows
# model = MLP((4, 2, 3))

# Lets train our MLP
# First, we'll split our dataset into training and testing sets
# Our training set will contain 30 samples from each class
training_set, testing_set = validation.make_train_test_sets(
    *dataset, train_per_class=30)

# We could customize training and stopping criteria through
# the arguments of train, but the defaults should be sufficient here
model.train(*training_set)

# Our MLP should converge in a couple of seconds
# Lets see how our MLP does on the testing set
print 'Testing accuracy:', validation.get_accuracy(model, *testing_set)

For further usage details, see comprehensive doc strings for public functions and classes.

Breaking Changes

03/28/2018

In RBF, replace pre_train_clusters with cluster_incrementally. When True, clusters are trained once before output is trained. When False, cluster are trained incrementally alongside output. This defaults to True, when previously, pre_train_clusters defaulted to False.

03/28/2018

RBF takes clustering_model instead of SOM hyperparameters.

03/28/2018

Change _pre_train and _post_train callbacks to take training dataset.

10/27/2017

Move Model.train pattern_select_func functionality to new Model.stochastic_train method. This improves compatibility with optimizers, by ensuring the optimizer is reset before pattern selection changes the optimization problem.

Also remove base.select_iterative function, because it no longer serves a purpose.

10/16/2017

Rename error functions, so that they end with Error, for greater clarity.

10/16/2017

Rename Model.test -> Model.print_results

optimal-py-learning's People

Contributors

justinlovinger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.