Git Product home page Git Product logo

sklearn's Introduction

sklearn

Trying to implement Scikit Learn for Python in C++

PREPROCESSING:

  1. Standardization
  2. Normalization
  3. Label Encoding
  4. Label Binarization

REGRESSION:

  1. Least Squares Regression
  2. Multiple Linear Regression

CLASSIFIFCATION:

  1. Gaussian Naive Bayes
  2. Logistic Regression

STANDARDIZATION

SOURCE NEEDED: preprocessing.h, proecessing.cpp and statx.h

StandardScaler will standardize features by removing the mean and scaling to unit variance. ref: Scikit Learn docs

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include "preprocessing.h"

int main()
{
	StandardScaler scaler({0, 0, 1, 1});
	std::vector<double> scaled = scaler.scale();
	// Scaled value and inverse scaling
	for (double i : scaled)
	{
		std::cout << i << " " << scaler.inverse_scale(i) << "\n";
	}
}

NORMALIZATION:

SOURCE NEEDED: preprocessing.h, proecessing.cpp and statx.h

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include "preprocessing.h"

int main()
{
	std::vector<double> normalized_vec = preprocessing::normalize({ 800, 10, 12, 78, 56, 49, 7, 1200, 1500 });
	for (double i : normalized_vec) std::cout << i << " ";
}

LABEL ENCODING:

SOURCE NEEDED: preprocessing.h and preprocessing.cpp

Label encoding is the process of encoding the categorical data into numerical data. For example if a column in the dataset contains country values like GERMANY, FRANCE, ITALY then label encoder will convert this categorical data into numerical data like this

country - categorical country - numerical
GERMANY 1
FRANCE 0
ITALY 2

Example code:

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include <string>
#include "preprocessing.h"

int main()
{
	std::vector<std::string> categorical_data = { "GERMANY", "FRANCE", "ITALY" };
	LabelEncoder<std::string> encoder(categorical_data);
	std::vector<unsigned long int> numerical_data = encoder.fit_transorm();
	for (int i = 0; i < categorical_data.size(); i++)
	{
		std::cout << categorical_data[i] << " - " << numerical_data[i] << "\n";
	}
}

Label Binarization:

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include <string>
#include "preprocessing.h"

int main()
{
    std::vector<std::string> ip_addresses = { "A", "B", "A", "B", "C" };
    LabelBinarizer<std::string> binarize(ip_addresses);
    std::vector<std::vector<unsigned long int>> result = binarize.fit();
    for (std::vector<unsigned long int> i : result)
    {
        for (unsigned long int j : i) std::cout << j << " ";
        std::cout << "\n";
    }
    // Predict
    std::cout << "Prediction:\n-------------\n";
    std::string test = "D";
    std::vector<unsigned long int> prediction = binarize.predict(test);
    for (unsigned long int i : prediction) std::cout << i << " ";
}

LEAST SQUARES REGRESSION(SIMPLE LINEAR REGRESSION)

HEADERS NEEDED: lsr.h and lsr.cpp

Creating new model and saving it:

DATASET:

X y
2 4
3 5
5 7
7 10
9 15
// SWAMI KARUPPASWAMI THUNNAI

#include "lsr.h"

int main()
{
	// X, y, print_debug messages
	simple_linear_regression slr({2, 3, 5, 7, 9}, {4, 5, 7, 10, 15}, DEBUG);
	slr.fit();
	std::cout << slr.predict(8);
	slr.save_model("model.txt");
}

Loading existing model

// SWAMI KARUPPASWAMI THUNNAI

#include "lsr.h"

int main()
{
	// X, y, print_debug messages
	simple_linear_regression slr("model.txt");
	std::cout << slr.predict(8);
}

SAMPLE PREDICTION PLOTTED:

Multiple Linear Regression:

Training and saving the model

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include "mlr.h"

int main()
{
	LinearRegression mlr({ {110, 40}, {120, 30}, {100, 20}, {90, 0}, {80, 10} }, {100, 90, 80, 70, 60}, NODEBUG);
	mlr.fit();
	std::cout << mlr.predict({ 110, 40 });
	mlr.save_model("model.json");
}

Loading the saved model

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include "mlr.h"

int main()
{
	// Don't use fit method here
	LinearRegression mlr("model.json");
	std::cout << mlr.predict({ 110, 40 });
}

Classification - Gaussian Naive Bayes

Classification male - female using height, weight, foot size and saving the model.

HEADERS / SOURCE NEEDED: naive_bayes.h, naive_bayes.cpp, json.h

// SWAMI KARUPPASWAMI THUNNAI

#include "naive_bayes.h"

int main()
{
	gaussian_naive_bayes nb({ {6, 180, 12}, {5.92, 190, 11}, {5.58, 170, 12}, {5.92, 165, 10}, {5, 100, 6}, {5.5, 150, 8}, {5.42, 130, 7}, {5.75, 150, 9} }, { 0, 0, 0, 0, 1, 1, 1, 1 }, DEBUG);
	nb.fit();
	nb.save_model("model.json");
	std::map<unsigned long int, double> probabilities = nb.predict({ 6, 130, 8 });
	double male = probabilities[0];
	double female = probabilities[1];
	if (male > female) std::cout << "MALE";
	else std::cout << "FEMALE";
}

Loading a saved model:

// SWAMI KARUPPASWAMI THUNNAI

#include "naive_bayes.h"

int main()
{
	gaussian_naive_bayes nb(NODEBUG);
	nb.load_model("model.json");
	std::map<unsigned long int, double> probabilities = nb.predict({ 6, 130, 8 });
	double male = probabilities[0];
	double female = probabilities[1];
	if (male > female) std::cout << "MALE";
	else std::cout << "FEMALE";
}

Logistic Regression:

Please do not get confused with the word "regression" in Logistic regression. It is generally used for classification problems. The heart of the logistic regession is sigmoid activation function. An activation function is a function which takes any input value and outputs value within a certain case. In our case(sigmoid), it returns between 0 and 1.

In the image, you can see the output(y) of sigmoid activation function for -3 >= x <= 3

The idea behind the logistic regression is taking the output from linear regression, i.e., y = mx+c and applying logistic function 1/(1+e^-y) which outputs the value between 0 and 1. We can clearly see this is a binary classifier, i.e., for example, it can be used for classifying binary datasets like predicting whether it is a male or a female using certain parameters.

But we can use this logistic regression to classify multi-class problems too with some modifications. Here, we are using the one vs rest principle. That is training many linear regression models, for example, if the class count is 10, it will train 10 Linear Regression models by changing the class values with 1 as the class value to predict the probability and 0 to the rest. If you don't understand, here is a detailed explanation: https://prakhartechviz.blogspot.com/2019/02/multi-label-classification-python.html

We are going to take a simple classification problem to classify whether it is a male or female.

Classification male - female using height, weight, foot size and saving the model. Here is our dataset:

All we have to do is to predict whether the person is male or female using height, weight and foot size.

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include "logistic_regression.h"

int main()
{
    logistic_regression lg({ { 6, 180, 12 },{ 5.92, 190, 11 },{ 5.58, 170, 12 },
        { 5.92, 165, 10 },{ 5, 100, 6 },{ 5.5, 150, 8 },{ 5.42, 130, 7 },{ 5.75, 150, 9 } },
        { 0, 0, 0, 0, 1, 1, 1, 1 }, NODEBUG);
    lg.fit();
    // Save the model
    lg.save_model("model.json");
    std::map<unsigned long int, double> probabilities = lg.predict({ 6, 130, 8 });
    double male = probabilities[0];
    double female = probabilities[1];
    if (male > female) std::cout << "MALE";
    else std::cout << "FEMALE";
}

and loading a saved model:

// SWAMI KARUPPASWAMI THUNNAI

#include <iostream>
#include "logistic_regression.h"

int main()
{
    logistic_regression lg("model.json");
    std::map<unsigned long int, double> probabilities = lg.predict({ 6, 130, 8 });
    double male = probabilities[0];
    double female = probabilities[1];
    if (male > female) std::cout << "MALE";
    else std::cout << "FEMALE";
}

sklearn's People

Contributors

imgbotapp avatar jeandsmith avatar junz1 avatar visweswaran1998 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.