Git Product home page Git Product logo

100daysofmlcodechallenge's Introduction

100 Days of ML Code Challenge

Codacy Badge

100 Days of Machine Learning Coding as proposed by Siraj Raval

Machine Learning

Day 0 :- Gather all the tools for Data Science.

Today's Work :- Today I have installed all the tools and packages required for this challenge.

Day 1 :- Data Preprocessing

Check out the code from here.

Today's Work :- I have completed the most crucial Data Preprocessing step on the following dataset.

Day 2 :- Simple Linear Regression

Check out the code from here.

Today's work:- I have applied Simple Linear Regression on the following dataset and obtained following graphs for training and test prediction.

Day 3 :- Multiple Linear Regression

Check out the code from here.

Today's work:- I have applied Multiple Linear Regression on the following dataset and also applied Backward Elimintion method to get the best model.

Day 4 :- Polynomial Regression

Check out the code from here.

Today's work:- I have applied Polynomial Regression on the following dataset to predict weather a employee is telling truth or bluff about his salary and got the following graphs for Linear and Polynomial Regression.

Day 5 :- Support Vector Regression(SVR)

Check out the code from here.

Today's work:- I have applied Support Vector Regression(SVR) on the following dataset to predict weather a employee is telling truth or bluff about his salary and got the following graphs for Linear and Polynomial Regression.

Day 6 :- Decision Tree Regression

Check out the code from here.

Today's work:- I have applied Decision Regression on the following dataset to predict weather a employee is telling truth or bluff about his salary and got the following graph for Decision Tree Regression and and also ploted the Tree.

Day 7 :- Random Forest Regression

Check out the code from here.

Today's work:- I have applied Random Forest Regression on the following dataset to predict weather a employee is telling truth or bluff about his salary and got the following graphs for Decision Tree 10, 100 and 500 Decision Trees.

Day 8 :- R square and Estimated R square

Today's work:- I have learnt about the R square and Estimated R square and also find it on the the following dataset by various studied algorithms till now. I also studied about the pros and cons of various algorithms which I have studied till now.

Project :- Board Game Review Predictions

Day 9 to 11 :-

Check out the Analysis from here

Project's work:- I have done a project for board game review prediction using regression on the following dataset with all knowledge of regresssion and data preprocessing obtained till now. I also learnt and used new techniques of regression in this project.

Day 12 :- Logistic Regression

Check out the code from here

Today's work:- I have applied Logistic Regression on the following dataset to predict weather a person buy's a SUV car for a company and obtained the following graphs for training and test data sets.

Day 13 :- K Nearest Neighbours(K-NN)

Check out the code from here

Today's work:- I have applied K-NN Classifier on the following dataset to predict weather a person buy's a SUV car for a company and obtained the following graphs for training and test data sets.

Day 14 :- Support Vector Machine(SVM)

Check out the code from here

Today's work:- I have applied Linear Support Vector Machine(SVM) on the following dataset to predict weather a person buy's a SUV car for a company and obtained the following graphs for training and test data sets.

Day 15 :- Kernal Support Vector Machine(K-SVM)

Check out the code from here

Today's work:- I have applied Kernal Support Vector Machine(K-SVM) on the following dataset to predict weather a person buy's a SUV car for a company and obtained the following graphs for training and test data sets.

Day 16 :- Naive Bayes

Check out the code from here

Today's work:- Today I have learnt about the Bayes Theoram and its application. Then, I have applied Naive-Bayes algorithm on the following dataset to predict weather a person buy's a SUV car for a company and obtained the following graphs for training and test data sets.

Day 17 :- Decision Tree Classification

Check out the code from here

Today's work:- I have applied Decision Tree Classification on the following dataset to predict weather a person buy's a SUV car for a company and obtained the following graphs for training and test data sets and I also visulaized it by ploting an actuall tree.

Day 18 :- Random Forest Classification

Check out the code from here

Today's work:- I have applied Random Forest Classification on the following dataset to predict weather a person buy's a SUV car for a company and obtained the following graphs for training and test data sets. Then I compared the results with changing the values of number of trees and obtained the following graph.

Day 19 :- Cumuative Accuracy Paradox Curve(CAP Curve)

Today's work:- Today, I studied about the Accuraccy Paradox of the Confusion Matrix and also analyse it using CAP Curve analysis method for getting the best possible model in various classification algorithms. I also find various pros and cons of various classification algorithms.

Project :- Credit Card Fraud Detection

Day 20 to 22 :-

Check out the Jupyter Notebook from here

Project's work:- I have done a project for credit card fraud detection using classification with all knowledge of classification and data preprocessing obtained till now. I also learnt and used new techniques of classification in this project.

Day 23 :- K-Means Clustering

Check out the code from here

Today's work:- I have applied K-Means Clustering algorithm on the following dataset to predict spending score of person based on their anual income in a mall by making clusters. I first aplied the Elbow Method to obtain the threshhold value of K as shown in following graph in python and R, then plotted the curve based on value of k obtained and plotted the following graphs in Python and R.

Python R

Project :- K Means Clustering for Imagery Analysis

Day 24 to 26 :-

Check out the project here

Check out the Jupyter Notebook from here

Project's work:- In this project, we will use a K-means algorithm to perform image classification. Clustering isn't limited to the consumer information and population sciences, it can be used for imagery analysis as well. Leveraging Scikit-learn and the MNIST dataset, we will investigate the use of K-means clustering for computer vision.

For this project, we will be using the MNIST dataset. It is available through keras, a deep learning library we have used in previous tutorials. Although we won't be using other features of keras today, it will save us time to import mnist from this library. It is also available through the tensorflow library or for download at http://yann.lecun.com/exdb/mnist/.

Day 27 :- Hierarchical-Clustering

Check out the code from here

Today's work:- I have applied Hierarchical-Clustering algorithm on the following dataset to predict spending score of person based on their anual income in a mall by making clusters. I first find out the Dendogram of the dataset to obtain the threshhold value of K as shown in following Dendogram in python and R, then plotted the curve based on value of k obtained and plotted the following graphs in Python and R.

Python R

Day 28 :- Apiori Association Rule Learning Algorithm

Check out the code from here

Today's work:- I have applied Apiori Association Rule Learning algorithm on the following dataset to pridict the optimized placement of objects in a supermarket for maximizing the revenue.

Day 29 :- Eclat Association Rule Learning Algorithm

Check out the code from here

Today's work:- I have applied Eclat Association Rule Learning algorithm on the following dataset to pridict the optimized placement of objects in a supermarket for maximizing the sale.

Day 30 :- Upper Confidence Bound (UCB)

Check out the code from here

Today's work:- I have applied Upper Confidence Bound (UCB) Reinforcement Learning Algorithm on the following dataset to pridict the best Ad for a car company based on their performance in different social media. It was done by giving 1 reward for if the user interacted with the Ad and 0 if he does't. I obtained the following Histogram based on the prediction.

Day 31 :- Thompson Sampling

Check out the code from here

Today's work:- I have applied Thompson Sampling Reinforcement Learning Algorithm on the following dataset to pridict the best Ad for a car company based on their performance in different social media. It was done by giving 1 reward for if the user interacted with the Ad and 0 if he does't. I obtained the following Histogram based on the prediction.

Day 32 :- Natural Language Processing

Check out the code from here

Today's work:- I have processed the following dataset for getting the best sparse matrix for pridicting weather the customer likes the food in a restaurant based on their reviews. I first removed the unnecessory words using NLP then obtained a sparse matrix of maximum 1500 features. I then, applied the Naive Bayes Classification algorithm and Random Forest Classification algorithm to pridict weather the customer likes the food or not.

Project :- Text Classification with NLTK and Scikit-learn

Day 33 to 35 :-

Check out the Jupyter Notebook from here or deployment here

Project's work:- Text classification using a simple support vector classifier on a dataset of positive and negative movie reviews. The data set we will be using comes from the UCI Machine Learning Repository. It contains over 5000 SMS labeled messages that have been collected for mobile phone spam research. It can be downloaded from the following URL:

Project :- Natural Language Processing

Day 36 to 38 :-

Check out the Jupyter Notebook from here or deployment here

Project's work:- I have done a project of Natural Language Processing in movies review corpus from nlkt library in which I have classified weather a review is possitive or negative using Support Vector Classifier.

Day 39 :- Principal Component Analysis (PCA)

Check out the code from here

Today's work:- I have processed the following dataset which has 12 features to reduce it to lesser no of fetures based on Principal Component Analysis Method which works on varience of feture. Then, I applied Support Vector Machine algorithm to pridict the customer segment based on their wine selection.

Project :- Data Compression and Visualization using Principle Component Analysis (PCA)

Day 40 to 42 :-

Check out the Jupyter Notebook from here or deployment here

Project's work:- This project would focus on mapping high dimensional data to a lower dimensional space, a necessary step for projects that utilize data compression or data visualizations. As the ethical discussions surrounding AI continue to grow, scientists and businesses alike are using visualizations of high dimensional data to explain results.

During this project, we will perform K Means clustering on the well known Iris data set, which contains 3 classes of 50 instances each, where each class refers to a type of iris plant. To visualize the clusters, we will use principle component analysis (PCA) to reduce the number of features in the dataset..

Day 43 :- Linear Discriminant Analysis (LDA)

Check out the code from here

Today's work:- I have processed the following dataset which has 12 features to reduce it to lesser no of fetures based on Linear Discriminant Analysis Method. Then, I applied Support Vector Machine algorithm to pridict the customer segment based on their wine selection.

Day 44 :- Kernal PCA

Check out the code from here

Today's work:- I have processed the following dataset to reduce the dataset to lesser no of fetures based on Kernal PCA of Gaussian Kernal. Then, I applied Logistic Regression to pridict the weather a person buy's a SUV car for a company.

Day 45 :- K-Fold Cross Validations

Check out the code from here

Today's work:- I have Applied K-Fold Cross Validations Technique on the following dataset for dividing the dataset into 10 parts to get the best model and got an average accuraccy of 91 percent.

Day 46 :- Grid Search

Check out the code from here

Today's work:- I have Applied Grid Search Algorithm on the following dataset for getting the best model. I first applied linear kernal with penalities 1, 10, 100, 1000 then applied rbf kernal with penalities 1, 10, 100, 1000 and got an average accuraccy of 90 percent.

Day 47 :- XGBoost

Check out the code from here

Today's work:- I have applied XGBoost on the following dataset for obtaining the best model and got an accuracy of 87 percent.

Deep Learning

Day 48 :- Artificial Neural Networks

Check out the code from here

Today's work:- I have created Artificial Neural Network for the following dataset for predicting weather the customer of the bank exited or not by creating a Artificial Neural Network of one input layer having 6 nodes, one hidden layer of 6 nodes and one output layer having 1 nodes. I further obtained an accuracy of 86 precent by obtaining a confusion matrix.

Day 49 :- Convolutional Neural Networks(CNN)

Check out the code from here

Today's work:- I have created Convolutional Neural Network for predicting weather the image is of cat or dog using 5000 images of both(4000 training and 1000 test) and got an Accuracy of 85 percent. I first applied the convolution operation with ReLU as activation function on each image then, applied pooling step and then flatterened the result and fed it into the Neural Network and obtained the above prediction successfully.

Day 50 :- Recurrent Neural Networks(CNN)

Check out the code from here

Today's work:- I have created Recurrent Neural Network for predicting weather the google stock price of 2017 using previous 3 months data to pridict next day's stock price. I first applied Sequential Kreas model with 50 inputs neurons then 3 hidden layers of 50 neurons each and finally one output layer of one neuron , finally obtained the following graph.

Day 51 :- Self Organizing Maps (SOM)

Check out the code from here

Today's work:- I have created Self Organizing Maps (SOM) on dataset for predicting weather the user is fraud or not when he applies for a cridit card application. I have used a third party library minisom to obtain the following Self Organizing Map.

Day 52 :- Going from Supervised to Unsupervised Learning

Check out the code from here

Today's work:- Yesterday, I have created Supervised Self Organizing Maps (SOM) on dataset for predicting weather the user is fraud or not when he applies for a cridit card application. I have used a third party library minisom to obtain the Self Organizing Map. Today, I have made changes to existing one and takeing it from Supervised to UnSupervised Learning by creating a Sequential Kreas Neural Network with adam optimizer.

Day 53 :- Boltzmann Machine

Check out the code from here

Today's work:- I have created UnSupervised movie recommendation system for recommending a movie to a new user using Deep Boltzmann Machine.

Day 54 :- AutoEncoders

Check out the code from here

Today's work:- I have created UnSupervised movie recommendation system for recommending a movie to a new user using AutoEncoders.

Project :- Chatbot

Day 55 to 58 :-

Check out the code from here

Project's work:- Worked on a chatbot project based on Sequence to Sequence (seq2seq) Deep Natural Language Processing model in which I have used Cornell Movie-Dialogs Dataset which consist of 220,579 conversational exchanges between 10,292 pairs of movie characters which involves 9,035 characters from 617 movies.

Computer Vision

Day 59:- Face Detection

Check out the code from here

Today's work:- I have applied opencv for detecting faces using Cascade Classifier with haarcascade_frontalface_default.xml as detector.

Day 60:- Smile Detection

Check out the code from here

Today's work:- I have applied opencv for detecting smile in face using Cascade Classifier with haarcascade_smile.xml as detector.

Day 61:- Object Detection

Check out the code from here

Today's work:- I have applied Single Shot Detection(SSD) algorithmn for object detection on funny_dog.mp4 for detecting objects in this video.

Project :- Object recognition on the CIFAR-10 image dataset of 2015.

Day 62 to 65 :-

Check out the project here

Check out the Jupyter Notebook from here

Project's work:- In this project, I have deployed a convolutional neural network (CNN) for object recognition. More specifically, I have used the All-CNN network published in the 2015 ICLR paper, "Striving For Simplicity: The All Convolutional Net". This paper can be found at the following link

Project :- The Super Resolution Convolutional Neural Network for Image Restoration.

Day 66 to 68 :-

Check out the project here

Check out the Jupyter Notebook from here

Project's work:- The goal of super-resolution (SR) is to recover a high resolution image from a low resolution input, or as they might say on any modern crime show, enhance!

To accomplish this goal, we will be deploying the super-resolution convolution neural network (SRCNN) using Keras. This network was published in the paper, "Image Super-Resolution Using Deep Convolutional Networks" by Chao Dong, et al. in 2014. You can read the full paper at https://arxiv.org/abs/1501.00092.

Day 69:- Generative Adversarial Net(GAN) Generator

Check out the code from here

Today's work:- I have created and defined the Generator for GAN which will generate images for Discriminator to distinguish between a real and a generated image as best as it could when an image is fed.

Day 70:- Generative Adversarial Net(GAN) Discriminator and Training

Check out the code from here

Today's work:- I have created and defined the Discriminator for GAN which will distinguish between a real and a generated image as best as it could when an image is fed from generator. Further, I have trained the discriminator with real and generated image with backpropogated the total error and updating the weights of the neural network of the generator simultaneously. I simultaneously printed the losses and saving the real images and the generated images of the minibatch in every 100th step and finally obtained the following result.

100daysofmlcodechallenge's People

Contributors

codacy-badger avatar shritesh99 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.