Sameer Bhosale's Projects
Contains submission of all assignments for the Big Data Bootcamp 2.0
Bidirectional Encoder Representations from Transformers (BERT) is a technique for natural language processing (NLP) pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google.
This repo contains some supervise ML models for performing classification on the Iris Dataset
Predicts the credit card default risk
This is a demo repository for learning purposes
demo repository
This repository contains the implementation of the ID3 algorithm from scratch in python.
This repo contains the source code for the MLH Localhost workshop, How to Collaborate on Code Projects with GitHub.
Implementing Naive Bayes algorithm for NLP Classification task
Config files for my GitHub profile.