Git Product home page Git Product logo

autograd's Introduction

Pythonic Autograd

Pythonic implementation for PyTorch autograd

Table of Content

Introduction

This repository comprises numerous notebooks, each playing a significant role in enhancing comprehension of the PyTorch graph and facilitating the development of Pythonic autograd functionality.

Walkthrough

Notebook 1 : Unsupervised Loss Implementation

This notebook addresses a target concerning an unsupervised training problem, which can be broken down as follows:

  • Problem: find a solution to adjust an initially positioned point towards the central position amidst a distribution of randomly scattered points.
  • Goal: Develop an understanding of the central objective underpinning gradient descent techniques, which revolves around the attainment of the optimal minimum point

Notebook 2 : Unsupervised Closefrom Loss

Refining the loss equation established in the previous notebook to a closed-form expression, as opposed to a numerical gradient loss, promises faster and more efficient computations.

Notebook 3 : Different Gradiant Decent Implementations

  • Implementation of the Full batch gradient decent
  • Implementation of the Mini batch gradient decent
  • Implementation of the Stochastic gradient decent

Notebook 4 : Backpropagation Algorithm explained

A simplified mathematical breakdown of the backpropagation algorithm, which serves as the fundamental engine behind autograd functionality.

Notebook 5 : Pytorch Under The Hood

Explaination of some pytorch built-in tensor's related functions

Notebook 6 : Pytorch vs Numba

Creating the same function using both PyTorch and Numba, followed by a performance comparison between the two approaches.

computionally Expensive to run, consume lot of disk memory, for the comparsion results.

Notebook 7 : Back Propagation details P1

Thoroughly elaborating the implementation process of the backpropagation algorithm using Torch tensors, applied to the moons dataset.

Notebook 8 : Back Propagation details P2

Continuing from the previous notebook, this iteration provides deeper insights into the backpropagation algorithm. It includes a comprehensive guide to implementing the backward function, leading to the creation and training of a distinct PyTorch model.

Notebook9 : Testing Implemented Autograd

Assessing the implemented autograd feature using the Euclidean distance loss. This involves a comparison of outcomes against both the native PyTorch autograd capability and the previously developed loss and gradient calculation functions.

Autograd Implementation

autograd's People

Contributors

andrew2077 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.