Git Product home page Git Product logo

stat212b's Introduction

stat212b

Topics Course on Deep Learning for Spring 2016

UC Berkeley, Statistics Department

##Syllabus

1st part: Convolutional Neural Networks

  • Invariance, stability.
  • Variability models (deformation model, stochastic model).
  • Scattering
  • Extensions
  • Group Formalism
  • Supervised Learning: classification.
  • Properties of CNN representations: invertibility, stability, invariance.
  • covariance/invariance: capsules and related models.
  • Connections with other models: dictionary learning, LISTA, Random Forests.
  • Other tasks: localization, regression.
  • Embeddings (DrLim), inverse problems
  • Extensions to non-euclidean domains.
  • Dynamical systems: RNNs and optimal control.
  • Guest Lecture: Wojciech Zaremba (OpenAI)

2nd part: Deep Unsupervised Learning

  • Autoencoders (standard, denoising, contractive, etc.)
  • Variational Autoencoders
  • Adversarial Generative Networks
  • Maximum Entropy Distributions
  • Open Problems
  • Guest Lecture: Ian Goodfellow (Google)

3rd part: Miscellaneous Topics

  • Non-convex optimization theory for deep networks
  • Stochastic Optimization
  • Attention and Memory Models
  • Guest Lecture: Yann Dauphin (Facebook AI Research)

Schedule

Lec1 Jan 19: Intro and Logistics

Lec2 Jan 21: Representations for Recognition : stability, variability. Kernel approaches / Feature extraction. Properties.

Lec3 Jan 26: Convnets / Scattering

Lec4 Jan 28: Scattering Properties

Lec5 Feb 2: Further Scattering

Lec6 Feb 4: Supervised Learning: classfication. Properties of learnt representations

Lec7 Feb 9: Properties of learnt representations. Covariance and Invariance. Estimation properties: above generalization error. Redundancy in parameter space.

Lec8 Feb 11: Connections with other models (DL, Lista, Random Forests, CART)

Lec9 Feb 16: Representations of stationary processes. Properties.

Lec10 Feb 18: Other high level tasks: localization, regression, embedding, inverse problems.

Lec11 Feb 23: Extensions to non-Euclidean domain. Sequential Data RNNs.

Lec12 Feb 25: Guest Lecture ( W. Zaremba, OpenAI )

Lec13 Mar 1: Unsupervised Learning: autoencoders. Density estimation. Parzen estimators. Curse of dimensionality

Lec14 Mar 3: Variational Autoencoders

Lec15 Mar 8: Adversarial Generative Networks

Lec16 Mar 10: Maximum Entropy Distributions

Lec17 Mar 29: Self-supervised models (analogies, video prediction, text, word2vec).

Lec18 Mar 31: Guest Lecture ( I. Goodfellow, Google Brain )

Lec19 Apr 5: Non-convex Optimization: parameter redundancy, spin-glass, optimiality certificates. stability

Lec20 Apr 7: Tensor Decompositions

Lec20 Apr 12: Stochastic Optimization, Batch Normalization, Dropout

Lec21 Apr 14: Reasoning, Attention and Memory: New trends of the field and challenges. limits of sequential representations (need for attention and memory). modern enhancements (NTM, Memnets, Stack/RNNs, etc.)

Lec22 Apr 19: Guest Lecture (Y. Dauphin, Facebook AI Research)

Lec23-25 Oral Presentations

stat212b's People

Contributors

joanbruna avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.