Git Product home page Git Product logo

dlpu's Introduction

DLPU

PyTorch model DLPU for phase unwrapping

This is a PyTorch realisation of deep convolutional Unet-like network, described in arcticle [1].

Original network was designed in TensorFlow framework, and this is the PyTorch version of it.

Usage

<pip install -r example-requirements.txt>

Changes

I've added following moments to the structure:

  1. Replication padding mode in conv3x3 blocks, because experiments have shown that it's important at the edges of phase maps, otherwise unwrapping quality will be low
  2. In article there are some unclear moments: neural net structure contains of "five repeated uses of two 3×3 convolution operations (each followed by a BN and a ReLU), a residual block between the two convolution operations,..." So I made residual connections only for contracting path So, according to the article it should be CONV3x3->BN->ReLU -> Residual Block(???) -> CONV3x3->BN->ReLU and it's not clear. In contracting path (down) it's possible to make "good" residual connection, as shown below

But autors write, that in expansive path (up) there is similar structure CONV3x3->BN->ReLU -> Residual Block(???) -> CONV3x3->BN->ReLU and it's impossible to use residual connection below (figure from article) because first CONV3x3 reduces channels by two, and second CONV3x3 reduces again channels by two, and that makes no sence (and possibility, because numbers of channels don't match) to use residual connection here like in contracting path.

But i've tried to make following residual connection.

Dataset

Dataset was generated synthetically according to articles [1,2]

So, dataset data was generated using two methods (in equal proportions):

  1. Interpolation of squared matrixes (with uniformly distributed elements) of different sizes (2x2 to 15x15) to 256x256 and multiplying by random value, so the magnitude is between 0 and 22 rad
  2. Randomly generated Gaussians on 256x256 field with random quantity of functions, means, STD, and multiplying by random value, so the magnitude is between 2 and 20 rad

Example1 Example2

Model

Model can be shown as following: In original paper there is unclear moment: "residual block (see Ref. 20 for details) between the two convolution operations"

Metrics

I've implemented BEM (Binary Error Map), described in [3] with threshold 5%, according to formula

render

Training info

In original paper authors describe train hyperparameters as follows:

loss: pixelwise MSE

optimizer: Adam

learning rate: 10e-3

My training: Training with MSE converges ~10x times faster than with MAE SGD with momentum=0.9 converges ~10x times faster than with adam (in both variations learning rate was 10e-4)

(!) Succeed train to zero cost (0.025) SGR m=0.9, lr=0.0001 (really slow)

Parameters counting

Total Trainable Params: 1824937

Todo's

  1. code refactoring

References

  1. K. Wang, Y. Li, K. Qian, J. Di, and J. Zhao, “One-step robust deep learning phase unwrapping,” Opt. Express 27, 15100–15115 (2019).
  2. Spoorthi, G. E. et al. “PhaseNet 2.0: Phase Unwrapping of Noisy Data Based on Deep Learning Approach.” IEEE Transactions on Image Processing 29 (2020): 4862-4872.
  3. Qin, Y., Wan, S., Wan, Y., Weng, J., Liu, W., & Gong, Q. (2020). Direct and accurate phase unwrapping with deep neural network. Applied optics, 59 24, 7258-7267 .

dlpu's People

Contributors

lyuzinmaxim avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.