Git Product home page Git Product logo

deep_prior_interpolation's Introduction

Unsupervised 3D Seismic Data Reconstruction Based On Deep Prior

This repository contains codes and examples of the data interpolation schemes that leverages the deep prior paradigm.

Authors

Fantong Kong1, Francesco Picetti2, Vincenzo Lipari2, Paolo Bestagini2, Xiaoming Tang1, and Stefano Tubaro2

1: School of Geosciences - China University of Petroleum (East), Qingdao, China
2: Dipartimento di Elettronica, Informazione e Bioingegneria - Politecnico di Milano, Italy

Abstract

Irregularity and coarse spatial sampling of seismic data strongly affect the performances of processing and imaging algorithms. Therefore, interpolation is a necessary pre-processing step in most of the processing workflows. In this work, we propose a seismic data interpolation method based on the deep prior paradigm: an ad-hoc Convolutional Neural Network is used as a prior to solve the interpolation inverse problem, avoiding any costly and prone-to-overfitting training stage. In particular, the proposed method leverages a multi resolution U-Net with 3D convolution kernels exploiting correlations in 3D seismic data, at different scales in all directions. Numerical examples on different corrupted synthetic and field datasets show the effectiveness and promising features of the proposed approach.

The inverse problem is defined starting from the sampling equation:

Sampling Equation

that is solved using the deep prior paradigm by

Deep Inverse Problem

The estimate of the true model is then obtained as the output of the optimized network:

Output

The architecture we propose is the MultiResolution UNet:

MultiRes UNet

Setup and Code Organization

The code mainly relies on top of pytorch. You can recreate our conda environment named dpi (acronym for "deep prior interpolation") through

conda create env -f environment.yml

Then, activate it with source activate dpi before running any example.
NOTE: if you have initialized conda through conda init, use conda activate dpi instead.

This python project is organized as follows:

  • main.py is the main script that actually does the interpolation
  • parameter.py contains the run options that the main will parse as shell arguments. Check it out!
  • architectures contains pytorch implementations of the networks and loss functions
  • data.py contains data management utilities, such as data patch extraction.
  • utils contains some general purpose utilities

Usage Examples

Here we report the example tests on the 3D hyperbolic data included in the paper.

# solve mask 1 saving the CNN weights
python main.py --imgdir ./datasets/hyperbolic3d --imgname original.npy --maskname random66_shot1.npy --datadim 3d --gain 40 --upsample linear --epochs 3000 --savemodel --outdir TL/shot1
# solve mask 2 from scratch
python main.py --imgdir ./datasets/hyperbolic3d --imgname original.npy --maskname random66_shot2.npy --datadim 3d --gain 40 --upsample linear --epochs 3000 --outdir TL/shot2_scratch
# solve mask 2 using as initial guess the CNN weights of mask 1
python main.py --imgdir ./datasets/hyperbolic3d --imgname original.npy --maskname random66_shot2.npy --datadim 3d --gain 40 --upsample linear --epochs 3000 --outdir TL/shot2_transfer --net load --netdir TL/shot1/model.pth

Data preparation

We are glad you want to try our method on your data! To minimize the effort, keep in mind that:

  • The data dimensions are (t,x,y), and so are defined the patch shape and stride (during extraction). If you have 2D native datasets, please add an extra axis.
  • If you process the data in a 2.5D fashion, the tensors will be transposed in order to tile the patches in the last dimension (as they are a channel). This procedure is automatic and should be reversed in the patch assembly in data.reconstruct_patches.
  • The subsampling mask can be made of 0 and 1; however we prefer to store the "decimated" version of the data, with NaN missing traces. This has the advantage of removing the ambiguity given by the zeros in the data and the zeros in the mask. Nonetheless, our codes can take into account both ways.
  • We study the behaviour of the network as a nonlinear prior for the decimated data. Therefore we do not perform any preprocessing, a part from a scalar --gain for avoiding numerical errors.

Related Publications

  1. F. Kong, V. Lipari, F. Picetti, P. Bestagini, and S. Tubaro. "A Deep Prior Convolutional Autoencoder for Seismic Data Interpolation", in European Association of Geophysicists and Engineers (EAGE) Annual Meeting, 2020. DOI
  2. F. Kong, F. Picetti, V. Lipari, P. Bestagini, and S. Tubaro. "Deep prior-based seismic data interpolation via multi-res U-net", in Society of Exploration Geophysicists (SEG) Annual Meeting, 2020. DOI
  3. F. Kong, F. Picetti, V. Lipari, P. Bestagini, X. Tang, and S. Tubaro. "Deep Prior Based Unsupervised Reconstruction of Irregularly Sampled Seismic Data", in IEEE Geoscience and Remote Sensing Letters (GRSL), 2020. DOI
  4. F. Picetti, V. Lipari, P. Bestagini, and S. Tubaro. "Anti-Aliasing Add-On For Deep Prior Seismic Data Interpolation", in IEEE International Conference on Image Processing (ICIP), 2021. DOI

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.