Git Product home page Git Product logo

sphere's People

Contributors

avigan avatar tirkarthi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

sphere's Issues

Write a complete documentation

Different levels of documentation:

  • documentation of each object/method
  • simple documentation for standard reduction.
  • jupyter notebook example reduction
  • first basic documentation on readthedocs
  • full detailed documentation on readthedocs

Improve cleaning of IRDIS images

Several possibilities:

  • implement a better bad pixel correction routine.
  • estimate the bad pixels on the science data
  • mask values at the edge of IRDIS FoV
  • add an additional step of bad pixels identification and correction?

Add a method to combine and save all the science sequence

The method should handle:

  • PSF images
  • star center images
  • standard coronagraphic images
  • for each type, save parallactic angle values and other info
  • files and frames data frames

But before issue #6 must be solved to provide a final, accurate wavelength solution

Inconsistent use of "ext" in IFU bad pixel correction

In sph_ifs_fix_badpix, there is some confusion between whether the region to be searched is from -ext to +ext+1, or -ext//2 to +ext//2+1. Most of the code uses the former, but the definitions of sub_low and sub_high use the latter, leading to inconsistent array lengths and an error.

Can be fixed by removing the "//2" on lines 217 and 218 of IFS.py, assuming that gives the desired definition.

Add parallactic angle correction for data acquired before 2016-07-12

Correction needed because of derotator tracking issue identified in May/June 2016, documented in the user manual.

Correction factor (IDL):

;; derotator drift correction
alt_beg = sxpar_eso(hdr,'HIERARCH ESO TEL ALT')
drot2_beg = sxpar_eso(hdr,'HIERARCH ESO INS4 DROT2 BEGIN')
if jul_out lt date_conv('2016-07-12','J') then begin
    corr = atan(tan((alt_beg-2.*drot2_beg)*!pi/180.))*180./!pi
endif else begin
    corr = 0
endelse

The correction factor needs to be added to the parallactic angle value.

Add check on final image sizes

To make sure that the user does not ask images larger than the physical does of the date. Applies to all types of data.

Implementation of the IRDIS/LSS mode support

  • object definition
  • basic methods to sort files & frames
  • creation of calibrations
  • pre-processing of science frames
  • processing of science frames
    • star center
    • wavelength recalibration
    • combining all data
  • cleaning
  • small refinements specific to LSS
    • handling of neutral density in data combination
    • wavelength recalibration in LRS
    • chromatism in MRS

Bad pixel interpolation fails if too few good pixels within ddmax

vltpf.utils.imutils.fix_badpix throws an error if too few good pixels (i.e. less than npix) are found, specifically on line 1039 when it tries to index the first npix pixels. This can occur at the corners of IFS flat frames, where (0, 0) is more than ddmax (= 100) pixels from any good, illuminated pixels, and so the array good_pix is empty.

One possible solution is just to increase ddmax, with ~170 needed to avoid problems on IFS frames. Alternatively, a simple replacement value could be used in such cases - if a pixel is more than 100px outside the illuminated area, it's value really doesn't matter.

Edit: The value does matter slightly, actually, as the flat frame is normalised again after the step where I was getting the error. Maybe fill with the frame mean, or fill with np.nan and use np.nanmedian to normalise, perhaps replacing nans with a nicer value later?

Possible optimisations for sigma_filter

Two possible optimisations for sigma_filter in imutils.py:

  1. scipy's uniform_filter() is about 4x faster than astropy's Box2DKernel and convolve(). Using uniform_filter's "constant" mode (i.e. zero padding), the two agree to about ~1e-13. The replacement would be as follows:
# Current
    box2 = box**2

    kernel = Box2DKernel(box)
    img_clip = (convolve(img, kernel)*box2 - img) / (box2-1)

    imdev = (img - img_clip)**2
    fact = nsigma**2 / (box2-2)
    imvar = fact*(convolve(imdev, kernel)*box2 - imdev)

# Replacement
    from scipy.ndimage.filter import uniform_filter

    box2 = box**2

    img_clip = (uniform_filter(img, box, mode = "constant")*box2 - img)/(box2-1)

    imdev = (img - img_clip)**2
    fact = nsigma**2 / (box2-2)
    imvar = fact*(uniform_filter(imdev, box, mode = "constant")*box2 - imdev)
  1. I think it's safe to assume that if one iteration of sigma clipping does not clip any values, then no future iteration will ever do so, because each is passing the same, unchanged input to the next iteration. I've found this typically cuts the number of iterations in half. One can check if any pixels are changed with:
    nchange = img.size - nok
    if (iterate is True):
        _iters = _iters+1
        if (_iters >= max_iter) or nchange == 0:
            # return...

Implementation of the IRDIS/DPI mode support

  • object definition
  • basic methods to sort files & frames
  • creation of calibrations
  • pre-processing of science frames
  • processing of science frames
  • cleaning
  • small refinements specific to DPI

Add method for recalibration of wavelength

Similar to the version in the SPHERE-legacy IDL

Necessary steps:

  • extract DRH-calibrated wavelength
  • add high-pass spatial filtering option
  • find star center files
    • if no OBJECT,CENTER found, need to save the DRH wavelength
  • find the centers to process the rescaling factor
  • extract mean flux from wave cal file processed as science
  • fit the new wavelength law
  • save all products

Add support for SPARTA data

Implement method to extract information available in SPARTA files downloaded with the science from the archive.

Find a way to handle errors when doing multiple reductions

Current implementation of SPHERE.Dataset will step when an error occurs in one of the reductions

  • could this be avoided?
  • do we need to implement a success flag in the reductions?
  • if yes, at which point is a reduction considered a success?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.