jmccormac01 / donuts Goto Github PK
View Code? Open in Web Editor NEWDonuts AG & image alignment code
Donuts AG & image alignment code
This can be a shortcut for those who do not want to deal with the multiple lines of getting the shifts. For example instead of:
shifts = d.compute_offset(ref)
# do something with shifts.x and shifts.y
it could be:
x, y = d.compute_offset(ref).shifts
or even:
x, y = d.compute_shifts(ref)
# which calls compute_offset(ref).shifts under the hood
TODO: better naming
Hi,
The problems with conda last week highlighted that continuum does not provide builds for a number of astropy/python/numpy combinations for which are commonly used for testing in affiliated packages (and in the matrix in the package template).
As of earlier today, these astropy builds are now available on the conda channel astropy-ci-extras (most affiliated packages are probably already using the channel since it is in the template and ci-helpers):
python 2.7:
- Builds of the latest astropy LTS (1.0.10) for numpy 1.7 through 1.11
- Builds of the latest astropy release (1.2) for numpy 1.7 through 1.10 (continuum provides the numpy 1.11 build)
python 3.5:
- Builds of astropy LTS (1.0.10) for the versions of numpy provided by continuum (1.9, 1.10, 1.11)
- Builds of astropy stable (1.2) for numpy 1.9 and 1.10 (continuum provides the numpy 1.11 build).
To get a better handle on what other builds would be useful for package developers, please take a few minutes to fill out this survey if you work on an affiliated package or use the astropy-ci-extras channel as part of your continuous integration: http://goo.gl/forms/qJvQiVt0ynTfOCq22
Thanks,
Matt CraigPS It is a good sign of the state of the astropy community that in the ~20 hours since some of these builds became available they've been downloaded for CI tests 171 times!
It seems like all the builds are failing with a pytest version issue. I've run some of them locally and they seem to work fine with my environment. See below.
PythonScripts/astropy/donuts master ✔ 10m
▶ python setup.py test --coverage
Freezing version number to donuts/version.py
running test
running build
running build_py
copying donuts/donuts.py -> build/lib.macosx-10.10-x86_64-2.7/donuts
copying donuts/version.py -> build/lib.macosx-10.10-x86_64-2.7/donuts
copying donuts/tests/synthetic_data.py -> build/lib.macosx-10.10-x86_64-2.7/donuts/tests
copying donuts/tests/test_donuts_with_synthetic_data.py -> build/lib.macosx-10.10-x86_64-2.7/donuts/tests
copying donuts/tests/test_integration.py -> build/lib.macosx-10.10-x86_64-2.7/donuts/tests
copying donuts/tests/test_synthetic_data.py -> build/lib.macosx-10.10-x86_64-2.7/donuts/tests
creating build/lib.macosx-10.10-x86_64-2.7/donuts/data
copying donuts/data/IMAGE80520160114005507.fits -> build/lib.macosx-10.10-x86_64-2.7/donuts/data
copying donuts/data/IMAGE80520160114005520.fits -> build/lib.macosx-10.10-x86_64-2.7/donuts/data
copying donuts/data/IMAGE80520160114005533.fits -> build/lib.macosx-10.10-x86_64-2.7/donuts/data
======================================= test session starts ========================================
platform darwin -- Python 2.7.8 -- py-1.4.30 -- pytest-2.7.3
rootdir: /private/var/folders/dq/_bt7cxt952n_66n5ts2dc6340000gn/T/donuts-test-qbZnvp, inifile: setup.cfg
Running tests with Astropy version 1.1.2.
Running tests in lib.macosx-10.10-x86_64-2.7/donuts docs.
Date: 2016-03-28T15:15:30
Platform: Darwin-14.5.0-x86_64-i386-64bit
Executable: /usr/local/opt/python/bin/python2.7
Full Python Version:
2.7.8 (default, Oct 19 2014, 16:02:00)
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.54)]
encodings: sys: ascii, locale: UTF-8, filesystem: utf-8, unicode bits: 15
byteorder: little
float info: dig: 15, mant_dig: 15
Numpy: 1.10.4
Scipy: 0.17.0
Matplotlib: 1.5.1
h5py: not available
Pandas: 0.16.2
collected 14 items
donuts/tests/test_donuts_with_synthetic_data.py .....
donuts/tests/test_integration.py .
donuts/tests/test_synthetic_data.py ........
==================================== 14 passed in 5.40 seconds =====================================
Saving coverage data in .coverage...
Saving HTML coverage report in htmlcov...
We've just had an email about an update to the package template:
I have now tagged a new 'release' of the package-template - I called
it v1.1 since the last one was v1.0 (the numbers don't have to be in
sync with the astropy package). I have written up a list of changes
here:https://github.com/astropy/package-template/blob/master/TEMPLATE_CHANGES.md#11-2016-06-01
In particular, note the summary at the bottom of the v1.1 section,
which will help you navigate the changes.An example of a pull request to a package to update to the latest
version is here:https://github.com/astrofrog/reproject/pull/99
I would be happy to review any pull requests that carry out this
update, so please just ping me, either on github, or reply to this
email. I'm not going to update any other packages for now, so anyone
can feel free to prepare pull requests for other packages.None of the changes are critical, except for the changes to
MANIFEST.in, which will make sure that you don't end up with temporary
files inside astropy-helpers when you release your package.
For reasons outlined in #2, we could synthesise test data
We could use the package towncrier
where we document the issues/features we add based on their github issue id, and it automatically classifies the changes and generates a changelog.
Alternatively we keep a changelog ourselves but this is tricky!
The docs are a little out of date, update for the new changes in v0.1.0
We are missing a test for the correction for odd shaped CCD arrays. A NITES like CCD image would hit this section (e.g. 1057x1030). The if statement in:
rx = self.dx % 16
ry = self.dy % 16
if rx != 0 or ry != 0:
self.dimx = int((self.dx / self.base) * self.base)
self.dimy = int((self.dy / self.base) * self.base)
else:
self.dimx = self.dx
self.dimy = self.dy
is not tested.
We are also missing several checks on location of phi_max
for the X and Y directions in measure_shift()
. I can write some quick tests to cover this if I get a minute in the morning.
To whom it may concern,
If you are using https://github.com/astropy/ci-helpers in your appveyor.yml , please know that the Astropy project has dropped active development/support for Appveyor CI. If it still works, good for you, because we did not remove the relevant files (yet). But if it ever stops working, we have no plans to fix anything for Appveyor CI. Please consider using native Windows support other CI, e.g., Travis CI (see https://docs.travis-ci.com/user/reference/windows/). We apologize for any inconvenience caused.
If this issue is opened in error or irrelevant to you, feel free to close. Thank you.
I thought we had to manually put versions on Pypi, they are uploading automatically. Also dev160 is failing to import and complaining about builtins
>>> import donuts
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/donuts/__init__.py", line 10, in <module>
from ._astropy_init import *
File "/usr/local/lib/python2.7/site-packages/donuts/_astropy_init.py", line 13, in <module>
import builtins as builtins
ImportError: No module named builtins
>>> from donuts import Donuts
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/donuts/__init__.py", line 10, in <module>
from ._astropy_init import *
File "/usr/local/lib/python2.7/site-packages/donuts/_astropy_init.py", line 13, in <module>
import builtins as builtins
ImportError: No module named builtins
>>>
readthedocs are changing their hosting addresses to *.readthedocs.io
instead of *.readthedocs.org
:
Hello!
Starting today, Read the Docs will start hosting projects from subdomains on the domain readthedocs.io, instead of on readthedocs.org. This change addresses some security concerns around site cookies while hosting user generated data on the same domain as our dashboard.Changes to provide security against broader threats have been in place for a while, however there are still a few scenarios that can only be addressed by migrating to a separate domain.
We implemented session hijacking detection and took precautions to limit cookie usage, but there are still a number of scenarios utilizing XSS and CSRF attacks that we aren't able to protect against while hosting documentation from subdomains on the readthedocs.org domain. Moving documentation hosting to a separate domain will provide more complete isolation between the two user interfaces.
Projects will automatically be redirected, and this redirect will remain in place for the foreseeable future. Still, you should plan on updating links to your documentation after the new domain goes live.
We should change any urls pointing to our documentation to use the new domain.
nearest
mode will be depreciated at some point. Change it to be edge
.
/Users/James/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/skimage/_shared/utils.py:174: skimage_deprecation: Mode 'nearest' has been renamed to 'edge'. Mode 'nearest' will be removed in a future release.
"Mode 'nearest' has been renamed to 'edge'. Mode 'nearest' will be "
Currently there is common process is applied to the reference image as to each "test" image. It would be great to extract this into a method/function/class as per DRY.
Paul Chote spotted that ngimage zoom doesn't quite get the resampling back to the full CCD size right and we end up with a slightly deformed image (although it is the right shape).
I found that we might be able to use skimage.transform.resize
instead. I tested it very briefly tonight and it is faster than ndimage.zoom and it seems to not be skewed. I need to do some more thorough testing in the devel version.
Example:
bkgmap = ndimage.zoom(coarse, (tilesizey, tilesizex), order=2)
becomes
bkgmap=resize(coarse,(tilesizey*tile_num,tilesizex*tile_num),mode='nearest')
coarse
is created in the same way as before:
coarse = np.empty((tile_num, tile_num))
for i in range(0, tile_num):
for j in range(0, tile_num):
coarse[i][j] = np.median(data[(i * tilesizey):(i + 1) * tilesizey,
(j * tilesizex):(j + 1) * tilesizex])
I've started this already but I'm not 100% sure about what to do when we catch and IOError
etc.
Most of this functionality can be done through numpy or is relatively easy to implement.
The tests however do require scipy, but that's for developers anyway.
I am trying to use donuts, but when I create my Donuts object I get a ZeroDivisionError. I am using the sample code:
d = Donuts(refimage=DIR+reference_image_name,image_ext=0, overscan_width=20,prescan_width=20,border=64, normalise=True,exposure='EXPOSURE',subtract_bkg=True, ntiles=32)
with the attached fits file (just remove the .txt only used for uploading here)
elp1m008-fl05-20160424-0173-e90_cropped.fits.txt
Currently quite a lot of variables are saved as instance state (with self.
). If for example we want to make the shift computation thread-safe we cannot use instance variables (or at least write to them, reading is fine e.g. self.texp
). We should really consider what variables need to be saved on the instance and which don't.
Images taken with a DSLR or similar will unpack into a 3 x X x Y cube when the images are loaded. Donuts currently does not handle this correctly.
Either, we can stack these three array elements or guide on one of them?
When running donuts with 1024x1024 images and overscan_width=20, prescan_width=20, a shape mismatch happens during subtraction:
self.backsub_region = self.raw_region - self.sky_background.
Swapping assignment order to:
dim_y, dim_x = self.raw_region.shape
fixes the problem.
for image in science_image_names:
... x, y = d.measure_shift(checkimage=image)
... print(x, y)
to something as:
for image in science_image_names:
... res = d.measure_shift(checkimage_filename=image)
... print(res.x, res.y)
We expect the code to run on windows, so we should test on windows.
Currently there is no package at https://pypi.python.org/pypi/donuts which - given how common the name is - is relatively surprising! It might be worth registering it fairly soon.
We're adding some new features, update docs to give examples
Currently the package distributions weigh in at around 15MB each, which is quite a lot for a simple python script.
Astropy packages have a convention for hosting external datasets which may be useful for us as it will allow us to create a much smaller package which is less of a download for our users.
The other alternative is to not package the tests along with the package, which is ok for end users and developers/contributors will be cloning the git repo anyway.
All the astropy stuff seems superfluous now given we're not proceeding with being an affiliated package. Remove it and repackage the code as a simple python package.
It may be useful for users to have access to:
for both the reference image, and each test image.
We could split out the process raw image -> corrected image -> projections
into another class, e.g. Image
class. The Donuts.__init__
will store one of these objects as self.reference_image
, and measure_shift
will return a new one of these to the user. These objects contain the above properties.
Example possible design:
class Image(object):
@classmethod
def from_file(cls, filename):
'''Extract the required information from a file on disk
'''
with fitsio.FITS(filename) as infile:
self = cls(data=infile[0].read(), header=infile[0].read_header())
self.filename = filename
return self
def __init__(self, data, header):
self.raw_image = data
self.header = header
self.x = None
self.y = None
self.proj_x = None
self.proj_y = None
def compute_shifts_from(self, other):
self.compute_projections()
# Compute cross correlations
self.x, self.y = 15, -20
# Convention for pure mutation methods to return self
# for daisy chaining
return self
def compute_projections(self):
self.proj_x = 15
self.proj_y = -20
# Convention for pure mutation methods to return self
# for daisy chaining
return self
class Donuts(object):
def __init__(self, reference_filename):
self.reference_image = Image.from_file(
reference_filename).compute_projections()
def measure_shift(self, filename):
return Image.from_file(
filename).compute_shifts_from(self.reference_image)
# End user API remains the same
reference_image = 'reference.fits'
files = ['a.fits', 'b.fits', 'c.fits']
donuts = Donuts(reference_image)
# Access intermediate products of the reference image
background_image = donuts.reference_image.sky_background
for filename in files:
# measure the shift (nearly) as before
result = donuts.measure_shift(filename)
x, y = result.x, result.y
# get properties of the test image as well
per_image_background = result.sky_background
Documentation builds are failing. Figure out what is wrong and update the build procedure.
If we cannot download a fits file from the default server, we want to be able to try a fallback.
Just tested your example script and got the following error :
Traceback (most recent call last):
File ".\test.py", line 13, in <module>
ntiles=32)
File "C:\Users\totara\AppData\Local\Programs\Python\Python37-32\lib\site-packages\donuts\donuts.py", line 72, in __init__
self.reference_image = self.construct_object(self.refimage_filename)
File "C:\Users\totara\AppData\Local\Programs\Python\Python37-32\lib\site-packages\donuts\donuts.py", line 102, in construct_object
border=self.border
File "C:\Users\totara\AppData\Local\Programs\Python\Python37-32\lib\site-packages\donuts\image.py", line 120, in trim
dy, dx = image_section.shape
ValueError: too many values to unpack (expected 2)
Not too sure where it comes from, but the image_section.shape value for this image is :
(3, 579, 765)
Frames acquired by the Warwick 1m telescope can have arbitrary sizes and plate scales depending on the CCD windowing and binning.
The current hardcoded background tile size and trimming could be replaced by commandline arguments for the [x1:x2,y1:y2]
guide window and (minimum) background tile size.
See warwick-one-metre/pixelshift/framedata.c#L119 for a reference implementation.
This is a CI site which checks for code quality (using e.g. pylint etc.). We could add integration pretty easily and fix the problems...
We are using a submodule astropy_helpers
which came as part of the astropy
package template. Do we have a policy on updating this repository? We could:
Already there have been many changes which will need reviewing.
CI is showing as failing. Update this and also check that test cases cover new code.
Do we want to package NGTS data along with the code, even in the tests? We may run into licensing/proprietary period related questions...
I have the idea of synthesising test data for ourselves (e.g. making two identical images -> the deltas should be 0, making offset images and asserting the deltas)
This script currently checks that Donuts runs on real world data but it doesn't not look for the right answer. assert the known shifts from Cyclic on this test. Generate a similar test to use odd shaped CCD data and address #9.
Clip out bad parts of the image from the correction.
Currently the do not run because the data files used in the documentation do not exist. Try to get the doctests to see the test data.
Announcement: https://blog.travis-ci.com/2016-12-06-the-crons-are-here
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.