kirxkirx / vast Goto Github PK
View Code? Open in Web Editor NEWVariability Search Toolkit (VaST)
Home Page: http://scan.sai.msu.ru/vast/
License: GNU General Public License v3.0
Variability Search Toolkit (VaST)
Home Page: http://scan.sai.msu.ru/vast/
License: GNU General Public License v3.0
For some star fields I have 20k+ fits files (many years of observations).
When passing these as *.fits into vast, linux is complaining about the size of the glob expansion.
It seems there are no easy solutions for this.
Would it be possible to pass the list of fits files in e.g. a .txt file to circumvent this limitation?
thx
Mike
Hi,
I am wondering if any one can help me. Is it possible to use VaST to find exoplanets? I have used VaST to find the light curve of an exoplanet, and I found that the results are good in comparison to AstroImageJ results. But I could not find any thing about the exoplanet with VaST in the VaST paper (and its website). Is there anything which can prevent us from using VaST for exopanet lightcurves generally? If there is, what is/are that/those thing/s? And, can we improve it?
Best Regards
Azim
removing images and everything that comes after that is not multi threaded and takes a very long time for 35k images.
version: vast-1.0rc84
Note: the reference frame in 'vast_summary.log' is processed via astrometry.net and saved as new-image.fits.
If I execute the following command:
./sextract_single_image ../../new-image.fits 886 1199
The red cross is over a star. When I click this star I get the following output:
When I look at this same star in either 'vast_list_of_all_stars.log' or in 'out01177.dat', the following coordinates are displayed:
So the question is, is this a bug or is there another way to get the pixel position of a star on the reference frame?
In the description (at http://scan.sai.msu.ru/vast/) of the columns in the lightcurve files (out*.dat) there are 7 column names:
- 1st column - JD(TT) (default) or JD(UTC) (if VaST was started with "-u" flag)
- 2nd column - magnitude (with respect to the background level on the reference image if an absolute calibration was not done yet)
- 3rd column - estimated magnitude error
- 4th column - X position of the star on the current frame (in pixels)
- 5th column - Y position of the star on the current frame (in pixels)
- 6th column - diameter of the circular aperture used to measure the current frame (in pixels)
- 7th column - file path corresponding to the current frame
In my output file there are however 10 extra columns, probably 5 comparison stars with mag + mag error or something similar. Could you explain this + where can I find which 5 stars are used for this.
I take the reference frame chosen by VaST in 'vast_summary.log' and perform plate solving using astrometry.net.
Up till now, when I overlay this reference frame with the wcs transformed x/y coordinates on the ref frame (given by VaST), they match perfectly.
Now I have the case where the rotation of the reference frame chosen by VaST is 180.3 degrees as found in 'vast_image_details.log'.
==> there is no more a match between wcs transformed xy coordinates and the reference frame. Not when I plate-solve the original ref frame, nor when I rotate the reference frame by 180.3 degrees and plate solve that. Is there some translation I need to take into account or something else I misunderstood?
PS: we found our first 17 new variable stars, I'll send you the link by mail
Dear Kirill,
Thanks for such a fantastic code. I am trying to do calibration on my images before running vast on them, and to do so, I use ms, md, and mk functions in util/ccd directory. There is just one problem to do calibration. Actually, some times there is no dark and dark flat separately (I mean with the exposure time of light and flat frames respectively ), and in this cases I need to consider exposure time differences between light/flat and dark images to do the best calibration (see the formula here: http://www.bu.edu/astronomy/wp-assets/script-files/buas/oldsite/astrophotography/flat.htm).
There is no problem about subtracting bias frames (exposure time=0) with the current ms function, but for subtracting the dark frames one should consider the exposures, is this possible to add a new function to read the exposure times from image header and consider them (with a linear assumption) in subtracting (like the formula in above link)? or improve the current ms function to act differently for bias subtracting and non bias frames subtracting? I think this can improve the quality of calibrated images significantly.
Thanks
Azim
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.