nkphysics / autonicer Goto Github PK
View Code? Open in Web Editor NEWA program that automates the data retrieval and data reduction of observational data gathered by NASA's NICER x-ray mission
License: Apache License 2.0
A program that automates the data retrieval and data reduction of observational data gathered by NASA's NICER x-ray mission
License: Apache License 2.0
Currently when running any of the autonicer.Reprocess() tools there is no error handling for any issues regarding the inability t identify the relevant metadata or even files themselves when passing in an --inlist
Title explains itself.
This will significantly speed up autonicer.
Please see the following error message fr investigation. This happened after processing multiple datasets with autonicer --checkcal --reprocess --bc --compress -i *
Migrating to 1060010101
Latest NICER CALDB: xti20221001
######## Decompressing None ########
ni1060010101_0mpu7_cl.evt.tar.gz extracted
ni1060010101_0mpu2_ufa.evt.tar.gz extracted
ni1060010101_0mpu1_ufa.evt.tar.gz extracted
ni1060010101_0mpu6_ufa.evt.tar.gz extracted
ni1060010101_0mpu0_ufa.evt.tar.gz extracted
ni1060010101_0mpu4_ufa.evt.tar.gz extracted
ni1060010101_0mpu3_ufa.evt.tar.gz extracted
ni1060010101_0mpu5_ufa.evt.tar.gz extracted
ni1060010101_0mpu7_ufa.evt.tar.gz extracted
############## Auto NICER ##############
nicerl2 1.27
--------------------------------------------------------
ang_dist = 0.015
attfile = $INDIR/auxil/ni$OBSID.att
autoscreen = YES
br_earth = 30
calstation = FLIGHT
cldir = $INDIR/xti/event_cl
cleanup = 1
cleanup_ufa_files = 0
clfile = $CLDIR/ni$OBSID_0mpu7_cl.evt
clobber = 1
cor_range = *-*
dec = OBJ
detlist = launch
elv = 15
filtcolumns = NICERV4
geomag_columns = FILTCOLUMNS
geomag_path = DEFAULT
gtifiles = NONE
gzip_thresh = 50
history = 1
hkdir = $INDIR/xti/hk
hkpat = $HKDIR/ni??????????_?mpu?.hk{,.gz}
incremental = 0
indir = None/
issmanfile = CALDB
leapinit = AUTO
min_fpm = 7
mkfile = $INDIR/auxil/ni$OBSID.mkf
mpugtimerge = OR
mpugtiscr = DEFAULT
mpulist = 0-6
nicercal_filtexpr = EVENT_FLAGS=bxxxx00
nicerclean_args = NONE
nicersaafilt = YES
nimaketime_gtiexpr = NONE
niprefilter = 1
niprefilter2 = 1
niprefilter2_coltypes = FILTCOLUMNS
noise25scr = DEFAULT
noiseextscr = DEFAULT
orbfile = $INDIR/auxil/ni$OBSID.orb
overonly_expr = NONE
overonly_range = *-30
overonlyscr = DEFAULT
picalfile = CALDB
pifastcalfile = CALDB
pirange = 20:1500
prefilter_columns = FILTCOLUMNS
ra = OBJ
robofile = CALDB
roundrobbinscr = DEFAULT
saafilt = NO
saaregfile = NONE
st_valid = YES
tasks = ALL
timebiascalfile = CALDB
trackfilt = YES
trumpetfilt = YES
trumpetkeep = GOOD
ufafile = $CLDIR/ni$OBSID_0mpu7_ufa.evt
ufdir = $INDIR/xti/event_uf
ufpat = $UFDIR/ni??????????_?mpu?_uf.evt{,.gz}
underonly_range = 0-500
underonlyscr = DEFAULT
vehiclefile = CALDB
MPU List: 0-6
Requested task operations: CALMERGE,SCREEN,MKF
Processed prefilter_columns: NICERV4,-MPU_OVER_COUNT,-MPU_UNDER_COUNT,-MPU_XRAY_COUNT,-MPU_ALL_COUNT
Processed niprefilter2_coltypes: BASE,NICERV4
Processed geomag_columns: kp_potsdam.fits(KP),solarphi_oulu.fits(SOLAR_PHI),COR_NYM
ERROR: input directory None does not exist at /home/nick/heasoft-6.31/x86_64-pc-linux-gnu-libc2.35/bin/nicerl2 line 253.
Task nicerl2 1.27 terminating with status -1
Traceback (most recent call last):
File "/home/nick/.local/lib/python3.10/site-packages/autonicer/reprocess.py", line 198, in inlist
raise FileNotFoundError
FileNotFoundError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/nick/.local/bin/autonicer", line 8, in <module>
sys.exit(run())
File "/home/nick/.local/lib/python3.10/site-packages/autonicer/__init__.py", line 67, in run
inlist(argp)
File "/home/nick/.local/lib/python3.10/site-packages/autonicer/reprocess.py", line 210, in inlist
reprocess_check(argp, curr_cals)
File "/home/nick/.local/lib/python3.10/site-packages/autonicer/reprocess.py", line 175, in reprocess_check
check.reprocess(argp.bc, argp.compress)
File "/home/nick/.local/lib/python3.10/site-packages/autonicer/reprocess.py", line 161, in reprocess
an.reduce(self.obsid)
File "/home/nick/.local/lib/python3.10/site-packages/autonicer/autonicer.py", line 326, in reduce
if self.bc_sel.lower() == "n":
AttributeError: 'bool' object has no attribute 'lower'
There is no error handling for --checkcal
when establishing CALDB_VER from metadata of cl.evt files
The glob std library can replace autonicer.file_find()
Replace This:
def file_find(query):
"""
Runs a ls command and returns the contents in a list
"""
files = sp.run(f"ls {query}", shell=True, capture_output=True, encoding="utf-8")
filelist = []
for i in str(files.stdout).split("\n"):
if i != "":
filelist.append(i)
return filelist
Wtih this:
glob.glob(query)
Which will return the same filelist
Title is self explanatory. Poetry is becoming a pain to deal with so before caching or std product generation can be added there need to be a migration to setuptools.
It would be useful not to have to make an inlist file every time.
Instead consider navigating to a dir containing all NICER OBSID datasets and the running either
--inlist=*
or
--inlist=/PATH-TO-DIR/*
Users doing spectrum analysis should be able to select an option to auto-generate the following for spectrum analysis through xspec:
The following is still a hindrance to reprocessing larger datasets...
Migrating to 1013010133
Unable to identify required metadata.
Consider Re-downloading and reducing this dataset
OR
Try nicerl2 manually
Unable to identify required metadata.
Consider Re-downloading and reducing this dataset
OR
Try nicerl2 manually
Latest NICER CALDB: xti20221001
CALDB for bc1013010133_0mpu7_cl.evt: xti20180711
NOT Up to date with latest NICER CALDB
CALDB for ni1013010133_0mpu7_cl.evt: xti20180711
NOT Up to date with latest NICER CALDB
There has to be some way of still obtaining the required metadata to automatically reprocess these types of datasets.
There are unit warnings coming up when using astropy v5.1. This indicates the use of deprecated functionality in astropy and as such autonicer should be brought up to date to work with astropy v5.1.
A temporary fix was applied with v1.0.2 of autonicer, but a more permanent fix should be accounted for in future versions.
Update dependencies to solve CVE-2022-23491 vulnerability with certifi
The following error occurred with no when trying to reprocess the dataset listed in the example.
Migrating to 1020420119
Latest NICER CALDB: xti20221001
######## Decompressing None ########
ni1020420119_0mpu7_cl.evt.tar.gz extracted
ni1020420119_0mpu0_ufa.evt.tar.gz extracted
ni1020420119_0mpu1_ufa.evt.tar.gz extracted
ni1020420119_0mpu6_ufa.evt.tar.gz extracted
ni1020420119_0mpu5_ufa.evt.tar.gz extracted
ni1020420119_0mpu2_ufa.evt.tar.gz extracted
ni1020420119_0mpu4_ufa.evt.tar.gz extracted
ni1020420119_0mpu3_ufa.evt.tar.gz extracted
ni1020420119_0mpu7_ufa.evt.tar.gz extracted
############## Auto NICER ##############
Target:
It appears the OBSID was not identified from the metadata nor was the target.
When entering a single OBSID into the autonicer prompt it isn't resolving with the HEASARC query table output.
Example 3013010105 will give a OBSID not found message, while selecting cycle 3 will que up the 3013010105 OBSID
I know and generally like to follow the motto "If it ain't broke don't fix it", but I've been continually reminded of the benefits of minimizing dependencies. The heasoft dependency clearly cannot be dropped for obvious reasons, but autoNICER still relies on wget (not the wget python lib) to retrieve datasets from the heasarc.
See the first part of the pull reduce function:
def pull_reduce(self):
"""
Downloads the NICER data
Puts the retrieved data through a standardized data reduction scheme
"""
downCommand = (
"wget -q -nH --no-check-certificate --cut-dirs=5 -r -l0 -c -N -np -R "
+ "'index*'"
+ " -erobots=off --retr-symlinks https://heasarc.gsfc.nasa.gov/FTP/nicer/data/obs/"
)
This potentially could be improved upon with just using a conventional get request with the requests lib, it also could potentially lead to converting all of this to be async, which could certainly improve the code.
Currently autonicer.get_caldb_ver()
is pulling the latest NICER caldb version at each new instance of autonicer.Reprocess()
when passing in with --inlist
which is overkill. This should be called at most 1-2 times per day.
There are two warnings that appear when running an autonicer locally. Both of the warnings are UnitWarnings from astropy.
One is for 'DEGREE' did not parse as fits unit: At col 0, Unit 'DEGREE' not supported by fits standard.
Another is for 'MJD' did not parse as fits unit: At col0, Unit 'MJD' not supported by fits standard.
This is most likely due to changes with a new version of astropy, as this warning didn't occur as of May 23, 2022.
It can probably be fixed with some dependency constraints in the .toml until more work can be done to bring autonicer up to date with the new changes in astropy.
*Changes in astropy are assumed (as of right now), I haven't investigated changes on astropy's end yet.
In the autoNICER command prompt if an int that doesn't correspond to any OBSID is entered it appears to pass through as if it were adding an OBSID that doesn't exist to be processed.
Example:
`############## Auto NICER ##############
Target: crab pulsar
Apply Bary-Center Correction: [y]
Write Output Log: [n]
Compress XTI files (.tar.gz): [y]
autoNICER > 11111
autoNICER >
`
The following error message is giving very contradictory messages and needs to be more clear
Unable to identify required metadata.
Consider Re-downloading and reducing this dataset
OR
Try nicerl2 manually
Unable to identify Object -> IS OK
Proceeding with Reduction...
Latest NICER CALDB: xti20221001
!!!!! CANNOT IDENTIFY CALDB !!!!!
After adding a few OSBIDs and entering the "done" command to start processing data, autonicer gives back an IndexError: list out of range response.
Currently autoNICER can only automate the procedure of both retrieving and reducing NICER OBSIDs. In future versions there should be a way of reprocessing an existing dataset through the use of some kind of --reprocess or --update flag.
This will allow analysts to quickly and easily update existing datasets as new calibrations become available.
NICER OBSIDs from at least October 2021-present do not resolve in the code. Giving a KeyError:0.
Currently autoNICER forces a user to generate a barycenter corrected mpu7_cl.evt file for each observation.
If a user isn't doing any timing work then the barycenter correction is unnecessary so there needs to be some functionality to bypass the barycenter correction to just generate a normal mpu7_cl.evt file for an observation.
If an an unknown command is entered in the prompt then the program ends and gives an "Keyerror:0" error message.
Any OBSIDs from November 2022-present currently cannot be queried on the testing host.
This is likely due to one of the following:
AutoNICER uses pytest as a development dependency and per CVE-2022-42969 is thus vulnerable to the ReDoS vulnerability in svnurl.py.
A restriction to pytest >7.2
needs to be introduced to the dev dependencies
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.