Git Product home page Git Product logo

hapi's People

Contributors

hitranonline avatar romankochanov avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hapi's Issues

Can not import HAPI V.1.2.2.0 under Python 3.4

Hello,
Thanks for providing the HAPI for scientific research!
I am a freshman for HAPI and Python. I am using Python 3.4.1 with Anaconda 2.1.0 (64-bit) and the numpy has been intalled. When I tried to import the newest version of HAPI[V.1.2.2.0], I got the following error message (can also be checked within the attachment ):

from hapi import * File "C:\Users\xxx\Anaconda3\lib\hapi.py", line 35549 return absorptionCoefficient_Generic(*args,**kwargs, SyntaxError: invalid syntax

20220825145957

I have tested that the HAPI [V.1.2.2.0] can be imported under Python 3.9+ while the HAPI [V.1.1.0.7] can be imported under Python 3.4. But I have to use Python 3.4 now, can you help finding out what a issue exist?

Thank you!

Cannot connect to hitran

I have a problem connecting to the hitran database using the api. I use the following code:

import hapi
hapi.db_begin('data')
hapi.fetch('H2O',1,1,3400,4100)

and get a "SSLCertVerificationError". I attached the full trace as a file. log.txt I found this discussion online: https://bugs.python.org/issue36866 It may be related?

I use MacOS 10.14.5, Python 3.7.3, hitran-api 1.1.0.9.6 (and also with version 1.1.0.9.0).

Publish on PYPI

Hello, good work on the HAPI!

Would it be possible to provide this library on the PYPI platform? This will make it much easier to download and maintain the local copy of the library.

[FEATURE REQUEST] Partial loading in storage2cache()?

Hi,

HAPI is great, but not for offline usage.

Because, using it offline you would want to work with a exhaustive .par file, but only request lines within a certain wavenumber range.

The problem is that, as far as I understand, the storage2cache() method loads the entire .par file.

(btw, the pos=None parameter is not even used in this method, so you could at least add

if pos:
    for i in range(0, min([pos, line_count])):
        InfileData.readline())

)

A lot of time would be saved by loading only lines within a certain wavenumber range (except maybe when the seeked range is at the end of the .par file).

I am not sure it is possible with the current code as it is, because lines are all read as strings, and then converted to numbers.

For this reason, I continue to use our former matlab code which is faster because it has this functionality, which is sad because I like Python better.

Am I the only one to needthis functionality?

tx

Package/module structure, code style, performance - maybe go for version 2

Hello,

As a developer for infrared spectroscopy software, I really like to work with the functionality that HITRAN offers (Perkin Elmer even built a patent with it for water vapour and CO2-correction of IR-spectra), but from time to time one has to look into the code which tries to summarize everything in a single .py-file (global constants, data-specific definitions, calculation routines, a 10 minutes Python tutorial). This fact makes it hard to understand/maintain and also increases the likelihood of errors a lot (there are still a some Pull requests and issues open regarding thins).

Before tackling the issues, I think a first step could be a version 2.0 that actually makes more use of Python built-in functionalities like

  • using of Python conventions for global constans (UPPER_CASE_CONSTANT)
  • making more use of Enums and dataclasses (or even pydantic models) rather than relying on global-constants-dict-combinations which are hard to track back in scope
  • not checking types with type(var) == some_type but rather with isinstance
  • splitting the package up into modules like (models for the global constants and data models involved, database_io for reading from the database, data for the hard-coded data in the module, lineshapes for the computation of lineshapes, environment for the environment specification part, misc for something like the tutorial ..., just a first example structure I could think of)

Besides, the code contains some parts that could be improved

  • black formatting
  • linting (ruff) which would uncover something like #37 like a charm
  • adding type hints
    these 3 steps would not require a lot of effort, but make the code so much more joy to read and maintain
  • not using Python-loops in heavy-duty numerics, e.g., switching to numba, cython, or even rust with multiprocessing/threading could reduce the computation time quite a bit (this is currently a limiting factor for me)
  • in some parts scipy could also be beneficial as dependency that most developers in that field have installed anyway
  • adding automated tests with pytest (how is the module currently tested for correctness?) that could then be run whenever somebody makes a push or pull request
  • ...

Then, it would be way easier to incorporate the changes required to resolve open issues/pull requests in less time.

Please don't see this as criticism, I would like to join as a contributor to this package and help 😄
However, I see a lot of open issues and pull requests and therefore wanted to ask if the project is still active and if there is interest in such big changes.

Thanks for your time.

Basic coding errors in HAPI.py: calculate_parameter_NuVC

A simple syntax analysis of hapi.py shows several coding errors: undefined variables and so on

For instance, in calculate_parameter_NuVC:

  • Shift0 is undefined
  • NuVC_species is undefined

I think (but I may be wrong), that this can be solved by:

  • replacing Shift0 by Delta0 (they are the same, just different notations)
  • initializing NuVC_species to 0 and then incrementing it with NuVCDB*(Tref/T)**KappaDB*p and decrementing it with EtaDB*abun*(Gamma0T-1j*Delta0T)

Another error is getLinelist() which calls un inexistent makeQuery

In those cases, raise a NotImplementedError instead

Bug in arange_

With Python 3.7.6 and numpy 1.18.1 the function linspace spects a integer type for num (npnt) parameter.

Otherwise the error

object of type <class 'numpy.float64'> cannot be safely interpreted as an integer.

is given.

def arange_(lower,upper,step):
    npnt = floor((upper-lower)/step)+1
    upper_new = lower + step*(npnt-1)
    if abs((upper-upper_new)-step) < 1e-10:
        upper_new += step
        npnt += 1    
    return linspace(lower,upper_new,npnt)

Uncertainty feature request

I have a feature request. I want to be able to obtain the uncertainty in the absorption coefficient spectrum.

My idea was to use a keyword in absorptionCoefficient_*(), for example uncertainty, allow a user to choice to output the uncertainty data from the function.

Further I wanted a finer control of which uncertainty code I want to use. This could be achieved by passing a list of positional ids (like indices in python) to uncertainty keyword to select the uncertainty for one or a number of uncertainty parameters:

Line position,
Air pressure-induced line shift,
Intensity,
Air broadened Half-width,
Self broadened Half-width,
temperature-dependence of the air-broadened half-width

An example of the list would be uncertainty=[0, 2] to use the uncertainty in line position and intensity. If the argument passed to uncertainty is True all uncertainty parameters will be used.

The output of the absorptionCoefficient_*() with uncertainty=True keyword will have four outputs nu, coef, nu_unc, coef_unc where *_unc are uncertainties in wavenumber and coefficient.

Error connection URL

When running fetch I am getting URL connection error. I am not sure what I am doing wrong. I also noticed that the hitran/hitemp website (https://hitran.org/) is not working for me. I tried using different connections (personal hotspot too) and I don't get a different result.

Dependency on Numpy not declared in setup.py

Hi,

thanks for developing HAPI 😄

I just tried to install HAPI into a new virtual environment and it failed with the message:

ModuleNotFoundError: No module named 'numpy'

This is because Numpy is not declared in setup.py as a dependency.
You can use the install_requires keyword argument to include Numpy as an explicit dependency:

setup(
    name='hitran-api',
    version=HAPI_VERSION,
    packages=['hapi',],
    install_requires=["numpy",], # This is missing. Better specify a version, e.g. "numpy>=1.20" or whatever HAPI needs
    license='MIT',
)

The next problem is that setup.py imports HAPI_VERSION and HAPI_HISTORY form hapi.hapi.
This leads to setup.py trying to import Numpy, which then fails, although Numpy is now a declared dependency.
I solved this problem by moving the declarations of HAPI_VERSION and HAPI_HISTORY into a file _version.py in the root of the repository and then did

from _version import HAPI_VERSION, HAPI_HISTORY

in setup.py.

If you want to keep these declarations inside the hapi folder, I guess you would need to remove

from .hapi import *

from __init__.py, because this triggers the import of Numpy.

You should be able to do this if you instead declare the exported functions with the __all__ variable, like described here:
https://docs.python.org/3/tutorial/modules.html#importing-from-a-package

This problem seemed to be undetected for quite a while, I guess because most people have Numpy installed.
However, if your want to use a dedicated environment for your project (like I always do), Numpy will not be available by default.

Best regards
Nils

Unrealistic jump in computed absorption for main isotope of water near 2729.5 nm

Hi,

When computing the absorption spectrum of water in the 2725-2732 nm range, spectreatm.py displays a jump at 2729.5 nm

  • The problem does not occur with a custom-made Matlab program reading .par files and computing Voigt profiles
  • I have checked that the Hitran lines download with HAPI have a 100% match with the ones I am using in my other program
  • Increasing eps or the resolution does not solve the problem
  • This is not due to missing line data, because it works with my other program and hey, we're dealing with the main isotope of water here

I think it may have something to do with the Voigt code not taking into account enough neighboring lines, i.e. as if the default values for the 'OmegaWing' and/or 'IntensityThreshold' were incorrect.

We were on the verge of migrating our matlab code to Python but it gave m a cold chill.

Please tell me if you can reproduce the issue. Thanks.

My code is the following:

import spectreatm
spectres = spectreatm.Spectra([H2O], [2725,2732] , zenith_angle=0,
                   layers_nb=1, temperature=300, pressure=1.01325, target_dist=1, round_trip=False, eps=10)
spectres.compute_absorption()
spectres.plot_all()
spectres.save_all()

My colleague told me that the error happens within this piece of code:

nu, coef = hapi.absorptionCoefficient_Voigt(
self.components, self.name, Environment=environment,
WavenumberRange=self.wn_range,
WavenumberStep=wn_delta, HITRAN_units=True,
Diluent={'air': 1-vmr, 'self': vmr})

spectreatm / HAPI result:

HAPI

Matlab (in-house code) result:

Matlab

Scaling column values using select()

Hi,

With Python 3.8.8 and HAPI v1.2.2.0, I am just reporting a minor bug when trying to scale intensities by a factor using the select() command (same example as given on page 15 of the HAPI manual). See traceback below:

Traceback (most recent call last):
  File "filepath\hapi.py", line 1902, in newRowObject
    par_format = expr[3]
IndexError: tuple index out of range

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "test.py", line 28, in <module>
    select('H2O',ParameterNames=('nu',('/','sw',0.99731)))
  File "filepath\hapi.py", line 2211, in select
    selectInto(DestinationTableName,TableName,ParameterNames,Conditions)
  File "filepath\hapi.py", line 2161, in selectInto
    RowObjectNew = newRowObject(ParameterNames,RowObject,VarDictionary,ContextFormat)
  File "filepath\hapi.py", line 1904, in newRowObject
    par_format = getDefaultFormat(type(par_value))
  File "filepath\hapi.py", line 1863, in getDefaultFormat
    raise Exception('Unknown type')
Exception: Unknown type

Bug in data selection?

As a new user of HAPI, I'm very happy that there's finally a convenient API for working with HITRAN data. Also as a new user, it's possible I'm not yet using it correctly, as I'm experiencing the following strange behavior: despite selecting CH4 as the constituent I want to plot and seeing that that data was successfully downloaded via 'fetch()', I think I am getting a spectrum for O3 instead. See the attached screen capture.

I had indeed selected O3 in an earlier test, but I continue to get the same spectrum when I try to get absorption for CO2 or CH4, even after I delete the data folders and restart the Jupyter kernel. Please let me know whether I'm overlooking something obvious.

hapi_spectrum

Diluent, Pressure, Temperature specifications

Hello, I'm using Hapi to compute transmission spectra of molecules in the atmospheres of possible planets.
I have some doubts about the input parameters to compute the 'absorptionCoefficient_Lorentz' and 'transmittanceSpectrum'.

Let's say I want to recreate the atmosphere of the Earth and see how the O2 transmission spectra is.
I use the following code:

`path=8000e2 #cm
temperature=288 #K
pressure=1 #atm

environment = {'l': path,'T':temperature}
nu,coef = absorptionCoefficient_Lorentz(SourceTables='O2', Environment={'p':pressure,'T':temperature}, Diluent={'air':1.0}, HITRAN_units=False)
wavenumbers, transmittance = transmittanceSpectrum(nu,coef,Environment={'T':temperature,'l':path})
`
I have some doubts... Is the pressure and temperature referred just to the O2 quantity, or to the diluent, i.e. air?
And does the diluent include the percentages of the Earth composition (78% Nitrogen, 21% Oxigen etc.)?
Last question: what is exactly 'l' (path length)? Can I consider it like the atmosphere height (8 km)? Because by putting these values I obtain a weird spectrum, which has too much absorption in the transmittance spectrum.
Thank you :)

Sensitivity issue with Diluent Parameter

This is regarding the Diluent parameter in the absorptionCoefficient_Lorentz function - putting in a volume mixing ratio for 'self' with very low values like 4e-05 is giving me inaccurate values - all the features of the spectrum are lost. Is there a lower bound value below which the function doesn't work?

Recent change in CO Einstein coefficients ?

Hello @RomanKochanov and the HITRAN team,

I work on the Python code RADIS which computes spectra from HITRAN/HITEMP/etc., with a focus on high-temperature spectra with millions of lines.

One of our validation test uses HAPI as a reference and recently failed (radis/radis#160) because of an unexpected change in CO Einstein coefficients downloaded from HITRAN, compared to hardcoded values. I'm surprised that the test did not fail before, given that our hardcoded values have not changed in the past 3 years..

  • Was there are a recent change in CO Einstein coefficient ? HITRAN news do not indicate any update to CO lines.

  • Changes are about ~2% (not a lot, but enough to trig the error in RADIS). See below (comparaison with "old" from 1 year ago, 2019) :

image

  • More importantly, there seem to be major changes in the latest columns "ierr", "iref", "line_mixing_flag", compared to values from 2019. Can this be a a problem ?

image

Python 3 Support

Thank you for putting the effort in building this great link to the HITRAN database 🥇

Is the library supporting Python 3 or only Python 2.7, as described in the manual?

Due to my inability to use HAPI for data downloads, I have modified some of the code for network applications to be usable in Python 3.10

import urllib.request  
import ssl  
import json  

def queryHITRAN(TableName, iso_id_list, numin, numax, pargroups=[], params=[], dotpar=True, head=False):
    ParameterList = prepareParlist(pargroups=pargroups, params=params, dotpar=dotpar)
    TableHeader = prepareHeader(ParameterList)
    TableHeader['table_name'] = TableName
    DataFileName = VARIABLES['BACKEND_DATABASE_NAME'] + '/' + TableName + '.data'
    HeaderFileName = VARIABLES['BACKEND_DATABASE_NAME'] + '/' + TableName + '.header'
    
    iso_id_list_str = [str(iso_id) for iso_id in iso_id_list]
    iso_id_list_str = ','.join(iso_id_list_str)
    print('\nData is fetched from %s\n' % VARIABLES['GLOBAL_HOST'])
    
    if pargroups or params:  # custom par search
        url = VARIABLES['GLOBAL_HOST'] + '/lbl/api?' + \
            'iso_ids_list=' + iso_id_list_str + '&' + \
            'numin=' + str(numin) + '&' + \
            'numax=' + str(numax) + '&' + \
            'head=' + str(head) + '&' + \
            'fixwidth=0&sep=[comma]&' + \
            'request_params=' + ','.join(ParameterList)
    else:  # old-fashioned .par search
        url = VARIABLES['GLOBAL_HOST'] + '/lbl/api?' + \
            'iso_ids_list=' + iso_id_list_str + '&' + \
            'numin=' + str(numin) + '&' + \
            'numax=' + str(numax)
    
    if VARIABLES['DISPLAY_FETCH_URL']:
        print(url + '\n')
    
    try:
        
        context = ssl._create_unverified_context()
        
        if VARIABLES['PROXY']:
            print('Using proxy ' + str(VARIABLES['PROXY']))
            proxy = urllib.request.ProxyHandler(VARIABLES['PROXY'])
            opener = urllib.request.build_opener(proxy)
            urllib.request.install_opener(opener)
        
        req = urllib.request.urlopen(url, context=context)
    except urllib.error.HTTPError:
        raise Exception('Failed to retrieve data for given parameters.')
    except urllib.error.URLError:
        raise Exception('Cannot connect to %s. Try again or edit GLOBAL_HOST variable.' % VARIABLES['GLOBAL_HOST'])
    
    CHUNK = 64 * 1024
    print('BEGIN DOWNLOAD: ' + TableName)
    
    with open(DataFileName, 'w') as fp:
        while True:
            chunk = req.read(CHUNK)
            if not chunk:
                break
            fp.write(chunk.decode('utf-8'))
            print('  %d bytes written to %s' % (CHUNK, DataFileName))
    
    with open(HeaderFileName, 'w') as fp:
        fp.write(json.dumps(TableHeader, indent=2))
        print('Header written to %s' % HeaderFileName)
    
    print('END DOWNLOAD')
    
    storage2cache(TableName)
    print('PROCESSED')

querying any absorption coefficient gives always the same result

Hi,
I'm trying to extract the transmittance spectrum for several gases.
I followed the manual, however I always get the same profile.
What am I doing wrong?
Thanks in advance!

import hapi
hapi.db_begin('hitran_data')
for x in ['NO2', 'CO2']:
    hapi.fetch(x,1,1,3000,7000)
    nu,coef = hapi.absorptionCoefficient_Lorentz(SourceTables=x, HITRAN_units=False)
    nu,trans = hapi.transmittanceSpectrum(nu,coef)
    plt.figure()
    plt.plot(nu,trans)

linspace needs an int

On line 161, the linspace function needs to be called with an integer for number of points. So,

return linspace(lower,upper_new,npnt)

needs to be

return linspace(lower,upper_new,int(npnt))

I'm using Python 3, so this may be a Py3 issue, or related to a new version of numpy (I'm using 1.18.1).

@hitranonline Is this repository being actively maintained? If so, I can make a PR.

The error that I was getting before this change was the following:

Traceback (most recent call last):
  File "/Users/danhickstein/Documents/NIST/2017-10-30 - Dual Comb/HITRAN/plot_HITRAN0.4.py", line 58, in <module>
    nu,c = hapi.absorptionCoefficient_Lorentz(((m, 1),), filename, HITRAN_units=False, Environment=environ)
  File "/Users/danhickstein/Documents/NIST/2017-10-30 - Dual Comb/HITRAN/hapi.py", line 19201, in absorptionCoefficient_Lorentz
    Omegas = arange_(OmegaRange[0],OmegaRange[1],OmegaStep) # fix
  File "/Users/danhickstein/Documents/NIST/2017-10-30 - Dual Comb/HITRAN/hapi.py", line 161, in arange_
    return linspace(lower,upper_new,npnt)
  File "<__array_function__ internals>", line 6, in linspace
  File "/Users/danhickstein/opt/anaconda3/lib/python3.7/site-packages/numpy/core/function_base.py", line 121, in linspace
    .format(type(num)))
TypeError: object of type <class 'numpy.float64'> cannot be safely interpreted as an integer.

Consult SDvoigt for questions related to parameters

When I use hapi, I found that the fitted curve of sdvoigt and voigt is consistent. I want to ask whether there is any problem, and also want to ask where the speed parameter should be set in sdvoigt, but I didn't find it. Here are the cases and effect charts I used.

case.docx

SSL/URL error when trying to connect to HITRAN via hapi

I have been trying to connect to HITRAN via hapi and and keep receiving an ssl/url error.

I am using a macOS 14.4.1 with python 3.9.13. I am able to access HITRAN.org via any web browser, but am not able to fetch data using hapi.

I have tried using this fix with a variety of HAPI versions, with no luck.

I have also seen some other people online who may be having this same issue.

One Error:

HAPI version: 1.2.2.0
To get the most up-to-date version please check http://hitran.org/hapi
ATTENTION: Python versions of partition sums from TIPS-2021 are now available in HAPI code

           MIT license: Copyright 2021 HITRAN team, see more at http://hitran.org. 

           If you use HAPI in your research or software development,
           please cite it using the following reference:
           R.V. Kochanov, I.E. Gordon, L.S. Rothman, P. Wcislo, C. Hill, J.S. Wilzewski,
           HITRAN Application Programming Interface (HAPI): A comprehensive approach
           to working with spectroscopic data, J. Quant. Spectrosc. Radiat. Transfer 177, 15-30 (2016)
           DOI: 10.1016/j.jqsrt.2016.03.005

           ATTENTION: This is the core version of the HITRAN Application Programming Interface.
                      For more efficient implementation of the absorption coefficient routine, 
                      as well as for new profiles, parameters and other functional,
                      please consider using HAPI2 extension library.
                      HAPI2 package is available at http://github.com/hitranonline/hapi2

Using lahetra


Data is fetched from http://hitran.org

Traceback (most recent call last):

  File "/Users/erinmccaughey/Desktop/hapi_tests/hapi.py", line 3276, in queryHITRAN
    req = urllib2.urlopen(url)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 214, in urlopen
    return opener.open(url, data, timeout)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 523, in open
    response = meth(req, response)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 632, in http_response
    response = self.parent.error(

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 555, in error
    result = self._call_chain(*args)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 494, in _call_chain
    result = func(*args)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 747, in http_error_302
    return self.parent.open(new, timeout=req.timeout)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 517, in open
    response = self._open(req, data)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 539, in _open
    return self._call_chain(self.handle_open, 'unknown',

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 494, in _call_chain
    result = func(*args)

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/urllib/request.py", line 1417, in unknown_open
    raise URLError('unknown url type: %s' % type)

URLError: <urlopen error unknown url type: https>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/Users/erinmccaughey/opt/anaconda3/lib/python3.9/site-packages/spyder_kernels/py3compat.py", line 356, in compat_exec
    exec(code, globals, locals)

  File "/Users/erinmccaughey/Desktop/hapi_tests/hapi_test.py", line 23, in <module>
    fetch_by_ids('co2_sdv_6056cm_primary',[7],6056,6064,ParameterGroups=['160-char','SDVoigt_Air'])

  File "/Users/erinmccaughey/Desktop/hapi_tests/hapi.py", line 5324, in fetch_by_ids
    queryHITRAN(

  File "/Users/erinmccaughey/Desktop/hapi_tests/hapi.py", line 3281, in queryHITRAN
    raise Exception(

Exception: Cannot connect to http://hitran.org. Try again or edit GLOBAL_HOST variable.

error

Team:
I received this error message when I start the HAPI package at the comma after **kwargs, "invalid syntax"
Please help.

START COMPUTING...
Traceback (most recent call last):
File "C:\OLD-C-HP\Users\GOU\Desktop\KaCodes\HAPI-MOLECULAR.py", line 30, in
import hapi as hapi
File "C:\OLD-C-HP\Users\GOU\WinPython-64bit-3.4.4.1\python-3.4.4.amd64\lib\site-packages\hapi_init_.py", line 7, in
from .hapi import *
File "C:\OLD-C-HP\Users\GOU\WinPython-64bit-3.4.4.1\python-3.4.4.amd64\lib\site-packages\hapi\hapi.py", line 35549
return absorptionCoefficient_Generic(*args,**kwargs, profile=PROFILE_HT, calcpars=calculateProfileParametersFullPriority)
^
SyntaxError: invalid syntax

SDVoigt Absorption Coefficient Repeatable Errors

Hello! I think I've found two reproducible issues with absorptionCoefficient_SDVoigt. These might be user error, but I want to put them in front of the community just in case. Any guidance would be greatly appreciated.

In this first example, an extremely low pressure value (1e-5 atm) throws an IndexError whereas a slightly higher pressure doesn't (1e-4 atm). Newer versions of Numpy throw errors on mismatches of shape/size when binary-indexing arrays - which appears to be what's happening here. Older versions didn't care. Don't know if that could be part of the issue? Also, if you use absorptionCoefficient_Voigt insetad of absorptionCoefficient_SDVoigt, it appears to work just fine.

from hapi import *
db_begin('data')

fetch('CO2', 2, 1, 6350, 6375, ParameterGroups=[
      '160-char', 'Voigt', 'SDVoigt'])

nu, coef = absorptionCoefficient_SDVoigt(
    SourceTables='CO2',
    Environment={'T':117,'p':1e-05},# THIS PRESSURE VALUE THROWS THE ERROR BELOW
    #Environment={'T': 117, 'p': 1e-04},  # THIS PRESSURE VALUE WORKS
    WavenumberRange=[6350, 6375],
    WavenumberStep=0.001,
    HITRAN_units=True)

#   File "C:\ProgramData\Anaconda3\lib\site-packages\hapi\hapi.py", line 33898, in pcqsdhc
#     WR1_PART4[index_CPF] = WR1

# IndexError: boolean index did not match indexed array along dimension 0; dimension is 371 but corresponding boolean dimension is 370

User elizamanelli also posted an issue with this same pcqsdhc (partially-Correlated quadratic-Speed-Dependent Hard-Collision Subroutine) ... I don't have the expertise to tell if it's related.

In this second example, the absorption coefficients returned are all zeros. Similarly to the previous example, if you use absorptionCoefficient_Voigt insetad of absorptionCoefficient_SDVoigt, it works just fine.

from hapi import *
db_begin('data')

wnlowbound = 6045
wnhighbound = 6070
fetch('CH4',6,1,wnlowbound,wnhighbound,ParameterGroups=['160-char','Voigt','SDVoigt'])

nu, coef = absorptionCoefficient_SDVoigt( #this works fine if I just do Voigt instead of SDVoigt
    SourceTables='CH4',
    #Environment={'T':Tnow,'p':Pnow},
    WavenumberRange=[wnlowbound, wnhighbound],
    WavenumberStep=0.001,
    HITRAN_units=True)

# coef
# Out[2]: array([0., 0., 0., ..., 0., 0., 0.])
#sum(coef>0)
#Out[3]: 0

Again, any guidance would be greatly appreciated.

~Doug

diluent

Hello,

I am a new user of HAPI, and I am having trouble understanding what the diluent parameter is.
I have read the HAPI manual and tried to research articles online, and I understand what this does to the line function broadening in relation to gamma. However, I still don't understand what this actually means in relation to species I am considering

For example, if I am working with O2 (all isotopes), how much O2 is there in relation to the broadening mixture?
In other words, what does it means to set diluent = {'self' : 1} or diluent = {'air' : 1} or diluent = {'O2': 1} in relation to the absorbing species?
Note that I get different results for each of these. Why is the difference between 'O2' and 'self'?
If I just have a reference cell filled with O2, do I need a broadening mixture?
and If you look at the manual, you can enter volume mixing ratio as gas mixture composition, so I wonder if the concentration is reflected in the result of the spectrum simulation.

Diluent parameter understanding

Hello,

I am a new user of HAPI, and I am having trouble understanding what the diluent parameter is.
I have read the HAPI manual and tried to research articles online, and I understand what this does to the line function broadening in relation to gamma. However, I still don't understand what this actually means in relation to species I am considering

For example, if I am working with H2O (all isotopes), how much H2O is there in relation to the broadening mixture?
In other words, what does it means to set diluent = {'self' : 1} or diluent = {'air' : 1} or diluent = {'H2O': 1} in relation to the absorbing species? Note that I get different results for each of these. If I just have a reference cell filled with H2O, do I need a broadening mixture?

Partial Pressure

The documentation on the HITRAN website explained that it is possible to calculate the Lorentzian HWHM with partial pressure. I see that the absorptionCoefficient_Lorentz and other absorptionCoefficient_ does not support the partial pressure.

Would it be possible to add this partial pressure variable please?

Unphysical gap in CO2 absorption table for isotopologues 1 & 2?

When I plot the molecular cross-sections (scaled by relative abundance) for the first three isotopologues (see attached), absorption for the first two drops out completely between about 1200 and 1420 wavenumbers, leaving the third [(16O)(12C)(18O)] as the dominant isotopologue within that range, despite having an abundance of only 0.4% relative to (12C)(16O)2. Is this a real absorption gap or does it reflect an unphysical gap, either in the HITRAN database itself or in HAPI's transmittal of the data via fetch()? I notice that elsewhere, the three isotopologues show fairly similar behavior.

Each isotopologue line file was obtained with a single fetch statement, so this should not have been a problem with the wavenumber range specified. I tried fetching the data again, with the same result.

I spoke with an IR spectroscopy expert who could not think of a physical reason for such a gap.

CO2_2.pdf

403 Error in Line-by-Line Search 4. "Create New Output Format"

Dear @RomanKochanov @hitranonline:

image

I am trying to use the Line-by-Line data from HITRAN to compute absorption coefficients via HAPI for a mixture of ambient gasses. I cannot use the traditional .par format to retrieve the broadening parameters I require and I must make a custom format.

In Line-by-Line data access step 4, I am trying to "Create New Output Format" as instructed on the website and in your video tutorials "Part 3: Parameters beyond the standard HITRAN format", but I keep getting a 403 Error (Forbidden), is there a permission my user profile needs to utilize the custom output format feature of HITRAN? Hoping to access more than just "air" and "self" broadening coefficients.

*Update: you must be logged in the access this feature of the database

Many thanks for your support,
Gibson

problem with connecting to Hitran web

hello ,

I am trying to run the simple code

from hapi import *
db_begin('data')
fetch('H2O',1,1,3400,4100)

how can i connect the web, any idea on how to solve this ?

and getting the following errors 
Traceback (most recent call last):
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 1346, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\http\client.py", line 1285, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\http\client.py", line 1331, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\http\client.py", line 1280, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\http\client.py", line 1040, in _send_output
    self.send(msg)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\http\client.py", line 980, in send
    self.connect()
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\http\client.py", line 1454, in connect
    self.sock = self._context.wrap_socket(self.sock,
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\ssl.py", line 501, in wrap_socket
    return self.sslsocket_class._create(
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\ssl.py", line 1041, in _create
    self.do_handshake()
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\ssl.py", line 1310, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1129)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\site-packages\hapi\hapi.py", line 2857, in queryHITRAN
    req = urllib2.urlopen(url)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 214, in urlopen
    return opener.open(url, data, timeout)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 523, in open
    response = meth(req, response)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 632, in http_response
    response = self.parent.error(
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 555, in error
    result = self._call_chain(*args)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 494, in _call_chain
    result = func(*args)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 747, in http_error_302
    return self.parent.open(new, timeout=req.timeout)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 517, in open
    response = self._open(req, data)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 534, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 494, in _call_chain
    result = func(*args)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 1389, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\urllib\request.py", line 1349, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1129)>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\SamiPython\Spectroscopy\test.py", line 8, in <module>
    fetch('H2O',1,1,3400,4100)
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\site-packages\hapi\hapi.py", line 4886, in fetch
    queryHITRAN(TableName,[ISO[(M,I)][ISO_INDEX['id']]],numin,numax,
  File "C:\Users\Sami\AppData\Local\Programs\Python\Python39\lib\site-packages\hapi\hapi.py", line 2861, in queryHITRAN
    raise Exception('Cannot connect to %s. Try again or edit GLOBAL_HOST variable.' % GLOBAL_HOST)
Exception: Cannot connect to http://hitran.org. Try again or edit GLOBAL_HOST variable.

csqrtY computation and meaning

Hi,

I'm currently trying to fully understand the pCqSDHC model and your implementation thereof. So while looking through the associated papers (http://dx.doi.org/10.1016/j.jqsrt.2013.06.015, http://dx.doi.org/10.1016/j.jqsrt.2013.05.034, http://dx.doi.org/10.1016/j.jqsrt.2013.10.015) I've been able to retrace almost every step of the computation. I'm only grappeling with your variable csqrtY which is defined in line 33861:

csqrtY = (Gam2 - iz * Shift2) / (2.0e0 * cte * (1.0e0-eta) * (Gam2 ** 2 + Shift2 ** 2))

I expect this variable csqrtY to be the square root of the variable Y defined just above based on its name and the further usage. Y is defined just above in line 33860:

Y = __ComplexType__(1.0e0 / ((2.0e0 * cte * c2t)) ** 2)

which would simply be

Y = 1/ ( 2 * cte * c2t ) = 1 / (2 * cte * (1 - eta) * c2) = 1 / (2 * cte * (1-eta) * Gam2 + i Shift2)

could you please explain the step I missed in your computation of csqrtY? Your help would be greatly appreciated, thanks!

DB stored incorrectly, when data includes iso_id's >=10

The issue prevents loading DB's, even when they are downloaded with HAPI directly. e.g:

import hapi
import os

# start a fresh DB and download CO2=2, isotopologue=10
pathstr = './hapi/test'
hapi.db_begin(pathstr)
hapi.fetch('testme', 2, 10, 2250, 2251)
hapi.db_commit()

# open the DB path, loading the stored databases
hapi.db_begin(pathstr)    #  !this will cause an Exception

Viewing the data file, it seems HAPI saves the hitran file iso_id's in decimal, unlike the hitran web downloads:

#HAPI download
 210 2250.088119 1.280E-28 1.699E+02.06740.083 1682.20280.76-.002884       1 0 0 12       1 0 0 02                    R 36e     3777642231 5 4 5 7   150.0  146.0

#Hitran download
 20 2250.088119 1.280E-28 1.699e+02.06740.083 1682.20280.76-.002884       1 0 0 12       1 0 0 02                    R 36e     3777642231 5 4 5 7   150.0  146.0

During loads, HAPI expects the iso id's at a fixed location, and 1 digits. The Hitran Web downloads are loaded correctly, but files fetched with HAPI fail.

Two question about APPLYING INSTRUMENTAL FUNCTIONS

Thank you very much for your work!In the part of INSTRUMENTAL FUNCTIONS, I have a few questions.
The first question is whether the Michelson Instrumental functions do not weight the center?
Code show as below:

def SLIT_MICHELSON(x,g):
    """
    Instrumental (slit) function.
    B(x) = 2/γ*sin(2pi*x/γ)/(2pi*x/γ) if x!=0 else 1,
    where 1/γ is the maximum optical path difference.
    """
    y = np.zeros(len(x))
    index_zero = x==0
    index_nonzero = ~index_zero
    dk_ = 2*np.pi/g
    x_ = dk_*x[index_nonzero]
    #Michelson:B(x)=\frac{2}{\gamma}\sin^2\left(\frac{\pi}{\gamma}x\right)/\left(\frac{\pi}{\gamma}x\right)^2
    #Original code:y[index_zero] = 1
    y[index_zero]= 2 / g
    y[index_nonzero] = 2/g*np.sin(x_)/x_
    return y

The second question is the wing width sampling point. If the center point of 0 is not sampled, will it cause wave number drift?
Code show as below:

def arange_(lower,upper,step):
    npnt = floor((upper-lower)/step)+1
    npnt = int(npnt) # cast to integer to avoid type errors
    upper_new = lower + step*(npnt-1)
    if abs((upper-upper_new)-step) < 1e-10:
        upper_new += step
        npnt += 1    
    return linspace(lower,upper_new,npnt)

Thanks again for your contributions!

atmospheric mixing ratios or volume fractions for constituents

I'm aware that the relative abundances for different isotopologues (within each chemical constituent) are part of the HAPI data base and are specified in the ISO dictionaries, but I haven't found where atmospheric abundances (e.g., atmospheric mixing ratios or volume fractions) for each molecule are specified. Unless I'm misunderstanding something, these would seem to be necessary for getting transmittance through air over a specified path length or absorption coefficient in units of cm-1.

If the intent is to give transmittance only for a pure constituent, how is air broadening computed? If the intent is instead for the user to supply concentrations or mixing ratios and work with cross-sections per molecule, where can "standard" concentrations be found, especially for the more obscure trace constituents?

UPDATE: Because of continued uncertainty regarding the meaning of absorption coefficient in cm-1 for mixtures of air and a given constituent (because there doesn't appear to be a mechanism for specifying the mixing ratio), I am avoiding the HITRAN_unit=False option and using my own routines to compute absorption coefficient, transmittance etc., with GammaL="gamma_air." I would still welcome clarification – including perhaps in the HAPI manual – if I'm misunderstanding how HAPI is intended to be used for air mixtures.

The question about partial volume concentration of gas used in calculting the Abosrption Coefficient

Function NATURAL_ABUNDANCES & ABUNDANCES
The values of these two functions are the same, so the equation in line_19116
Xsect[BoundIndexLower:BoundIndexUpper]+= factor / NATURAL_ABUNDANCES[(MoleculeNumberDB,IsoNumberDB)] * \ ABUNDANCES[(MoleculeNumberDB,IsoNumberDB)] * \ LineIntensity * lineshape_vals
is same with
Xsect[BoundIndexLower:BoundIndexUpper] += factor * LineIntensity * lineshape_vals I am not sure whether it is a bug or the misunderstanding

Futhermore, when calculate the absorption coefficient with HITRAN_units = False
the codes in line_18991
if HITRAN_units: factor = __FloatType__(1.0) else: factor = volumeConcentration(p,T)
is uses the Environment parameters 'p' as the gas pressure to calculate the absorption coefficient. In this case, the parameters Diluent={'air': , 'self': , ... } is seems uesless in the calculation of partial concentration.

By the way, i am a newer with HITRAN_HAPI, please correct me if i misunderstanding the codes. Hope your reply.

Basic functionality broken for Python 3 on Windows (LF end-of-line character issue?)

Hi,

Firstly, thanks for the useful API to the Hitran database.

Unfortunately, it seems not to work for Python 3 on Windows machines - e.g. running a basic fetch command fails with the following error:

Command:

import hapi as h
h.fetch('CO2',2,1,2000,2100)

Result:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-6-12a3201e4aae> in <module>()
----> 1 h.fetch('CO2',2,1,2000,2100)

C:\Dropbox\Work\Code\forked\hapi\hapi\hapi.py in fetch(TableName, M, I, numin, numax, ParameterGroups, Parameters)
   5496     """
   5497     queryHITRAN(TableName,[ISO[(M,I)][ISO_INDEX['id']]],numin,numax,
-> 5498                 pargroups=ParameterGroups,params=Parameters)
   5499     iso_name = ISO[(M,I)][ISO_INDEX['iso_name']]
   5500     Comment = 'Contains lines for '+iso_name

C:\Dropbox\Work\Code\forked\hapi\hapi\hapi.py in queryHITRAN(TableName, iso_id_list, numin, numax, pargroups, params, dotpar, head)
   3344     # Set comment
   3345     # Get this table to LOCAL_TABLE_CACHE
-> 3346     storage2cache(TableName)
   3347     print('PROCESSED')
   3348 

C:\Dropbox\Work\Code\forked\hapi\hapi\hapi.py in storage2cache(TableName, cast, ext)
   1768             converters.append(cfunc)
   1769             #start = end
-> 1770         data_matrix = [[cvt(line) for cvt in converters] for line in InfileData]
   1771         data_columns = zip(*data_matrix)
   1772         for qnt, col in zip(quantities, data_columns):

C:\Dropbox\Work\Code\forked\hapi\hapi\hapi.py in <listcomp>(.0)
   1768             converters.append(cfunc)
   1769             #start = end
-> 1770         data_matrix = [[cvt(line) for cvt in converters] for line in InfileData]
   1771         data_columns = zip(*data_matrix)
   1772         for qnt, col in zip(quantities, data_columns):

C:\Dropbox\Work\Code\forked\hapi\hapi\hapi.py in <listcomp>(.0)
   1768             converters.append(cfunc)
   1769             #start = end
-> 1770         data_matrix = [[cvt(line) for cvt in converters] for line in InfileData]
   1771         data_columns = zip(*data_matrix)
   1772         for qnt, col in zip(quantities, data_columns):

C:\Dropbox\Work\Code\forked\hapi\hapi\hapi.py in cfunc(line, dtype, start, end)
   1764                                 raise Exception('PARSE ERROR: unknown format of the par value (%s)'%line[start:end])
   1765                 else:
-> 1766                     return dtype(line[start:end])
   1767             #cfunc.__doc__ = 'converter {} {}'.format(qnt, fmt) # doesn't work in earlier versions of Python
   1768             converters.append(cfunc)

ValueError: invalid literal for int() with base 10: '\n'

I did get this same code working on Python 2 and a Ubuntu kernel with Python 3, however.

Looking at the error message, I think this may arise from the different end-of-line characters in Windows and Unix. Since Python 3 changed the default file type for open() compared to Python 2, this may explain why the error only happens for Python 3 + Windows.

I found a quick fix (after: https://stackoverflow.com/questions/2536545/how-to-write-unix-end-of-line-characters-in-windows-using-python/23434608#23434608) by adding the argument
newline='\n' to each open() call, forcing unix end-of-line characters on all operating systems. I have a fork of this repo including the fixes.

Basic features (data extraction and plotting, as per the hapi manual) now work for me, although I haven't fully tested the extent of these changes.

If you want, I can submit these as a PR?

Cheers

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.