Git Product home page Git Product logo

hyp3-isce2's Introduction

HyP3 documentation

DOI

HyP3 documentation is built using MkDocs and the ASF Theme.

How to

Setting up a development environment

In order to automatically document some of our APIs, we use a conda environment with our APIs installed. You can get Miniconda (recommended) here:

https://docs.conda.io/en/latest/miniconda.html

Once conda is installed, from the repository root, you can create and activate a conda environment with all the necessary dependencies

conda env create -f environment.yml
conda activate hyp3-docs

Later, you can update the environment's dependencies with

conda env update -f environment.yml

Build and view the documentation site

With the hyp3-docs conda environment activated, run

mkdocs serve

to generate the documentation. This will allow you to view it at http://127.0.0.1:8000/. MkDocs will automatically watch for new/changed files in this directory and rebuild the website so you can see your changes live (just refresh the webpage!).

Note: mkdocs serve captures your terminal; use crtl+c to exit. It is recommended you use a second/dedicated terminal so you can keep this command running.

Deploy

This documentation site is deployed as a GitHub Organization website with a CNAME so that it's viewable at https://hyp3-docs.asf.alaska.edu/. The website is served out of the special https://github.com/ASFHyP3/ASFHyP3.github.io repository. Deployment is handled automatically with the .github/workflows/deploy_to_github_io.yml GitHub Action for any merge to main.

There is also a test site deployed to https://hyp3-docs.asf.alaska.edu/hyp3-docs, which tracks the develop branch of this repo and is served out of the gh-pages branch of this repo.

Enable or disable the announcement banner

We can display a site-wide banner for important announcements. The content of this banner is specified in overrides/main.html, which should contain the following placeholder text when the banner is not in use:

{% extends "partials/main.html" %}

{# Uncomment this block to enable the announcement banner:
{% block announce %}
<div id="announcement-content">
    ⚠️ TODO: Your announcement here.<br />
    <a class="announcement-link" href="TODO">Read the full announcement.</a>
</div>
{% endblock %}
#}

In order to enable the banner, uncomment the announce block and fill in the TODOs. Below is an example of an enabled announcement banner (taken from here):

{% extends "partials/main.html" %}

{% block announce %}
<div id="announcement-content">
    ⚠️ Monthly processing quotas were replaced by a credit system on April 1st.<br />
    <a class="announcement-link" href="/using/credits">Read the full announcement.</a>
</div>
{% endblock %}

When the announcement is no longer needed, restore the file to the placeholder text in order to disable the banner.

If you are building and viewing the site locally, you will need to exit with ctrl+c and then re-run mkdocs serve in order to re-render any changes you make to this file.

Markdown formatting

The way MkDocs and GitHub parse the markdown documents are slightly different. Some compatibility tips:

  • Raw links should be wrapped in angle brackets: <https://example.com>

  • MkDocs is pickier about whitespace between types (e.g., headers, paragraphs, lists) and seems to expect indents to be 4 spaces. So to get a representation like:


    • A list item

      A sub list heading
      • A sub-list item

    in MkDocs, you'll want to write it like:

    Good

    - A list item
    
        ##### A sub list heading
        - A sub-list item
    

    Bad

    - A list item
      ##### A sub list heading
      - A sub-list item
    
    - A list item
        ##### A sub list heading
        - A sub-list item
    
    - A list item
    
      ##### A sub list heading
      - A sub-list item
    

hyp3-isce2's People

Contributors

andrewplayer3 avatar asjohnston-asf avatar cirrusasf avatar dependabot[bot] avatar forrestfwilliams avatar github-actions[bot] avatar hjkristenson avatar jacquelynsmale avatar jhkennedy avatar jtherrmann avatar mfangaritav avatar scottyhq avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

hyp3-isce2's Issues

Coverage workflow does not support PRs from Forks

Jira: https://asfdaac.atlassian.net/browse/TOOL-2507

Note: The above link is accessible only to members of ASF.


Our coverage-report.yml workflow doesn't support PRs from forks b/c it does not have write permissions to a user's fork.

Example: https://github.com/ASFHyP3/hyp3-isce2/actions/runs/7504780261/job/20441454780?pr=177

Ideally, we'd provide a coverage badge that doesn't depend on an image being committed to the repository. But we could either figure out how to get the bot permissions like a maintainer or skip running on forks.

DEM output may not be correct

Initial explorations have shown that we might not be outputting an incorrect DEM, which leads to strange DEM bounds and potentially incorrect data. Right now we are using the merged/z.rdr.full file to create the output geotiff. More exploration of this issue is needed.

Support Python 3.10

When I specify python=3.10 in the environment file, I get the following conflicts:

python-3-10-conflicts-hyp3-isce2.txt

I have not yet inspected this output carefully enough to identify the root of the conflicts. For now, it seems we cannot support Python 3.10.

GitHub Actions fails to push `hyp3-isce2` image to GHCR with `403 Forbidden`

Jira: https://asfdaac.atlassian.net/browse/TOOL-2099

The dockerize job for #119 is failing here with:

Error: buildx failed with: ERROR: failed to solve: failed to push ghcr.io/asfhyp3/hyp3-isce2:0.6.2.dev5_g3db5604: unexpected status: 403 Forbidden

Relevant links:

Control resolution of geocoded outputs

Right now it appears that the geocoded outputs match the resolution of the input dem (30x30 m). We will want to customize the output pixel resolution for each multilook combination like this:

Multilook Geocoded Resolution (m)
20x4 80
10x2 40
5x1 20

This will likely need to be accomplished by creating a DEM with the target resolution specifically for geocoding.

SSLCertVerificationError for sentinel1-burst.asf.alaska.edu

The bug

  File "/Users/scott/miniforge3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/adapters.py", line 517, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='sentinel1-burst.asf.alaska.edu', port=443): Max retries exceeded with url: /S1A_IW_SLC__1SDV_20200604T022251_20200604T022318_032861_03CE65_7C85/IW2/VV/7.xml (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1006)')))

To Reproduce

Ran into this this afternoon ~4PM Pacific. Seems to be a serverside issue for any pair (traceback above is running the example in the README)

Matching Hosted Hyp3 Outputs Locally

Background

Running hyp3-isce2 (0.9.2) via ASF's hosted hyp3 service is fantastic. I also like that there is an option to run the workflow locally via conda or Docker. But I'm not sure how to obtain an identical output running locally. In particular it seems the final formatting step is coming from a separate workflow?
translate_outputs: Convert the outputs of hyp3-isce2 to hyp3-gamma formatted geotiff files

Describe the solution you'd like

Document someplace how to get the same output locally

Provide an amplitude image with burst InSAR products

Jira: https://asfdaac.atlassian.net/browse/TOOL-2776

Note: The above link is accessible only to members of ASF.


We recently received a request for burst InSAR products to include an amplitude image like we provide in our GAMMA-based InSAR products with this justification:

In the context of time series InSAR processing, the inclusion of amplitude information holds paramount importance. It serves as a fundamental resource for various critical tasks, including the identification of stable targets and the validation of statistically homogeneous families, thereby contributing substantially to the overall efficacy and accuracy of the analysis.

`ValueError: There should only be 2 VRT files` error for `insar_isce_burst` workflow during preprocessing

The bug

The insar_isce_burst workflow (and INSAR_ISCE_TEST jobs run via HyP3) potentially fail with this error during ISCE preprocessing:
ValueError: There should only be 2 VRT files in the reference and secondary directories, this indicates there is likely a bug in the region of interest generation.

To Reproduce

I've only observed this error for this pair of bursts (so far):

S1_165626_IW2_20230708T005451_VV_00E5-BURST
S1_165626_IW2_20230720T005452_VV_CD3B-BURST

The workflow fails when run at any of 20x4, 10x2, or 5x1 looks.

Example failed jobs:

hyp3-isce2 can not process the pair which crosses over multiple partition of a GSHHG parquet

The bug

current hyp3-isce2 create water_mask based on GSHHG parquet file. GSHHG file is from asf-dem-west/WATER_MASK/GSHHG/hyp3_water_mask_20220912. In order to speed up the reading file, parquet store the GSHHG into 8 partitions. When a pair crosses over multiple partition, the hyp3_isce2 can not product the correct water_mask, the program stops with error message.
As an example, we pick the spain pair:
S1_062521_IW1_20230622T175457_VV_91B3-BURST
S1_062521_IW1_20230704T175458_VV_C6F8-BURST
which crosses over two partitions ( 000_-0090 and 000_0000). hyp3_isce2 gave the following errors when it process this pair:
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in _run_code
File "/home/jiangzhu/projects/work/hyp3/hyp3-isce2/src/hyp3_isce2/main.py", line 51, in
main()
File "/home/jiangzhu/projects/work/hyp3/hyp3-isce2/src/hyp3_isce2/main.py", line 47, in main
sys.exit(process_entry_point.load()())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jiangzhu/projects/work/hyp3/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 439, in main
isce_output_dir = insar_tops_burst(
^^^^^^^^^^^^^^^^^
File "/home/jiangzhu/projects/work/hyp3/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 136, in insar_tops_burst
resample_to_radar_io(water_mask_path, 'merged/lat.rdr', 'merged/lon.rdr', 'merged/water_mask.rdr')
File "/home/jiangzhu/projects/work/hyp3/hyp3-isce2/src/hyp3_isce2/utils.py", line 189, in resample_to_radar_io
mask = np.reshape(mask, [maskim.coord2.coordSize, maskim.coord1.coordSize])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jiangzhu/apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/numpy/core/fromnumeric.py", line 285, in reshape
return _wrapfunc(a, 'reshape', newshape, order=order)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jiangzhu/apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/numpy/core/fromnumeric.py", line 59, in _wrapfunc
return bound(*args, **kwds)
^^^^^^^^^^^^^^^^^^^^
ValueError: cannot reshape array of size 2 into shape (3600,7200)

The problem is the code can not produce the correct water_mask.

The `isce2` package is incompatible with newer versions of `numpy`

I get the following error:

  File "/home/jth/code/hyp3-isce2/src/hyp3_isce2/topsapp.py", line 175, in run_topsapp_burst
    insar.run()
  File "/home/jth/jth-apps/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/iscesys/Component/Application.py", line 142, in run
    exitStatus = self._processSteps()
                 ^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/iscesys/Component/Application.py", line 405, in _processSteps
    result = func(*pargs, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/isceobj/TopsProc/Factories.py", line 40, in __call__
    return self.method(self.other, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/isceobj/TopsProc/runFineResamp.py", line 387, in runFineResamp
    azCarrPoly, dpoly = secondary.estimateAzimuthCarrierPolynomials(secondaryBurst, offset = -1.0 * offset)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/isceobj/Sensor/TOPS/TOPSSwathSLCProduct.py", line 285, in estimateAzimuthCarrierPolynomials
    x = np.arange(0, burst.numberOfSamples,xstep,dtype=np.int)
                                                       ^^^^^^
  File "/home/jth/jth-apps/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/numpy/__init__.py", line 305, in __getattr__
    raise AttributeError(__former_attrs__[attr])
AttributeError: module 'numpy' has no attribute 'int'.
`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:
    https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations. Did you mean: 'inf'?

The issue is resolved by including the following lines in environment.yml:

- python>=3.8,<=3.10
- numpy<1.20

Numpy must be pinned to <1.20 because of the isce2 incompatibility, and Python must be pinned to <=3.10 because the older version of Numpy is incompatible with Python 3.11.

However, I have isce2 version 2.6.1 installed, and I just noticed that 2.6.2 is available from https://anaconda.org/conda-forge/isce2. So perhaps the newer version of isce2 resolves the numpy issue.

TODO:

  • Determine if installing isce2 version 2.6.2 solves the issue.
  • If not, pin python and numpy as shown above.
  • Unpin those packages when the issue is fixed in isce2

ValueError: ASF Search failed to find S1_023790_IW1_20230621T121426_VV_BAD8-BURST.

The bug

When running the workflow locally, I encountered this error a few times in a row. 15 minutes later I tried again and the workflow ran as expected. Now the error is back.

To Reproduce

python -m hyp3_isce2 ++process insar_tops_burst S1_023790_IW1_20231218T121430_VV_B9A6-BURST S1_023790_IW1_20230621T121426_VV_BAD8-BURST --looks 5x1

Additional context

`2024-01-24 14:20:57,426 - hyp3_isce2.insar_tops_burst - INFO - Begin ISCE2 TopsApp run
2024-01-24 14:21:27,543 - root - ERROR - Connection Error (Timeout): CMR took too long to respond. Set asf constant "CMR_TIMEOUT" to increase. (url='https://cmr.earthdata.nasa.gov/search/granules.umm_json', timeout=30)
2024-01-24 14:21:28,000 - root - ERROR - The asf-search module ecountered an error with CMR, and the following message was automatically reported to ASF:

"
Error Message: Connection Error (Timeout): CMR took too long to respond. Set asf constant "CMR_TIMEOUT" to increase. (url='https://cmr.earthdata.nasa.gov/search/granules.umm_json', timeout=30)
User Agent: Python/3.11.6; requests/2.31.0; asf_search/6.7.3
Search Options: {
product_list: ['S1_023790_IW1_20230621T121426_VV_BAD8-BURST']
}
"If you have any questions email [email protected]
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in _run_code
File "/mnt/c/Users/qbren/Desktop/taco/projects/hyp3-isce2/src/hyp3_isce2/main.py", line 51, in
main()
File "/mnt/c/Users/qbren/Desktop/taco/projects/hyp3-isce2/src/hyp3_isce2/main.py", line 47, in main
sys.exit(process_entry_point.load()())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/Users/qbren/Desktop/taco/projects/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 508, in main
isce_output_dir = insar_tops_burst(
^^^^^^^^^^^^^^^^^
File "/mnt/c/Users/qbren/Desktop/taco/projects/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 87, in insar_tops_burst
ref_params = get_burst_params(reference_scene)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/Users/qbren/Desktop/taco/projects/hyp3-isce2/src/hyp3_isce2/burst.py", line 373, in get_burst_params
raise ValueError(f'ASF Search failed to find {scene_name}.')
ValueError: ASF Search failed to find S1_023790_IW1_20230621T121426_VV_BAD8-BURST.`

Fix merge permissions and `CODEOWNERS`

After I approved #22, I was able to merge but @forrestfwilliams was not.

Also, https://github.com/ASFHyP3/hyp3-isce2/blob/develop/.github/CODEOWNERS contains errors but https://github.com/ASFHyP3/hyp3-gamma/blob/develop/.github/CODEOWNERS does not, even though they appear to be identical.

I don't know if either of these two issues is related, or if the merge permissions issue has anything to do with this setting (which is enabled) from the main branch protection rule:

Restrict who can push to matching branches

Ensure common pixel grid

  • ensure common pixel grid for geotiffs within a single job
  • ensure common pixel grid for geotiffs across multiple jobs

Name at burst processing

The news that hpy3-sdk supports burst processing with ISCE in the new release is encouraging, but currently the only burst data that can be searched on the ASF website is from June 2023 onwards, with fewer burst sequences, making it difficult to support Time-Series InSAR processing (only some of the test regions have long time-series bursts).
I would like to know when the full burst numbering will be available, or if another method can be added to temporarily replace the current burst numbering until the full burst numbering feature is implemented, i.e., uploading a mission by the number of the whole view image, the strip number, and the burst sequence number, which was available in the previous version of HYP3-ISCE2.
As far as I know, getting the number, strip number, and burst serial number of the whole view image of a burst at the same location is complicated, I found a way and tested it, it had some errors, I modified it, after I tested the method by processing it many times on my local computer, the modified method has no errors so far, hopefully this can be an before burst numbering is completely overwritten An alternative way, unfortunately at the moment I can't upload the task with this whole view image numbering, strip numbering, and burst serial numbering method.

https://nbviewer.org/github/relativeorbit/hyp3bursts/blob/main/workflow-ascending.ipynb

modified:
https://github.com/ZGHHGZ/Single-Burst-Processing-Flow/blob/main/SBPF/code/hyp3_isce2_start.ipynb

insar_tops workflow fails to geocode z.rdr.full.vrt

z.rdr.full.vrt was added to the list of images to geocode at https://github.com/ASFHyP3/hyp3-isce2/blob/develop/src/hyp3_isce2/topsapp.py#L38

The geocoding code expects filename + '.xml' to exist in the same directory. For the dem, the only files in the merged directory are:

z.rdr.full.vrt
z.rdr.full.xml

So the program fails with a No such file or directory: 'merged/z.rdr.full.vrt.xml' exception:

Traceback (most recent call last):
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/bin/insar_tops", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/asjohnston/src/hyp3-isce2/src/hyp3_isce2/insar_tops.py", line 105, in main
    product_dir = insar_tops(
                  ^^^^^^^^^^^
  File "/home/asjohnston/src/hyp3-isce2/src/hyp3_isce2/insar_tops.py", line 81, in insar_tops
    topsapp.run_topsapp_burst(start='startup', end='geocode', config_xml=config_path)
  File "/home/asjohnston/src/hyp3-isce2/src/hyp3_isce2/topsapp.py", line 172, in run_topsapp_burst
    insar.run()
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/iscesys/Component/Application.py", line 142, in run
    exitStatus = self._processSteps()
                 ^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/iscesys/Component/Application.py", line 405, in _processSteps
    result = func(*pargs, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/isceobj/TopsProc/Factories.py", line 40, in __call__
    return self.method(self.other, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/isceobj/TopsProc/runGeocode.py", line 125, in runGeocode
    inImage,method = ge.create(prod)
                     ^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/stdproc/rectify/geocode/Geocodable.py", line 62, in create
    prop, fac, misc = parser.parse(filename + '.xml')
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/iscesys/Parsers/XmlParser.py", line 41, in parse
    root = ET.parse(filename)
           ^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/xml/etree/ElementTree.py", line 1218, in parse
    tree.parse(source, parser)
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/xml/etree/ElementTree.py", line 569, in parse
    source = open(source, "rb")
             ^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'merged/z.rdr.full.vrt.xml'

OSError: [Errno 101] Network is unreachable

The bug

I noticed what seems like an intermittent network error. Simply waiting and re-running is a workaround, but wanted to report it anyways (hyp3-isce2 v1.0.1)

requests.exceptions.ConnectionError: HTTPSConnectionPool(host='urs.earthdata.nasa.gov', port=443): Max retries exceeded with url: /oauth/authorize?response_type=code&client_id=BO_n7nTIlMljdvU6kRRB3g&redirect_uri=https%3A%2F%2Fauth.asf.alaska.edu%2Flogin (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f59187f18d0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))

To Reproduce

As mentioned, it isn't consistently reproduced, I ran the following when I encountered this:

python -m hyp3_isce2 ++process insar_tops_burst \
      S1_292377_IW2_20160314T020058_VV_C91D-BURST \
      S1_292377_IW2_20160407T020058_VV_EB2A-BURST \
      --looks 20x4 \
      --apply-water-mask false 

Additional context

Full Traceback
Traceback (most recent call last):
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 203, in _new_conn
  sock = connection.create_connection(
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
  raise err
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 73, in create_connection
  sock.connect(sa)
OSError: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 791, in urlopen
  response = self._make_request(
             ^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 492, in _make_request
  raise new_e
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 468, in _make_request
  self._validate_conn(conn)
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 1097, in _validate_conn
  conn.connect()
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 611, in connect
  self.sock = sock = self._new_conn()
                     ^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 218, in _new_conn
  raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f59187f18d0>: Failed to establish a new connection: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/requests/adapters.py", line 486, in send
  resp = conn.urlopen(
         ^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 845, in urlopen
  retries = retries.increment(
            ^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/retry.py", line 515, in increment
  raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='urs.earthdata.nasa.gov', port=443): Max retries exceeded with url: /oauth/authorize?response_type=code&client_id=BO_n7nTIlMljdvU6kRRB3g&redirect_uri=https%3A%2F%2Fauth.asf.alaska.edu%2Flogin (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f59187f18d0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/__main__.py", line 51, in <module>
  main()
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/__main__.py", line 47, in main
  sys.exit(process_entry_point.load()())
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/insar_tops_burst.py", line 530, in main
  insar_tops_burst(
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/insar_tops_burst.py", line 101, in insar_tops_burst
  ref_metadata, sec_metadata = download_bursts([ref_params, sec_params])
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/burst.py", line 331, in download_bursts
  with get_asf_session() as asf_session:
       ^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/burst.py", line 309, in get_asf_session
  response = session.get('https://urs.earthdata.nasa.gov/oauth/authorize', params=payload)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 602, in get
  return self.request("GET", url, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
  resp = self.send(prep, **send_kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
  r = adapter.send(request, **kwargs)
      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/requests/adapters.py", line 519, in send
  raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='urs.earthdata.nasa.gov', port=443): Max retries exceeded with url: /oauth/authorize?response_type=code&client_id=BO_n7nTIlMljdvU6kRRB3g&redirect_uri=https%3A%2F%2Fauth.asf.alaska.edu%2Flogin (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f59187f18d0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))

Create a readme to be included with output products

Like hyp3-gamma, we need to create a readme describing the sub-products in the output product directory. The hyp3-gamma readme template is here. On a high-level this file should include:

  • source granule names
  • product naming specification
  • pixel spacing discussion
  • how to use this data
    • point to hyp3 docs
    • how to cite
    • information on input granules
    • information on input dem
  • product contents
    • description of each sub-product
  • InSAR processing
    • short summary as numbered list
    • in depth description of each step
  • information on the sentinel-1 mission
  • how to contact ASF

Instructions for trying this plugin

Background

Hi! I'm working with several students at UW who have been working with HYP3 (thank you it is a wonderful resource for the research community). But we'd like to try ISCE2 processing for some projects. This seems to be a place where we could testing things out and contribute.

But it's unclear to me from the docs if you need to deploy your own HYP3 to use this (https://hyp3-docs.asf.alaska.edu/plugins/)

Additional context

We have our own AWS account if needed.

hyp3lib.exceptions.OrbitDownloadError

The bug

Occasionally it seems some sort of network error leads to this traceback, which is a bit misleading. The orbit files do exist, the download just failed for some reason.

hyp3lib.exceptions.OrbitDownloadError: Unable to find a valid orbit file from providers: ('ESA', 'ASF')

To Reproduce

It's intermittent, so hard to reproduce.
(hyp3-isce2 v1.0.1)

python -m hyp3_isce2 ++process insar_tops_burst \
      S1_292377_IW2_20190504T020036_VV_6F58-BURST \
      S1_292377_IW2_20190516T020037_VV_9045-BURST \
      --looks 20x4 \
      --apply-water-mask false 

# S1B_OPER_AUX_POEORB_OPOD_20210302T092716_V20190503T225942_20190505T005942.EOF
# S1B_OPER_AUX_POEORB_OPOD_20210302T141618_V20190515T225942_20190517T005942.EOF

Additional context

2024-05-08 17:59:48,223 - root - WARNING - Error encountered fetching AUX_POEORB orbit file from ESA; looking for another
2024-05-08 18:04:17,124 - root - WARNING - Error encountered fetching AUX_POEORB orbit file from ASF; looking for another
2024-05-08 18:04:18,505 - root - INFO - Downloading None
2024-05-08 18:04:19,177 - root - WARNING - Error encountered fetching AUX_RESORB orbit file from ESA; looking for another
2024-05-08 18:06:13,669 - root - INFO - Downloading None
2024-05-08 18:06:13,669 - root - WARNING - Error encountered fetching AUX_RESORB orbit file from ASF; looking for another
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/__main__.py", line 51, in <module>
    main()
  File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/__main__.py", line 47, in main
    sys.exit(process_entry_point.load()())
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/insar_tops_burst.py", line 530, in main
    insar_tops_burst(
  File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/insar_tops_burst.py", line 128, in insar_tops_burst
    downloadSentinelOrbitFile(granule, str(orbit_dir), esa_credentials=(esa_username, esa_password))
  File "/home/runner/micromamba/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3lib/get_orb.py", line 189, in downloadSentinelOrbitFile
    raise OrbitDownloadError(f'Unable to find a valid orbit file from providers: {providers}')
hyp3lib.exceptions.OrbitDownloadError: Unable to find a valid orbit file from providers: ('ESA', 'ASF')

Unable to write ISCE format on MacOS (SystemError: Unkown GDAL Error)

The bug

For version 0.9.2. The example workflow fails saving the glo30 DEM in ISCE format running via a conda-install on MacOS. It seems like the conda-forge gdal doesn't have a functional ISCE driver because writing to other formats is fine.

dem_profile['nodata'] = None
dem_profile['driver'] = 'ISCE'

To Reproduce

conda activate hyp3-isce2
python -m hyp3_isce2 ++process insar_tops_burst \
  S1_136231_IW2_20200604T022312_VV_7C85-BURST \
  S1_136231_IW2_20200616T022313_VV_5D11-BURST \
  --looks 20x4 \
  --apply-water-mask False

Additional context

Seems a problem only on MacOS, conda-installs on linux are working fine.

(hyp3-isce2) ➜  conda list                      
gdal                      3.6.3           py311h619941e_1    conda-forge
libgdal                   3.6.3                h8ea55aa_1    conda-forge
rasterio                  1.3.6           py311hc41c901_0    conda-forge
2024-01-12 08:54:12,127 - hyp3_isce2.insar_tops_burst - INFO - DEM ROI: (53.07855985074144, 27.491831927404057, 54.15583739250667, 27.84714385769884)
Reading glo_30 Datasets: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:00<00:00,  2.11it/s]
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/scott/GitHub/hyp3-isce2/src/hyp3_isce2/__main__.py", line 51, in <module>
    main()
  File "/Users/scott/GitHub/hyp3-isce2/src/hyp3_isce2/__main__.py", line 47, in main
    sys.exit(process_entry_point.load()())
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/scott/GitHub/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 482, in main
    isce_output_dir = insar_tops_burst(
                      ^^^^^^^^^^^^^^^^^
  File "/Users/scott/GitHub/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 105, in insar_tops_burst
    dem_path = download_dem_for_isce2(dem_roi, dem_name='glo_30', dem_dir=dem_dir, buffer=0, resample_20m=False)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/scott/GitHub/hyp3-isce2/src/hyp3_isce2/dem.py", line 130, in download_dem_for_isce2
    with rasterio.open(dem_path, 'w', **dem_profile) as ds:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/scott/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/rasterio/env.py", line 451, in wrapper
    return f(*args, **kwds)
           ^^^^^^^^^^^^^^^^
  File "/Users/scott/miniconda3/envs/hyp3-isce2/lib/python3.11/site-packages/rasterio/__init__.py", line 314, in open
    dataset = writer(
              ^^^^^^^
  File "rasterio/_io.pyx", line 1450, in rasterio._io.DatasetWriterBase.__init__
  File "rasterio/_err.pyx", line 222, in rasterio._err.exc_wrap_pointer
SystemError: Unkown GDAL Error. To debug: https://rasterio.readthedocs.io/en/latest/topics/errors.html#debugging-internal-gdal-functions

Refactor how we use `asf_search`

Refactoring TODOs moved from #70 (which added asf_search as a dependency):

  • leverage the polygon returned by asf_search for determining bbox and dem aoi
  • leverage asf_search.download() when downloading the burst geotiffs
  • leverage asf_search.download_url() when downloading the burst xmls
  • eliminate BurstParams class in favor of using asf_search result directly

When I use 'def get_asf_session()' , it's wrong

I found some bugs when I use "Local Python Package Interface"
########################################################################################
↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓
Code:
def get_asf_session() -> requests.Session:
session = requests.Session()
payload = {
'response_type': 'code',
'client_id': 'BO_n7nTIlMljdvU6kRRB3g',
'redirect_uri': 'https://auth.asf.alaska.edu/login',
}
response = session.get('https://urs.earthdata.nasa.gov/oauth/authorize',headers=headers, params=payload)
response.raise_for_status()
return session
########################################################################################
↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓
Error:
Traceback (most recent call last):
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 449, in _make_request
six.raise_from(e, None)
File "", line 3, in raise_from
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 444, in _make_request
httplib_response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1378, in getresponse
response.begin()
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/http/client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/adapters.py", line 486, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 787, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/retry.py", line 550, in increment
raise six.reraise(type(error), error, _stacktrace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/packages/six.py", line 769, in reraise
raise value.with_traceback(tb)
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 449, in _make_request
six.raise_from(e, None)
File "", line 3, in raise_from
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 444, in _make_request
httplib_response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1378, in getresponse
response.begin()
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/http/client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/mnt/c/Users/23361/Desktop/hyp3-isce2-develop/src/hyp3_isce2/etc/download_example.py", line 21, in
with get_asf_session() as session:
^^^^^^^^^^^^^^^^^
File "/home/zgh/hyp3-isce2/src/hyp3_isce2/burst.py", line 292, in get_asf_session

File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 725, in send
history = [resp for resp in gen]
^^^^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 725, in
history = [resp for resp in gen]
^^^^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 266, in resolve_redirects
resp = self.send(
^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zgh/anaconda3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/adapters.py", line 501, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
ERROR conda.cli.main_run:execute(47): conda run python /mnt/c/Users/23361/Desktop/hyp3-isce2-develop/src/hyp3_isce2/etc/download_example.py failed. (See above for error)

S1 TopsApp processing (SLC or burst) cannot process SH (HH) scenes

Jira: https://asfdaac.atlassian.net/browse/TOOL-2451

Note: The above link is accessible only to members of ASF.


ISCE2 sets the default polarization to vv so it will not be able to process hh scenes unless the polarization parameter is set for the reference and secondary scenes in the topsApp.xml (see this discussion), which we do not do.

Traceback:

2023-12-14 22:23:08,019 - hyp3_isce2.burst - INFO - SAFEs created!
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/__main__.py", line 51, in <module>
    main()
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/__main__.py", line 47, in main
    sys.exit(process_entry_point.load()())
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/insar_tops_burst.py", line 477, in main
    isce_output_dir = insar_tops_burst(
                      ^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/insar_tops_burst.py", line 93, in insar_tops_burst
    ref_footprint = get_isce2_burst_bbox(ref_params)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/burst.py", line 241, in get_isce2_burst_bbox
    s1_obj.parse()
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/isceobj/Sensor/TOPS/Sentinel1.py", line 316, in parse
    self.validateUserInputs()
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/isce/components/isceobj/Sensor/TOPS/Sentinel1.py", line 238, in validateUserInputs
    raise Exception('No annotation xml file found in {0}'.format(dirname))
Exception: No annotation xml file found in /home/conda/S1A_IW_SLC__1SSH_20231103T035107_20231103T035135_051047_0627B4_9900.SAFE

Adding additional output options (COG, STAC)

Background

The outputs from hyp3-isce2 are publicly accessible, which is great for collaboration among research groups. Currently outputs are in S3, but it's unclear how to access them directly. Instead we're using cloudfront links to zipped collections of all the files.

In the interest of a fully-cloud workflow it would be great to have COG images rather than just tiled Geotiff. It would also be helpful to have a STAC Item json per job. And if zip is necessary to save space, use SOZIP.

This combination would allow very efficient construction of data cubes reading hyp3 outputs directly from S3, especially when using a system like OpenScienceLab also in AWS us-west-2.

Describe the solution you'd like

Chatted a bit with @forrestfwilliams about this at AGU and it seemed like a possibility to add at least the STAC generation here (though ultimately it would be useful for any hyp3 processor).

We have some currently messy code here to generate the STAC Item here https://github.com/relativeorbit/agu2023/blob/main/utils.py - let me know if you'd be open to a pull request?

Alternatives

Have another step in the canned HYP3 workflow to generate COG+STAC Item for each job. (replacing - translate_outputs: Convert the outputs of hyp3-isce2 to hyp3-gamma formatted geotiff files)

Additional context

COGs+STAC would allow taking advantage of some nice existing tools:

  1. Dynamic tiling already "works" with current GeoTiffs with 256x256 tiling, but would be better if pyramid overviews were included:
    https://titiler.xyz/cog/map?url=/vsizip//vsicurl/https://d3gm2hf49xd6jj.cloudfront.net/cfd3566f-dafe-4859-ab05-39a83f86c98d/S1_245655_IW3_20220727_20230722_VV_INT80_67C8.zip/S1_245655_IW3_20220727_20230722_VV_INT80_67C8/S1_245655_IW3_20220727_20230722_VV_INT80_67C8_wrapped_phase.tif&rescale=-3.14,3.14&colormap_name=hsv

  2. For example, a simple static catalog allows efficient browsing of the outputs (this "works" already, which is pretty neat!, but doesn't do efficient COG tiling):
    https://radiantearth.github.io/stac-browser/#/external/raw.githubusercontent.com/relativeorbit/agu2023/main/catalog.json

  3. having STAC items allows easy construction of datacubes for postprocessing with Xarray via libraries like odc-stac (for example, https://github.com/relativeorbit/agu2023/blob/main/A64-postprocess.ipynb)

Standardize xml library

In various parts of the codebase, we use both lxml and eaementtree (ET) we should decide which library we prefer, then use only one.

Retry downloading of aux cal files

When I run the command:

python src/hyp3_isce2/process.py --reference-scene S1A_IW_SLC__1SSV_20150621T120220_20150621T120232_006471_008934_72D8 --secondary-scene S1A_IW_SLC__1SSV_20150504T120217_20150504T120229_005771_00769E_EF9A --swath-number 1 --reference-burst-number 1 --secondary-burst-number 1

Sometimes I get the following error:

Traceback (most recent call last):
  File "/home/jth/code/hyp3-isce2/foo/../src/hyp3_isce2/process.py", line 108, in <module>
    main()
  File "/home/jth/code/hyp3-isce2/foo/../src/hyp3_isce2/process.py", line 104, in main
    topsapp_burst(**args.__dict__)
  File "/home/jth/code/hyp3-isce2/foo/../src/hyp3_isce2/process.py", line 63, in topsapp_burst
    download_aux_cal(aux_cal_dir)
  File "/home/jth/code/hyp3-isce2/src/hyp3_isce2/s1_auxcal.py", line 54, in download_aux_cal
    _download_platform(url, aux_cal_dir)
  File "/home/jth/code/hyp3-isce2/src/hyp3_isce2/s1_auxcal.py", line 35, in _download_platform
    response = requests.get(url)
               ^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/api.py", line 73, in get
    return request("get", url, params=params, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 587, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 723, in send
    history = [resp for resp in gen]
              ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 723, in <listcomp>
    history = [resp for resp in gen]
              ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 266, in resolve_redirects
    resp = self.send(
           ^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/sessions.py", line 701, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jth/jth-apps/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/requests/adapters.py", line 565, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='obs.eu-nl.otc.t-systems.com', port=443): Max retries exceeded with url: /s1ece-archive/s1website/S1A/AUX_CAL/2019/02/28/S1A_AUX_CAL_V20190228T092500_G20210104T141310.SAFE.zip?AWSAccessKeyId=redacted&Signature=redacted&Expires=1681150701 (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fd81179a4d0>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))

So it looks like we may want to retry the GET request in _download_platform a few more times.

`insar_tops_burst` support for products over the antimerdian

Jira: https://asfdaac.atlassian.net/browse/TOOL-2059

insar_tops_burst does not currently support InSAR for burst products over the antimeridient (longitude=180/-180). Attempting to process such a product results in a Shrink your bounding area error:

$ insar_tops_burst S1_141655_IW3_20230625T063253_VV_563F-BURST S1_141655_IW3_20230613T063252_VV_FD06-BURST
2023-07-06 09:38:35,355 - hyp3_isce2.insar_tops_burst - INFO - Begin ISCE2 TopsApp run
2023-07-06 09:38:40,495 - hyp3_isce2.burst - INFO - Download attempt #1 for https://sentinel1-burst.asf.alaska.edu/S1A_IW_SLC__1SDV_20230613T063234_20230613T063309_048963_05E358_FD06/IW3/VV/6.xml
2023-07-06 09:38:40,496 - hyp3_isce2.burst - INFO - Download attempt #1 for https://sentinel1-burst.asf.alaska.edu/S1A_IW_SLC__1SDV_20230625T063234_20230625T063310_049138_05E8AB_563F/IW3/VV/6.xml
2023-07-06 09:38:40,498 - hyp3_isce2.burst - INFO - Download attempt #1 for https://sentinel1-burst.asf.alaska.edu/S1A_IW_SLC__1SDV_20230613T063234_20230613T063309_048963_05E358_FD06/IW3/VV/6.tiff
2023-07-06 09:38:40,500 - hyp3_isce2.burst - INFO - Download attempt #1 for https://sentinel1-burst.asf.alaska.edu/S1A_IW_SLC__1SDV_20230625T063234_20230625T063310_049138_05E8AB_563F/IW3/VV/6.tiff
2023-07-06 09:39:00,936 - hyp3_isce2.burst - INFO - SAFEs created!
Input XML files:  ['/home/asjohnston/tmp/burst/S1A_IW_SLC__1SDV_20230613T063234_20230613T063309_048963_05E358_FD06.SAFE/annotation/s1a-iw3-slc-vv-20230613t063235-20230613t063309-048963-05e358-006.xml']
Input TIFF files:  ['/home/asjohnston/tmp/burst/S1A_IW_SLC__1SDV_20230613T063234_20230613T063309_048963_05E358_FD06.SAFE/measurement/s1a-iw3-slc-vv-20230613t063235-20230613t063309-048963-05e358-006.tiff']
Manifest files:  ['/home/asjohnston/tmp/burst/S1A_IW_SLC__1SDV_20230613T063234_20230613T063309_048963_05E358_FD06.SAFE/manifest.safe']
Setting IPF version to :  003.61
Extracting orbit from annotation XML file
Input XML files:  ['/home/asjohnston/tmp/burst/S1A_IW_SLC__1SDV_20230625T063234_20230625T063310_049138_05E8AB_563F.SAFE/annotation/s1a-iw3-slc-vv-20230625t063236-20230625t063310-049138-05e8ab-006.xml']
Input TIFF files:  ['/home/asjohnston/tmp/burst/S1A_IW_SLC__1SDV_20230625T063234_20230625T063310_049138_05E8AB_563F.SAFE/measurement/s1a-iw3-slc-vv-20230625t063236-20230625t063310-049138-05e8ab-006.tiff']
Manifest files:  ['/home/asjohnston/tmp/burst/S1A_IW_SLC__1SDV_20230625T063234_20230625T063310_049138_05E8AB_563F.SAFE/manifest.safe']
Setting IPF version to :  003.61
Extracting orbit from annotation XML file
2023-07-06 09:39:01,448 - hyp3_isce2.insar_tops_burst - INFO - InSAR ROI: (-179.5828069472891, -16.95605848174402, -179.5728069472891, -16.94605848174402)
2023-07-06 09:39:01,448 - hyp3_isce2.insar_tops_burst - INFO - DEM ROI: (-179.5778069472891, -16.95105848174402, 179.83383173833087, -16.57232697288399)
Traceback (most recent call last):
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/bin/insar_tops_burst", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/asjohnston/src/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 387, in main
    isce_output_dir = insar_tops_burst(
                      ^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/src/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 85, in insar_tops_burst
    dem_path = download_dem_for_isce2(dem_roi, dem_name='glo_30', dem_dir=dem_dir, buffer=0)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/src/hyp3-isce2/src/hyp3_isce2/dem.py", line 82, in download_dem_for_isce2
    dem_array, dem_profile = dem_stitcher.stitch_dem(
                             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/dem_stitcher/stitcher.py", line 322, in stitch_dem
    glo_90_missing_intersection = intersects_missing_glo_30_tiles(bounds)
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/dem_stitcher/datasets.py", line 110, in intersects_missing_glo_30_tiles
    df_missing = get_overlapping_dem_tiles(extent, 'glo_90_missing')
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/dem_stitcher/datasets.py", line 80, in get_overlapping_dem_tiles
    crossing = get_dateline_crossing(bounds)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/asjohnston/mambaforge/envs/hyp3-isce2/lib/python3.11/site-packages/dem_stitcher/dateline.py", line 67, in get_dateline_crossing
    raise DoubleDatelineCrossing('Shrink your bounding area')
dem_stitcher.exceptions.DoubleDatelineCrossing: Shrink your bounding area

hyp3-isce2 can not work with antimeridain pair correctly

hyp3-isce2 uses dem_stiticher package to product the dem file. The box to represent the extent of the antimeridian pair need to dealt correctly, otherwise it can not product the dem file correctly. The current hyp3_isce2 has issue to correctly produce the dem file.

For example, let run hyp3-isce2 with antimeridain pair:
S1_000693_IW1_20230620T183239_VV_F644-BURST
S1_000693_IW1_20230702T183239_VV_088F-BURST

the error:
dem_stitcher.exceptions.DoubleDatelineCrossing: Shrink your bounding area

GitHub Actions fails to push `hyp3-isce2` image to GHCR with `403 Forbidden` (again)

Jira: https://asfdaac.atlassian.net/browse/TOOL-2191

Note: The above link is accessible only to members of ASF.


The dockerize workflow is failing for #134 with the following error:

Error: buildx failed with: ERROR: failed to solve: failed to push ghcr.io/asfhyp3/hyp3-isce2:0.7.2.dev2_gb21307f: unexpected status from POST request to https://ghcr.io/v2/asfhyp3/hyp3-isce2/blobs/uploads/: 403 Forbidden

This is very similar to #120, where the error was as follows:

Error: buildx failed with: ERROR: failed to solve: failed to push ghcr.io/asfhyp3/hyp3-isce2:0.6.2.dev5_g3db5604: unexpected status: 403 Forbidden

Since the previous issue seemed to resolve itself, I figured this one would, too. But the workflow has now been failing for about a week over several different re-runs, so we should probably try to debug it. Might be worth revisiting the steps that I took in my attempts to debug the previous issue.

Finish following the `hyp3-cookiecutter` docs

I left off at https://github.com/ASFHyP3/hyp3-cookiecutter/tree/update_staging#7-restart-the-github-actions after merging #1 and #2. The second PR was to fix what I thought was a typo in hyp3-cookiecutter here but now I don't know if that was actually a typo.

The dockerize job is currently failing with:

ERROR: failed to solve: failed to push ghcr.io/asfhyp3/hyp3-isce2:0.0.1.dev4_g999cdf6: unexpected status: 403 Forbidden

TODO:

  • Solve the perms issue.
  • Revert #2
  • Finish following the hyp3-cookiecutter docs.

botocore.exceptions.ConnectTimeoutError and lots of DEBUG outputs

The bug

Running hyp3-isce2 (0.9.2) locally I'm seeing a lot of initial logging messages related to botocore looking for cloud credentials. It looks like there is a traceback related to looking for instance metadata (would only work on AWS), but in any case the code ultimately succeeds successfully.

To Reproduce

docker run -it --rm \
    -e EARTHDATA_USERNAME=$EARTHDATA_USERNAME \
    -e EARTHDATA_PASSWORD=$EARTHDATA_PASSWORD \
    -e ESA_USERNAME=$ESA_USERNAME \
    -e ESA_PASSWORD=$ESA_PASSWORD \
    ghcr.io/asfhyp3/hyp3-isce2:0.9.2 \
        ++process insar_tops_burst \
        S1_136231_IW2_20200604T022312_VV_7C85-BURST \
        S1_136231_IW2_20200616T022313_VV_5D11-BURST \
        --looks 20x4 \
        --apply-water-mask False

Additional context

2024-01-11 03:07:22,367 - botocore.credentials - DEBUG - Looking for credentials via: iam-role
2024-01-11 03:07:22,368 - urllib3.connectionpool - DEBUG - Starting new HTTP connection (1): 169.254.169.254:80
2024-01-11 03:07:23,370 - botocore.utils - DEBUG - Caught retryable HTTP exception while making metadata service request to http://169.254.169.254/latest/api/token: Connect timeout on endpoint URL: "http://169.254.169.254/latest/api/token"
Traceback (most recent call last):
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 174, in _new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 95, in create_connection
    raise err
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
    sock.connect(sa)
TimeoutError: timed out
full traceback
/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/hyp3_isce2/__main__.py:40: DeprecationWarning: SelectableGroups dict interface is deprecated. Use select.
  eps = entry_points()['hyp3']
2024-01-11 03:07:22,348 - botocore.hooks - DEBUG - Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane
2024-01-11 03:07:22,351 - botocore.hooks - DEBUG - Changing event name from before-call.apigateway to before-call.api-gateway
2024-01-11 03:07:22,352 - botocore.hooks - DEBUG - Changing event name from request-created.machinelearning.Predict to request-created.machine-learning.Predict
2024-01-11 03:07:22,355 - botocore.hooks - DEBUG - Changing event name from before-parameter-build.autoscaling.CreateLaunchConfiguration to before-parameter-build.auto-scaling.CreateLaunchConfiguration
2024-01-11 03:07:22,356 - botocore.hooks - DEBUG - Changing event name from before-parameter-build.route53 to before-parameter-build.route-53
2024-01-11 03:07:22,357 - botocore.hooks - DEBUG - Changing event name from request-created.cloudsearchdomain.Search to request-created.cloudsearch-domain.Search
2024-01-11 03:07:22,357 - botocore.hooks - DEBUG - Changing event name from docs.*.autoscaling.CreateLaunchConfiguration.complete-section to docs.*.auto-scaling.CreateLaunchConfiguration.complete-section
2024-01-11 03:07:22,359 - botocore.hooks - DEBUG - Changing event name from before-parameter-build.logs.CreateExportTask to before-parameter-build.cloudwatch-logs.CreateExportTask
2024-01-11 03:07:22,360 - botocore.hooks - DEBUG - Changing event name from docs.*.logs.CreateExportTask.complete-section to docs.*.cloudwatch-logs.CreateExportTask.complete-section
2024-01-11 03:07:22,360 - botocore.hooks - DEBUG - Changing event name from before-parameter-build.cloudsearchdomain.Search to before-parameter-build.cloudsearch-domain.Search
2024-01-11 03:07:22,360 - botocore.hooks - DEBUG - Changing event name from docs.*.cloudsearchdomain.Search.complete-section to docs.*.cloudsearch-domain.Search.complete-section
2024-01-11 03:07:22,363 - botocore.utils - DEBUG - IMDS ENDPOINT: http://169.254.169.254/
2024-01-11 03:07:22,365 - botocore.credentials - DEBUG - Looking for credentials via: env
2024-01-11 03:07:22,365 - botocore.credentials - DEBUG - Looking for credentials via: assume-role
2024-01-11 03:07:22,365 - botocore.credentials - DEBUG - Looking for credentials via: assume-role-with-web-identity
2024-01-11 03:07:22,365 - botocore.credentials - DEBUG - Looking for credentials via: sso
2024-01-11 03:07:22,365 - botocore.credentials - DEBUG - Looking for credentials via: shared-credentials-file
2024-01-11 03:07:22,365 - botocore.credentials - DEBUG - Looking for credentials via: custom-process
2024-01-11 03:07:22,366 - botocore.credentials - DEBUG - Looking for credentials via: config-file
2024-01-11 03:07:22,366 - botocore.credentials - DEBUG - Looking for credentials via: ec2-credentials-file
2024-01-11 03:07:22,366 - botocore.credentials - DEBUG - Looking for credentials via: boto-config
2024-01-11 03:07:22,366 - botocore.credentials - DEBUG - Looking for credentials via: container-role
2024-01-11 03:07:22,367 - botocore.credentials - DEBUG - Looking for credentials via: iam-role
2024-01-11 03:07:22,368 - urllib3.connectionpool - DEBUG - Starting new HTTP connection (1): 169.254.169.254:80
2024-01-11 03:07:23,370 - botocore.utils - DEBUG - Caught retryable HTTP exception while making metadata service request to http://169.254.169.254/latest/api/token: Connect timeout on endpoint URL: "http://169.254.169.254/latest/api/token"
Traceback (most recent call last):
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 174, in _new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 95, in create_connection
    raise err
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
    sock.connect(sa)
TimeoutError: timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/httpsession.py", line 464, in send
    urllib_response = conn.urlopen(
                      ^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 799, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/retry.py", line 525, in increment
    raise six.reraise(type(error), error, _stacktrace)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/packages/six.py", line 770, in reraise
    raise value
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 416, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/awsrequest.py", line 96, in request
    rval = super().request(method, url, body, headers, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 244, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1286, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1332, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1281, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/awsrequest.py", line 123, in _send_output
    self.send(msg)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/awsrequest.py", line 223, in send
    return super().send(str)
           ^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 979, in send
    self.connect()
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 205, in connect
    conn = self._new_conn()
           ^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 179, in _new_conn
    raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<botocore.awsrequest.AWSHTTPConnection object at 0x7f88cc7e5c10>, 'Connection to 169.254.169.254 timed out. (connect timeout=1)')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/utils.py", line 460, in _fetch_metadata_token
    response = self._session.send(request.prepare())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/httpsession.py", line 499, in send
    raise ConnectTimeoutError(endpoint_url=request.url, error=e)
botocore.exceptions.ConnectTimeoutError: Connect timeout on endpoint URL: "http://169.254.169.254/latest/api/token"
2024-01-11 03:07:23,375 - urllib3.connectionpool - DEBUG - Starting new HTTP connection (2): 169.254.169.254:80
2024-01-11 03:07:24,377 - botocore.utils - DEBUG - Caught retryable HTTP exception while making metadata service request to http://169.254.169.254/latest/meta-data/iam/security-credentials/: Connect timeout on endpoint URL: "http://169.254.169.254/latest/meta-data/iam/security-credentials/"
Traceback (most recent call last):
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 174, in _new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 95, in create_connection
    raise err
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
    sock.connect(sa)
TimeoutError: timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/httpsession.py", line 464, in send
    urllib_response = conn.urlopen(
                      ^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 799, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/util/retry.py", line 525, in increment
    raise six.reraise(type(error), error, _stacktrace)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/packages/six.py", line 770, in reraise
    raise value
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connectionpool.py", line 416, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/awsrequest.py", line 96, in request
    rval = super().request(method, url, body, headers, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 244, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1286, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1332, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 1281, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/awsrequest.py", line 123, in _send_output
    self.send(msg)
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/awsrequest.py", line 223, in send
    return super().send(str)
           ^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/http/client.py", line 979, in send
    self.connect()
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 205, in connect
    conn = self._new_conn()
           ^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/urllib3/connection.py", line 179, in _new_conn
    raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<botocore.awsrequest.AWSHTTPConnection object at 0x7f88cc5cc890>, 'Connection to 169.254.169.254 timed out. (connect timeout=1)')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/utils.py", line 515, in _get_request
    response = self._session.send(request.prepare())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/httpsession.py", line 499, in send
    raise ConnectTimeoutError(endpoint_url=request.url, error=e)
botocore.exceptions.ConnectTimeoutError: Connect timeout on endpoint URL: "http://169.254.169.254/latest/meta-data/iam/security-credentials/"
2024-01-11 03:07:24,380 - botocore.utils - DEBUG - Max number of attempts exceeded (1) when attempting to retrieve data from metadata service.
2024-01-11 03:07:24,381 - botocore.loaders - DEBUG - Loading JSON file: /opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/data/endpoints.json
2024-01-11 03:07:24,396 - botocore.loaders - DEBUG - Loading JSON file: /opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/data/sdk-default-configuration.json
2024-01-11 03:07:24,397 - botocore.hooks - DEBUG - Event choose-service-name: calling handler <function handle_service_name_alias at 0x7f88cc718900>
2024-01-11 03:07:24,411 - botocore.loaders - DEBUG - Loading JSON file: /opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/data/s3/2006-03-01/service-2.json
2024-01-11 03:07:24,427 - botocore.loaders - DEBUG - Loading JSON file: /opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/data/s3/2006-03-01/endpoint-rule-set-1.json
2024-01-11 03:07:24,431 - botocore.loaders - DEBUG - Loading JSON file: /opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/data/partitions.json
2024-01-11 03:07:24,433 - botocore.hooks - DEBUG - Event creating-client-class.s3: calling handler <function add_generate_presigned_post at 0x7f88cc878e00>
2024-01-11 03:07:24,433 - botocore.hooks - DEBUG - Event creating-client-class.s3: calling handler <function lazy_call.<locals>._handler at 0x7f88ccc20220>
2024-01-11 03:07:24,447 - botocore.hooks - DEBUG - Event creating-client-class.s3: calling handler <function add_generate_presigned_url at 0x7f88cc878b80>
2024-01-11 03:07:24,447 - botocore.configprovider - DEBUG - Looking for endpoint for s3 via: environment_service
2024-01-11 03:07:24,447 - botocore.configprovider - DEBUG - Looking for endpoint for s3 via: environment_global
2024-01-11 03:07:24,447 - botocore.configprovider - DEBUG - Looking for endpoint for s3 via: config_service
2024-01-11 03:07:24,447 - botocore.configprovider - DEBUG - Looking for endpoint for s3 via: config_global
2024-01-11 03:07:24,447 - botocore.configprovider - DEBUG - No configured endpoint found.
2024-01-11 03:07:24,460 - botocore.endpoint - DEBUG - Setting s3 timeout as (60, 60)
2024-01-11 03:07:24,462 - botocore.loaders - DEBUG - Loading JSON file: /opt/conda/envs/hyp3-isce2/lib/python3.11/site-packages/botocore/data/_retry.json
2024-01-11 03:07:24,462 - botocore.client - DEBUG - Registering retry handlers for service: s3
2024-01-11 03:07:24,463 - botocore.utils - DEBUG - Registering S3 region redirector handler
2024-01-11 03:07:24,463 - botocore.utils - DEBUG - Registering S3Express Identity Resolver
2024-01-11 03:07:25,524 - hyp3_isce2.insar_tops_burst - INFO - Begin ISCE2 TopsApp run
2024-01-11 03:07:30,789 - hyp3_isce2.burst - INFO - Download attempt #1 for https://sentinel1-burst.asf.alaska.edu/S1A_IW_SLC__1SDV_20200604T022251_20200604T022318_032861_03CE65_7C85/IW2/VV/7.xml

HTTPError: 502 Server Error: Bad Gateway for url

The bug

Running this workflow I periodically get a 502 Server Error. It seems like an intermittent server-side error since waiting a few seconds and re-running tends to work (v0.9.2)

To Reproduce

python -m hyp3_isce2 ++process insar_tops_burst \
  S1_136231_IW2_20200604T022312_VV_7C85-BURST \
  S1_136231_IW2_20200616T022313_VV_5D11-BURST \
  --looks 20x4 \
  --apply-water-mask False

Additional context

/Users/scott/GitHub/relativeorbit/hyp3-isce2/src/hyp3_isce2/__main__.py:40: DeprecationWarning: SelectableGroups dict interface is deprecated. Use select.
  eps = entry_points()['hyp3']
2024-01-16 14:51:38,405 - hyp3_isce2.insar_tops_burst - INFO - Begin ISCE2 TopsApp run
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/scott/GitHub/relativeorbit/hyp3-isce2/src/hyp3_isce2/__main__.py", line 51, in <module>
    main()
  File "/Users/scott/GitHub/relativeorbit/hyp3-isce2/src/hyp3_isce2/__main__.py", line 47, in main
    sys.exit(process_entry_point.load()())
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/scott/GitHub/relativeorbit/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 482, in main
    isce_output_dir = insar_tops_burst(
                      ^^^^^^^^^^^^^^^^^
  File "/Users/scott/GitHub/relativeorbit/hyp3-isce2/src/hyp3_isce2/insar_tops_burst.py", line 90, in insar_tops_burst
    ref_metadata, sec_metadata = download_bursts([ref_params, sec_params])
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/scott/GitHub/relativeorbit/hyp3-isce2/src/hyp3_isce2/burst.py", line 311, in download_bursts
    with get_asf_session() as asf_session:
         ^^^^^^^^^^^^^^^^^
  File "/Users/scott/GitHub/relativeorbit/hyp3-isce2/src/hyp3_isce2/burst.py", line 290, in get_asf_session
    response.raise_for_status()
  File "/Users/scott/miniforge3/envs/hyp3-isce2/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 502 Server Error: Bad Gateway for url: https://auth.asf.alaska.edu/login?code=xxOZZcOSRk_giCfl0AwwYCehOu5MlGZFtgDjKuhIdYBM7gXCnB2CSddBGA 

Clean up logging

From @asjohnston-asf:

not getting log output, only isce output, I think because https://github.com/ASFHyP3/hyp3-isce2/blob/develop/src/hyp3_isce2/insar_tops_burst.py#L30 sets log to the module logger, and https://github.com/ASFHyP3/hyp3-isce2/blob/develop/src/hyp3_isce2/insar_tops_burst.py#L193 only configures the root logger

we have a mix of log = logging.getLogger() and log = logging.getLogger(__name__) in hyp3-gamma, I remember it being a mess to clean up there when I refactored RTC and I don't recall if we ever cleaned up InSAR

Possible loss of resolution for `insar_tops_burst` geotiffs at 5x1 looks

Jira: https://asfdaac.atlassian.net/browse/TOOL-2070

The expectation is that Sentinel-1 InSAR products at 5x1 looks should have around 20m resolution, hence setting the pixel size for these geotiffs to 20m as of hyp3-isce2 v0.6.0.

However, ISCE2's geocode step creates geotiffs using the pixel size of the input DEM, which is often around ~27m for the Copernicus 30m DEM. We then resample these geotiffs to 20m pixel size, which is counterproductive as far as file size and image resolution is concerned.

ISCE2 allows separate DEMs to be used when generating the interferogram (demFilename in topsapp.xml) and when performing the final geocoding step (geocodeDemFilename in topsapp.xml). We may see improved resolution in our final 5x1 look geotiffs were we to resample the DEM to a finer pixel size prior to geocoding.

Use of BurstIDs from ESA S1 Burst Database

Background

If generating many burst interferograms you run into the case where it's not clear what burst-number to use if the framing is different for different granules.

IPF>=3.4 has standard burstIDs in the metadata for every granule which would be nice to use
https://sentinels.copernicus.eu/web/sentinel/-/publication-of-brust-id-maps-for-copernicus-sentinel-1

For the Iran Earthquake example in the readme, here is a quick plot of all the ASF S1 frames and the burst 64_IW2_136231 that is processed
Screenshot 2023-05-23 at 5 24 33 PM

Describe the solution you'd like

I'm thinking as alternative to setting --reference-burst-number and --secondary-burst-number you could have --burst-id IW2_136231 and the software would figure out which burst within the SLC to extract

Additional context

For IPF>=3.4 the burst_ids should be in the metadata, but for IPF<3.4 this would require an additional lookup maybe based on the burstid database polygons (available via the above link). short example below for reading the burst_id info

gf = gpd.read_file('S1_burstid_20220530/IW/sqlite/burst_map_IW_000001_375887.sqlite3',
                   bbox=(54.1, 27.4, 54.2, 27.5)
                  )
myburst = gf[(gf.relative_orbit_number == 64) & (gf.subswath_name == 'IW2') & (gf.burst_id == 136231)]                 
burst_id                                                          136231
subswath_name                                                        IW2
relative_orbit_number                                                 64
time_from_anx_sec                                             2514.65709
orbit_pass                                                    DESCENDING
geometry               MULTIPOLYGON Z (((54.146844 27.677253 0, 53.68...

The tmp directory must be at the same device partition as the hyp3-isce2 code locate in order to make the tests pass

pytest tests/test_merge_tops_bursts.py fails. it creates the following error:

OSError: [Errno 18] Invalid cross-device link: '/tmp/pytest-of-jiangzhu/pytest-3/test_snaphu_unwrap0/merged/filt_topophase.unw' -> 'tmp.unw'

The problem is that the renameFile function in the maskUnwrap (isceobj.TopsProc.runIon import maskUnwrap) can not do cross-device link.

solution:
Before run the test, create tmp directory to be at the same device partition as the hyp3-isce2 codes locate, and point the TMPDIR environment variable to be the tmp.

For example,

mkdir /home/user/tmp
export TMPDIR=/home/user/tmp
pytest tests/test_merge_tops_bursts.py

successfully.

Remove rasterio dependency

While not necessary for the initial work, we should consider removing the rasterio dependency in the DEM download functionality. This issue is a reminder.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.