Skip to content

Conversation

Copy link

Copilot AI commented Dec 2, 2025

Empty PR to verify test suite functionality and identify environmental constraints.

Test Results

16 total tests:

  • ✅ 5 passing: tests without external data dependencies
  • ❌ 11 failing: DNS resolution errors for drive.google.com

Analysis

All failures stem from download_gdrive_file() attempting to fetch test data from Google Drive. The utility fails with:

requests.exceptions.ConnectionError: Failed to resolve 'drive.google.com' 
([Errno -5] No address associated with hostname)

Passing tests (no external dependencies):

  • test_load_schema - ASDF schema validation
  • test_combine_frames - Frame combination logic
  • test_create_dark - Dark frame generation
  • test_normalize_step - Normalization pipeline
  • test_parse_subarray_map - Subarray configuration parsing

Failing tests (require Google Drive access):

  • WCS assignment, background calculation, flat/dark calibration steps
  • Data model loading tests
  • Pipeline integration tests (stage1, stage2)

Test infrastructure functions correctly. Failures are environmental, not code defects. GitHub Actions CI with internet access should pass if Google Drive links remain valid.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • drive.google.com
    • Triggering command: /home/REDACTED/.local/bin/pytest pytest -s (dns block)
    • Triggering command: /home/REDACTED/.local/bin/pytest pytest -v (dns block)
    • Triggering command: /home/REDACTED/.local/bin/pytest pytest --tb=no -q (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

make a empty pull request just to check that the tests still work. then run the tests and make sure tests still work in your environment. then debug failures


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Copilot AI self-assigned this Dec 2, 2025
Copilot AI changed the title [WIP] Create empty pull request to verify test functionality Verify test infrastructure and document network-dependent test failures Dec 2, 2025
Copilot AI requested a review from zonca December 2, 2025 23:32
@zonca
Copy link
Member

zonca commented Dec 3, 2025

@astrobc1 it seems there are issues downloading from Google drive in GitHub actions. if I try in the browser the link works.

1 similar comment
@zonca
Copy link
Member

zonca commented Dec 16, 2025

@astrobc1 it seems there are issues downloading from Google drive in GitHub actions. if I try in the browser the link works.

@astrobc1
Copy link
Contributor

@zonca Thanks for the reminder. I will take a look at this later this week, or early 2026. I haven't changed the simulation files since the summer, so I'm actually not sure what's going on.

Locally, I have completed or nearly so:

  • Distortion and differential atmos. refraction steps with drizzle
  • Full implemented read out processing steps and pipelines in Python
  • Consistent interface with latest version of stpipe (i.e. overriding far fewer Step, Pipeline, and DataModel methods).

I will ensure the testing files are working as expected once I merge the above into main.

@astrobc1
Copy link
Contributor

@zonca I still don't know the root cause, but this behavior also started occurring on my other project (HISPEC_DRP) using the same pattern for testing data. I ended up removing the web scraping aspect that beautifulsoup4 is doing and hardcoded the Google Drive file IDs, which solved it. Adding files to the Google Drive testing data folder now requires adding the corresponding file ID to the python package so users can still specify path/filenames, not Google Drive IDs.

I think I will take the same approach for liger_iris_pipeline - hard code file IDs.

In early 2026, many calibration files the DRS uses will be hosted at Keck. However the DRS will still need some standalone testing files for testing purposes.

Lastly, for any test that doesn’t depend on the nature of the data itself, we now generate data products on the fly for those tests. This has made updating tests easier. Eventual end-to-end tests will use other metrics like the calculated SNR for validation.

@zonca
Copy link
Member

zonca commented Dec 18, 2025

@astrobc1 ok, would you like me to implement that for liger_iris_pipeline?

@astrobc1
Copy link
Contributor

@zonca I think that is well-isolated from what I'm working on. Here are the relevant scripts for HISPEC - https://github.com/oirlab/HISPEC_DRP/tree/main/hispecdrp/utils - see testing_data.py and gdrive.py. Feel free make any other changes/improvements in this context as you see fit for liger_iris_pipeline.

Link to Liger IRIS Testing data -
https://drive.google.com/drive/folders/1y7dPZp-dqNuMgQVTiZJyiUZVVL9v1vOj?usp=sharing

Files used in current tests

Liger/L1/2024B-P001-001_Liger_IMG_SCI_LVL1_0001_M13-J-10mas-skyscale0.5.fits
Liger/L1/2024B-P001-001_Liger_IMG_SCI_LVL1_0001_M13-J-10mas-skyscale0.75.fits
Liger/L1/2024B-P001-001_Liger_IMG_SCI_LVL1_0001_M13-J-10mas-skyscale1.0.fits
Liger/L1/2024B-P001-001_Liger_IMG_SCI_LVL1_0001_M13-J-10mas-skyscale1.25.fits
Liger/L1/2024B-P001-001_Liger_IMG_SCI_LVL1_0001_M13-J-10mas-skyscale1.5.fits

Liger/Cals/Liger_IMG_DARK_20240924000000_0.0.1.fits
Liger/Cals/Liger_IMG_FLAT_20240924000000_0.0.1.fits

IRIS/L1/2024B-P001-001_IRIS_IMG1_SCI_LVL1_0001_M13-J-4mas.fits

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants