Skip to content

Conversation

@cookpa
Copy link
Member

@cookpa cookpa commented Jan 30, 2026

The models and networks are too large to cache (limit 10Gb per repository), but this PR lets them be saved within a workflow using artifacts. Thus we can run a build matrix to test different python versions, without doing repeated downloads. But the artifacts aren't accessible to subsequent runs.

cookpa added 12 commits December 2, 2025 10:23
Fixes:

1. Cache data based on the files antspynet/utilities/get_antsxnet_data.py and antspynet/utilities/get_pretrained_network.py.

These are what actually determine what gets downloaded. Caching on the hash of download_all_data.py will result in cache hits
even if data changes.

2. Only update cache once per job. Don't do it in the build matrix. Avoids race conditions and duplicating cache jobs.

3. Specify branches in run conditions, avoids running twice on PRs

4. Build from pyproject.toml, not modified requirements.txt. This allows testing on recent python. Maybe need a separate
test for requirements.txt install, if it's still needed
Tired of guessing where I can put an environment variable
Not great but something
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants