Skip to content

Conversation

@StevenClontz
Copy link
Collaborator

@StevenClontz StevenClontz commented Oct 20, 2025

Closes #638

I feel like I've probably reinvented some wheel, but this script should grab and extract the latest build artifact if it exists.

Once #623 is merged, I'll update this to run on provisioning a new codespace before building things, which should really speed things up.

@StevenClontz StevenClontz requested a review from siefkenj October 20, 2025 10:27
@StevenClontz
Copy link
Collaborator Author

CC @oscarlevin and @siwelwerd -- this pattern would allow us to use GitHub's built-in artifact system to download caches of PreTeXt-generated or CheckIt-generated assets for new codespaces/clones, without needing to upload them elsewhere as is done currently for TBIL.org

@siefkenj
Copy link
Contributor

Is this intended to be run inside of container setup or with every commit. If it is run on container setup, I am worried that the artifacts may be expired...

Two workarounds would be (1) to have another repo where the artifact gets published to: https://github.com/marketplace/actions/publish-artifact-to-git or (2) to publish the artifact zip bundle at a fixed URL in the github pages and download from there.

Or we could trust that doenet will always be updated every 5 days and so the artifacts won't be expired.

@dqnykamp do you have an opinion?

@StevenClontz
Copy link
Collaborator Author

So the script is now successfully downloading and extracting something, but it doesn't seem to actually speed up the build process if I run bash scripts/download-latest-artifact.sh && npm install && npm run build (obviously I don't really understand the build process yet).

@siefkenj
Copy link
Contributor

So the script is now successfully downloading and extracting something, but it doesn't seem to actually speed up the build process if I run bash scripts/download-latest-artifact.sh && npm install && npm run build (obviously I don't really understand the build process yet).

Can you make sure that each subdirectory of packages/ has a .wireit subdirectory inside of it?

@dqnykamp
Copy link
Member

Two workarounds would be (1) to have another repo where the artifact gets published to: https://github.com/marketplace/actions/publish-artifact-to-git or (2) to publish the artifact zip bundle at a fixed URL in the github pages and download from there.

Or we could trust that doenet will always be updated every 5 days and so the artifacts won't be expired.

I expect that the DoenetML repository will frequently have stretches of longer than 5 days were the main branch does not change. If that means that this caching will cease working in those intervals, I think it make sense to implement one of these workarounds. I'd have a slight preference to storing the artifact on github pages rather than to the git repository, but would be happy to go with what the two of you deem the best solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Host build cache online for downloading to new development environments

3 participants