[WIP] Github action for pulling in nightly builds from CircleCI #29
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Part of ethereum/solidity#9258.
What works
solc-bin
repo without checking out all the 10 GB of data. This makes it run in under a minute (compared to ~5 minutes with full checkout).git clone --filter blob:none
) makes it almost instant but then git downloads all the blobs when you commit anyway. Maybe I could get it to stop doing that somehow but 1 minute is already good enough, even for debugging.git clone --depth 1
) should make it download less data but that doesn't seem to be the case and even makes the whole operation run slower (~30 seconds longer).It's already running in my fork of
solc-bin
.What's still in progress
b_ems
job, even if it's still a PR. I need to switch to getting them only fromdevelop
.develop
run only once per day so they can be several pages down the list. I need to start using theoffset
query parameter for pagination and do multiple requests.wasm/
directory yet.Other
pull-soljson action <[email protected]>
).master
branch in the repo and set it as the branch where the action adds new commits. Since there are external tools that rely on binaries hosted on GH pages, I think that it would be best to considergh-pages
frozen until we get S3 ready to serve the files instead and to work onmaster
in the meantime.