Upgrade to v2.3/v2.4 results in post-setup failure
See original GitHub issueWe are using actions/setup-node
v2.2 in Yarn-based monorepo. We attempted upgrade to v2.3 and v2.4, but the updated version fails with error Error: Cache folder path is retrieved for yarn but doesn't exist on disk
in post-setup step in workflows that also run yarn install
command. (We have a workflow that runs yarn audit
without actually installing the dependencies. and an upgrade to v2.3/v2.4 passes there without failures.)
Our usage:
steps:
- name: 'Checkout'
uses: actions/checkout@v2.3.4
- name: 'Setup Node'
uses: actions/setup-node@v2.4.0
with:
node-version: 14.x
cache: 'yarn'
Logs from “Setup Node” step (identical for passing and failing builds):
Run actions/setup-node@v2.4.0
with:
node-version: 14.x
cache: yarn
always-auth: false
check-latest: false
token: ***
Found in cache @ /opt/hostedtoolcache/node/14.17.4/x64
/usr/local/bin/yarn --version
1.22.11
/usr/local/bin/yarn cache dir
/home/runner/.cache/yarn/v6
yarn cache is not found
Logs from “Post Setup Node” in successful v2.4 run:
Post job cleanup.
/usr/local/bin/yarn --version
1.22.11
/usr/local/bin/yarn cache dir
/home/runner/.cache/yarn/v6
/usr/bin/tar --posix --use-compress-program zstd -T0 -cf cache.tzst -P -C /home/runner/work/<redacted>/<redacted> --files-from manifest.txt
Cache Size: ~0 MB (258 B)
Cache saved successfully
Cache saved with the key: node-cache-Linux-yarn-abc1f1c083ff490ac797c8966c624b3975d27c522daaed7ed60b66058fca009f
Logs from “Post Setup Node” in failed v2.4 run:
Post job cleanup.
/usr/local/bin/yarn --version
1.22.11
/usr/local/bin/yarn cache dir
/home/runner/.cache/yarn/v6
Error: Cache folder path is retrieved for yarn but doesn't exist on disk: /home/runner/.cache/yarn/v6
Issue Analytics
- State:
- Created 2 years ago
- Reactions:5
- Comments:15 (6 by maintainers)
Top Results From Across the Web
SANDIA REPORT Xyce™ Parallel Electronic Simulator Users ...
Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes ...
Read more >Customer reviews: Official Creality Sprite Direct Drive Extruder ...
Find helpful customer reviews and review ratings for Official Creality Sprite Direct Drive Extruder Pro Kit with 80N Stepper Motor, Direct Drive Upgrade...
Read more >Untitled
Xbox remote codes, Game 4 2012 world series box score, Dyson am04 fan heater price ... Fic ice cream menu, Ahurissantes, Rugby 3...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The current behavior is expected and correct. We don’t have plans to change it.
If
cache
is enabled in YAML and build doesn’t install any dependencies - nothing to cache andcache
feature won’t work properly. In previous version, we skipped such errors silently but starting from 2.4.0, we have decided to fail the builds to notify users that “cache doesn’t work” and user can either fix his build or disable cache.Doesn’t this strip
actions/setup-node
build steps of their idempotence? That is,actions/setup-node
build steps may behave differently based on the success or failure of other steps in the job, whether or not the job has been run before, and/or the results of previous jobs?If so, I feel we may have lost something valuable here.
I personally would prefer the previous behavior, as I don’t want to have to worry about whether or not setting the cache flag on
actions/setup-node
might cause otherwise successful jobs to fail. I get not wanting to silently swallow “cache doesn’t work” issues — perhaps a clear warning in the job logs is more appropriate than a fatal error?Here’s a workaround I’m using for now, FWIW:
Appreciate your continued investment in this project, Dmitry and team! 🙇 🙏