question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cache restoration intermittently hangs at +90%

See original GitHub issue

We are intermittently seeing our cache restoration hang when its completion is over 90%. It can hang for over 10 minutes, so we’ve set a timeout of 10 minutes to terminate the jobs. When our developers retry the workflow run without any change, the run often succeeds.

Sample logs:

2022-07-25T20:24:42.4105298Z Received 20971520 of 532098493 (3.9%), 20.0 MBs/sec
2022-07-25T20:24:43.4104976Z Received 125829120 of 532098493 (23.6%), 60.0 MBs/sec
2022-07-25T20:24:44.4105956Z Received 243269632 of 532098493 (45.7%), 77.3 MBs/sec
2022-07-25T20:24:45.4128635Z Received 356515840 of 532098493 (67.0%), 84.9 MBs/sec
2022-07-25T20:24:46.4134718Z Received 427819008 of 532098493 (80.4%), 81.6 MBs/sec
2022-07-25T20:24:47.4133829Z Received 465567744 of 532098493 (87.5%), 74.0 MBs/sec
2022-07-25T20:24:48.4138417Z Received 507510784 of 532098493 (95.4%), 69.1 MBs/sec
2022-07-25T20:24:49.4148580Z Received 511126973 of 532098493 (96.1%), 60.9 MBs/sec
2022-07-25T20:24:50.4167174Z Received 511126973 of 532098493 (96.1%), 54.1 MBs/sec
2022-07-25T20:24:51.4178143Z Received 511126973 of 532098493 (96.1%), 48.7 MBs/sec
2022-07-25T20:24:52.4191064Z Received 511126973 of 532098493 (96.1%), 44.3 MBs/sec
2022-07-25T20:24:53.4205470Z Received 511126973 of 532098493 (96.1%), 40.6 MBs/sec
2022-07-25T20:24:54.4218795Z Received 511126973 of 532098493 (96.1%), 37.5 MBs/sec
2022-07-25T20:24:55.4232178Z Received 511126973 of 532098493 (96.1%), 34.8 MBs/sec
2022-07-25T20:24:56.4246845Z Received 511126973 of 532098493 (96.1%), 32.5 MBs/sec
< log lines showing 96.1% repeat until manually terminated, with MBs/sec dropping and the following finish >
2022-07-25T20:58:16.9998830Z Received 511126973 of 532098493 (96.1%), 0.2 MBs/sec
2022-07-25T20:58:17.0660707Z ##[error]The operation was canceled.

Here is the code that has triggered this cache restoration. It is inside a matrix step that triggers 20 concurrent jobs, so it’s possible that resource contention is a factor:

      - name: Precompiled Assets Cache
        uses: actions/cache@v3
        id: precompiled-assets-cache
        with:
          path: |
            vendor/assets/components/**
            public/assets/**
            public/packs/**
            tmp/**
          key: precompiled-assets-${{ env.CACHE_VERSION }}-${{github.ref}}-${{github.sha}}
          restore-keys: |
            precompiled-assets-${{ env.CACHE_VERSION }}-${{github.ref}}-
            precompiled-assets-${{ env.CACHE_VERSION }}-refs/heads/main-
            precompiled-assets-${{ env.CACHE_VERSION }}-

Here is the code that originally cached the data:

      - name: Precompiled Assets Cache
        uses: actions/cache@v3
        id: precompiled-assets-cache
        with:
          path: |
            vendor/assets/components/**
            public/assets/**
            public/packs/**
            tmp/**
          key: precompiled-assets-${{ env.CACHE_VERSION }}-${{github.ref}}-${{github.sha}}
          restore-keys: |
            precompiled-assets-${{ env.CACHE_VERSION }}-${{github.ref}}-
            precompiled-assets-${{ env.CACHE_VERSION }}-refs/heads/main-
            precompiled-assets-${{ env.CACHE_VERSION }}-

Please let me know if we’re doing anything incorrectly, would love to learn if this is occurring due to the way we’re using the feature, or if perhaps there’s a limit that we’re hitting.

Thanks, maintainers, for all that you do! ❤️

Issue Analytics

  • State:closed
  • Created a year ago
  • Reactions:6
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

4reactions
bishal-pdMSFTcommented, Jul 29, 2022

This is an issue in Azure storage SDK and we are following up with them.

3reactions
asottile-sentrycommented, Aug 15, 2022

@kotewar can that be configured down to say ~3 minutes? an hour is pretty bad still

Read more comments on GitHub >

github_iconTop Results From Across the Web

Cache is slow #11864 - microsoft/azure-pipelines-tasks - GitHub
I'm seeing this intermittently as well. This run took 2 min and 46 seconds to download 1.7GB. Running a NuGet restore takes less...
Read more >
Whatsapp Restoring Media Stuck - Fixed - YouTube
WhatsApp stuck on restoring mediaRestoring media paused waiting for wifiWhatsApp Backup Stuck ? - Please follow same solutions.
Read more >
Restoring media stuck at 100% : r/whatsapp - Reddit
I restored the messages, however, it stuck at 99% or 100%. I tried the following: Clear the cache of 2nd Whatsapp, the problem...
Read more >
Windows 10 File Transfer Stuck at 99%? Fixes Are Here!
Lack of memory space may be one of the main causes of copying files that always stops and gets stuck at 0, 10,...
Read more >
WhatsApp Backup Stuck? Here are 15 Ways to Try
To fix this, you can go to your device's Settings > Apps/Application Manager and select WhatsApp. Go to its Storage and tap on...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found