question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to speed up checkout for big repos?

See original GitHub issue

I would like to limit number of cloned branches to minimum. As example, cloning of the master branch takes ~7s at ~3MiB/s rate:

$ time git clone --single-branch --branch master https://github.com/plokhotnyuk/jsoniter-scala.git
Cloning into 'jsoniter-scala'...
remote: Enumerating objects: 260, done.
remote: Counting objects: 100% (260/260), done.
remote: Compressing objects: 100% (221/221), done.
remote: Total 29660 (delta 39), reused 236 (delta 27), pack-reused 29400
Receiving objects: 100% (29660/29660), 13.22 MiB | 2.92 MiB/s, done.
Resolving deltas: 100% (10355/10355), done.
 
real	0m6,484s
user	0m2,249s
sys	0m0,368s

While the checkout action with the following configuration takes more than 1.5m:

      - uses: actions/checkout@v1
        with:
          ref: ${{ github.ref }}
          fetch-depth: 100

Which options can be used to speed it up?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:11

github_iconTop GitHub Comments

1reaction
ericsciplecommented, Dec 2, 2019

@plokhotnyuk i’m making perf improvements with https://github.com/actions/checkout/pull/70

i’ll merge it into master tomorrow and push a tag v2-beta

0reactions
plokhotnyukcommented, Dec 13, 2019

@ericsciple Thank you a lot for you support!

The following config works fine and completes both steps in ~4 seconds:

      - uses: actions/checkout@v2
        with:
          fetch-depth: 100
      - name: Fetch tags
        run: git fetch --depth=100 origin +refs/tags/*:refs/tags/*

It didn’t work with depth=1 for me.

Using of tags to get version of the latest release is quite handful, especially in case of multiple maintained branches.

Read more comments on GitHub >

github_iconTop Results From Across the Web

speed up git in big repos with this trick (beginner) anthony ...
today I show a little tip for speeding up `git` operations in large repositories by enabling a feature flag! this made `git status`...
Read more >
How to handle big repositories with Git | Atlassian Git Tutorial
Solution for big folder trees: git sparse-checkout · Clone the full repository once: 'git clone' · Activate the feature: 'git config core.sparsecheckout true' ......
Read more >
Faster Git checkouts on NFS and SSD with parallelism
Parallel checkout produces the best results for (large) repos on SSDs and NFS mounts. It is not recommended for small repos and/or repos...
Read more >
Get up to speed with partial clone and shallow clone
If your repository has a deep history full of large blobs, then this option can significantly reduce your git clone times. The commit...
Read more >
Checkout large repository from gerrit is slow. - Google Groups
as the JGit object cache is allocated on the JVM's heap. git gc should be configured to generate bitmap indexes which speed up...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found