question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. ItĀ collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Performance regression in CI from 6.6 -> 6.7 (and 6.8/7.0)

See original GitHub issue

Current behavior

Cypress got slower in v6.7.

Desired behavior

That it’s about as fast as 6.6. (or faster! šŸ˜‰ )

Details

My team is using Cypress with Gitlab CI, and when upgrading from 6.6 to 6.7 our test jobs slowed significantly. (Cypress suites taking 2-3x the time)

I can’t share the code, so I produced a new project that demonstrates a performance regression, hopefully for the same reasons:

https://github.com/jrr/cypress-storybook-example

The project contains two Cypress suites - the example tests that came out of the box, and a separate suite that visits storybook pages (containing miscellaneous Material-UI React components).

Locally (2018 Macbook Pro) the performance difference is negligible. In CI, though, there’s noticeable slowdown:

Suite Environment 6.6 6.7 6.8 7.0 7.2
Visit Storybook Local 0:27
0:26
0:27
0:27
0:31
0:28
0:29
0:30
0:31
0:30
Visit Storybook GitHub 0:55
0:54
1:07
1:44
1:28
2:05
2:04
2:11
1:53
1:39
1:45
1:43
1:10
1:03
1:03
Visit Storybook GitLab 2:54
2:53
2:54
5:53
5:15
6:01
5:23
5:27
4:57
5:09
4:45
4:48
4:25
3:26
4:33
Cypress Examples Local 1:14
1:16
1:20
1:21
1:21
1:21
1:34
1:28
1:22
1:23
Cypress Examples GitHub 1:31
1:25
1:27
1:48
1:40
1:49
2:03
2:09
1:42
1:45
1:54
1:46
1:27
1:20
1:22
Cypress Examples GitLab 2:03
1:53
1:48
3:32
3:32
3:21
3:21
3:30
3:04
3:28
3:07
3:01
2:03
1:55
2:01

(Using Chrome --headless. I tried a few runs in case of outliers.)

Both GitHub Actions and GitLab CI are connected to the project repo and logs are publicly visible:

CI pipeline status

Both jobs are running tests in the cypress/browsers:node14.16.0-chrome89-ff86 Docker environment.

Gitlab’s ā€œshared runnersā€ are rather meager - 3.75GB of RAM and 1 vCPU. GitHub’s are slightly beefier with 7GB and 2 CPUs.

Here’s what’s reported by cypress info:

// GitHub
Cypress Version: 6.6.0
System Platform: linux (Debian - 10.8)
System Memory: 7.29 GB free 1.16 GB
// Gitlab
Cypress Version: 6.6.0
System Platform: linux (Debian - 10.8)
System Memory: 3.87 GB free 573 MB

I’m not sure where the bottleneck is (e.g. CPU, Memory, Disk). Are you aware of performance regression in the last couple of versions? Does Cypress maintain performance tests to catch major deltas from release to release? (it looks like cypress-real-world-app tests may have gotten a bit slower according to its dashboard)

Is there anything I can try to further track it down and/or improve it?

Side Note 1: Retained Prereleases

I started trying to bisect commits between 6.6 and 6.7 to track down what change introduced the performance regression (using the install pre-release version guide), but I didn’t get far before I hit the end of the prerelease packages.

Could you retain them for longer? Like, say, 3 or 4 releases worth? The prerelease packages are great for trying out the latest bleeding edge, but they could also be great for bisecting history if they were available longer.

Side Note 2: Cypress Stress Test

If you take the storybook test in this repo, and crank it up from 50 visits to ~300, it starts out pretty zippy:

https://user-images.githubusercontent.com/164652/113465256-a4291a00-9400-11eb-9472-c12df185a037.mp4

But by the end it gets real slow:

https://user-images.githubusercontent.com/164652/113465259-abe8be80-9400-11eb-90e0-a09c7dc4af1d.mp4

Do you know what’s going on here? I thought it might be useful as a stress/performance test if you don’t already have something like this.


edit: added numbers for Cypress 7.0 edit: added numbers for Cypress 7.2

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:10
  • Comments:27 (9 by maintainers)

github_iconTop GitHub Comments

1reaction
timfaasecommented, Apr 30, 2021

I’m seeing a similar performance drop between 6.6.0 and 7.2.0. From around 33 minutes to 40 minutes. Screen size did not matter.

1reaction
GC-Markcommented, Apr 7, 2021

@flotwig Should we create a separate issue for the performance regression between 6.8.0 and 7.0.0

Edit: done #15853. Close if you want to keep it all here.

Read more comments on GitHub >

github_iconTop Results From Across the Web

office-ui-fabric-react/CHANGELOG.json at master - GitHub
React components for building Microsoft web experiences. - office-ui-fabric-react/CHANGELOG.json at master Ā· zouhir/office-ui-fabric-react.
Read more >
The Validity of Effluent and Ambient Toxicity Tests for ...
The Complex Effluent Toxicity Testing Program was initiated to support the developing trend toward water quality-based toxicity control in the National.
Read more >
(PDF) Screening of the Toxicity of Polystyrene Nano
This study aimed to assess the potential bioavailability and acute toxicity of polystyrene (PS) NPs (50 and 500 nm) and of MPs (4.5...
Read more >
https://svn.ietf.org/svn/tools/ietfdb/tags/7.36.0/...
Fixed a recent regression in agenda.html which caused the 'now' context ... Django 2.2 does not wrap single queries in transactions, for performance...
Read more >
some curve-fitting fundamentals - DTIC
regression analysis in particular when developing estimating relation- ... 6. Using the Method of Least Squares to Fit the Parabola. Form 1: Worksheet....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found