question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

v7.rc.0 serving 2x - 3x slower than stable versions on my projects

See original GitHub issue

Bug or Feature ?

- [x] bug report

Command ?

- [x ] serve --sourceMap=false

Versions ?

Angular CLI: 7.0.0-rc.1 Node: 10.9.0 OS: win32 x64 Angular: 7.0.0-rc.0 … common, compiler, compiler-cli, core, forms, http … language-service, platform-browser, platform-browser-dynamic … router Package @angular-devkit/* 0.9.0-rc.1 @angular-devkit/core 7.0.0-rc.1 @angular-devkit/schematics 7.0.0-rc.1 @angular/cdk 7.0.0-beta.2 @angular/cli 7.0.0-rc.1 @angular/material 7.0.0-beta.2 @ngtools/webpack 7.0.0-rc.1 @schematics/angular 7.0.0-rc.1 @schematics/update 0.9.0-rc.1 rxjs 6.3.3 typescript 3.1.1 webpack 4.19.1

Repro steps ?

Update from 6.2.3 to v7 rc0, it is first time to test cli v7 in my projet. It is slower even when serving without sourcemaps. So ng s or ng s --sourceMap=false same refresh cli server around 10seconds while it was around 2-3s before.

Log ?

https://gist.github.com/istiti/9da2393a26aeb0117932e56bb04edd9a

Useful details ?

I can say project is big relativly to what angular/cli can support without bazel (1000+ files) and can’t share project unfortunately I just come back to stable version of cli v.6.2.4 and angular (see working package.json) and I get this score when serving https://gist.github.com/istiti/fc8b629e0a7f7cab13b41b070cbfb94b

Thanks @clydin

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:2
  • Comments:41 (30 by maintainers)

github_iconTop GitHub Comments

26reactions
filipesilvacommented, Oct 5, 2018

I’ve had some questions in the past about how we debug these performance regressions, so thought I could do a write up here for anyone that’s curious.

The first step is always to reproduce the problem in a side by side comparison, and to get some data on it. Since the projects in this issues were not open source I tried to reproduce using Angular.io (AIO), which you can find inside the aio folder of https://github.com/angular/angular.

I cloned AIO twice. In one the the clones I use the latest v6 CLI (@angular/cli@6.2.4 + @angular-devkit/build-angular@0.8.4), and the other clone I use the upcoming v7 RC (@angular/cli@7.0.0-rc.1+@angular-devkit/build-angular@0.9.0-rc.1).

The reports were in non-AOT builds but AIO has some custom configurations, so I added a new configuration that disabled most things to ensure I was getting data from the simplest case:

          "configurations": {
            "debug": {
              "aot": false,
              "optimization": false,
              "buildOptimizer": false,
              "outputHashing": "none",
              "sourceMap": false,
              "statsJson": false,
              "extractCss": false,
              "extractLicenses": false,
              "vendorChunk": true
            },

I ran ng serve --configuration=debug, triggered changes by adding console.log(1); to src/main.ts, and gathered some numbers. It’s important to actually add code when triggering a rebuild because the build system will do a lot less work if the AST of the files does not change. I ignore the numbers from the first few rebuilds as well, since sometimes it’s artificially inflated as caches are being populated.

With this setup I got v6 at ~300ms rebuilds and v7 at ~6000ms. This confirmed the original report: rebuilds were much slower in v7.

Since I didn’t really know where to start looking I tried to get a profile of where time was being spent using https://github.com/GoogleChromeLabs/ndb, which is a simplified version of what’s described in https://medium.com/@paul_irish/debugging-node-js-nightlies-with-chrome-devtools-7c4a1b95ae27.

I tried taking a few profiles of the initial build plus the rebuild but ndb kept crashing. I’d seen this happen in the past when there are a lot of profile events so instead tried to profile a single rebuild. I took a sample of a v6 and a v7 rebuild by following these steps:

  • ran ndb node ./node_modules/@angular/cli/bin/ng serve --configuration=debug
  • waited for the initial build to finish
  • wait 5s, manually appended console.log(1); to src/main.ts, wait for the rebuild to finish, wait 5s to let all processes stop activity
  • triggered 3 rebuilds like this
  • went to ndb, performance tab, clicked record button on top, waited for recording to start
  • triggered another rebuild, waiting 8s before and after to be double sure I got only an isolated rebuild
  • in ndb stopped the recording, then saved it to disk
  • killed the serve process, which kills ndb

I shared these profiles with the team, then opened ndb twice, side by side, and loaded the profiles I had saved. The initial view is called “Summary” and looked like this:

v6 Summary image

v7 Summary image

There’s 3 processes listed: Main (which is from ndb I think), the ng serve, and the forked type checker. We run type checking in a separate process to speed up rebuilds.

  • v6 has the main thread working from 8080 to 8900 (820ms), typechecker from 8200 to 10200 (2000ms).
  • v7 has the main thread working from 10360 to 23420 (13060ms), typechecker from 10450 to 32450 (22000ms).

Remember that these numbers are for a process while being profiled: individual numbers are not representative of real world apps, but the comparison between two profiled processes is.

Couldn’t say much here besides that v7 took way longer. But it was interesting to see that even the type checker took way longer. Since the type checker doesn’t really use any webpack internals, it was indicative that whatever was slowing things down wasn’t related to webpack.

At the bottom of the ndb window I switched to the “Bottom-Up” view. This tells you how much time is spent on functions. Bear in mind there is one table for each process.

ng serve bottom-Up, v6 left, v7 right image

type checker bottom-Up, v6 left, v7 right image

What we care about here is the column called “Self Time”, which is the time spent on that specific function, but not the functions it calls. We assume the v6 profile as the normal one, and see what’s different in the v7 one.

We (the team) went over these numbers and drew some conclusions:

  • v7 spends a lot more time on virtual file system functions (an abstraction we added)
  • there’s a lot more time spent inside RxJs observables.
  • normalize looks expensive or called excessively
  • The v6 type checker is dominated by TS function calls, v7 has a lot of internal and fs errors. Drilling down shows they actually come from our devkit/core hosts.

We use a lot of Observables and knew RxJs had been updated in our packages, so wondered if there was a performance regression there.

To debug this I forced the v7 devkit packages to use the same RxJs as was in v6 by copying it over to node_modules/@angular-devkit/node_modules/. This forced node resolution to use that copy instead of the top level one. I made sure it was being used by deleting a couple of files, which showed a bunch of RxJs errors. No real change in rebuild times, so this didn’t seem the culprit.

Then I started replacing more of the @angular-devkit/* modules src/ folder with their v6 versions to see when the rebuild times changed. The most crucial ones seemed @angular-devkit/core (where our virtual file system is) and @ngtools/webpack (where our webpack plugin is). This isn’t a great of debugging things but in this case worked because not too much had changed.

Changing the @ngtools/webpack source back to v6 made rebuilds fast again. The regression was somewhere in the source of this package. We only had some 15 commits to this package that were in v7 but not v6 so started looking at those.

https://github.com/angular/angular-cli/pull/12462 and https://github.com/angular/angular-cli/pull/12461 were tested and together reduced the rebuild times by some 40% (4.5s-> 2.7s). Still too big but it helped.

You might notice that the 4.5s rebuild time I just mentioned was different than the 6s one that I reported at the beginning. I don’t know the specific reason for the discrepancy. Likely my machine had more resources available, less things in the background, or the process just ended up on a CPU with less load. All of this is common, so comparing numbers from different debugging sessions doesn’t mean much. If you want accurate numbers you need to do a before and after that’s close together, which is how I got the 4.5s-> 2.7s.

I took some more profiles and compared the latest changes (v7-64d1524) with the original v7.

ng serve bottom-Up, v7 left, v7-64d1524 right image

The effect seemed as expected from https://github.com/angular/angular-cli/pull/12462 and https://github.com/angular/angular-cli/pull/12461: normalize calls took a lot less time and there was less file system calls in general.

Now most of the time was being spent in some kind of file system error. Drilling down showed it ultimately came from TypeScript looking up files:

image

After some time debugging, we say that it was typescript trying to resolve modules, which tries to see if files with various names exist. We discussed this for a while and saw there were some changes to how files were cached.

We tried to add a cache to the Typescript module name resolution which further reduced rebuild time, but didn’t leave them that close to the original. @clydin discovered that we weren’t actually caching the TypeScript SourceFiles anymore which caused full instead of incremental TypeScript rebuilds.

He put up https://github.com/angular/angular-cli/pull/12483, that showed ~254ms (266, 281, 244, 246, 233) rebuilds, while v6 has ~270ms (318, 233, 248, 238, 313) rebuilds. So perhaps slightly faster even, but might also just be some noise in the data. It looks like it re-establishes parity.

I’d like to say that we are looking at better automated ways to detect these performance regressions as they are introduced. We have a internal benchmarking tool (https://github.com/angular/angular-cli/pull/12022) and a new --profile flag (https://github.com/angular/angular-cli/pull/11497). But unfortunately neither of these are very useful for rebuilds right now, which is why this is still a lot of manual work.

I hope someone finds this write-up useful!

4reactions
filipesilvacommented, Oct 2, 2018

I could reproduce with https://github.com/angular/angular/tree/master/aio.

Using @angular-devkit/build-angular@0.8.3 I saw non-AOT ng serve rebuilds of ~300ms.

With @angular-devkit/build-angular@0.8.3 rebuilds were ~6000ms. Most of the time seemed to be spent at 0% progress.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Angular devs upgrading to v7, watch this
I opened same issue title 2 week ago here without AOT : v7.rc.0 serving 2x - 3x slower than stable versions on my...
Read more >
v7.7beta [testing] is released!
RouterOS version 7.7beta3 has been released "v7 testing" channel! ... I hope this problem can be fixed until to the stable version.
Read more >
FMPA 12 slow - Claris Community - FileMaker Pro
I was testing the pre-release version of FMPA12 and it was slow when I was doing some edits to layouts. I downloaded the...
Read more >
IBM FlashSystem Best Practices and Performance Guidelines
This edition applies to IBM Spectrum Virtualize Version 8.4. ... 5.10.1 Changing the preferred node of a volume within an I/O group ....
Read more >
qandamaster391.xml
... .com/homework-help/questions-and-answers/function-3r-1-1-2-x-0-f-x-b-f-2- ... -following-reaction-12-g-cl2-g-21c1-g-kp-8619-21-c-calculate-3x-q27204613 ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found