Improving performance on Line charts with over 10,000 datapoints.
See original GitHub issueDescription
Hello everyone. I was wondering if someone could help me improve performance on large data visualizations. Ideally, I want to have four or more of these visualizations per page. However, zooming and panning on just one of these take a very long time. I have found that not rendering points has helped with the performance somewhat but it is still very poor. My data currently only has 2000 points but I expect it to have up to 10,000. Is Billboard.js just not the right move for these types of visualizations?
Steps to check or reproduce
My component
import { Component, Input, ViewEncapsulation } from '@angular/core';
import { bb, area, bar, zoom, line, subchart } from "billboard.js";
@Component({
selector: 'graph',
templateUrl: './graph.component.html',
styleUrls: ['./graph.component.scss'],
encapsulation: ViewEncapsulation.None,
})
export class GraphComponent {
@Input()
dataPath;
constructor() {
}
async ngAfterViewInit(): Promise<void>{
const data = await this.getJSON(this.dataPath);
console.log(data);
const chart = bb.generate({
bindto: "#chart",
data: {
// for ESM import usage, import 'line' module and execute it as
// type: line(),
type: line(),
columns: [
["data1", ...data['REDACTED']]
]
},
zoom: {
// for ESM import usage, import 'zoom' module and execute it as
// enabled: zoom()
enabled: zoom()
},
line: {
point:false,
},
padding: {
top: 20,
right: 20,
bottom: 20,
left: 20
},
axis:{
x:{
tick:{
show:false,
}
}
}
});
}
async getJSON(path: string): Promise<Response> {
return await fetch(path).then(r => r.json());
}
}
my truncated data. It contains 2000-10,000 points
[ 2144.93569540061092, 2421.26223494051994, 2237.770344371368225, 2434.59656950940412, ... ]
I appreciate the help.
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (4 by maintainers)
Top Results From Across the Web
Amazon Quicksight Line chart support for 10,000 data points
Amazon QuickSight now supports 10,000 data points for Line charts. With this update, line chart performance has been improved to support 10,000 ......
Read more >How to increase RadChart performance with thousands of ...
The collection contains 9000 data points, which makes the chart take 1 minute to display. How can we improve the performance of the...
Read more >Performance | Chart.js
Line charts are able to do automatic data decimation during draw, when certain conditions are met. You should still consider decimating data ...
Read more >Improving chart performance – amCharts 4 Documentation
Normally, Line series is fast. However, those can be bogged down by tens or even hundreds of thousands data points. Below are some...
Read more >Chart Performance with Large number of Data Points
JavaScript Charts & Graphs with large amount of Data Points. Built for High Performance and Ease of Use. Offers 10X Better Performance over...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I had a similar scenario recently, and solved it via sampling. Just want to share it for future onlookers here since the original request is closed anyway.
The alternative to plotting a lot of points obviously is to reduce this number by “sampling”, i.e. only taking a subset of them by some algorithm, resulting in some nice-to-handle 500-2000 data points.
Algorithms for this are easily available, e.g. the largest-triangle-three-bucket approach used for flot [1]. This one and other algorithms are also available as part of d3fc [2]. The logic is based on a publicly available Master thesis, and both named projects have their code available under MIT license conditions. They both contain ready-to-use JavaScript implementations that can just be copy pasted, or you could easily implement them server-side as I did in C# (with some help of [3]). Using sampling will allow you to have all the performance and animations you are used to from “normal” plots, without much drawbacks since the number of pixels in one dimension is limited to some 1000-4000 anyway.
The only relevant obstacle is of course zooming. This can addressed by updating the chart’s underlying data in one of the zoom.onzoom* functions [4] (I did it after zooming into the old sampled data via onzoomend, so my zooming is now effectively a two-step process). Depending on whether you use server-side or client-side handling, the data needs to be requeried or resampled in this step, and then updated e.g. via the load() function. Having replaced the chart’s underlying data in this step, the second adjustment is to also use the zoom.resetButton.onclick function hook, where you will need to reload the original data back into the chart again. I recommend to simply save the original (sampled) data in some variable before giving it to the billboard chart in the first place and just use this variable again for resetting the chart during zoom-reset.
This approach of course requires a few more code adjustments than just plotting additional data points, but it will give you better performance on low(er)-end hardware and browsers, and it will still work when your data grows. [5]
[1] https://www.base.is/flot/ [2] https://blog.scottlogic.com/2015/11/16/sampling-large-data-in-d3fc.html [3] https://gist.github.com/DanielWJudge/63300889f27c7f50eeb7 [4] https://naver.github.io/billboard.js/release/latest/doc/Options.html#.zoom [5] Apart from the rendering problems, there appears to be a hard limit for some array size somewhere between 10000 and 100000 points (I forgot the exact number), where d3 will just fail and throw errors if you cross it.
pan isn’t supported for drag type zoom.