Batch write multiple points
See original GitHub issueInflux supports writing multiple points at once and we would probably get much better performance with such batching.
I think the easiest is to rewrite writePoint
so that it can take an array of points. The code should not be much different.
Issue Analytics
- State:
- Created 10 years ago
- Comments:16 (8 by maintainers)
Top Results From Across the Web
InfluxDB - batch write multiple points with the same tag
When I write multiple points with the same tag value, it only writes the first point to the database. Is this a bug...
Read more >Batch write multiple points with the same tag #349 - GitHub
Hi,. When I write multiple points with the same tag value, it only writes the first point to the database. Is this a...
Read more >BatchWriteItem - Amazon DynamoDB - AWS Documentation
The BatchWriteItem operation puts or deletes multiple items in one or more tables. A single call to BatchWriteItem can transmit up to 16MB...
Read more >Inserting, Deleting, and Fetching multiple items at a time with ...
batchWrite and batchGet are a way to get the parallelism of sending multiple writes or gets in multiple threads regardless of whether or...
Read more >DynamoDB - Batch Writing - Tutorialspoint
Batch writing operates on multiple items by creating or deleting several items. These operations utilize BatchWriteItem, which carries the limitations of no ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
You can provide an array of points to .writePoints… https://node-influx.github.io/class/src/index.js~InfluxDB.html#instance-method-writePoints
@bencevans thank you for the support, but now it works as expected. The error message
*.forEach is not a function
was just a little confusing to me, i did not pass the data in the right format.After adjusting the points data to objects inside an array (not nested arrays) and adjusting the schema for it, it batch writes to influxdb. As expected this is a lot faster.
Edit: btw. I had to set
max-body-size = 0
in/etc/influxdb/influxdb.conf
to prevent theerror 413 request entity too large
error