Payload does not work well with large csv files
See original GitHub issueI’m trying to use Artillery with a large csv file. I’ve noticed that it hangs when it tries to parse the csv file. Perhaps streaming the contents of the file would be preferable over using readFileSync
?
Issue Analytics
- State:
- Created 7 years ago
- Comments:6 (3 by maintainers)
Top Results From Across the Web
Handling a large csv file ( 700 - 800 MB ) which needs to be ...
I am using Mulesoft Community edition Runtime 4.4. Our use case is : Receive a CSV file ( about 700 - 800 MB...
Read more >Memory issues : parse a large csv file transform to json and ...
This means that if you use your payload to log the number of rows like this sizeOf(payload) , it will consume the paylaod...
Read more >Memory issues processing and writing a large file - General
I'm trying to traverse through a large CSV file. Each line, I do some data remodelling and convert it to XML format.
Read more >Processing a large CSV file with a lambda, line by line : r/aws
Well, the first issue is that you only have 512 MB of temp space for the file. Are you planning to put the...
Read more >sapui5 file upload error "413 Payload Too Large"
Smaller csv files work fine, but if the csv file is above 1MB, its throwing an error. I am sending the data in...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
We need the data as it is in the CSV to reliably test our scenarios so chancejs isn’t an option unfortunately.
Hi any updates on this ticket? I have a CSV over 650,000 rows which is needed for testing.