Pass in an array of data [bulk insert] using an insert statement
See original GitHub issueHi,
How can I pass in an array of 1,000 elements and have pg
do a bulk insert into the database?
I am trying to solve this problem brought up here, but using postgres
instead
var sql = 'insert into '+table+
' (Month, Merchant_Id, Merchant_Descriptor, Network, Qualification_Code, Transaction_Type, '+
' Issuer_Type, Card_Type, Txn_Count, Txn_Amount, ' +
'Interchange) values ($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11)';
psql.connect();
reduceArray(large_array, function(small_array){
db(small_array);
});
var reduceArray = function(arr, cb){
var size = 1000,
small_array = []
;
for (var i=0; i<arr.length; i+=size){
var chunk = arr.slice(i,i+size);
small_array.push(chunk);
}
cb(small_array);
};
db insert:
var db = function(small_array){
async.times(small_array.length, function(n, next){
var data = small_array[n];
psql.query(sql, [data], function(err,result){
next(err, result);
});
}, function(err){
console.log(err);
});
};
Issue Analytics
- State:
- Created 8 years ago
- Comments:7 (3 by maintainers)
Top Results From Across the Web
Problems making my own array to pass to bulk insert query
This runs jsMakeBulkInsert first which makes the array and stores it in a temp state var and then fires the SQL query. //...
Read more >Oracle - How to bulk insert values in one query with an array ...
First attempt was to use IN clause inside of specific column value part and second attempt was to use INSERT ALL. Could not...
Read more >Iterating an array and passing values as input parameters for ...
I am learning mulesoft. I am trying to read a CSV file and insert records in database through bulk-insert option (using Mule 4,...
Read more >Bulk insert of records , using something like a user defined...
The fastest way is: insert into t select * from t2; and perform the computations in SQL. There is not more (or less)...
Read more >Using Arrays of Parameters
Passing arrays of parameter values for bulk insert operations, for example, with SQLPrepare/SQLExecute and SQLExecDirect can reduce the ODBC call load and ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yeah if you really need maximum speed you can string concatenate all the values into a single query - but absolutely do not do that for untrusted input. String concatenation is 99% of the time bad, bad, bad for sql.
You could also look at https://github.com/brianc/node-pg-copy-streams but it also does not sanitize the input. It doesn’t suffer from the same sql injection attacks because it doesn’t use a sql statement to do inserts, but there could still be issues w/ malicious input.
On Tue, Oct 27, 2015 at 4:18 PM, John Skilbeck notifications@github.com wrote:
Ok sounds good. Yeah, the way I did it was just using
.map
… I thought that this might create unnecessary overhead however (singleinsert
statement for each record)…Was thinking there might be savings if you could just have one
insert
statement with an array of records. Aninsert
statement for each record also works though; was just trying to save fractions of a second.