question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Pass in an array of data [bulk insert] using an insert statement

See original GitHub issue

Hi,

How can I pass in an array of 1,000 elements and have pg do a bulk insert into the database?

I am trying to solve this problem brought up here, but using postgres instead

var sql = 'insert into '+table+
    ' (Month, Merchant_Id, Merchant_Descriptor, Network, Qualification_Code, Transaction_Type, '+ 
    ' Issuer_Type, Card_Type, Txn_Count, Txn_Amount, ' +
    'Interchange) values ($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11)';

psql.connect();
reduceArray(large_array, function(small_array){
    db(small_array);
});

var reduceArray = function(arr, cb){
    var size = 1000,
     small_array = []
     ;

    for (var i=0; i<arr.length; i+=size){
        var chunk = arr.slice(i,i+size);
        small_array.push(chunk);
    }
    cb(small_array);
};

db insert:

var db = function(small_array){
    async.times(small_array.length, function(n, next){
        var data = small_array[n]; 
            psql.query(sql, [data], function(err,result){
                next(err, result);
            });
    }, function(err){
        console.log(err);
    });
};

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
brianccommented, Oct 27, 2015

Yeah if you really need maximum speed you can string concatenate all the values into a single query - but absolutely do not do that for untrusted input. String concatenation is 99% of the time bad, bad, bad for sql.

You could also look at https://github.com/brianc/node-pg-copy-streams but it also does not sanitize the input. It doesn’t suffer from the same sql injection attacks because it doesn’t use a sql statement to do inserts, but there could still be issues w/ malicious input.

On Tue, Oct 27, 2015 at 4:18 PM, John Skilbeck notifications@github.com wrote:

Ok sounds good. Yeah, the way I did it was just using .map … I thought that this might create unnecessary overhead however (single insert statement for each record)…

Was thinking there might be savings if you could just have one insert statement with an array of records. An insert statement for each record also works though; was just trying to save fractions of a second.

data.map(function(item,index){ psql.query(sql, item, function(err,result){ if(err) console.log(err); }); });

— Reply to this email directly or view it on GitHub https://github.com/brianc/node-postgres/issues/880#issuecomment-151647004 .

0reactions
skilbjocommented, Oct 27, 2015

Ok sounds good. Yeah, the way I did it was just using .map … I thought that this might create unnecessary overhead however (single insert statement for each record)…

Was thinking there might be savings if you could just have one insert statement with an array of records. An insert statement for each record also works though; was just trying to save fractions of a second.

data.map(function(item,index){
  psql.query(sql, item, function(err,result){
    if(err) console.log(err);
  });
});
Read more comments on GitHub >

github_iconTop Results From Across the Web

Problems making my own array to pass to bulk insert query
This runs jsMakeBulkInsert first which makes the array and stores it in a temp state var and then fires the SQL query. //...
Read more >
Oracle - How to bulk insert values in one query with an array ...
First attempt was to use IN clause inside of specific column value part and second attempt was to use INSERT ALL. Could not...
Read more >
Iterating an array and passing values as input parameters for ...
I am learning mulesoft. I am trying to read a CSV file and insert records in database through bulk-insert option (using Mule 4,...
Read more >
Bulk insert of records , using something like a user defined...
The fastest way is: insert into t select * from t2; and perform the computations in SQL. There is not more (or less)...
Read more >
Using Arrays of Parameters
Passing arrays of parameter values for bulk insert operations, for example, with SQLPrepare/SQLExecute and SQLExecDirect can reduce the ODBC call load and ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found