Sign up ×
Stack Overflow is a community of 4.7 million programmers, just like you, helping each other. Join them; it only takes a minute:

I have the following code in nodejs that uses the pg (https://github.com/brianc/node-postgres) My code to create subscriptions for an employee is as such.

    client.query(
      'INSERT INTO subscriptions (subscription_guid, employer_guid, employee_guid) 
       values ($1,$2,$3)', [
        datasetArr[0].subscription_guid,
        datasetArr[0].employer_guid,
        datasetArr[0].employee_guid
      ],


      function(err, result) {
        done();

        if (err) {
          set_response(500, err, res);
          logger.error('error running query', err);
          return console.error('error running query', err);
        }

        logger.info('subscription with created');
        set_response(201);

      });

As you have already noticed datasetArr is an array. I would like to create mass subscriptions for more than one employee at a time. However I would not like to loop through the array. Is there a way to do it out of the box with pg?

share|improve this question
1  
Use whatever interface node offers to PostgreSQL's COPY command. – Richard Huxton Jun 3 '14 at 8:16
    
@RichardHuxton: As per postgresql.org/docs/9.1/static/sql-copy.html 'COPY' command works only with STDIN (csv/file upload). How do I get it to work it with an array? – lonelymo Jun 3 '14 at 9:33
    
I don't know - that's why it's a comment not an answer. You'll need to read the documentation for the node-postgres library. – Richard Huxton Jun 3 '14 at 9:42
    
Wrap this all up in a transaction and execute the entire sequence at once: github.com/vitaly-t/pg-promise/wiki/… – vitaly-t Jun 28 at 0:42
up vote 1 down vote accepted

I did a search for the same question, but found no solution yet. With the async library it is very simple to use the query several times, and do the necessary error handling.

May be this code variant helps. (for inserting 10.000 small json objects to an empty database it took 6 sec).

Christoph

function insertData(item,callback) {
  client.query('INSERT INTO subscriptions (subscription_guid, employer_guid, employee_guid)
       values ($1,$2,$3)', [
        item.subscription_guid,
        item.employer_guid,
        item.employee_guid
       ], 
  function(err,result) {
    // return any err to async.each iterator
    callback(err);
  })
}
async.each(datasetArr,insertData,function(err) {
  // Release the client to the pg module
  done();
  if (err) {
    set_response(500, err, res);
    logger.error('error running query', err);
    return console.error('error running query', err);
  }
  logger.info('subscription with created');
  set_response(201);
})
share|improve this answer

Create your data structure as:

[ [val1,val2],[val1,val2] ...]

Then convert it into a string:

 JSON.stringify([['a','b'],['c']]).replace(/\[/g,"(").replace(/\]/g,")").replace(/"/g,'\'').slice(1,-1)

append it to the query and you are done!

Agreed it has string parsing costs but its way cheaper than single inserts.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.