2

In Oracle OCI and OCCI there are API facilities to perform array inserts where you build up an array of values in the client and send this array along with a prepared statement to the server to insert thousands of entries into a table in a single shot resulting in huge performance improvements in some scenarios. Is there anything similar in PostgreSQL ?

I am using the stock PostgreSQL C API.

Some pseudo code to illustrate what i have in mind:

stmt = con->prepare("INSERT INTO mytable VALUES ($1, $2, $3)");
pg_c_api_array arr(stmt);
for triplet(a, b, c) in mylongarray:
    pg_c_api_variant var = arr.add();
    var.bind(1, a);
    var.bind(2, b);
    var.bind(3, c);
stmt->bindarray(arr);
stmt->exec()
1
  • 1
    What language/PostgreSQL client API are you using? This can make a difference as to what features can easily be used. Commented May 11, 2012 at 9:44

3 Answers 3

3

PostgreSQL has similar functionality - statement COPY and COPY API - it is very fast

libpq documentation

char *data = "10\t20\40\n20\t30\t40";

pres = PQexec(pconn, "COPY mytable FROM stdin");

/* can be call repeatedly */ 
copy_result = PQputCopyData(pconn, data, sizeof(data));
if (copy_result != 1)
{
        fprintf(stderr, "Copy to target table failed: %s\n",
                                        PQerrorMessage(pconn));
        EXIT;
}

if (PQputCopyEnd(pconn, NULL) == -1)
{
         fprintf(stderr, "Copy to target table failed: %s\n",
                                            PQerrorMessage(pconn));
                            EXIT;
}

pres = PQgetResult(pconn);
if (PQresultStatus(pres) != PGRES_COMMAND_OK)
{
       fprintf(stderr, "Copy to target table failed:%s\n",
                                            PQerrorMessage(pconn));
           EXIT;
}

PQclear(pres);
2
  • So is the data always rows of tab delimited values ? I take if one column is varchar then one must enclose in quotes or perhaps call PQ escape function ? Commented May 11, 2012 at 14:09
  • you can use any char like delimiter, tab is default delimiter - see COPY doc postgresql.org/docs/9.1/static/sql-copy.html Commented May 11, 2012 at 15:56
1

As Pavel Stehule points out, there is the COPY command and, when using libpq in C, associated functions for transmitted the copy data. I haven't used these. I mostly program against PostgreSQL in Python, have have used similar functionality from psycopg2. It's extremely simple:

conn = psycopg2.connect(CONN_STR)
cursor = conn.cursor()
f = open('data.tsv')
cusor.copy_from(f, 'incoming')
f.close()

In fact I've often replaced open with a file-like wrapper object that performs some basic data cleaning first. It's pretty seamless.

3
  • Its a pity that one needs to put the data into a file first. In my case i receive a coded file, decode values from the file and want to insert them directly from memory. Putting them back into a new (PostgreSQL copy coded) file seems unfortunate. Commented May 11, 2012 at 13:13
  • No, you can use COPY with stream or directly with data lines in string. Commented May 11, 2012 at 13:36
  • it doesn't have to be in a string. One can pass binary data too: gist.github.com/ictlyh/12fe787ec265b33fd7e4b0bd08bc27cb Commented Jul 20, 2020 at 3:28
1

I like this way of creating thousands of rows in a single command:

INSERT INTO mytable VALUES (UNNEST($1), UNNEST($2), UNNEST($3));

Bind an array of the values of column 1 to $1, an array of the values of column 2 to $2 etc.! Providing the values in columns may seem a bit strange at first when you are used to thinking in rows.

You need PostgreSQL >= 8.4 for UNNEST or your own function to convert arrays into sets.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.