Join the Stack Overflow Community
Stack Overflow is a community of 6.4 million programmers, just like you, helping each other.
Join them; it only takes a minute:
Sign up

I want to copy a CSV file to a Postgres table. There are about 100 columns in this table, so I do not want to rewrite them if I don't have to.

I am using the \copy table from 'table.csv' delimiter ',' csv; command but without a table created I get ERROR: relation "table" does not exist. If I add a blank table I get no error, but nothing happens. I tried this command two or three times and there was no output or messages, but the table was not updated when I checked it through PGAdmin.

Is there a way to import a table with headers included like I am trying to do?

share|improve this question
2  
Your table is named table ? Very confusing. Does the table exist, or do you want to create it based on the CSV? (you can't) – wildplasser Jul 15 '13 at 19:57
1  
well, I named it something else, but for this example lets call it table. I tried with and without it existing I also tried to do \copy table(column1, column2, ...) from 'table.csv' delimiter ',' csv; with no luck either. Ideally the table could be created through the CSV alone, and use the headers in that file. – Peppered Lemons Jul 15 '13 at 20:11
    
Related: stackoverflow.com/questions/2987433/… – G. Cito Nov 19 '14 at 4:49
1  
Just a heads up for anyone planning on turning a large csv into a postgres table -- postgres is capped at 1600 columns in a single table. You cannot chunk tables into 1600-column-sized ones and then join them after. You need to redesign the db. – Achekroud Nov 18 '15 at 15:52
up vote 38 down vote accepted

This worked. The first row had column names in it.

COPY wheat FROM 'wheat_crop_data.csv' DELIMITER ';' CSV HEADER
share|improve this answer
2  
I think the problem with this command is, that you have to be the DB superuser. \copy works as normal user, too – Exocom Feb 8 '14 at 13:00
14  
COPY does not create a table or add columns to it, it adds rows to an existing table with its existing columns. Presumably the asker wants to automate the creation of the ~100 columns, and COPY does not have this functionality, as of PG 9.3 at least. – Daniel Vérité Jul 28 '14 at 15:00
    
@Exocom good catch. Since I'm never an admin or superuser for DBs on the postgres systems I use (the pgadmin makes me owner of the databases I use and gives me limited privileges/roles) I must have used `\COPY'. Cheers – G. Cito Jul 28 '14 at 16:11
1  
@Daniel I understood the user's table already existed and had all the columns they needed and that they wanted to simply ADD data. – G. Cito Jul 28 '14 at 16:16

With the Python library pandas, you can easily create column names and infer data types from a csv file.

from sqlalchemy import create_engine
import pandas as pd

engine = create_engine('postgresql://user:pass@localhost/db_name')
df = pd.read_csv('/path/to/csv_file')
df.to_sql('pandas_db', engine)

The if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql('pandas_db', engine, if_exists='replace'). This works for additional input file types as well, docs here and here.

share|improve this answer
1  
I find that pd.DataFrame.from_csv gives me less trouble, but this answer is by far the easiest way to do this, IMO. – brock Oct 8 '15 at 14:34
    
True, I am not sure why I typed pd.read_excel, instead of pd.read_csv. I updated the answer. – cheflo Oct 9 '15 at 1:30
    
this is a fantastic solution for when you do not want to pre-create the table that will hold a large csv. Just a heads up though -- postgres can only take 1600 columns in a table. Apparently other DB engines will allow more. Having this many columns is apparently poor SQL form, although this consensus has yet to filter through to epidemiology. – Achekroud Nov 18 '15 at 15:51

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.