I am trying to import a big data in a CSV file to a postgreSQL table. I found some guys saying following code would do that:
>CREATE TABLE mytable();
>COPY mytable FROM 'C:/myCSVfile.csv' DELIMITER ',' CSV;
However, this code invoked an error saying:
ERROR: extra data after last expected column
CONTEXT: COPY mytable, line 1: "Name, Gender, Age, Email, ... "
If I typed in
>COPY mytable FROM 'C:/myCSVfile.csv' DELIMITER '.' CSV HEADER;
instead, then I got following error:
ERROR: extra data after last expected column
CONTEXT: COPY mytable, line 2: "King Kong, M, 10, [email protected], ..."
I know that specifying columns like
>CREATE TABLE mytable(Name varchar(20), Gender char(1), Age int(2), Email varchar(40), ...);
>COPY mytable(Name varchar(20), Gender char(1), Age int(2), Email varchar(40), ...) FROM 'C:/myCSVfile.csv' DELIMITER ',' CSV HEADER;
would yield the postgreSQL's table in the way I want.
But I have a large number of columns (over 250) in the CSV file, so I need to import the data without specifying each columns as above.
Would somebody help me out?
Thanks in advance.
COPY mytable FROM 'abolute/path/to/csvfile.csv' WITH CSV HEADER;
is enough || note: themytable
filed(s) andCSV
file header(s) should be same || – wingedpanther Jul 28 '14 at 8:56*.CSV
file ?? – wingedpanther Jul 28 '14 at 8:57