How to import CSV file data into a PostgreSQL table?











up vote
491
down vote

favorite
168












How can I write a stored procedure that imports data from a CSV file and populates the table?










share|improve this question




















  • 13




    Why a stored procedure? COPY does the trick
    – Frank Heikens
    Jun 7 '10 at 6:43










  • I have a user interface that uploads the csv file, to hook up this i need the stored procedure that actually copies the data from the cvs file
    – vardhan
    Jun 7 '10 at 6:58






  • 3




    could you elaborate on how to use the COPY ?
    – vardhan
    Jun 7 '10 at 7:10






  • 15




    Bozhidar Batsov already gave you a link to an example, the fine manual could also help: postgresql.org/docs/8.4/interactive/sql-copy.html
    – Frank Heikens
    Jun 7 '10 at 7:11






  • 4




    Current manual: postgresql.org/docs/current/static/sql-copy.html
    – Basil Bourque
    Aug 9 '14 at 20:37

















up vote
491
down vote

favorite
168












How can I write a stored procedure that imports data from a CSV file and populates the table?










share|improve this question




















  • 13




    Why a stored procedure? COPY does the trick
    – Frank Heikens
    Jun 7 '10 at 6:43










  • I have a user interface that uploads the csv file, to hook up this i need the stored procedure that actually copies the data from the cvs file
    – vardhan
    Jun 7 '10 at 6:58






  • 3




    could you elaborate on how to use the COPY ?
    – vardhan
    Jun 7 '10 at 7:10






  • 15




    Bozhidar Batsov already gave you a link to an example, the fine manual could also help: postgresql.org/docs/8.4/interactive/sql-copy.html
    – Frank Heikens
    Jun 7 '10 at 7:11






  • 4




    Current manual: postgresql.org/docs/current/static/sql-copy.html
    – Basil Bourque
    Aug 9 '14 at 20:37















up vote
491
down vote

favorite
168









up vote
491
down vote

favorite
168






168





How can I write a stored procedure that imports data from a CSV file and populates the table?










share|improve this question















How can I write a stored procedure that imports data from a CSV file and populates the table?







postgresql csv postgresql-copy






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 9 '17 at 18:59









Erwin Brandstetter

334k64602783




334k64602783










asked Jun 7 '10 at 6:19









vardhan

2,4593113




2,4593113








  • 13




    Why a stored procedure? COPY does the trick
    – Frank Heikens
    Jun 7 '10 at 6:43










  • I have a user interface that uploads the csv file, to hook up this i need the stored procedure that actually copies the data from the cvs file
    – vardhan
    Jun 7 '10 at 6:58






  • 3




    could you elaborate on how to use the COPY ?
    – vardhan
    Jun 7 '10 at 7:10






  • 15




    Bozhidar Batsov already gave you a link to an example, the fine manual could also help: postgresql.org/docs/8.4/interactive/sql-copy.html
    – Frank Heikens
    Jun 7 '10 at 7:11






  • 4




    Current manual: postgresql.org/docs/current/static/sql-copy.html
    – Basil Bourque
    Aug 9 '14 at 20:37
















  • 13




    Why a stored procedure? COPY does the trick
    – Frank Heikens
    Jun 7 '10 at 6:43










  • I have a user interface that uploads the csv file, to hook up this i need the stored procedure that actually copies the data from the cvs file
    – vardhan
    Jun 7 '10 at 6:58






  • 3




    could you elaborate on how to use the COPY ?
    – vardhan
    Jun 7 '10 at 7:10






  • 15




    Bozhidar Batsov already gave you a link to an example, the fine manual could also help: postgresql.org/docs/8.4/interactive/sql-copy.html
    – Frank Heikens
    Jun 7 '10 at 7:11






  • 4




    Current manual: postgresql.org/docs/current/static/sql-copy.html
    – Basil Bourque
    Aug 9 '14 at 20:37










13




13




Why a stored procedure? COPY does the trick
– Frank Heikens
Jun 7 '10 at 6:43




Why a stored procedure? COPY does the trick
– Frank Heikens
Jun 7 '10 at 6:43












I have a user interface that uploads the csv file, to hook up this i need the stored procedure that actually copies the data from the cvs file
– vardhan
Jun 7 '10 at 6:58




I have a user interface that uploads the csv file, to hook up this i need the stored procedure that actually copies the data from the cvs file
– vardhan
Jun 7 '10 at 6:58




3




3




could you elaborate on how to use the COPY ?
– vardhan
Jun 7 '10 at 7:10




could you elaborate on how to use the COPY ?
– vardhan
Jun 7 '10 at 7:10




15




15




Bozhidar Batsov already gave you a link to an example, the fine manual could also help: postgresql.org/docs/8.4/interactive/sql-copy.html
– Frank Heikens
Jun 7 '10 at 7:11




Bozhidar Batsov already gave you a link to an example, the fine manual could also help: postgresql.org/docs/8.4/interactive/sql-copy.html
– Frank Heikens
Jun 7 '10 at 7:11




4




4




Current manual: postgresql.org/docs/current/static/sql-copy.html
– Basil Bourque
Aug 9 '14 at 20:37






Current manual: postgresql.org/docs/current/static/sql-copy.html
– Basil Bourque
Aug 9 '14 at 20:37














13 Answers
13






active

oldest

votes

















up vote
686
down vote













Take a look at this short article.





Solution paraphrased here:



Create your table:



CREATE TABLE zip_codes 
(ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


Copy data from your CSV file to the table:



COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' WITH (FORMAT csv);





share|improve this answer



















  • 41




    actually use copy would do the same trick if you do not have the super user access; it complaints on my Fedora 16 when using COPY with a non-root account.
    – asksw0rder
    Oct 15 '12 at 17:07






  • 78




    TIP: you can indicate what columns you have in the CSV using zip_codes(col1, col2, col3). The columns must be listed in the same order that they appear in the file.
    – David Pelaez
    Jan 2 '13 at 5:16






  • 3




    @asksw0rder does copy have the same syntax? bcoz I'm getting a syntax error with copy
    – JhovaniC
    May 29 '13 at 19:59






  • 6




    Should I include the header row?
    – bernie2436
    Oct 27 '13 at 23:09






  • 100




    You can easily include the header row -- just add HEADER in the options: COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV HEADER; postgresql.org/docs/9.1/static/sql-copy.html
    – Barrett Clark
    Nov 8 '13 at 15:17




















up vote
154
down vote













If you don't have permission to use COPY (which work on the db server), you can use copy instead (which works in the db client). Using the same example as Bozhidar Batsov:



Create your table:



CREATE TABLE zip_codes 
(ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


Copy data from your CSV file to the table:



copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV


You can also specify the columns to read:



copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV





share|improve this answer























  • copy voters(ZIP,CITY) FROM '/Users/files/Downloads/WOOD.TXT' DELIMITER ',' CSV HEADER; ERROR: extra data after last expected column CONTEXT: COPY voters, line 2: "OH0012781511,87,26953,HOUSEHOLDER,SHERRY,LEIGH,,11/26/1965,08/19/1988,,211 N GARFIELD ST , ,BLOOMD..."
    – JZ.
    Sep 6 '15 at 17:29










  • @JZ. I had a similar error. It was because I had extra blank columns. Check your csv and if you have blank columns, that could be the reason.
    – alex bennett
    Jul 8 '16 at 4:32






  • 4




    This is somewhat misleading: the difference between COPY and copy is much more than just permissions, and you can't simply add a `` to make it magically work. See the description (in the context of export) here: stackoverflow.com/a/1517692/157957
    – IMSoP
    Jan 26 '17 at 16:29










  • @IMSoP: you're right, I added a mention of server and client to clarify
    – bjelli
    Jan 27 '17 at 9:03










  • @bjelli is copy slower than copy? I have a 1.5MB file and a db.m4.large instance on RDS and it's been hours that this copy command has been running (at least 3).
    – Sebastian
    May 28 at 22:38


















up vote
61
down vote













One quick way of doing this is with the Python pandas library (version 0.15 or above works best). This will handle creating the columns for you - although obviously the choices it makes for data types might not be what you want. If it doesn't quite do what you want you can always use the 'create table' code generated as a template.



Here's a simple example:



import pandas as pd
df = pd.read_csv('mypath.csv')
df.columns = [c.lower() for c in df.columns] #postgres doesn't like capitals or spaces

from sqlalchemy import create_engine
engine = create_engine('postgresql://username:password@localhost:5432/dbname')

df.to_sql("my_table_name", engine)


And here's some code that shows you how to set various options:



# Set it so the raw sql output is logged
import logging
logging.basicConfig()
logging.getLogger('sqlalchemy.engine').setLevel(logging.INFO)

df.to_sql("my_table_name2",
engine,
if_exists="append", #options are ‘fail’, ‘replace’, ‘append’, default ‘fail’
index=False, #Do not output the index of the dataframe
dtype={'col1': sqlalchemy.types.NUMERIC,
'col2': sqlalchemy.types.String}) #Datatypes should be [sqlalchemy types][1]





share|improve this answer



















  • 6




    In addition, the if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql("fhrs", engine, if_exists='replace')
    – joelostblom
    Apr 30 '15 at 0:47








  • 1




    username and password : need to create Login and assign DB to user. If uses pgAdmin, then create "Login/Group role" using GUI
    – Somnath Kadam
    Mar 24 '17 at 12:52






  • 7




    Pandas is a super slow way of loading to sql (vs csv files). Can be orders of magnitude slower.
    – user48956
    May 4 '17 at 18:46










  • This could be a way to write data but it is super slow even with batch and good computing power. Using CSVs is a good way to accomplish this.
    – Ankit Singh
    Jul 13 at 14:09


















up vote
28
down vote













You could also use pgAdmin, which offers a GUI to do the import. That's shown in this SO thread. The advantage of using pgAdmin is that it also works for remote databases.



Much like the previous solutions though, you would need to have your table on the database already. Each person has his own solution but what I usually do is open the CSV in Excel, copy the headers, paste special with transposition on a different worksheet, place the corresponding data type on the next column then just copy and paste that to a text editor together with the appropriate SQL table creation query like so:



CREATE TABLE my_table (
/*paste data from Excel here for example ... */
col_1 bigint,
col_2 bigint,
/* ... */
col_n bigint
)





share|improve this answer























  • pls show a couple of sample rows of your pasted data
    – dcorking
    Apr 19 at 11:07


















up vote
19
down vote













As Paul mentioned, import works in pgAdmin:



right click on table -> import



select local file, format and coding



here is a german pgAdmin GUI screenshot:



pgAdmin import GUI



similar thing you can do with DbVisualizer (I have a license, not sure about free version)



right click on a table -> Import Table Data...



DbVisualizer import GUI






share|improve this answer



















  • 2




    DBVisualizer took 50 seconds to import 1400 rows with three fields -- and I had to cast everything back from a String to whatever it was supposed to be.
    – Noumenon
    Sep 29 '16 at 10:46


















up vote
17
down vote













Most other solutions here require that you create the table in advance/manually. This may not be practical in some cases (e.g., if you have a lot of columns in the destination table). So, the approach below may come handy.



Providing the path and column count of your csv file, you can use the following function to load your table to a temp table that will be named as target_table:



The top row is assumed to have the column names.



create or replace function data.load_csv_file
(
target_table text,
csv_path text,
col_count integer
)

returns void as $$

declare

iter integer; -- dummy integer to iterate columns with
col text; -- variable to keep the column name at each iteration
col_first text; -- first column name, e.g., top left corner on a csv file or spreadsheet

begin
set schema 'your-schema';

create table temp_table ();

-- add just enough number of columns
for iter in 1..col_count
loop
execute format('alter table temp_table add column col_%s text;', iter);
end loop;

-- copy the data from csv file
execute format('copy temp_table from %L with delimiter '','' quote ''"'' csv ', csv_path);

iter := 1;
col_first := (select col_1 from temp_table limit 1);

-- update the column names based on the first row which has the column names
for col in execute format('select unnest(string_to_array(trim(temp_table::text, ''()''), '','')) from temp_table where col_1 = %L', col_first)
loop
execute format('alter table temp_table rename column col_%s to %s', iter, col);
iter := iter + 1;
end loop;

-- delete the columns row
execute format('delete from temp_table where %s = %L', col_first, col_first);

-- change the temp table name to the name given as parameter, if not blank
if length(target_table) > 0 then
execute format('alter table temp_table rename to %I', target_table);
end if;

end;

$$ language plpgsql;





share|improve this answer



















  • 1




    Hi Mehmet, thanks for the answer you posted but when I run your code I get the following error message : ERROR: schema "data" does not exist
    – user2867432
    Nov 8 '16 at 5:34










  • user2867432 you need to change schema name that you use accordingly (e.g., public)
    – mehmet
    Nov 8 '16 at 13:05










  • Hi Mehmet, Thanks for solution, it's perfect but This works only if the postgres DB user is superuser, is ther any way to make it work without superuser?
    – Geeme
    Jun 23 '17 at 9:05










  • Geeme: read "security definer" here, but I have not used it myself.
    – mehmet
    Jun 23 '17 at 16:55


















up vote
15
down vote













COPY table_name FROM 'path/to/data.csv' DELIMITER ',' CSV HEADER;





share|improve this answer






























    up vote
    6
    down vote













    Personal experience with PostgreSQL, still waiting for a faster way.



    1. Create table skeleton first if the file is stored locally:



        drop table if exists ur_table;
    CREATE TABLE ur_table
    (
    id serial NOT NULL,
    log_id numeric,
    proc_code numeric,
    date timestamp,
    qty int,
    name varchar,
    price money
    );
    COPY
    ur_table(id, log_id, proc_code, date, qty, name, price)
    FROM 'pathxxx.csv' DELIMITER ',' CSV HEADER;


    2. When the pathxxx.csv is on the server, postgreSQL doesn't have the
    permission to access the server, you will have to import the .csv file through the pgAdmin built in functionality.



    Right click the table name choose import.



    enter image description here



    If you still have problem, please refer this tutorial.
    http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






    share|improve this answer






























      up vote
      5
      down vote













      Use this SQL code



          copy table_name(atribute1,attribute2,attribute3...)
      from 'E:test.csv' delimiter ',' csv header


      the header keyword lets the DBMS know that the csv file have a header with attributes



      for more visit http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






      share|improve this answer






























        up vote
        4
        down vote













        IMHO, the most convenient way is to follow "Import CSV data into postgresql, the comfortable way ;-)", using csvsql from csvkit, which is a python package installable via pip.






        share|improve this answer



















        • 2




          Link rot is voracious! The article you linked to no longer works, which makes me uncomfortable :(
          – chbrown
          Jul 27 '16 at 20:18










        • you might want to mention that his is py.
          – mountainclimber
          Aug 9 '16 at 14:46






        • 1




          For me I get a MemoryError if trying to import a large CSV so it looks like it doesn't stream.
          – DavidC
          Oct 20 '16 at 12:32










        • @DavidC Interesting. How big is your file? How much memory do you have? If it doesnt stream as it appears, I suggest chunking the data before insertion
          – sal
          Oct 31 '16 at 12:13






        • 1




          The file was 5GBs in size and I have 2GB memory. I gave up on it and use a script to generate CREATE TABLE and COPY commands in the end.
          – DavidC
          Nov 1 '16 at 9:51


















        up vote
        1
        down vote














        1. create a table first


        2. Then use copy command to copy the table details:



        copy table_name (C1,C2,C3....)
        from 'path to your csv file' delimiter ',' csv header;



        Thanks






        share|improve this answer






























          up vote
          0
          down vote













          Create table and have required columns that are used for creating table in csv file.




          1. Open postgres and right click on target table which you want to load & select import and Update the following steps in file options section


          2. Now browse your file in filename


          3. Select csv in format


          4. Encoding as ISO_8859_5



          Now goto Misc. options and check header and click on import.






          share|improve this answer






























            up vote
            0
            down vote













            If you need simple mechanism to import from text/parse multiline CSV you could use:



            CREATE TABLE t   -- OR INSERT INTO tab(col_names)
            AS
            SELECT
            t.f[1] AS col1
            ,t.f[2]::int AS col2
            ,t.f[3]::date AS col3
            ,t.f[4] AS col4
            FROM (
            SELECT regexp_split_to_array(l, ',') AS f
            FROM regexp_split_to_table(
            $$a,1,2016-01-01,bbb
            c,2,2018-01-01,ddd
            e,3,2019-01-01,eee$$, 'n') AS l) t;


            DBFiddle Demo






            share|improve this answer




















              protected by eyllanesc Apr 23 at 17:56



              Thank you for your interest in this question.
              Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



              Would you like to answer one of these unanswered questions instead?














              13 Answers
              13






              active

              oldest

              votes








              13 Answers
              13






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              686
              down vote













              Take a look at this short article.





              Solution paraphrased here:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' WITH (FORMAT csv);





              share|improve this answer



















              • 41




                actually use copy would do the same trick if you do not have the super user access; it complaints on my Fedora 16 when using COPY with a non-root account.
                – asksw0rder
                Oct 15 '12 at 17:07






              • 78




                TIP: you can indicate what columns you have in the CSV using zip_codes(col1, col2, col3). The columns must be listed in the same order that they appear in the file.
                – David Pelaez
                Jan 2 '13 at 5:16






              • 3




                @asksw0rder does copy have the same syntax? bcoz I'm getting a syntax error with copy
                – JhovaniC
                May 29 '13 at 19:59






              • 6




                Should I include the header row?
                – bernie2436
                Oct 27 '13 at 23:09






              • 100




                You can easily include the header row -- just add HEADER in the options: COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV HEADER; postgresql.org/docs/9.1/static/sql-copy.html
                – Barrett Clark
                Nov 8 '13 at 15:17

















              up vote
              686
              down vote













              Take a look at this short article.





              Solution paraphrased here:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' WITH (FORMAT csv);





              share|improve this answer



















              • 41




                actually use copy would do the same trick if you do not have the super user access; it complaints on my Fedora 16 when using COPY with a non-root account.
                – asksw0rder
                Oct 15 '12 at 17:07






              • 78




                TIP: you can indicate what columns you have in the CSV using zip_codes(col1, col2, col3). The columns must be listed in the same order that they appear in the file.
                – David Pelaez
                Jan 2 '13 at 5:16






              • 3




                @asksw0rder does copy have the same syntax? bcoz I'm getting a syntax error with copy
                – JhovaniC
                May 29 '13 at 19:59






              • 6




                Should I include the header row?
                – bernie2436
                Oct 27 '13 at 23:09






              • 100




                You can easily include the header row -- just add HEADER in the options: COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV HEADER; postgresql.org/docs/9.1/static/sql-copy.html
                – Barrett Clark
                Nov 8 '13 at 15:17















              up vote
              686
              down vote










              up vote
              686
              down vote









              Take a look at this short article.





              Solution paraphrased here:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' WITH (FORMAT csv);





              share|improve this answer














              Take a look at this short article.





              Solution paraphrased here:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' WITH (FORMAT csv);






              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited Aug 11 '17 at 7:16









              dylanfprice

              284




              284










              answered Jun 7 '10 at 6:24









              Bozhidar Batsov

              45.4k1181105




              45.4k1181105








              • 41




                actually use copy would do the same trick if you do not have the super user access; it complaints on my Fedora 16 when using COPY with a non-root account.
                – asksw0rder
                Oct 15 '12 at 17:07






              • 78




                TIP: you can indicate what columns you have in the CSV using zip_codes(col1, col2, col3). The columns must be listed in the same order that they appear in the file.
                – David Pelaez
                Jan 2 '13 at 5:16






              • 3




                @asksw0rder does copy have the same syntax? bcoz I'm getting a syntax error with copy
                – JhovaniC
                May 29 '13 at 19:59






              • 6




                Should I include the header row?
                – bernie2436
                Oct 27 '13 at 23:09






              • 100




                You can easily include the header row -- just add HEADER in the options: COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV HEADER; postgresql.org/docs/9.1/static/sql-copy.html
                – Barrett Clark
                Nov 8 '13 at 15:17
















              • 41




                actually use copy would do the same trick if you do not have the super user access; it complaints on my Fedora 16 when using COPY with a non-root account.
                – asksw0rder
                Oct 15 '12 at 17:07






              • 78




                TIP: you can indicate what columns you have in the CSV using zip_codes(col1, col2, col3). The columns must be listed in the same order that they appear in the file.
                – David Pelaez
                Jan 2 '13 at 5:16






              • 3




                @asksw0rder does copy have the same syntax? bcoz I'm getting a syntax error with copy
                – JhovaniC
                May 29 '13 at 19:59






              • 6




                Should I include the header row?
                – bernie2436
                Oct 27 '13 at 23:09






              • 100




                You can easily include the header row -- just add HEADER in the options: COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV HEADER; postgresql.org/docs/9.1/static/sql-copy.html
                – Barrett Clark
                Nov 8 '13 at 15:17










              41




              41




              actually use copy would do the same trick if you do not have the super user access; it complaints on my Fedora 16 when using COPY with a non-root account.
              – asksw0rder
              Oct 15 '12 at 17:07




              actually use copy would do the same trick if you do not have the super user access; it complaints on my Fedora 16 when using COPY with a non-root account.
              – asksw0rder
              Oct 15 '12 at 17:07




              78




              78




              TIP: you can indicate what columns you have in the CSV using zip_codes(col1, col2, col3). The columns must be listed in the same order that they appear in the file.
              – David Pelaez
              Jan 2 '13 at 5:16




              TIP: you can indicate what columns you have in the CSV using zip_codes(col1, col2, col3). The columns must be listed in the same order that they appear in the file.
              – David Pelaez
              Jan 2 '13 at 5:16




              3




              3




              @asksw0rder does copy have the same syntax? bcoz I'm getting a syntax error with copy
              – JhovaniC
              May 29 '13 at 19:59




              @asksw0rder does copy have the same syntax? bcoz I'm getting a syntax error with copy
              – JhovaniC
              May 29 '13 at 19:59




              6




              6




              Should I include the header row?
              – bernie2436
              Oct 27 '13 at 23:09




              Should I include the header row?
              – bernie2436
              Oct 27 '13 at 23:09




              100




              100




              You can easily include the header row -- just add HEADER in the options: COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV HEADER; postgresql.org/docs/9.1/static/sql-copy.html
              – Barrett Clark
              Nov 8 '13 at 15:17






              You can easily include the header row -- just add HEADER in the options: COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV HEADER; postgresql.org/docs/9.1/static/sql-copy.html
              – Barrett Clark
              Nov 8 '13 at 15:17














              up vote
              154
              down vote













              If you don't have permission to use COPY (which work on the db server), you can use copy instead (which works in the db client). Using the same example as Bozhidar Batsov:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV


              You can also specify the columns to read:



              copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV





              share|improve this answer























              • copy voters(ZIP,CITY) FROM '/Users/files/Downloads/WOOD.TXT' DELIMITER ',' CSV HEADER; ERROR: extra data after last expected column CONTEXT: COPY voters, line 2: "OH0012781511,87,26953,HOUSEHOLDER,SHERRY,LEIGH,,11/26/1965,08/19/1988,,211 N GARFIELD ST , ,BLOOMD..."
                – JZ.
                Sep 6 '15 at 17:29










              • @JZ. I had a similar error. It was because I had extra blank columns. Check your csv and if you have blank columns, that could be the reason.
                – alex bennett
                Jul 8 '16 at 4:32






              • 4




                This is somewhat misleading: the difference between COPY and copy is much more than just permissions, and you can't simply add a `` to make it magically work. See the description (in the context of export) here: stackoverflow.com/a/1517692/157957
                – IMSoP
                Jan 26 '17 at 16:29










              • @IMSoP: you're right, I added a mention of server and client to clarify
                – bjelli
                Jan 27 '17 at 9:03










              • @bjelli is copy slower than copy? I have a 1.5MB file and a db.m4.large instance on RDS and it's been hours that this copy command has been running (at least 3).
                – Sebastian
                May 28 at 22:38















              up vote
              154
              down vote













              If you don't have permission to use COPY (which work on the db server), you can use copy instead (which works in the db client). Using the same example as Bozhidar Batsov:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV


              You can also specify the columns to read:



              copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV





              share|improve this answer























              • copy voters(ZIP,CITY) FROM '/Users/files/Downloads/WOOD.TXT' DELIMITER ',' CSV HEADER; ERROR: extra data after last expected column CONTEXT: COPY voters, line 2: "OH0012781511,87,26953,HOUSEHOLDER,SHERRY,LEIGH,,11/26/1965,08/19/1988,,211 N GARFIELD ST , ,BLOOMD..."
                – JZ.
                Sep 6 '15 at 17:29










              • @JZ. I had a similar error. It was because I had extra blank columns. Check your csv and if you have blank columns, that could be the reason.
                – alex bennett
                Jul 8 '16 at 4:32






              • 4




                This is somewhat misleading: the difference between COPY and copy is much more than just permissions, and you can't simply add a `` to make it magically work. See the description (in the context of export) here: stackoverflow.com/a/1517692/157957
                – IMSoP
                Jan 26 '17 at 16:29










              • @IMSoP: you're right, I added a mention of server and client to clarify
                – bjelli
                Jan 27 '17 at 9:03










              • @bjelli is copy slower than copy? I have a 1.5MB file and a db.m4.large instance on RDS and it's been hours that this copy command has been running (at least 3).
                – Sebastian
                May 28 at 22:38













              up vote
              154
              down vote










              up vote
              154
              down vote









              If you don't have permission to use COPY (which work on the db server), you can use copy instead (which works in the db client). Using the same example as Bozhidar Batsov:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV


              You can also specify the columns to read:



              copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV





              share|improve this answer














              If you don't have permission to use COPY (which work on the db server), you can use copy instead (which works in the db client). Using the same example as Bozhidar Batsov:



              Create your table:



              CREATE TABLE zip_codes 
              (ZIP char(5), LATITUDE double precision, LONGITUDE double precision,
              CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);


              Copy data from your CSV file to the table:



              copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV


              You can also specify the columns to read:



              copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV






              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited Jan 27 '17 at 9:02

























              answered Jun 20 '15 at 7:26









              bjelli

              6,21022235




              6,21022235












              • copy voters(ZIP,CITY) FROM '/Users/files/Downloads/WOOD.TXT' DELIMITER ',' CSV HEADER; ERROR: extra data after last expected column CONTEXT: COPY voters, line 2: "OH0012781511,87,26953,HOUSEHOLDER,SHERRY,LEIGH,,11/26/1965,08/19/1988,,211 N GARFIELD ST , ,BLOOMD..."
                – JZ.
                Sep 6 '15 at 17:29










              • @JZ. I had a similar error. It was because I had extra blank columns. Check your csv and if you have blank columns, that could be the reason.
                – alex bennett
                Jul 8 '16 at 4:32






              • 4




                This is somewhat misleading: the difference between COPY and copy is much more than just permissions, and you can't simply add a `` to make it magically work. See the description (in the context of export) here: stackoverflow.com/a/1517692/157957
                – IMSoP
                Jan 26 '17 at 16:29










              • @IMSoP: you're right, I added a mention of server and client to clarify
                – bjelli
                Jan 27 '17 at 9:03










              • @bjelli is copy slower than copy? I have a 1.5MB file and a db.m4.large instance on RDS and it's been hours that this copy command has been running (at least 3).
                – Sebastian
                May 28 at 22:38


















              • copy voters(ZIP,CITY) FROM '/Users/files/Downloads/WOOD.TXT' DELIMITER ',' CSV HEADER; ERROR: extra data after last expected column CONTEXT: COPY voters, line 2: "OH0012781511,87,26953,HOUSEHOLDER,SHERRY,LEIGH,,11/26/1965,08/19/1988,,211 N GARFIELD ST , ,BLOOMD..."
                – JZ.
                Sep 6 '15 at 17:29










              • @JZ. I had a similar error. It was because I had extra blank columns. Check your csv and if you have blank columns, that could be the reason.
                – alex bennett
                Jul 8 '16 at 4:32






              • 4




                This is somewhat misleading: the difference between COPY and copy is much more than just permissions, and you can't simply add a `` to make it magically work. See the description (in the context of export) here: stackoverflow.com/a/1517692/157957
                – IMSoP
                Jan 26 '17 at 16:29










              • @IMSoP: you're right, I added a mention of server and client to clarify
                – bjelli
                Jan 27 '17 at 9:03










              • @bjelli is copy slower than copy? I have a 1.5MB file and a db.m4.large instance on RDS and it's been hours that this copy command has been running (at least 3).
                – Sebastian
                May 28 at 22:38
















              copy voters(ZIP,CITY) FROM '/Users/files/Downloads/WOOD.TXT' DELIMITER ',' CSV HEADER; ERROR: extra data after last expected column CONTEXT: COPY voters, line 2: "OH0012781511,87,26953,HOUSEHOLDER,SHERRY,LEIGH,,11/26/1965,08/19/1988,,211 N GARFIELD ST , ,BLOOMD..."
              – JZ.
              Sep 6 '15 at 17:29




              copy voters(ZIP,CITY) FROM '/Users/files/Downloads/WOOD.TXT' DELIMITER ',' CSV HEADER; ERROR: extra data after last expected column CONTEXT: COPY voters, line 2: "OH0012781511,87,26953,HOUSEHOLDER,SHERRY,LEIGH,,11/26/1965,08/19/1988,,211 N GARFIELD ST , ,BLOOMD..."
              – JZ.
              Sep 6 '15 at 17:29












              @JZ. I had a similar error. It was because I had extra blank columns. Check your csv and if you have blank columns, that could be the reason.
              – alex bennett
              Jul 8 '16 at 4:32




              @JZ. I had a similar error. It was because I had extra blank columns. Check your csv and if you have blank columns, that could be the reason.
              – alex bennett
              Jul 8 '16 at 4:32




              4




              4




              This is somewhat misleading: the difference between COPY and copy is much more than just permissions, and you can't simply add a `` to make it magically work. See the description (in the context of export) here: stackoverflow.com/a/1517692/157957
              – IMSoP
              Jan 26 '17 at 16:29




              This is somewhat misleading: the difference between COPY and copy is much more than just permissions, and you can't simply add a `` to make it magically work. See the description (in the context of export) here: stackoverflow.com/a/1517692/157957
              – IMSoP
              Jan 26 '17 at 16:29












              @IMSoP: you're right, I added a mention of server and client to clarify
              – bjelli
              Jan 27 '17 at 9:03




              @IMSoP: you're right, I added a mention of server and client to clarify
              – bjelli
              Jan 27 '17 at 9:03












              @bjelli is copy slower than copy? I have a 1.5MB file and a db.m4.large instance on RDS and it's been hours that this copy command has been running (at least 3).
              – Sebastian
              May 28 at 22:38




              @bjelli is copy slower than copy? I have a 1.5MB file and a db.m4.large instance on RDS and it's been hours that this copy command has been running (at least 3).
              – Sebastian
              May 28 at 22:38










              up vote
              61
              down vote













              One quick way of doing this is with the Python pandas library (version 0.15 or above works best). This will handle creating the columns for you - although obviously the choices it makes for data types might not be what you want. If it doesn't quite do what you want you can always use the 'create table' code generated as a template.



              Here's a simple example:



              import pandas as pd
              df = pd.read_csv('mypath.csv')
              df.columns = [c.lower() for c in df.columns] #postgres doesn't like capitals or spaces

              from sqlalchemy import create_engine
              engine = create_engine('postgresql://username:password@localhost:5432/dbname')

              df.to_sql("my_table_name", engine)


              And here's some code that shows you how to set various options:



              # Set it so the raw sql output is logged
              import logging
              logging.basicConfig()
              logging.getLogger('sqlalchemy.engine').setLevel(logging.INFO)

              df.to_sql("my_table_name2",
              engine,
              if_exists="append", #options are ‘fail’, ‘replace’, ‘append’, default ‘fail’
              index=False, #Do not output the index of the dataframe
              dtype={'col1': sqlalchemy.types.NUMERIC,
              'col2': sqlalchemy.types.String}) #Datatypes should be [sqlalchemy types][1]





              share|improve this answer



















              • 6




                In addition, the if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql("fhrs", engine, if_exists='replace')
                – joelostblom
                Apr 30 '15 at 0:47








              • 1




                username and password : need to create Login and assign DB to user. If uses pgAdmin, then create "Login/Group role" using GUI
                – Somnath Kadam
                Mar 24 '17 at 12:52






              • 7




                Pandas is a super slow way of loading to sql (vs csv files). Can be orders of magnitude slower.
                – user48956
                May 4 '17 at 18:46










              • This could be a way to write data but it is super slow even with batch and good computing power. Using CSVs is a good way to accomplish this.
                – Ankit Singh
                Jul 13 at 14:09















              up vote
              61
              down vote













              One quick way of doing this is with the Python pandas library (version 0.15 or above works best). This will handle creating the columns for you - although obviously the choices it makes for data types might not be what you want. If it doesn't quite do what you want you can always use the 'create table' code generated as a template.



              Here's a simple example:



              import pandas as pd
              df = pd.read_csv('mypath.csv')
              df.columns = [c.lower() for c in df.columns] #postgres doesn't like capitals or spaces

              from sqlalchemy import create_engine
              engine = create_engine('postgresql://username:password@localhost:5432/dbname')

              df.to_sql("my_table_name", engine)


              And here's some code that shows you how to set various options:



              # Set it so the raw sql output is logged
              import logging
              logging.basicConfig()
              logging.getLogger('sqlalchemy.engine').setLevel(logging.INFO)

              df.to_sql("my_table_name2",
              engine,
              if_exists="append", #options are ‘fail’, ‘replace’, ‘append’, default ‘fail’
              index=False, #Do not output the index of the dataframe
              dtype={'col1': sqlalchemy.types.NUMERIC,
              'col2': sqlalchemy.types.String}) #Datatypes should be [sqlalchemy types][1]





              share|improve this answer



















              • 6




                In addition, the if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql("fhrs", engine, if_exists='replace')
                – joelostblom
                Apr 30 '15 at 0:47








              • 1




                username and password : need to create Login and assign DB to user. If uses pgAdmin, then create "Login/Group role" using GUI
                – Somnath Kadam
                Mar 24 '17 at 12:52






              • 7




                Pandas is a super slow way of loading to sql (vs csv files). Can be orders of magnitude slower.
                – user48956
                May 4 '17 at 18:46










              • This could be a way to write data but it is super slow even with batch and good computing power. Using CSVs is a good way to accomplish this.
                – Ankit Singh
                Jul 13 at 14:09













              up vote
              61
              down vote










              up vote
              61
              down vote









              One quick way of doing this is with the Python pandas library (version 0.15 or above works best). This will handle creating the columns for you - although obviously the choices it makes for data types might not be what you want. If it doesn't quite do what you want you can always use the 'create table' code generated as a template.



              Here's a simple example:



              import pandas as pd
              df = pd.read_csv('mypath.csv')
              df.columns = [c.lower() for c in df.columns] #postgres doesn't like capitals or spaces

              from sqlalchemy import create_engine
              engine = create_engine('postgresql://username:password@localhost:5432/dbname')

              df.to_sql("my_table_name", engine)


              And here's some code that shows you how to set various options:



              # Set it so the raw sql output is logged
              import logging
              logging.basicConfig()
              logging.getLogger('sqlalchemy.engine').setLevel(logging.INFO)

              df.to_sql("my_table_name2",
              engine,
              if_exists="append", #options are ‘fail’, ‘replace’, ‘append’, default ‘fail’
              index=False, #Do not output the index of the dataframe
              dtype={'col1': sqlalchemy.types.NUMERIC,
              'col2': sqlalchemy.types.String}) #Datatypes should be [sqlalchemy types][1]





              share|improve this answer














              One quick way of doing this is with the Python pandas library (version 0.15 or above works best). This will handle creating the columns for you - although obviously the choices it makes for data types might not be what you want. If it doesn't quite do what you want you can always use the 'create table' code generated as a template.



              Here's a simple example:



              import pandas as pd
              df = pd.read_csv('mypath.csv')
              df.columns = [c.lower() for c in df.columns] #postgres doesn't like capitals or spaces

              from sqlalchemy import create_engine
              engine = create_engine('postgresql://username:password@localhost:5432/dbname')

              df.to_sql("my_table_name", engine)


              And here's some code that shows you how to set various options:



              # Set it so the raw sql output is logged
              import logging
              logging.basicConfig()
              logging.getLogger('sqlalchemy.engine').setLevel(logging.INFO)

              df.to_sql("my_table_name2",
              engine,
              if_exists="append", #options are ‘fail’, ‘replace’, ‘append’, default ‘fail’
              index=False, #Do not output the index of the dataframe
              dtype={'col1': sqlalchemy.types.NUMERIC,
              'col2': sqlalchemy.types.String}) #Datatypes should be [sqlalchemy types][1]






              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited Nov 7 at 19:22









              Brian Burns

              6,11134145




              6,11134145










              answered Apr 18 '15 at 20:22









              RobinL

              5,00012537




              5,00012537








              • 6




                In addition, the if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql("fhrs", engine, if_exists='replace')
                – joelostblom
                Apr 30 '15 at 0:47








              • 1




                username and password : need to create Login and assign DB to user. If uses pgAdmin, then create "Login/Group role" using GUI
                – Somnath Kadam
                Mar 24 '17 at 12:52






              • 7




                Pandas is a super slow way of loading to sql (vs csv files). Can be orders of magnitude slower.
                – user48956
                May 4 '17 at 18:46










              • This could be a way to write data but it is super slow even with batch and good computing power. Using CSVs is a good way to accomplish this.
                – Ankit Singh
                Jul 13 at 14:09














              • 6




                In addition, the if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql("fhrs", engine, if_exists='replace')
                – joelostblom
                Apr 30 '15 at 0:47








              • 1




                username and password : need to create Login and assign DB to user. If uses pgAdmin, then create "Login/Group role" using GUI
                – Somnath Kadam
                Mar 24 '17 at 12:52






              • 7




                Pandas is a super slow way of loading to sql (vs csv files). Can be orders of magnitude slower.
                – user48956
                May 4 '17 at 18:46










              • This could be a way to write data but it is super slow even with batch and good computing power. Using CSVs is a good way to accomplish this.
                – Ankit Singh
                Jul 13 at 14:09








              6




              6




              In addition, the if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql("fhrs", engine, if_exists='replace')
              – joelostblom
              Apr 30 '15 at 0:47






              In addition, the if_exists parameter can be set to replace or append to an existing table, e.g. df.to_sql("fhrs", engine, if_exists='replace')
              – joelostblom
              Apr 30 '15 at 0:47






              1




              1




              username and password : need to create Login and assign DB to user. If uses pgAdmin, then create "Login/Group role" using GUI
              – Somnath Kadam
              Mar 24 '17 at 12:52




              username and password : need to create Login and assign DB to user. If uses pgAdmin, then create "Login/Group role" using GUI
              – Somnath Kadam
              Mar 24 '17 at 12:52




              7




              7




              Pandas is a super slow way of loading to sql (vs csv files). Can be orders of magnitude slower.
              – user48956
              May 4 '17 at 18:46




              Pandas is a super slow way of loading to sql (vs csv files). Can be orders of magnitude slower.
              – user48956
              May 4 '17 at 18:46












              This could be a way to write data but it is super slow even with batch and good computing power. Using CSVs is a good way to accomplish this.
              – Ankit Singh
              Jul 13 at 14:09




              This could be a way to write data but it is super slow even with batch and good computing power. Using CSVs is a good way to accomplish this.
              – Ankit Singh
              Jul 13 at 14:09










              up vote
              28
              down vote













              You could also use pgAdmin, which offers a GUI to do the import. That's shown in this SO thread. The advantage of using pgAdmin is that it also works for remote databases.



              Much like the previous solutions though, you would need to have your table on the database already. Each person has his own solution but what I usually do is open the CSV in Excel, copy the headers, paste special with transposition on a different worksheet, place the corresponding data type on the next column then just copy and paste that to a text editor together with the appropriate SQL table creation query like so:



              CREATE TABLE my_table (
              /*paste data from Excel here for example ... */
              col_1 bigint,
              col_2 bigint,
              /* ... */
              col_n bigint
              )





              share|improve this answer























              • pls show a couple of sample rows of your pasted data
                – dcorking
                Apr 19 at 11:07















              up vote
              28
              down vote













              You could also use pgAdmin, which offers a GUI to do the import. That's shown in this SO thread. The advantage of using pgAdmin is that it also works for remote databases.



              Much like the previous solutions though, you would need to have your table on the database already. Each person has his own solution but what I usually do is open the CSV in Excel, copy the headers, paste special with transposition on a different worksheet, place the corresponding data type on the next column then just copy and paste that to a text editor together with the appropriate SQL table creation query like so:



              CREATE TABLE my_table (
              /*paste data from Excel here for example ... */
              col_1 bigint,
              col_2 bigint,
              /* ... */
              col_n bigint
              )





              share|improve this answer























              • pls show a couple of sample rows of your pasted data
                – dcorking
                Apr 19 at 11:07













              up vote
              28
              down vote










              up vote
              28
              down vote









              You could also use pgAdmin, which offers a GUI to do the import. That's shown in this SO thread. The advantage of using pgAdmin is that it also works for remote databases.



              Much like the previous solutions though, you would need to have your table on the database already. Each person has his own solution but what I usually do is open the CSV in Excel, copy the headers, paste special with transposition on a different worksheet, place the corresponding data type on the next column then just copy and paste that to a text editor together with the appropriate SQL table creation query like so:



              CREATE TABLE my_table (
              /*paste data from Excel here for example ... */
              col_1 bigint,
              col_2 bigint,
              /* ... */
              col_n bigint
              )





              share|improve this answer














              You could also use pgAdmin, which offers a GUI to do the import. That's shown in this SO thread. The advantage of using pgAdmin is that it also works for remote databases.



              Much like the previous solutions though, you would need to have your table on the database already. Each person has his own solution but what I usually do is open the CSV in Excel, copy the headers, paste special with transposition on a different worksheet, place the corresponding data type on the next column then just copy and paste that to a text editor together with the appropriate SQL table creation query like so:



              CREATE TABLE my_table (
              /*paste data from Excel here for example ... */
              col_1 bigint,
              col_2 bigint,
              /* ... */
              col_n bigint
              )






              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited May 23 '17 at 11:47









              Community

              11




              11










              answered Nov 3 '14 at 19:49









              Paul

              704819




              704819












              • pls show a couple of sample rows of your pasted data
                – dcorking
                Apr 19 at 11:07


















              • pls show a couple of sample rows of your pasted data
                – dcorking
                Apr 19 at 11:07
















              pls show a couple of sample rows of your pasted data
              – dcorking
              Apr 19 at 11:07




              pls show a couple of sample rows of your pasted data
              – dcorking
              Apr 19 at 11:07










              up vote
              19
              down vote













              As Paul mentioned, import works in pgAdmin:



              right click on table -> import



              select local file, format and coding



              here is a german pgAdmin GUI screenshot:



              pgAdmin import GUI



              similar thing you can do with DbVisualizer (I have a license, not sure about free version)



              right click on a table -> Import Table Data...



              DbVisualizer import GUI






              share|improve this answer



















              • 2




                DBVisualizer took 50 seconds to import 1400 rows with three fields -- and I had to cast everything back from a String to whatever it was supposed to be.
                – Noumenon
                Sep 29 '16 at 10:46















              up vote
              19
              down vote













              As Paul mentioned, import works in pgAdmin:



              right click on table -> import



              select local file, format and coding



              here is a german pgAdmin GUI screenshot:



              pgAdmin import GUI



              similar thing you can do with DbVisualizer (I have a license, not sure about free version)



              right click on a table -> Import Table Data...



              DbVisualizer import GUI






              share|improve this answer



















              • 2




                DBVisualizer took 50 seconds to import 1400 rows with three fields -- and I had to cast everything back from a String to whatever it was supposed to be.
                – Noumenon
                Sep 29 '16 at 10:46













              up vote
              19
              down vote










              up vote
              19
              down vote









              As Paul mentioned, import works in pgAdmin:



              right click on table -> import



              select local file, format and coding



              here is a german pgAdmin GUI screenshot:



              pgAdmin import GUI



              similar thing you can do with DbVisualizer (I have a license, not sure about free version)



              right click on a table -> Import Table Data...



              DbVisualizer import GUI






              share|improve this answer














              As Paul mentioned, import works in pgAdmin:



              right click on table -> import



              select local file, format and coding



              here is a german pgAdmin GUI screenshot:



              pgAdmin import GUI



              similar thing you can do with DbVisualizer (I have a license, not sure about free version)



              right click on a table -> Import Table Data...



              DbVisualizer import GUI







              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited Oct 21 '15 at 13:37

























              answered Sep 17 '15 at 8:55









              Andreas L.

              1,7751421




              1,7751421








              • 2




                DBVisualizer took 50 seconds to import 1400 rows with three fields -- and I had to cast everything back from a String to whatever it was supposed to be.
                – Noumenon
                Sep 29 '16 at 10:46














              • 2




                DBVisualizer took 50 seconds to import 1400 rows with three fields -- and I had to cast everything back from a String to whatever it was supposed to be.
                – Noumenon
                Sep 29 '16 at 10:46








              2




              2




              DBVisualizer took 50 seconds to import 1400 rows with three fields -- and I had to cast everything back from a String to whatever it was supposed to be.
              – Noumenon
              Sep 29 '16 at 10:46




              DBVisualizer took 50 seconds to import 1400 rows with three fields -- and I had to cast everything back from a String to whatever it was supposed to be.
              – Noumenon
              Sep 29 '16 at 10:46










              up vote
              17
              down vote













              Most other solutions here require that you create the table in advance/manually. This may not be practical in some cases (e.g., if you have a lot of columns in the destination table). So, the approach below may come handy.



              Providing the path and column count of your csv file, you can use the following function to load your table to a temp table that will be named as target_table:



              The top row is assumed to have the column names.



              create or replace function data.load_csv_file
              (
              target_table text,
              csv_path text,
              col_count integer
              )

              returns void as $$

              declare

              iter integer; -- dummy integer to iterate columns with
              col text; -- variable to keep the column name at each iteration
              col_first text; -- first column name, e.g., top left corner on a csv file or spreadsheet

              begin
              set schema 'your-schema';

              create table temp_table ();

              -- add just enough number of columns
              for iter in 1..col_count
              loop
              execute format('alter table temp_table add column col_%s text;', iter);
              end loop;

              -- copy the data from csv file
              execute format('copy temp_table from %L with delimiter '','' quote ''"'' csv ', csv_path);

              iter := 1;
              col_first := (select col_1 from temp_table limit 1);

              -- update the column names based on the first row which has the column names
              for col in execute format('select unnest(string_to_array(trim(temp_table::text, ''()''), '','')) from temp_table where col_1 = %L', col_first)
              loop
              execute format('alter table temp_table rename column col_%s to %s', iter, col);
              iter := iter + 1;
              end loop;

              -- delete the columns row
              execute format('delete from temp_table where %s = %L', col_first, col_first);

              -- change the temp table name to the name given as parameter, if not blank
              if length(target_table) > 0 then
              execute format('alter table temp_table rename to %I', target_table);
              end if;

              end;

              $$ language plpgsql;





              share|improve this answer



















              • 1




                Hi Mehmet, thanks for the answer you posted but when I run your code I get the following error message : ERROR: schema "data" does not exist
                – user2867432
                Nov 8 '16 at 5:34










              • user2867432 you need to change schema name that you use accordingly (e.g., public)
                – mehmet
                Nov 8 '16 at 13:05










              • Hi Mehmet, Thanks for solution, it's perfect but This works only if the postgres DB user is superuser, is ther any way to make it work without superuser?
                – Geeme
                Jun 23 '17 at 9:05










              • Geeme: read "security definer" here, but I have not used it myself.
                – mehmet
                Jun 23 '17 at 16:55















              up vote
              17
              down vote













              Most other solutions here require that you create the table in advance/manually. This may not be practical in some cases (e.g., if you have a lot of columns in the destination table). So, the approach below may come handy.



              Providing the path and column count of your csv file, you can use the following function to load your table to a temp table that will be named as target_table:



              The top row is assumed to have the column names.



              create or replace function data.load_csv_file
              (
              target_table text,
              csv_path text,
              col_count integer
              )

              returns void as $$

              declare

              iter integer; -- dummy integer to iterate columns with
              col text; -- variable to keep the column name at each iteration
              col_first text; -- first column name, e.g., top left corner on a csv file or spreadsheet

              begin
              set schema 'your-schema';

              create table temp_table ();

              -- add just enough number of columns
              for iter in 1..col_count
              loop
              execute format('alter table temp_table add column col_%s text;', iter);
              end loop;

              -- copy the data from csv file
              execute format('copy temp_table from %L with delimiter '','' quote ''"'' csv ', csv_path);

              iter := 1;
              col_first := (select col_1 from temp_table limit 1);

              -- update the column names based on the first row which has the column names
              for col in execute format('select unnest(string_to_array(trim(temp_table::text, ''()''), '','')) from temp_table where col_1 = %L', col_first)
              loop
              execute format('alter table temp_table rename column col_%s to %s', iter, col);
              iter := iter + 1;
              end loop;

              -- delete the columns row
              execute format('delete from temp_table where %s = %L', col_first, col_first);

              -- change the temp table name to the name given as parameter, if not blank
              if length(target_table) > 0 then
              execute format('alter table temp_table rename to %I', target_table);
              end if;

              end;

              $$ language plpgsql;





              share|improve this answer



















              • 1




                Hi Mehmet, thanks for the answer you posted but when I run your code I get the following error message : ERROR: schema "data" does not exist
                – user2867432
                Nov 8 '16 at 5:34










              • user2867432 you need to change schema name that you use accordingly (e.g., public)
                – mehmet
                Nov 8 '16 at 13:05










              • Hi Mehmet, Thanks for solution, it's perfect but This works only if the postgres DB user is superuser, is ther any way to make it work without superuser?
                – Geeme
                Jun 23 '17 at 9:05










              • Geeme: read "security definer" here, but I have not used it myself.
                – mehmet
                Jun 23 '17 at 16:55













              up vote
              17
              down vote










              up vote
              17
              down vote









              Most other solutions here require that you create the table in advance/manually. This may not be practical in some cases (e.g., if you have a lot of columns in the destination table). So, the approach below may come handy.



              Providing the path and column count of your csv file, you can use the following function to load your table to a temp table that will be named as target_table:



              The top row is assumed to have the column names.



              create or replace function data.load_csv_file
              (
              target_table text,
              csv_path text,
              col_count integer
              )

              returns void as $$

              declare

              iter integer; -- dummy integer to iterate columns with
              col text; -- variable to keep the column name at each iteration
              col_first text; -- first column name, e.g., top left corner on a csv file or spreadsheet

              begin
              set schema 'your-schema';

              create table temp_table ();

              -- add just enough number of columns
              for iter in 1..col_count
              loop
              execute format('alter table temp_table add column col_%s text;', iter);
              end loop;

              -- copy the data from csv file
              execute format('copy temp_table from %L with delimiter '','' quote ''"'' csv ', csv_path);

              iter := 1;
              col_first := (select col_1 from temp_table limit 1);

              -- update the column names based on the first row which has the column names
              for col in execute format('select unnest(string_to_array(trim(temp_table::text, ''()''), '','')) from temp_table where col_1 = %L', col_first)
              loop
              execute format('alter table temp_table rename column col_%s to %s', iter, col);
              iter := iter + 1;
              end loop;

              -- delete the columns row
              execute format('delete from temp_table where %s = %L', col_first, col_first);

              -- change the temp table name to the name given as parameter, if not blank
              if length(target_table) > 0 then
              execute format('alter table temp_table rename to %I', target_table);
              end if;

              end;

              $$ language plpgsql;





              share|improve this answer














              Most other solutions here require that you create the table in advance/manually. This may not be practical in some cases (e.g., if you have a lot of columns in the destination table). So, the approach below may come handy.



              Providing the path and column count of your csv file, you can use the following function to load your table to a temp table that will be named as target_table:



              The top row is assumed to have the column names.



              create or replace function data.load_csv_file
              (
              target_table text,
              csv_path text,
              col_count integer
              )

              returns void as $$

              declare

              iter integer; -- dummy integer to iterate columns with
              col text; -- variable to keep the column name at each iteration
              col_first text; -- first column name, e.g., top left corner on a csv file or spreadsheet

              begin
              set schema 'your-schema';

              create table temp_table ();

              -- add just enough number of columns
              for iter in 1..col_count
              loop
              execute format('alter table temp_table add column col_%s text;', iter);
              end loop;

              -- copy the data from csv file
              execute format('copy temp_table from %L with delimiter '','' quote ''"'' csv ', csv_path);

              iter := 1;
              col_first := (select col_1 from temp_table limit 1);

              -- update the column names based on the first row which has the column names
              for col in execute format('select unnest(string_to_array(trim(temp_table::text, ''()''), '','')) from temp_table where col_1 = %L', col_first)
              loop
              execute format('alter table temp_table rename column col_%s to %s', iter, col);
              iter := iter + 1;
              end loop;

              -- delete the columns row
              execute format('delete from temp_table where %s = %L', col_first, col_first);

              -- change the temp table name to the name given as parameter, if not blank
              if length(target_table) > 0 then
              execute format('alter table temp_table rename to %I', target_table);
              end if;

              end;

              $$ language plpgsql;






              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited Jun 23 '17 at 17:01

























              answered May 6 '15 at 17:24









              mehmet

              3,35621833




              3,35621833








              • 1




                Hi Mehmet, thanks for the answer you posted but when I run your code I get the following error message : ERROR: schema "data" does not exist
                – user2867432
                Nov 8 '16 at 5:34










              • user2867432 you need to change schema name that you use accordingly (e.g., public)
                – mehmet
                Nov 8 '16 at 13:05










              • Hi Mehmet, Thanks for solution, it's perfect but This works only if the postgres DB user is superuser, is ther any way to make it work without superuser?
                – Geeme
                Jun 23 '17 at 9:05










              • Geeme: read "security definer" here, but I have not used it myself.
                – mehmet
                Jun 23 '17 at 16:55














              • 1




                Hi Mehmet, thanks for the answer you posted but when I run your code I get the following error message : ERROR: schema "data" does not exist
                – user2867432
                Nov 8 '16 at 5:34










              • user2867432 you need to change schema name that you use accordingly (e.g., public)
                – mehmet
                Nov 8 '16 at 13:05










              • Hi Mehmet, Thanks for solution, it's perfect but This works only if the postgres DB user is superuser, is ther any way to make it work without superuser?
                – Geeme
                Jun 23 '17 at 9:05










              • Geeme: read "security definer" here, but I have not used it myself.
                – mehmet
                Jun 23 '17 at 16:55








              1




              1




              Hi Mehmet, thanks for the answer you posted but when I run your code I get the following error message : ERROR: schema "data" does not exist
              – user2867432
              Nov 8 '16 at 5:34




              Hi Mehmet, thanks for the answer you posted but when I run your code I get the following error message : ERROR: schema "data" does not exist
              – user2867432
              Nov 8 '16 at 5:34












              user2867432 you need to change schema name that you use accordingly (e.g., public)
              – mehmet
              Nov 8 '16 at 13:05




              user2867432 you need to change schema name that you use accordingly (e.g., public)
              – mehmet
              Nov 8 '16 at 13:05












              Hi Mehmet, Thanks for solution, it's perfect but This works only if the postgres DB user is superuser, is ther any way to make it work without superuser?
              – Geeme
              Jun 23 '17 at 9:05




              Hi Mehmet, Thanks for solution, it's perfect but This works only if the postgres DB user is superuser, is ther any way to make it work without superuser?
              – Geeme
              Jun 23 '17 at 9:05












              Geeme: read "security definer" here, but I have not used it myself.
              – mehmet
              Jun 23 '17 at 16:55




              Geeme: read "security definer" here, but I have not used it myself.
              – mehmet
              Jun 23 '17 at 16:55










              up vote
              15
              down vote













              COPY table_name FROM 'path/to/data.csv' DELIMITER ',' CSV HEADER;





              share|improve this answer



























                up vote
                15
                down vote













                COPY table_name FROM 'path/to/data.csv' DELIMITER ',' CSV HEADER;





                share|improve this answer

























                  up vote
                  15
                  down vote










                  up vote
                  15
                  down vote









                  COPY table_name FROM 'path/to/data.csv' DELIMITER ',' CSV HEADER;





                  share|improve this answer














                  COPY table_name FROM 'path/to/data.csv' DELIMITER ',' CSV HEADER;






                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Nov 16 '16 at 19:21









                  Tunaki

                  87.4k21189259




                  87.4k21189259










                  answered Nov 16 '16 at 19:06









                  Tim Siwula

                  563712




                  563712






















                      up vote
                      6
                      down vote













                      Personal experience with PostgreSQL, still waiting for a faster way.



                      1. Create table skeleton first if the file is stored locally:



                          drop table if exists ur_table;
                      CREATE TABLE ur_table
                      (
                      id serial NOT NULL,
                      log_id numeric,
                      proc_code numeric,
                      date timestamp,
                      qty int,
                      name varchar,
                      price money
                      );
                      COPY
                      ur_table(id, log_id, proc_code, date, qty, name, price)
                      FROM 'pathxxx.csv' DELIMITER ',' CSV HEADER;


                      2. When the pathxxx.csv is on the server, postgreSQL doesn't have the
                      permission to access the server, you will have to import the .csv file through the pgAdmin built in functionality.



                      Right click the table name choose import.



                      enter image description here



                      If you still have problem, please refer this tutorial.
                      http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






                      share|improve this answer



























                        up vote
                        6
                        down vote













                        Personal experience with PostgreSQL, still waiting for a faster way.



                        1. Create table skeleton first if the file is stored locally:



                            drop table if exists ur_table;
                        CREATE TABLE ur_table
                        (
                        id serial NOT NULL,
                        log_id numeric,
                        proc_code numeric,
                        date timestamp,
                        qty int,
                        name varchar,
                        price money
                        );
                        COPY
                        ur_table(id, log_id, proc_code, date, qty, name, price)
                        FROM 'pathxxx.csv' DELIMITER ',' CSV HEADER;


                        2. When the pathxxx.csv is on the server, postgreSQL doesn't have the
                        permission to access the server, you will have to import the .csv file through the pgAdmin built in functionality.



                        Right click the table name choose import.



                        enter image description here



                        If you still have problem, please refer this tutorial.
                        http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






                        share|improve this answer

























                          up vote
                          6
                          down vote










                          up vote
                          6
                          down vote









                          Personal experience with PostgreSQL, still waiting for a faster way.



                          1. Create table skeleton first if the file is stored locally:



                              drop table if exists ur_table;
                          CREATE TABLE ur_table
                          (
                          id serial NOT NULL,
                          log_id numeric,
                          proc_code numeric,
                          date timestamp,
                          qty int,
                          name varchar,
                          price money
                          );
                          COPY
                          ur_table(id, log_id, proc_code, date, qty, name, price)
                          FROM 'pathxxx.csv' DELIMITER ',' CSV HEADER;


                          2. When the pathxxx.csv is on the server, postgreSQL doesn't have the
                          permission to access the server, you will have to import the .csv file through the pgAdmin built in functionality.



                          Right click the table name choose import.



                          enter image description here



                          If you still have problem, please refer this tutorial.
                          http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






                          share|improve this answer














                          Personal experience with PostgreSQL, still waiting for a faster way.



                          1. Create table skeleton first if the file is stored locally:



                              drop table if exists ur_table;
                          CREATE TABLE ur_table
                          (
                          id serial NOT NULL,
                          log_id numeric,
                          proc_code numeric,
                          date timestamp,
                          qty int,
                          name varchar,
                          price money
                          );
                          COPY
                          ur_table(id, log_id, proc_code, date, qty, name, price)
                          FROM 'pathxxx.csv' DELIMITER ',' CSV HEADER;


                          2. When the pathxxx.csv is on the server, postgreSQL doesn't have the
                          permission to access the server, you will have to import the .csv file through the pgAdmin built in functionality.



                          Right click the table name choose import.



                          enter image description here



                          If you still have problem, please refer this tutorial.
                          http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/







                          share|improve this answer














                          share|improve this answer



                          share|improve this answer








                          edited Oct 9 '17 at 20:53

























                          answered Jul 26 '17 at 17:06









                          flowera

                          51579




                          51579






















                              up vote
                              5
                              down vote













                              Use this SQL code



                                  copy table_name(atribute1,attribute2,attribute3...)
                              from 'E:test.csv' delimiter ',' csv header


                              the header keyword lets the DBMS know that the csv file have a header with attributes



                              for more visit http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






                              share|improve this answer



























                                up vote
                                5
                                down vote













                                Use this SQL code



                                    copy table_name(atribute1,attribute2,attribute3...)
                                from 'E:test.csv' delimiter ',' csv header


                                the header keyword lets the DBMS know that the csv file have a header with attributes



                                for more visit http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






                                share|improve this answer

























                                  up vote
                                  5
                                  down vote










                                  up vote
                                  5
                                  down vote









                                  Use this SQL code



                                      copy table_name(atribute1,attribute2,attribute3...)
                                  from 'E:test.csv' delimiter ',' csv header


                                  the header keyword lets the DBMS know that the csv file have a header with attributes



                                  for more visit http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/






                                  share|improve this answer














                                  Use this SQL code



                                      copy table_name(atribute1,attribute2,attribute3...)
                                  from 'E:test.csv' delimiter ',' csv header


                                  the header keyword lets the DBMS know that the csv file have a header with attributes



                                  for more visit http://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/







                                  share|improve this answer














                                  share|improve this answer



                                  share|improve this answer








                                  edited Jan 18 '17 at 9:02









                                  alfonx

                                  4,1973955




                                  4,1973955










                                  answered Jan 12 '17 at 7:27









                                  djdere

                                  14117




                                  14117






















                                      up vote
                                      4
                                      down vote













                                      IMHO, the most convenient way is to follow "Import CSV data into postgresql, the comfortable way ;-)", using csvsql from csvkit, which is a python package installable via pip.






                                      share|improve this answer



















                                      • 2




                                        Link rot is voracious! The article you linked to no longer works, which makes me uncomfortable :(
                                        – chbrown
                                        Jul 27 '16 at 20:18










                                      • you might want to mention that his is py.
                                        – mountainclimber
                                        Aug 9 '16 at 14:46






                                      • 1




                                        For me I get a MemoryError if trying to import a large CSV so it looks like it doesn't stream.
                                        – DavidC
                                        Oct 20 '16 at 12:32










                                      • @DavidC Interesting. How big is your file? How much memory do you have? If it doesnt stream as it appears, I suggest chunking the data before insertion
                                        – sal
                                        Oct 31 '16 at 12:13






                                      • 1




                                        The file was 5GBs in size and I have 2GB memory. I gave up on it and use a script to generate CREATE TABLE and COPY commands in the end.
                                        – DavidC
                                        Nov 1 '16 at 9:51















                                      up vote
                                      4
                                      down vote













                                      IMHO, the most convenient way is to follow "Import CSV data into postgresql, the comfortable way ;-)", using csvsql from csvkit, which is a python package installable via pip.






                                      share|improve this answer



















                                      • 2




                                        Link rot is voracious! The article you linked to no longer works, which makes me uncomfortable :(
                                        – chbrown
                                        Jul 27 '16 at 20:18










                                      • you might want to mention that his is py.
                                        – mountainclimber
                                        Aug 9 '16 at 14:46






                                      • 1




                                        For me I get a MemoryError if trying to import a large CSV so it looks like it doesn't stream.
                                        – DavidC
                                        Oct 20 '16 at 12:32










                                      • @DavidC Interesting. How big is your file? How much memory do you have? If it doesnt stream as it appears, I suggest chunking the data before insertion
                                        – sal
                                        Oct 31 '16 at 12:13






                                      • 1




                                        The file was 5GBs in size and I have 2GB memory. I gave up on it and use a script to generate CREATE TABLE and COPY commands in the end.
                                        – DavidC
                                        Nov 1 '16 at 9:51













                                      up vote
                                      4
                                      down vote










                                      up vote
                                      4
                                      down vote









                                      IMHO, the most convenient way is to follow "Import CSV data into postgresql, the comfortable way ;-)", using csvsql from csvkit, which is a python package installable via pip.






                                      share|improve this answer














                                      IMHO, the most convenient way is to follow "Import CSV data into postgresql, the comfortable way ;-)", using csvsql from csvkit, which is a python package installable via pip.







                                      share|improve this answer














                                      share|improve this answer



                                      share|improve this answer








                                      edited Jun 21 at 13:58









                                      ruvim

                                      2,6871522




                                      2,6871522










                                      answered Nov 7 '15 at 9:54









                                      sal

                                      6081520




                                      6081520








                                      • 2




                                        Link rot is voracious! The article you linked to no longer works, which makes me uncomfortable :(
                                        – chbrown
                                        Jul 27 '16 at 20:18










                                      • you might want to mention that his is py.
                                        – mountainclimber
                                        Aug 9 '16 at 14:46






                                      • 1




                                        For me I get a MemoryError if trying to import a large CSV so it looks like it doesn't stream.
                                        – DavidC
                                        Oct 20 '16 at 12:32










                                      • @DavidC Interesting. How big is your file? How much memory do you have? If it doesnt stream as it appears, I suggest chunking the data before insertion
                                        – sal
                                        Oct 31 '16 at 12:13






                                      • 1




                                        The file was 5GBs in size and I have 2GB memory. I gave up on it and use a script to generate CREATE TABLE and COPY commands in the end.
                                        – DavidC
                                        Nov 1 '16 at 9:51














                                      • 2




                                        Link rot is voracious! The article you linked to no longer works, which makes me uncomfortable :(
                                        – chbrown
                                        Jul 27 '16 at 20:18










                                      • you might want to mention that his is py.
                                        – mountainclimber
                                        Aug 9 '16 at 14:46






                                      • 1




                                        For me I get a MemoryError if trying to import a large CSV so it looks like it doesn't stream.
                                        – DavidC
                                        Oct 20 '16 at 12:32










                                      • @DavidC Interesting. How big is your file? How much memory do you have? If it doesnt stream as it appears, I suggest chunking the data before insertion
                                        – sal
                                        Oct 31 '16 at 12:13






                                      • 1




                                        The file was 5GBs in size and I have 2GB memory. I gave up on it and use a script to generate CREATE TABLE and COPY commands in the end.
                                        – DavidC
                                        Nov 1 '16 at 9:51








                                      2




                                      2




                                      Link rot is voracious! The article you linked to no longer works, which makes me uncomfortable :(
                                      – chbrown
                                      Jul 27 '16 at 20:18




                                      Link rot is voracious! The article you linked to no longer works, which makes me uncomfortable :(
                                      – chbrown
                                      Jul 27 '16 at 20:18












                                      you might want to mention that his is py.
                                      – mountainclimber
                                      Aug 9 '16 at 14:46




                                      you might want to mention that his is py.
                                      – mountainclimber
                                      Aug 9 '16 at 14:46




                                      1




                                      1




                                      For me I get a MemoryError if trying to import a large CSV so it looks like it doesn't stream.
                                      – DavidC
                                      Oct 20 '16 at 12:32




                                      For me I get a MemoryError if trying to import a large CSV so it looks like it doesn't stream.
                                      – DavidC
                                      Oct 20 '16 at 12:32












                                      @DavidC Interesting. How big is your file? How much memory do you have? If it doesnt stream as it appears, I suggest chunking the data before insertion
                                      – sal
                                      Oct 31 '16 at 12:13




                                      @DavidC Interesting. How big is your file? How much memory do you have? If it doesnt stream as it appears, I suggest chunking the data before insertion
                                      – sal
                                      Oct 31 '16 at 12:13




                                      1




                                      1




                                      The file was 5GBs in size and I have 2GB memory. I gave up on it and use a script to generate CREATE TABLE and COPY commands in the end.
                                      – DavidC
                                      Nov 1 '16 at 9:51




                                      The file was 5GBs in size and I have 2GB memory. I gave up on it and use a script to generate CREATE TABLE and COPY commands in the end.
                                      – DavidC
                                      Nov 1 '16 at 9:51










                                      up vote
                                      1
                                      down vote














                                      1. create a table first


                                      2. Then use copy command to copy the table details:



                                      copy table_name (C1,C2,C3....)
                                      from 'path to your csv file' delimiter ',' csv header;



                                      Thanks






                                      share|improve this answer



























                                        up vote
                                        1
                                        down vote














                                        1. create a table first


                                        2. Then use copy command to copy the table details:



                                        copy table_name (C1,C2,C3....)
                                        from 'path to your csv file' delimiter ',' csv header;



                                        Thanks






                                        share|improve this answer

























                                          up vote
                                          1
                                          down vote










                                          up vote
                                          1
                                          down vote










                                          1. create a table first


                                          2. Then use copy command to copy the table details:



                                          copy table_name (C1,C2,C3....)
                                          from 'path to your csv file' delimiter ',' csv header;



                                          Thanks






                                          share|improve this answer















                                          1. create a table first


                                          2. Then use copy command to copy the table details:



                                          copy table_name (C1,C2,C3....)
                                          from 'path to your csv file' delimiter ',' csv header;



                                          Thanks







                                          share|improve this answer














                                          share|improve this answer



                                          share|improve this answer








                                          edited Apr 23 at 17:57









                                          eyllanesc

                                          69k93052




                                          69k93052










                                          answered Dec 22 '17 at 7:33









                                          user9130085

                                          111




                                          111






















                                              up vote
                                              0
                                              down vote













                                              Create table and have required columns that are used for creating table in csv file.




                                              1. Open postgres and right click on target table which you want to load & select import and Update the following steps in file options section


                                              2. Now browse your file in filename


                                              3. Select csv in format


                                              4. Encoding as ISO_8859_5



                                              Now goto Misc. options and check header and click on import.






                                              share|improve this answer



























                                                up vote
                                                0
                                                down vote













                                                Create table and have required columns that are used for creating table in csv file.




                                                1. Open postgres and right click on target table which you want to load & select import and Update the following steps in file options section


                                                2. Now browse your file in filename


                                                3. Select csv in format


                                                4. Encoding as ISO_8859_5



                                                Now goto Misc. options and check header and click on import.






                                                share|improve this answer

























                                                  up vote
                                                  0
                                                  down vote










                                                  up vote
                                                  0
                                                  down vote









                                                  Create table and have required columns that are used for creating table in csv file.




                                                  1. Open postgres and right click on target table which you want to load & select import and Update the following steps in file options section


                                                  2. Now browse your file in filename


                                                  3. Select csv in format


                                                  4. Encoding as ISO_8859_5



                                                  Now goto Misc. options and check header and click on import.






                                                  share|improve this answer














                                                  Create table and have required columns that are used for creating table in csv file.




                                                  1. Open postgres and right click on target table which you want to load & select import and Update the following steps in file options section


                                                  2. Now browse your file in filename


                                                  3. Select csv in format


                                                  4. Encoding as ISO_8859_5



                                                  Now goto Misc. options and check header and click on import.







                                                  share|improve this answer














                                                  share|improve this answer



                                                  share|improve this answer








                                                  edited Jul 4 '17 at 9:00









                                                  Pang

                                                  6,8191563101




                                                  6,8191563101










                                                  answered Jul 4 '17 at 8:23









                                                  suriruler

                                                  113




                                                  113






















                                                      up vote
                                                      0
                                                      down vote













                                                      If you need simple mechanism to import from text/parse multiline CSV you could use:



                                                      CREATE TABLE t   -- OR INSERT INTO tab(col_names)
                                                      AS
                                                      SELECT
                                                      t.f[1] AS col1
                                                      ,t.f[2]::int AS col2
                                                      ,t.f[3]::date AS col3
                                                      ,t.f[4] AS col4
                                                      FROM (
                                                      SELECT regexp_split_to_array(l, ',') AS f
                                                      FROM regexp_split_to_table(
                                                      $$a,1,2016-01-01,bbb
                                                      c,2,2018-01-01,ddd
                                                      e,3,2019-01-01,eee$$, 'n') AS l) t;


                                                      DBFiddle Demo






                                                      share|improve this answer

























                                                        up vote
                                                        0
                                                        down vote













                                                        If you need simple mechanism to import from text/parse multiline CSV you could use:



                                                        CREATE TABLE t   -- OR INSERT INTO tab(col_names)
                                                        AS
                                                        SELECT
                                                        t.f[1] AS col1
                                                        ,t.f[2]::int AS col2
                                                        ,t.f[3]::date AS col3
                                                        ,t.f[4] AS col4
                                                        FROM (
                                                        SELECT regexp_split_to_array(l, ',') AS f
                                                        FROM regexp_split_to_table(
                                                        $$a,1,2016-01-01,bbb
                                                        c,2,2018-01-01,ddd
                                                        e,3,2019-01-01,eee$$, 'n') AS l) t;


                                                        DBFiddle Demo






                                                        share|improve this answer























                                                          up vote
                                                          0
                                                          down vote










                                                          up vote
                                                          0
                                                          down vote









                                                          If you need simple mechanism to import from text/parse multiline CSV you could use:



                                                          CREATE TABLE t   -- OR INSERT INTO tab(col_names)
                                                          AS
                                                          SELECT
                                                          t.f[1] AS col1
                                                          ,t.f[2]::int AS col2
                                                          ,t.f[3]::date AS col3
                                                          ,t.f[4] AS col4
                                                          FROM (
                                                          SELECT regexp_split_to_array(l, ',') AS f
                                                          FROM regexp_split_to_table(
                                                          $$a,1,2016-01-01,bbb
                                                          c,2,2018-01-01,ddd
                                                          e,3,2019-01-01,eee$$, 'n') AS l) t;


                                                          DBFiddle Demo






                                                          share|improve this answer












                                                          If you need simple mechanism to import from text/parse multiline CSV you could use:



                                                          CREATE TABLE t   -- OR INSERT INTO tab(col_names)
                                                          AS
                                                          SELECT
                                                          t.f[1] AS col1
                                                          ,t.f[2]::int AS col2
                                                          ,t.f[3]::date AS col3
                                                          ,t.f[4] AS col4
                                                          FROM (
                                                          SELECT regexp_split_to_array(l, ',') AS f
                                                          FROM regexp_split_to_table(
                                                          $$a,1,2016-01-01,bbb
                                                          c,2,2018-01-01,ddd
                                                          e,3,2019-01-01,eee$$, 'n') AS l) t;


                                                          DBFiddle Demo







                                                          share|improve this answer












                                                          share|improve this answer



                                                          share|improve this answer










                                                          answered Apr 23 at 17:55









                                                          Lukasz Szozda

                                                          76.5k1059101




                                                          76.5k1059101

















                                                              protected by eyllanesc Apr 23 at 17:56



                                                              Thank you for your interest in this question.
                                                              Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



                                                              Would you like to answer one of these unanswered questions instead?



                                                              這個網誌中的熱門文章

                                                              Xamarin.form Move up view when keyboard appear

                                                              Post-Redirect-Get with Spring WebFlux and Thymeleaf

                                                              Anylogic : not able to use stopDelay()