Skip to main content

All Questions

Tagged with
Filter by
Sorted by
Tagged with
7 votes
2 answers
6k views

What causes large INSERT to slow down and disk usage to explode?

I have a table of about 3.1 million rows with the following definition and indexes: CREATE TABLE digiroad_liikenne_elementti ( ogc_fid serial NOT NULL, wkb_geometry geometry(Geometry,4258), ...
jeran's user avatar
  • 73
2 votes
1 answer
937 views

Create one row for every user in a Postgres table

I have a bunch of rows of users in a users table, for example: | id | name | |----|-------| | 1 | Chris | | 2 | Max | | 3 | Steve | For each one of these users, I'd like to create a row in ...
Chris Houghton's user avatar
4 votes
2 answers
40k views

Return the id after insert or select

I want to build a function which will insert an email if the email value doesn't exist in the table and return the email_id of the row. How can I do this? Also how can I return the id if the email was ...
RockNinja's user avatar
  • 683
3 votes
1 answer
15k views

How to ignore duplicates during bulk inserts?

In Postgres 9.3.5, I'm importing records from an external source where duplicates are VERY rare, but they do happen. Given a readings table with a unique compound key on (real_time_device_id, ...
fearless_fool's user avatar
12 votes
1 answer
43k views

Optimize PostgreSQL for a lot of INSERTS and bytea updates

What we have (software): PostrgeSQL 9.3 with base configuration (no changes in postgresql.conf) Windows 7 64 bit Hardware: Intel Core i7-3770 3.9 Ghz 32 Gb RAM WDC WD10EZRX-00L4HBAta Drive (1000Gb, ...
Andremoniy's user avatar
0 votes
1 answer
463 views

Insert a group of consistent item with foreign key but colliding with existing items

Is there a way to insert a group of items that are dependent but consistent between them with unique primary keys and foreign keys but that are colliding with items in database. For example, given a ...
hokkos's user avatar
  • 101