Take the 2-minute tour ×
Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

I have a column defined like:

data_url character varying(32768) NOT NULL

and UNIQUE constraint on that column:

CONSTRAINT unique_data_url UNIQUE (data_url)

When a large object is being inserted into the table, the following error message appears:

ERROR:  index row requires 32584 bytes, maximum size is 8191

How can PostgreSQL be set in order to be able to index object larger that 8191 chars? Space and speed are not a problem. It is a rarely changed table of at most hundreds of rows.

Environment: PostgreSQL 9.3.6, Fedora 20 x64

share|improve this question
1  
A common workaround is using something like CONSTRAINT unique_data_url_hash UNIQUE (MD5(data_url)) –  Josh Kupershmidt Mar 2 at 20:11
    

2 Answers 2

As suggested by @Josh Kupershmidt and @JoeNahmias the solution is to use UNIQUE on md5 hash of the long value. However PostgreSQL 9.3 doesn't support expressions in UNIQUE constraints so an index, that supports expressions, has to be used:

create unique index unique_data_url_index on mytable (md5(data_url));
share|improve this answer

PostgreSQL builds an index to support the UNIQUE constraint. You cannot index a field that large. As @josh-kupershmidt suggested, create the constraint on a hash of the field and you should be alright, barring hash collisions.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.