0

In PostgreSQL 9.3, I have a table like this

id | array_json
---+----------------------------
1  | ["{123: 456}", "{789: 987}", "{111: 222}"]
2  | ["{4322: 54662}", "{123: 5121}", "{1: 5345}" ... ]
3  | ["{3232: 413}", "{5235: 22}", "{2: 5453}" ... ]
4  | ["{22: 44}", "{12: 4324}", "{234: 4235}" ... ]
...

I want to get the sum of all values in array_json column. So, for example, for first row, I want:

id | total
---+-------
1  | 1665

Where 1665 = 456 + 987 + 222 (the values of all the elements of json array). No previous information about the keys of the json elements (just random numbers)

I'm reading the documentation page about JSON functions in PostgreSQL 9.3, and I think I should use json_each, but can't find the right query. Could you please help me with it?

Many thanks in advance

1

You started looking at the right place (going to the docs is always the right place).

Since your values are JSON arrays -> I would suggest using json_array_elements(json)

And since it's a json array which you have to explode to several rows, and then combine back by running sum over json_each_text(json) - it would be best to create your own function (Postgres allows it)

As for your specific case, assuming the structure you provided is correct, some string parsing + JSON heavy wizardry can be used (let's say your table name is "json_test_table" and the columns are "id" and "json_array"), here is the query that does your "thing"

select id, sum(val) from 
    (select id,
     substring(
        json_each_text(
          replace(
           replace(
            replace(
             replace(
              replace(json_array,':','":"')
             ,'{',''),
            '}','')
           ,']','}')
          ,'[','{')::json)::varchar 
     from '\"(.*)\"')::int  as val  
   from json_test_table) j group by id ; 

if you plan to run it on a huge dataset - keep in mind string manipulations are expensive in terms of performance

2
  • Great! I've found a solution, but using intermediate tables and/or with clauses. This is more elegant. Many thanks! – Jorge Arévalo Nov 2 '14 at 20:45
  • @jorgeas80 Look to my answer, perhaps you can find it more elegant too :) – Houari Nov 2 '14 at 21:08
0

You can get it using this:

/*
Sorry, sqlfiddle is busy :p
CREATE TABLE my_table
(
  id bigserial NOT NULL,
  array_json json[]
  --,CONSTRAINT my_table_pkey PRIMARY KEY (id)
)

INSERT INTO my_table(array_json)
   values (array['{"123": 456}'::json, '{"789": 987}'::json, '{"111": 222}'::json]);
*/   

select id, sum(json_value::integer)
from
(
select id, json_data->>json_object_keys(json_data) as json_value from
    (
    select id,  unnest(array_json) as json_data from my_table
    ) A
) B 
group by id

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

Not the answer you're looking for? Browse other questions tagged or ask your own question.