Question

How well is the Postgre JSON type optimized for large JSON documents? I am especially concerned about partial retrieval (e.g. the cost of getting the last N items of a JSON array or looking up one particular item in a JSON dict) in situations where the JSON object is multiple MB in size and too large to be loaded efficiently in full.

Background: I am working with a dataset where each record has 10,000s of annotations. I do not need these annotations fully indexed, but do I need the record to insert quickly, so I am considering storing them in a JSON field instead of creating thousands of additional rows in a mapping table.

This relates to PostgreSQL 9.3.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top