Pergunta

The table has three columns:

  • id
  • string (up to 400 characters)
  • length which records the length of string

The problem was, when I do a query, e.g.

select * from table where length = <whatever number>;

PostgreSQL never returns, it keeps calculating.

I was wondering if that was due to the large data set?

Should I somehow split the table into several?

Environment: 12GB RAM, PostgreSQL 12 on Win10.

Foi útil?

Solução

You have to have more patience, scanning a large table takes a while.

The classical remedy is to create an index:

CREATE INDEX ON "table" (length);

That will take a long time itself (and might use considerable disk space), but once you have that index, it will speed up your query (unless it returns so many rows that an index scan is no more efficient than a sequential table scan).

An alternative might be to partition the table according to length. That would require that the length hardly ever changes. Then each search for a certain string length would scan only the appropriate partition and be as fast as possible. Try to get fast disks.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a dba.stackexchange
scroll top