Fórum Root.cz
Hlavní témata => Vývoj => Téma založeno: PetreS 15. 05. 2017, 17:55:01
-
Ahojte, vedeli by ste mi niekto poradit, ako optimalizovat PostgreSQL query, pre full textove vyhladavanie nad tabulkou, s vacsim mnozstvom dat (>=700 000)? Momentalne to riesim tak, ze nad tabulkou vytvorim GIN index pre funkciu to_tsvector, nad pozadovanymi stlpcami. Potom select vyzera nejak takto:
SELECT *
FROM table
WHERE to_tsvector(...) @@ plainto_tsquery (...)
ORDER BY ts_rank(to_tsvector(...), plainto_tsquery (...), 1) DESC
-
Mas tam 2x to_tsvector, aj vo WHERE a potom v ORDER co je spomalovak. Odporucam zabalit ako vnoreny select podla tohto navodu:
{code}
SELECT id, meta->>'title' as title, meta FROM (
SELECT id, meta, tsv
FROM data_rows, plainto_tsquery('YOUR QUERY') AS q
WHERE (tsv @@ q)
) AS t1 ORDER BY ts_rank_cd(t1.tsv, plainto_tsquery('YOUR QUERY')) DESC;
{code}
https://blog.lateral.io/2015/05/full-text-search-in-milliseconds-with-postgresql/
https://news.ycombinator.com/item?id=9512360