Question

I have 10 sql files that contains different OLAP queries.

I want to run them as a batch. Meaning by batch, Postgres will consider each file as a batch and execute the queries at once (depends on its own managing process)

The follow command, execute each query one by one

psql -h localhost -d databasename -U user -p port -a -q -f input.sql -o output.txt

Is that possible ? and how to calculate the time it takes to execute that batch sql queries ?

Was it helpful?

Solution

You can use i.e. GNU Parallel which provides possibility to run arbitrary commands/programs at the same time, the syntax would be something like:

parallel psql -h localhost -d databasename -U user -p port -a -q -f input{}.sql -o output{}.txt ::: 1 2

this command will execute 2 queries located in input1.sql and input2.sql at the same time.

Another option would be going for a dedicated load testing tool like Apache JMeter, if you add Postgres JDBC Driver to JMeter Classpath you will be able to provide your connection details under JDBC Connection Configuration and the query(ies) in the JDBC Request Samplers. Check out The Real Secret to Building a Database Test Plan With JMeter article for more details if needed.

Then you can increase the number of threads in Thread Group and JMeter will execute the queries according to your workload model. You will be able to observe performance metrics either using Listeners or via HTML Reporting Dashboard. This way you will get more comprehensive results and be able to correlate increasing number of concurrent threads with increased queries execution time, identify saturation points, bottlenecks, etc.

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top