Looking for slow and frequent queries with pg_stat_statements

Mark Haylock April 17, 2020

Today I stumbled upon something so useful for investigating Postgres performance issues that I wish I’d learnt about it much earlier.

There is a Postgres extension module called pg_stat_statments and it is very helpfully already enabled on Heroku. This extension gathers statistics about every query made against your database.

Heroku uses the table for several CLI commands including "pg:diagnose", "pg:outliers" and "pg:long-running-queries" - but I think nothing substitutes having access to this raw data.

As I understand it, these stats are gathered across all time until you reset them - something you can do yourself with "SELECT pg_stat_statements_reset();" - as instructed by Heroku. Keep this in mind when looking at this table - if you make a change (e.g. add an index) then you’ll want to reset the stats and wait until more are gathered before you can see what impact that change had.

Here is a snippet from the documentation showing the kind of data it can provide.

image

(and yes, track_io_timing is enabled on Heroku).

We have other tools (New Relic, Heroku Postgres Dashboard) that provide some information about which of our queries are the slowest and most frequent. But this table will allow us to better explore this information and allow us to focus our efforts on those queries that will have the biggest impact when fixed.

If you’d like to learn more about the table, how it is populated, and how queries are grouped together then I recommend reading the Postgres documentation. Happy hunting for those performance optimi[sz]ations!

Mark Haylock

Mark Haylock