Log Files

Starting with version 1.8.0, pgmetrics can extract information from PostgreSQL log files and make them available in it’s JSON output. Currently, the following information is collected:

  • Query execution plans logged by the auto_explain extension. Plans in JSON or text formats, along with the SQL query text, the username of the user executing the query and the name of the database on which it is executed are collected.
  • Autovaccum log entries, with the name of the table being autovaccummed, the time of start and the duration.
  • Deadlock detection logs that include the queries that caused the deadlock.

Locating the Log Files

pgmetrics will attempt to locate the current PostgreSQL log file using:

  • the pg_current_logfile() function, if available
  • standard locations, e.g., /var/log/postgresql/postgresql-{VER}-main.log in Debian/Ubuntu

If this does not work for your installation, use the --log-file=path/to/log/file command-line option to specify the correct location of the file.

Specifying How Much to Collect

By default pgmetrics will examine the last 5 minutes worth of log file content, because pgmetrics is intended to be invoked periodically (like every 5 minutes) to collect metrics and information. You can change this duration using the command line option --log-span=MINS.

Log Line Prefix

pgmetrics will examine the configuration setting log_line_prefix directly from the database. Any value for this setting is acceptable to pgmetrics, as long as it includes one of %t (timestamp without milliseconds), %m (timestamp with milliseconds) or %n (epoch timestamp).

Additionally, it is highly recommended to include %u (username) and %d (database name).

Skipping Log File Processing

pgmetrics will attempt to read and process logs by default. If this behavior is not required, disable it using the --omit=log command-line option.

CSV Logs

Currently only text-format PostgreSQL logs are supported for processing (and not the csvlog ones).