Starting with version 1.8.0, pgmetrics
can extract information from
PostgreSQL log files and make them available in it’s JSON output.
Currently, the following information is collected:
From version 1.17.0, pgmetrics
will also include all the raw log entries it
reads in the JSON output.
pgmetrics
will attempt to find the log file(s) at the following locations, in
order:
--log-file
option--log-dir
optionNote: this behavior changed between pgmetrics v1.9.0 and v1.10.0.
By default pgmetrics
will examine the last 5 minutes worth of log file content,
because pgmetrics
is intended to be invoked periodically (like every 5
minutes) to collect metrics and information. You can change this duration using
the command line option --log-span=MINS
.
pgmetrics
will examine the configuration setting log_line_prefix
directly from the database. Any value for this setting is acceptable to
pgmetrics
, as long as it includes one of %t
(timestamp without milliseconds),
%m
(timestamp with milliseconds) or %n
(epoch timestamp).
Additionally, it is highly recommended to include %u
(username) and %d
(database name).
pgmetrics
will attempt to read and process logs by default. If this behavior
is not required, disable it using the --omit=log
command-line option.
Starting with version 1.10.0, pgmetrics supports reading from CSV logs.
If the setting log_destination
contains csvlog
and the setting
logging_collector
is enabled, then pgmetrics assumes that the logs are in the
CSV format.
Starting with version 1.17.0, pgmetrics includes each raw log entry it processes into the JSON output. Each entry looks like this:
{
"at": 1721810868,
"atfull": "2024-07-24T08:47:48.463Z",
"user": "alice",
"db_name": "alice",
"level": "ERROR",
"line": "permission denied for function pg_current_logfile",
"extra": [
{
"level": "STATEMENT",
"line": "SELECT COALESCE(pg_current_logfile( 'stderr' ),'')"
}
]
}