Centralize Docker logs

Docker is awesome but manage the logs can be a pain at the beginning.

When you scale and balance the traffic, debug the application it very complicated if you don't centralize all this information.

Scope: Create a logstash server and store the Docker logs in ElasticSearch via syslog.

docker run -d --name elastic elasticsearch
docker run --name kibana --link elastic:elasticsearch -p 81:5601 -d kibana

The very basic configuration it's so simple:


input {
  tcp {
    port => 5000
    type => syslog
  udp {
    port => 5000
    type => syslog

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOG5424PRI}%{NONNEGINT:ver} +(?:%{TIMESTAMP_ISO8601:ts}|-) +(?:%{HOSTNAME:containerid}|-) +(?:%{NOTSPACE:containername}|-) +(?:%{NOTSPACE:proc}|-) +(?:%{WORD:msgid}|-) +(?:%{SYSLOG5424SD:sd}|-|) +%{GREEDYDATA:msg}"     }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    if !("_grokparsefailure" in [tags]) {
      mutate {
        replace => [ "@source_host", "%{syslog_hostname}" ]
        replace => [ "@message", "%{syslog_message}" ]
    mutate {
      remove_field => [ "syslog_hostname", "syslog_message", "syslog_timestamp" ]
output {
  elasticsearch { hosts => ["http://elastichost:9200"] }
  stdout { codec => rubydebug }


Of course you'll need add your custom patterns to logstash

After that:

docker run -it --rm --name logstash --link elastic --link kibana -v "$PWD":/config-dir logstash logstash -f /config-dir/logstash-docker-syslog.conf

I use docker compose for the orchestration (obviously) but the same config is available via command.

Important: Configure whatever you run in the container to send the logs to STDOUT

You can do do it with for example:

RUN mkdir -p /app/var/logs/{type something here} \
    && touch /app/var/logs/{type something here}/error.log \
    && ln -sf /dev/stdout /app/var/logs/{type something here}/error.log 

Ok, now Imagine a node app configured to send the output to STDOUT

Docker compose

    image: node:6
      driver: syslog
        syslog-address: "tcp://${LOGSTASH_HOST}:${LOGSTASH_PORT}"
        tag: "uid={{.ID}};container={{.Name}};image={{.ImageName}};"

Now you can go to Kibana and have a happy debugging and extract metrics!


Unfortunately that example not always cover all requirements, but you can send the logs of you application to logstash too.

Symfony centralize logs

Because you will have other applications on the log database, first you will need to identify your logs.

The solution I found for that is prefix the log message with the name of the application:

"<14>1 symfony : my-custom-app : [2016-06-10 13:48:30] request.INFO: Matched route \"get_ping\". {\"route_parameters\":{\"_controller\":\"AppBundle\\\\Controller\\\\PingController::getAction\",\"_format\":\"json\",\"_route\":\"get_ping\"},\"request_uri\":\"\"} []\n",

To do that, I created a service and I pass as an argument the parameter to identify each application and the logstash endpoint config:


app.name:      my-custom-app
logstash.port: 5000


    class: AppBundle\Monolog\PrefixSyslogFormatter
      - '%app.name%'

Monolog formatter


namespace AppBundle\Monolog;

use Monolog\Formatter\LineFormatter;

 * Class PrefixSyslogFormatter
 * @package AppBundle\Monolog
class PrefixSyslogFormatter extends LineFormatter

     * PrefixSyslogFormatter constructor.
     * @param string $prefix
     * @param string|null $format
     * @param string|null $dateFormat
     * @param bool $allowInlineLineBreaks
     * @param bool $ignoreEmptyContextAndExtra
    public function __construct($prefix, $format = null, $dateFormat = null, $allowInlineLineBreaks = false, $ignoreEmptyContextAndExtra = false)
        parent::__construct('symfony : ' . $prefix.' : '. ($format ?: static::SIMPLE_FORMAT), $dateFormat, $allowInlineLineBreaks, $ignoreEmptyContextAndExtra);



Last step, configure monolog


            type: syslogudp
            host: '%logstash.ip%'
            port: '%logstash.port%'
            level: error
            action_level: error
            formatter: app.logger.syslog_formater

Done, now add the patterns to logstash to parse the message.