Fluentd "message does not exist" can happen when emitting logs

Bug #2064104 reported by Doug Szumski
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
kolla-ansible
In Progress
Undecided
Doug Szumski

Bug Description

Seen with ElasticSearch output, centralised logging enabled. Example error:

```
"2024-03-14 11:14:36 +0000 [warn]: #0 dump an error event: error_class=ArgumentError error="message does not exist" location=nil tag="fluent.info" time=2024-03-14 11:14:18.336792552 +0000 record={
"Payload"=>"fluentd worker is now running worker=0",
"Hostname"=>"nova-svr2",
"programname"=>"fluent",
"log_level"=>"info",
 "@timestamp"=>"2024-03-14T11:14:18.336792552+00:00"}
```

How can this happen?

The Kolla Ansible Fluentd pipeline is not idempotent. For example, this section which is used to standardise the log message field removes the original field:

```
    <filter fluent.**>
        @type parser
        key_name message
        format /^(?<Payload>.*)$/
    </filter>
```

If a bulk request is sent to Elastic/OpenSearch and not all logs are processed, the default behaviour of Fluentd is to re-emit the unprocessed logs using their original tag. These logs are then re-processed by the Fluentd pipeine. Since the pipeline is not idempotent, this can result in processing errors. In the above example, the 'message' field is no longer present, resulting in the "message does not exist" error.

There are various ways to fix this. Making the pipeline idempotent is not necessary if we use the `retry_tag` in the Elastic/OpenSearch output plugin. Eg. https://github.com/fluent/fluent-plugin-opensearch?tab=readme-ov-file#retry_tag

Doug Szumski (dszumski)
Changed in kolla-ansible:
assignee: nobody → Doug Szumski (dszumski)
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to kolla-ansible (master)
Changed in kolla-ansible:
status: New → In Progress
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.