Ask Your Question
4

How can we split the Filebeat message into different fields and then add a field specifically for Postgres logs?

asked 2022-07-29 11:00:00 +0000

djk gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2021-04-30 07:00:00 +0000

huitzilopochtli gravatar image
  1. First, we need to edit the Filebeat configuration file (filebeat.yml) to specify that we want to separate our log lines into different fields. We can do this by adding a "fields" section under the "output" section.

    Here's an example:

    output.elasticsearch:
     hosts: ["localhost:9200"]
     index: "my_logs-%{+yyyy.MM.dd}"
     fields:
       service: my_app
    

    In this example, we're specifying that we want to add a "service" field to all log events, with the value "my_app".

  2. Next, we need to create a Logstash pipeline to receive the Filebeat events and split them into different fields.

    Here's an example pipeline:

    input {
     beats {
       port => 5044
     }
    }
    
    filter {
     grok {
       match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} \[%{DATA:component}\] %{GREEDYDATA:message}" }
     }
     mutate {
       add_field => { "application" => "my_app" }
       add_field => { "type" => "postgresql" }
     }
    }
    
    output {
     elasticsearch {
       hosts => ["localhost:9200"]
       index => "my_logs-%{+yyyy.MM.dd}"
     }
    }
    

    In this pipeline, we're using the grok filter to split the log lines into different fields based on a regular expression. We're also adding two new fields: "application" with the value "my_app" and "type" with the value "postgresql".

    Note that in this example, we're assuming that the logs are coming from a PostgreSQL database. If your logs are coming from a different source, you may need to adjust the regular expression in the grok filter to match your log format.

  3. Finally, we need to make sure that Filebeat is sending its events to the Logstash pipeline. To do this, we can edit the Filebeat configuration file to specify the Logstash output instead of Elasticsearch.

    Here's an example:

    output.logstash:
     hosts: ["localhost:5044"]
    

    This will send all Filebeat events to the Logstash server running on localhost:5044.

With these steps, we should now be able to receive our log events with separate fields, including a "type" field specifically for PostgreSQL logs.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2022-07-29 11:00:00 +0000

Seen: 13 times

Last updated: Apr 30 '21