First, we need to edit the Filebeat configuration file (filebeat.yml) to specify that we want to separate our log lines into different fields. We can do this by adding a "fields" section under the "output" section.
Here's an example:
output.elasticsearch:
hosts: ["localhost:9200"]
index: "my_logs-%{+yyyy.MM.dd}"
fields:
service: my_app
In this example, we're specifying that we want to add a "service" field to all log events, with the value "my_app".
Next, we need to create a Logstash pipeline to receive the Filebeat events and split them into different fields.
Here's an example pipeline:
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} \[%{DATA:component}\] %{GREEDYDATA:message}" }
}
mutate {
add_field => { "application" => "my_app" }
add_field => { "type" => "postgresql" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "my_logs-%{+yyyy.MM.dd}"
}
}
In this pipeline, we're using the grok filter to split the log lines into different fields based on a regular expression. We're also adding two new fields: "application" with the value "my_app" and "type" with the value "postgresql".
Note that in this example, we're assuming that the logs are coming from a PostgreSQL database. If your logs are coming from a different source, you may need to adjust the regular expression in the grok filter to match your log format.
Finally, we need to make sure that Filebeat is sending its events to the Logstash pipeline. To do this, we can edit the Filebeat configuration file to specify the Logstash output instead of Elasticsearch.
Here's an example:
output.logstash:
hosts: ["localhost:5044"]
This will send all Filebeat events to the Logstash server running on localhost:5044.
With these steps, we should now be able to receive our log events with separate fields, including a "type" field specifically for PostgreSQL logs.
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2022-07-29 11:00:00 +0000
Seen: 13 times
Last updated: Apr 30 '21
What is the procedure for using pg_restore on Windows with Docker?
Due to SyntaxError, why am I unable to create a TIMESTAMP WITH TIMEZONE column in postgres?
What are the benefits of choosing sqlalchemy.types instead of sqlalchemy.dialects.mssql?
What is the method for placing parentheses in column names when creating a table using an SQL query?
How can larger BLOBs be compressed without being inlined?
How can pgcrypto be used to secure data on Postgres?
How can you apply a filter using in_() in SQLAlchemy for JSON data?