To ensure that the logs of Airflow using the Airflow community chart persist on Google Cloud Kubernetes Engine, you can follow these steps:
Create a Google Cloud Storage bucket to store the logs.
Modify the values.yaml
file for the Airflow chart to include the Google Cloud Storage bucket name as the data.logs.persistence.bucketName
value.
Deploy the Airflow chart using the modified values.yaml
file.
Configure Airflow to use the Google Cloud Storage bucket as the log handler. You can do this by modifying the airflow.cfg
file and adding the following section:
[handler_gcs]
class = airflow.contrib.hooks.gcs_log_upload_handler.GoogleCloudStorageUploadHandler
base_log_folder = gs://<bucket-name>/logs
airflow.cfg
file with the above configuration changes.After following these steps, the logs generated by Airflow will be stored in the Google Cloud Storage bucket, ensuring that they persist even if the Kubernetes pods are restarted or terminated.
Asked: 2022-03-12 11:00:00 +0000
Seen: 8 times
Last updated: Jun 21 '22