To ensure that the logs of Airflow using the Airflow community chart persist on Google Cloud Kubernetes Engine, you can follow these steps:
Create a Google Cloud Storage bucket to store the logs.
Modify the values.yaml
file for the Airflow chart to include the Google Cloud Storage bucket name as the data.logs.persistence.bucketName
value.
Deploy the Airflow chart using the modified values.yaml
file.
Configure Airflow to use the Google Cloud Storage bucket as the log handler. You can do this by modifying the airflow.cfg
file and adding the following section:
[handler_gcs]
class = airflow.contrib.hooks.gcs_log_upload_handler.GoogleCloudStorageUploadHandler
base_log_folder = gs://<bucket-name>/logs
airflow.cfg
file with the above configuration changes.After following these steps, the logs generated by Airflow will be stored in the Google Cloud Storage bucket, ensuring that they persist even if the Kubernetes pods are restarted or terminated.
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2022-03-12 11:00:00 +0000
Seen: 8 times
Last updated: Jun 21 '22
How can I delete Crash Report logs from the Rosetta terminal on a Mac?
How can I obtain comprehensive logs for Amazon Cognito after an unsuccessful authentication attempt?
How can one utilize a different Observable to determine when certain code should be run?
What is the method for accessing the logs of the Log Analytics agent extension for VMSS?
After a certain amount of time, Cloudwatch stops displaying logs from AWS lambda.