Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

To ensure that the logs of Airflow using the Airflow community chart persist on Google Cloud Kubernetes Engine, you can follow these steps:

  1. Create a Google Cloud Storage bucket to store the logs.

  2. Modify the values.yaml file for the Airflow chart to include the Google Cloud Storage bucket name as the data.logs.persistence.bucketName value.

  3. Deploy the Airflow chart using the modified values.yaml file.

  4. Configure Airflow to use the Google Cloud Storage bucket as the log handler. You can do this by modifying the airflow.cfg file and adding the following section:

    [handler_gcs]
    class = airflow.contrib.hooks.gcs_log_upload_handler.GoogleCloudStorageUploadHandler
    base_log_folder = gs://<bucket-name>/logs
    
    1. Update the Airflow deployment in Kubernetes to include the airflow.cfg file with the above configuration changes.

    After following these steps, the logs generated by Airflow will be stored in the Google Cloud Storage bucket, ensuring that they persist even if the Kubernetes pods are restarted or terminated.