![]() Helm show values apache-airflow/airflow > values.yamlĪnd then in the values.yaml modify the logs section it's at the end of the file, and it's something like If you are working with the helm chart you can modify the values.yaml or use set tags to your helm upgrade command HTTPConnectionPool(host='examplepythonoperatorprintthecontext-4a6e6a1f11fd431f8c2a1dc081', port=8793): Max retries exceeded with url: /log/example_python_operator/print_the_context/T15:42:25+00:00/1.log (Caused by NewConnectionError(': Failed to establish a new connection: Name or service not known'))įirst, your ENV VARS need to follow this structureĪIRFLOW_VAR_CORE_REMOTE_BASE_LOG_FOLDER="wasb-airflow". *** Fetching from: *** Failed to fetch log file from worker. When I try to view the logs, this message is displayed: *** Log file does not exist: /usr/local/airflow/logs/example_python_operator/print_the_context/T15:42:25+00:00/1.log I've tested this connection using a WasbHook and was able to delete a dummy file with success. The wasb_default connection includes a login and password for the Azure Blob Storage account. I've configured my environment variables like this: AIRFLOW_CORE_REMOTE_BASE_LOG_FOLDER="wasb-airflow"ĪIRFLOW_CORE_REMOTE_LOG_CONN_ID="wasb_default" I would like to upload the logs to our Azure Blob Storage account. All tasks are now being executed successfully on our Kubernetes cluster, but the logs of these tasks are nowhere to be found. I slightly modified the puckel/docker-airflow image to be able to install the Kubernetes executor. I've deployed an Airflow instance on Kubernetes using the stable/airflow helm chart.
0 Comments
Leave a Reply. |