Skip to content

Latest commit

 

History

History

airflow

Agent Check: Airflow

Overview

The Datadog Agent collects many metrics from Airflow, including those for:

  • DAGs (Directed Acyclic Graphs): Number of DAG processes, DAG bag size, etc.
  • Tasks: Task failures, successes, killed, etc.
  • Pools: Open slots, used slots, etc.
  • Executors: Open slots, queued tasks, running tasks, etc.

Metrics are collected through the Airflow StatsD plugin and sent to Datadog's DogStatsD.

In addition to metrics, the Datadog Agent also sends service checks related to Airflow's health.

Setup

Installation

All steps below are needed for the Airflow integration to work properly. Before you begin, install the Datadog Agent version >=6.17 or >=7.17, which includes the StatsD/DogStatsD mapping feature.

Configuration

There are two parts of the Airflow integration:

  • The Datadog Agent portion, which makes requests to a provided endpoint for Airflow to report whether it can connect and is healthy. The Agent integration also queries Airflow to produce some of its own metrics.
  • The Airflow StatsD portion, where Airflow can be configured to send metrics to the Datadog Agent, which can remap the Airflow notation to a Datadog notation.

The Airflow integration's metrics come from both the Agent and StatsD portions.

Host

Configure Datadog Agent Airflow integration

Configure the Airflow check included in the Datadog Agent package to collect health metrics and service checks. This can be done by editing the url within the airflow.d/conf.yaml file, in the conf.d/ folder at the root of your Agent's configuration directory, to start collecting your Airflow service checks. See the sample airflow.d/conf.yaml for all available configuration options.

Ensure that url matches your Airflow webserver base_url, the URL used to connect to your Airflow instance.

Connect Airflow to DogStatsD

Connect Airflow to DogStatsD (included in the Datadog Agent) by using the Airflow statsd feature to collect metrics. For more information about the metrics reported by the Airflow version used and the additional configuration options, see the Airflow documentation below:

Note: Presence or absence of StatsD metrics reported by Airflow might vary depending on the Airflow Executor used. For example: airflow.ti_failures/successes, airflow.operator_failures/successes, airflow.dag.task.duration are not reported for KubernetesExecutor.

  1. Install the Airflow StatsD plugin.

    pip install 'apache-airflow[statsd]'
  2. Update the Airflow configuration file airflow.cfg by adding the following configs:

    Do not set `statsd_datadog_enabled` to true. Enabling `statsd_datadog_enabled` can create conflicts. To prevent issues, ensure that the variable is set to `False`.
    [scheduler]
    statsd_on = True
    # Hostname or IP of server running the Datadog Agent
    statsd_host = localhost
    # DogStatsD port configured in the Datadog Agent
    statsd_port = 8125
    statsd_prefix = airflow
    
  3. Update the Datadog Agent main configuration file datadog.yaml by adding the following configs:

    # dogstatsd_mapper_cache_size: 1000  # default to 1000
    dogstatsd_mapper_profiles:
      - name: airflow
        prefix: "airflow."
        mappings:
          - match: "airflow.*_start"
            name: "airflow.job.start"
            tags:
              job_name: "$1"
          - match: "airflow.*_end"
            name: "airflow.job.end"
            tags:
              job_name: "$1"
          - match: "airflow.*_heartbeat_failure"
            name: airflow.job.heartbeat.failure
            tags:
              job_name: "$1"
          - match: "airflow.operator_failures_*"
            name: "airflow.operator_failures"
            tags:
              operator_name: "$1"
          - match: "airflow.operator_successes_*"
            name: "airflow.operator_successes"
            tags:
              operator_name: "$1"
          - match: 'airflow\.dag_processing\.last_runtime\.(.*)'
            match_type: "regex"
            name: "airflow.dag_processing.last_runtime"
            tags:
              dag_file: "$1"
          - match: 'airflow\.dag_processing\.last_run\.seconds_ago\.(.*)'
            match_type: "regex"
            name: "airflow.dag_processing.last_run.seconds_ago"
            tags:
              dag_file: "$1"
          - match: 'airflow\.dag\.loading-duration\.(.*)'
            match_type: "regex"
            name: "airflow.dag.loading_duration"
            tags:
              dag_file: "$1"
          - match: "airflow.local_task_job.task_exit.*.*.*.*"
            name: "airflow.local_task_job.task_exit"
            tags:
              job_id: "$1"
              dag_id: "$2"
              task_id: "$3"
              return_code: "$4"
          - match: "airflow.dag.*.*.queue_duration"
            name: "airflow.dag.queue_duration"
            tags:
              dag_id: "$1"
              task_id: "$2"
          - match: "airflow.dag.*.*.scheduled_duration"
            name: "airflow.dag.scheduled_duration"
            tags:
              dag_id: "$1"
              task_id: "$2"
          - match: "airflow.pool.open_slots.*"
          - match: "airflow.dagrun.*.first_task_scheduling_delay"
            name: "airflow.dagrun.first_task_scheduling_delay"
            tags:
              dag_id: "$1"
          - match: "airflow.pool.open_slots.*"
            name: "airflow.pool.open_slots"
            tags:
              pool_name: "$1"
          - match: "airflow.pool.queued_slots.*"
            name: "airflow.pool.queued_slots"
            tags:
              pool_name: "$1"
          - match: "airflow.pool.running_slots.*"
            name: "airflow.pool.running_slots"
            tags:
              pool_name: "$1"
          - match: "airflow.pool.used_slots.*"
            name: "airflow.pool.used_slots"
            tags:
              pool_name: "$1"
          - match: "airflow.pool.starving_tasks.*"
            name: "airflow.pool.starving_tasks"
            tags:
              pool_name: "$1"
          - match: 'airflow\.dagrun\.dependency-check\.(.*)'
            match_type: "regex"
            name: "airflow.dagrun.dependency_check"
            tags:
              dag_id: "$1"
          - match: 'airflow\.dag\.(.*)\.([^.]*)\.duration'
            match_type: "regex"
            name: "airflow.dag.task.duration"
            tags:
              dag_id: "$1"
              task_id: "$2"
          - match: 'airflow\.dag_processing\.last_duration\.(.*)'
            match_type: "regex"
            name: "airflow.dag_processing.last_duration"
            tags:
              dag_file: "$1"
          - match: 'airflow\.dagrun\.duration\.success\.(.*)'
            match_type: "regex"
            name: "airflow.dagrun.duration.success"
            tags:
              dag_id: "$1"
          - match: 'airflow\.dagrun\.duration\.failed\.(.*)'
            match_type: "regex"
            name: "airflow.dagrun.duration.failed"
            tags:
              dag_id: "$1"
          - match: 'airflow\.dagrun\.schedule_delay\.(.*)'
            match_type: "regex"
            name: "airflow.dagrun.schedule_delay"
            tags:
              dag_id: "$1"
          - match: 'airflow.scheduler.tasks.running'
            name: "airflow.scheduler.tasks.running"
          - match: 'airflow.scheduler.tasks.starving'
            name: "airflow.scheduler.tasks.starving"
          - match: 'airflow.sla_email_notification_failure'
            name: 'airflow.sla_email_notification_failure'
          - match: 'airflow\.task_removed_from_dag\.(.*)'
            match_type: "regex"
            name: "airflow.dag.task_removed"
            tags:
              dag_id: "$1"
          - match: 'airflow\.task_restored_to_dag\.(.*)'
            match_type: "regex"
            name: "airflow.dag.task_restored"
            tags:
              dag_id: "$1"
          - match: "airflow.task_instance_created-*"
            name: "airflow.task.instance_created"
            tags:
              task_class: "$1"
          - match: 'airflow\.ti\.start\.(.+)\.(\w+)'
            match_type: regex
            name: airflow.ti.start
            tags:
              dag_id: "$1"
              task_id: "$2"
          - match: 'airflow\.ti\.finish\.(\w+)\.(.+)\.(\w+)'
            name: airflow.ti.finish
            match_type: regex
            tags:
              dag_id: "$1"
              task_id: "$2"
              state: "$3"
Restart Datadog Agent and Airflow
  1. Restart the Agent.
  2. Restart Airflow to start sending your Airflow metrics to the Agent DogStatsD endpoint.
Integration service checks

Use the default configuration in your airflow.d/conf.yaml file to activate your Airflow service checks. See the sample airflow.d/conf.yaml for all available configuration options.

Log collection

Available for Agent versions >6.0

  1. Collecting logs is disabled by default in the Datadog Agent. Enable it in your datadog.yaml file:

    logs_enabled: true
  2. Uncomment and edit this configuration block at the bottom of your airflow.d/conf.yaml: Change the path and service parameter values and configure them for your environment.

    • Configuration for DAG processor manager and Scheduler logs:

      logs:
        - type: file
          path: "<PATH_TO_AIRFLOW>/logs/dag_processor_manager/dag_processor_manager.log"
          source: airflow
          log_processing_rules:
            - type: multi_line
              name: new_log_start_with_date
              pattern: \[\d{4}\-\d{2}\-\d{2}
        - type: file
          path: "<PATH_TO_AIRFLOW>/logs/scheduler/latest/*.log"
          source: airflow
          log_processing_rules:
            - type: multi_line
              name: new_log_start_with_date
              pattern: \[\d{4}\-\d{2}\-\d{2}

      Regular clean up is recommended for scheduler logs with daily log rotation.

    • Additional configuration for DAG tasks logs:

      logs:
        - type: file
          path: "<PATH_TO_AIRFLOW>/logs/*/*/*/*.log"
          source: airflow
          log_processing_rules:
            - type: multi_line
              name: new_log_start_with_date
              pattern: \[\d{4}\-\d{2}\-\d{2}

      Caveat: By default Airflow uses this log file template for tasks: log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log. The number of log files grow quickly if not cleaned regularly. This pattern is used by Airflow UI to display logs individually for each executed task.

      If you do not view logs in Airflow UI, Datadog recommends this configuration in airflow.cfg: log_filename_template = dag_tasks.log. Then log rotate this file and use this configuration:

      logs:
        - type: file
          path: "<PATH_TO_AIRFLOW>/logs/dag_tasks.log"
          source: airflow
          log_processing_rules:
            - type: multi_line
              name: new_log_start_with_date
              pattern: \[\d{4}\-\d{2}\-\d{2}
  3. Restart the Agent.

Containerized

Configure Datadog Agent Airflow integration

For containerized environments, see the Autodiscovery Integration Templates for guidance on applying the parameters below.

Parameter Value
<INTEGRATION_NAME> airflow
<INIT_CONFIG> blank or {}
<INSTANCE_CONFIG> {"url": "http://%%host%%:8080"}

Ensure that url matches your Airflow webserver base_url, the URL used to connect to your Airflow instance. Replace localhost with the template variable %%host%%.

If you are using Airflow's Helm chart, this exposes the webserver as a ClusterIP service that you should use in the url parameter.

For example, your Autodiscovery annotations may look like the following:

apiVersion: v1
kind: Pod
# (...)
metadata:
  name: '<POD_NAME>'
  annotations:
    ad.datadoghq.com/<CONTAINER_IDENTIFIER>.checks: |
      {
        "airflow": {
          "instances": ["url": "http://airflow-ui.%%kube_namespace%%.svc.cluster.local:8080"]
        }
      }      
    # (...)
Connect Airflow to DogStatsD

Connect Airflow to DogStatsD (included in the Datadog Agent) by using the Airflow statsd feature to collect metrics. For more information about the metrics reported by the Airflow version used and the additional configuration options, see the Airflow documentation below:

Note: Presence or absence of StatsD metrics reported by Airflow might vary depending on the Airflow Executor used. For example: airflow.ti_failures/successes, airflow.operator_failures/successes, airflow.dag.task.duration are not reported for KubernetesExecutor.

Note: The environment variables used for Airflow may differ between versions. For example in Airflow 2.0.0 this utilizes the environment variable AIRFLOW__METRICS__STATSD_HOST, whereas Airflow 1.10.15 utilizes AIRFLOW__SCHEDULER__STATSD_HOST.

The Airflow StatsD configuration can be enabled with the following environment variables in a Kubernetes Deployment:

env:
  - name: AIRFLOW__SCHEDULER__STATSD_ON
    value: "True"
  - name: AIRFLOW__SCHEDULER__STATSD_PORT
    value: "8125"
  - name: AIRFLOW__SCHEDULER__STATSD_PREFIX
    value: "airflow"
  - name: AIRFLOW__SCHEDULER__STATSD_HOST
    valueFrom:
      fieldRef:
        fieldPath: status.hostIP

The environment variable for the host endpoint AIRFLOW__SCHEDULER__STATSD_HOST is supplied with the node's host IP address to route the StatsD data to the Datadog Agent pod on the same node as the Airflow pod. This setup also requires the Agent to have a hostPort open for this port 8125 and accepting non-local StatsD traffic. For more information, see DogStatsD on Kubernetes Setup.

This should direct the StatsD traffic from the Airflow container to a Datadog Agent ready to accept the incoming data. The last portion is to update the Datadog Agent with the corresponding dogstatsd_mapper_profiles . This can be done by copying the dogstatsd_mapper_profiles provided in the Host installation into your datadog.yaml file. Or by deploying your Datadog Agent with the equivalent JSON configuration in the environment variable DD_DOGSTATSD_MAPPER_PROFILES. With respect to Kubernetes the equivalent environment variable notation is:

env:
  - name: DD_DOGSTATSD_MAPPER_PROFILES
    value: >
      [{"name":"airflow","prefix":"airflow.","mappings":[{"match":"airflow.*_start","name":"airflow.job.start","tags":{"job_name":"$1"}},{"match":"airflow.*_end","name":"airflow.job.end","tags":{"job_name":"$1"}},{"match":"airflow.*_heartbeat_failure","name":"airflow.job.heartbeat.failure","tags":{"job_name":"$1"}},{"match":"airflow.operator_failures_*","name":"airflow.operator_failures","tags":{"operator_name":"$1"}},{"match":"airflow.operator_successes_*","name":"airflow.operator_successes","tags":{"operator_name":"$1"}},{"match":"airflow\\.dag_processing\\.last_runtime\\.(.*)","match_type":"regex","name":"airflow.dag_processing.last_runtime","tags":{"dag_file":"$1"}},{"match":"airflow\\.dag_processing\\.last_run\\.seconds_ago\\.(.*)","match_type":"regex","name":"airflow.dag_processing.last_run.seconds_ago","tags":{"dag_file":"$1"}},{"match":"airflow\\.dag\\.loading-duration\\.(.*)","match_type":"regex","name":"airflow.dag.loading_duration","tags":{"dag_file":"$1"}},{"match":"airflow.dagrun.*.first_task_scheduling_delay","name":"airflow.dagrun.first_task_scheduling_delay","tags":{"dag_id":"$1"}},{"match":"airflow.pool.open_slots.*","name":"airflow.pool.open_slots","tags":{"pool_name":"$1"}},{"match":"airflow.pool.queued_slots.*","name":"airflow.pool.queued_slots","tags":{"pool_name":"$1"}},{"match":"airflow.pool.running_slots.*","name":"airflow.pool.running_slots","tags":{"pool_name":"$1"}},{"match":"airflow.pool.used_slots.*","name":"airflow.pool.used_slots","tags":{"pool_name":"$1"}},{"match":"airflow.pool.starving_tasks.*","name":"airflow.pool.starving_tasks","tags":{"pool_name":"$1"}},{"match":"airflow\\.dagrun\\.dependency-check\\.(.*)","match_type":"regex","name":"airflow.dagrun.dependency_check","tags":{"dag_id":"$1"}},{"match":"airflow\\.dag\\.(.*)\\.([^.]*)\\.duration","match_type":"regex","name":"airflow.dag.task.duration","tags":{"dag_id":"$1","task_id":"$2"}},{"match":"airflow\\.dag_processing\\.last_duration\\.(.*)","match_type":"regex","name":"airflow.dag_processing.last_duration","tags":{"dag_file":"$1"}},{"match":"airflow\\.dagrun\\.duration\\.success\\.(.*)","match_type":"regex","name":"airflow.dagrun.duration.success","tags":{"dag_id":"$1"}},{"match":"airflow\\.dagrun\\.duration\\.failed\\.(.*)","match_type":"regex","name":"airflow.dagrun.duration.failed","tags":{"dag_id":"$1"}},{"match":"airflow\\.dagrun\\.schedule_delay\\.(.*)","match_type":"regex","name":"airflow.dagrun.schedule_delay","tags":{"dag_id":"$1"}},{"match":"airflow.scheduler.tasks.running","name":"airflow.scheduler.tasks.running"},{"match":"airflow.scheduler.tasks.starving","name":"airflow.scheduler.tasks.starving"},{"match":"airflow.sla_email_notification_failure","name":"airflow.sla_email_notification_failure"},{"match":"airflow\\.task_removed_from_dag\\.(.*)","match_type":"regex","name":"airflow.dag.task_removed","tags":{"dag_id":"$1"}},{"match":"airflow\\.task_restored_to_dag\\.(.*)","match_type":"regex","name":"airflow.dag.task_restored","tags":{"dag_id":"$1"}},{"match":"airflow.task_instance_created-*","name":"airflow.task.instance_created","tags":{"task_class":"$1"}},{"match":"airflow\\.ti\\.start\\.(.+)\\.(\\w+)","match_type":"regex","name":"airflow.ti.start","tags":{"dag_id":"$1","task_id":"$2"}},{"match":"airflow\\.ti\\.finish\\.(\\w+)\\.(.+)\\.(\\w+)","name":"airflow.ti.finish","match_type":"regex","tags":{"dag_id":"$1","task_id":"$2","state":"$3"}}]}]

To add non-static tags to the StatsD metrics, you must use DogStatsD mapper profiles. See an example mapper profile that adds service and env tags.

Log collection

Available for Agent versions >6.0

Collecting logs is disabled by default in the Datadog Agent. To enable it, see Kubernetes Log Collection.

Parameter Value
<LOG_CONFIG> {"source": "airflow", "service": "<YOUR_APP_NAME>"}

Validation

Run the Agent's status subcommand and look for airflow under the Checks section.

Annexe

Airflow DatadogHook

In addition, Airflow DatadogHook can be used to interact with Datadog:

  • Send Metric
  • Query Metric
  • Post Event

Data Collected

Metrics

See metadata.csv for a list of metrics provided by this check.

Note: airflow.healthy, airflow.can_connect, airflow.dag.task.total_running, and airflow.dag.task.ongoing_duration metrics are collected from the Agent portion of the integration. All other metrics come from StatsD.

Events

The Airflow check does not include any events.

Service Checks

See service_checks.json for a list of service checks provided by this integration.

Troubleshooting

HTTP 403 errors for Agent integration

You may need to configure parameters for the Datadog Agent to make authenticated requests to Airflow's API. Use one of the available configuration options.

Need help? Contact Datadog support.