This check monitors Flink. Datadog collects Flink metrics through Flink's Datadog HTTP Reporter, which uses Datadog's HTTP API.
The Flink check is included in the Datadog Agent package. No additional installation is needed on your server.
-
Configure the Datadog HTTP Reporter in Flink.
In your
<FLINK_HOME>/conf/flink-conf.yaml
, add these lines, replacing<DATADOG_API_KEY>
with your Datadog API key:metrics.reporter.dghttp.factory.class: org.apache.flink.metrics.datadog.DatadogHttpReporterFactory metrics.reporter.dghttp.apikey: <DATADOG_API_KEY> metrics.reporter.dghttp.dataCenter: US #(optional) The data center (EU/US) to connect to, defaults to US.
-
Re-map system scopes in your
<FLINK_HOME>/conf/flink-conf.yaml
.metrics.scope.jm: flink.jobmanager metrics.scope.jm.job: flink.jobmanager.job metrics.scope.tm: flink.taskmanager metrics.scope.tm.job: flink.taskmanager.job metrics.scope.task: flink.task metrics.scope.operator: flink.operator
Note: The system scopes must be remapped for your Flink metrics to be supported, otherwise they are submitted as custom metrics.
-
Configure additional tags in
<FLINK_HOME>/conf/flink-conf.yaml
. Here is an example of custom tags:metrics.reporter.dghttp.scope.variables.additional: <KEY1>:<VALUE1>, <KEY1>:<VALUE2>
Note: By default, any variables in metric names are sent as tags, so there is no need to add custom tags for
job_id
,task_id
, etc. -
Restart Flink to start sending your Flink metrics to Datadog.
Available for Agent >6.0
-
Flink uses the
log4j
logger by default. To enable logging to a file, customize the format by editing thelog4j*.properties
configuration files in theconf/
directory of the Flink distribution. See the Flink logging documentation for information on which configuration file is relevant for your setup. See Flink's repository for default configurations. -
By default, the integration pipeline supports the following layout pattern:
%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n
An example of a valid timestamp is:
2020-02-03 18:43:12,251
.Clone and edit the integration pipeline if you have a different format.
-
Collecting logs is disabled by default in the Datadog Agent, enable it in your
datadog.yaml
file:logs_enabled: true
-
Uncomment and edit the logs configuration block in your
flink.d/conf.yaml
file. Change thepath
andservice
parameter values based on your environment. See the sample flink.d/conf.yaml for all available configuration options.logs: - type: file path: /var/log/flink/server.log source: flink service: myapp #To handle multi line that starts with yyyy-mm-dd use the following pattern #log_processing_rules: # - type: multi_line # pattern: \d{4}\-(0?[1-9]|1[012])\-(0?[1-9]|[12][0-9]|3[01]) # name: new_log_start_with_date
Run the Agent's status subcommand and look for flink
under the Checks section.
See metadata.csv for a list of metrics provided by this integration.
Flink does not include any service checks.
Flink does not include any events.
Need help? Contact Datadog support.