Datadog Collect Slice

Overview

Datadog is a monitoring service for cloud-scale applications, providing monitoring of servers, databases, tools, and services, through a SaaS-based data analytics platform.

The Datacoral Datadog slice collects data from a Datadog account and enables data flow of metrics into a data warehouse, such as Redshift.

Steps to add this slice to your installation

The steps to launch your slice are:

  1. Generate Datadog API keys
  2. Specify the slice config
  3. Add the Datadog slice

1. Generate Datadog API keys

Setup requirements

Before getting started please make sure to have the following information:

  • Access to an active Datadog account

Setup instructions

  1. Generate a new Application key a. Go to URL: https://app.datadoghq.com/account/settings#api b. Navigate to "Application Keys" section c. Enter "Datacoral" as App Key Name & click "Create Application Key" d. Copy generated key
  2. Copy API Key a. Go to URL: https://app.datadoghq.com/account/settings#api b. Navigate to "API Keys" section c. Enter "Datacoral" as API Key Name & click "Create API Key" d. Copy generated key

2. Specify the slice config

To get a template for the Datadog slice configuration save the output of the describe --input-parameters command as follows:

datacoral collect describe --slice-type datadog \
--input-parameters > datadog_parameters_file.json

Necessary input parameters:

  • api_key - your Datadog API token
  • app_key - your Datadog Application key

Optional input parameters:

  • schedule - in cron format (note: you can specify different schedules for 'metric_list' and 'metadata_list' to query the metrics and metadata at different rates)

  • filterByMetric - array of metric names that defines which metrics will be collected, if absent all active metrics will be queried

    Example templates:

  1. collect all active metrics
{
"api_key": "YOUR_API_KEY",
"app_key": "YOUR_APP_KEY",
"filterByMetric": [],
}
  1. collect only YOUR_METRIC_1 and YOUR_METRIC_2
{
"api_key": "YOUR_API_KEY",
"app_key": "YOUR_APP_KEY",
"filterByMetric": ["YOUR_METRIC_1", "YOUR_METRIC_2"],
}

Modify the datadog_parameters_file file to add the auth_token generated from Datadog

3. Add the Slice

datacoral collect add --slice-type datadog --slice-name <slice-name> --parameters-file <params-file>
  • slice-name Name of your slice. A schema with your slice-name is automatically created in your warehouse
  • params-file File path to your input parameters file. Ex. datadog_parameters_file.json

Supported load units

  • metric_list
  • metadata_list
  • metric
  • metadata

Notes

By default, the slice runs daily. If desired, you can change the slice configuration and specify different schedules for the metadata_list and metric_list loadunits. This way, updates on the metadata and metric are run at different rates.

Slice output

Output of this slice is stored in S3 and Redshift.

AWS S3 Data stored in AWS S3 is partitioned by date and time in the following bucket s3://datacoral-data-bucket/<sliceName>

AWS Redshift: Schema - schema name will be same as a slice-name. Tables produced by the slice are:

- schema.metadata_list
- schema.metadata
- schema.metric_list
- schema.metric

Questions? Interested?

If you have questions or feedback, feel free to reach out at hello@datacoral.co or Request a demo