Pipelines de logs
Rapport de recherche Datadog : Bilan sur l'adoption de l'informatique sans serveur Rapport : Bilan sur l'adoption de l'informatique sans serveur

Pipelines de logs

Les pipelines et les processeurs fonctionnent sur les logs entrants : ils effectuent le parsing de ces logs et les transforment en attributs structurés pour faciliter les requêtes.

  • Consultez la [page de configuration des pipelines] (https://app.datadoghq.com/logs/pipelines) pour obtenir une liste des pipelines et des processeurs actuellement configurés dans notre interface utilisateur.

  • Vous trouverez de plus amples informations concernant l’API pour les processeurs dans la documentation relative aux processeurs.

  • Pour en savoir plus sur les pipelines, consultez la documentation à ce sujet.

  • Remarque : ces endpoints sont uniquement disponibles pour les utilisateurs admin. Veillez à utiliser une clé d’application créée par un admin.

Les règles de parsing Grok peuvent affecter la sortie JSON et nécessitent de configurer les données renvoyées avant leur utilisation dans une requête. Par exemple, si vous utilisez les données renvoyées par une requête dans un autre corps de requête, et que vous avez une règle de parsing qui utilise une expression regex comme \s pour les espaces, vous devrez configurer tous les espaces échappés en tant que %{space} pour utiliser les données.

Create a pipeline

POST https://api.datadoghq.comhttps://api.datadoghq.eu/api/v1/logs/config/pipelines

Présentation

Create a pipeline in your organization.

Requête

Body Data (required)

Definition of the new pipeline.

Expand All

Champ

Type

Description

filter

object

Filter for logs.

query

string

The filter query.

id

string

ID of the pipeline.

is_enabled

boolean

Whether or not the pipeline is enabled.

is_read_only

boolean

Whether or not the pipeline can be edited.

name [required]

string

Name of the pipeline.

processors

[object]

Ordered list of processors in this pipeline.

type

string

Type of pipeline.

{
  "filter": {
    "query": "source:python"
  },
  "is_enabled": false,
  "name": "",
  "processors": []
}

Réponse

OK

Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying.

Note: These endpoints are only available for admin users. Make sure to use an application key created by an admin.

Expand All

Champ

Type

Description

filter

object

Filter for logs.

query

string

The filter query.

id

string

ID of the pipeline.

is_enabled

boolean

Whether or not the pipeline is enabled.

is_read_only

boolean

Whether or not the pipeline can be edited.

name [required]

string

Name of the pipeline.

processors

[object]

Ordered list of processors in this pipeline.

type

string

Type of pipeline.

{
  "filter": {
    "query": "source:python"
  },
  "id": "string",
  "is_enabled": false,
  "is_read_only": false,
  "name": "",
  "processors": [],
  "type": "pipeline"
}

Bad Request

Response returned by the Logs API when errors occur.

Expand All

Champ

Type

Description

error

object

Error returned by the Logs API

code

string

Code identifying the error

details

[object]

Additional error details

message

string

Error message

{
  "error": {
    "code": "string",
    "details": [],
    "message": "string"
  }
}

Forbidden

Error response object.

Expand All

Champ

Type

Description

errors [required]

[string]

Array of errors returned by the API.

{
  "errors": [
    "Bad Request"
  ]
}

Exemple de code


                                        # Curl command
curl -X POST "https://api.datadoghq.com"https://api.datadoghq.eu/api/v1/logs/config/pipelines" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_CLIENT_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_CLIENT_APP_KEY}" \
-d @- << EOF
{
  "name": ""
}
EOF
package main

import (
    "context"
    "fmt"
    "os"
    datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )

    body := *datadog.NewLogsPipeline("Name_example") // LogsPipeline | Definition of the new pipeline.

    configuration := datadog.NewConfiguration()
    api_client := datadog.NewAPIClient(configuration)
    resp, r, err := api_client.LogsPipelinesApi.CreateLogsPipeline(context.Background()).Body(body).Execute()
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.CreateLogsPipeline``: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
    // response from `CreateLogsPipeline`: LogsPipeline
    fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.CreateLogsPipeline`: %v\n", resp)
}
// Import classes:
import com.datadog.api.v1.client.ApiClient;
import com.datadog.api.v1.client.ApiException;
import com.datadog.api.v1.client.Configuration;
import com.datadog.api.v1.client.auth.*;
import com.datadog.api.v1.client.model.*;
import com.datadog.api.v1.client.api.LogsPipelinesApi;

public class Example {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        // Configure the Datadog site to send API calls to
        HashMap<String, String> serverVariables = new HashMap<String, String>();
        String site = System.getenv("DD_SITE");
        if (site != null) {
            serverVariables.put("site", site);
            defaultClient.setServerVariables(serverVariables);
        }
        // Configure API key authorization: 
        HashMap<String, String> secrets = new HashMap<String, String>();
        secrets.put("apiKeyAuth", System.getenv("DD_CLIENT_API_KEY"));
        secrets.put("appKeyAuth", System.getenv("DD_CLIENT_APP_KEY"));
        defaultClient.configureApiKeys(secrets);

        LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient);
        LogsPipeline body = new LogsPipeline(); // LogsPipeline | Definition of the new pipeline.
        try {
            LogsPipeline result = api.createLogsPipeline()
                .body(body)
                .execute();
            System.out.println(result);
        } catch (ApiException e) {
            System.err.println("Exception when calling LogsPipelinesApi#createLogsPipeline");
            System.err.println("Status code: " + e.getCode());
            System.err.println("Reason: " + e.getResponseBody());
            System.err.println("Response headers: " + e.getResponseHeaders());
            e.printStackTrace();
        }
    }
}

Delete a pipeline

DELETE https://api.datadoghq.comhttps://api.datadoghq.eu/api/v1/logs/config/pipelines/{pipeline_id}

Présentation

Delete a given pipeline from your organization. This endpoint takes no JSON arguments.

Arguments

Paramètres du chemin

Nom

Type

Description

pipeline_id [required]

string

ID of the pipeline to delete.

Réponse

OK

Bad Request

Response returned by the Logs API when errors occur.

Expand All

Champ

Type

Description

error

object

Error returned by the Logs API

code

string

Code identifying the error

details

[object]

Additional error details

message

string

Error message

{
  "error": {
    "code": "string",
    "details": [],
    "message": "string"
  }
}

Forbidden

Error response object.

Expand All

Champ

Type

Description

errors [required]

[string]

Array of errors returned by the API.

{
  "errors": [
    "Bad Request"
  ]
}

Exemple de code


                                        # Path parameters
export pipeline_id="CHANGE_ME"
# Curl command curl -X DELETE "https://api.datadoghq.com"https://api.datadoghq.eu/api/v1/logs/config/pipelines/${pipeline_id}" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_CLIENT_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_CLIENT_APP_KEY}"
package main

import (
    "context"
    "fmt"
    "os"
    datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )

    pipelineId := "pipelineId_example" // string | ID of the pipeline to delete.

    configuration := datadog.NewConfiguration()
    api_client := datadog.NewAPIClient(configuration)
    resp, r, err := api_client.LogsPipelinesApi.DeleteLogsPipeline(context.Background(), pipelineId).Execute()
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.DeleteLogsPipeline``: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
}
// Import classes:
import com.datadog.api.v1.client.ApiClient;
import com.datadog.api.v1.client.ApiException;
import com.datadog.api.v1.client.Configuration;
import com.datadog.api.v1.client.auth.*;
import com.datadog.api.v1.client.model.*;
import com.datadog.api.v1.client.api.LogsPipelinesApi;

public class Example {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        // Configure the Datadog site to send API calls to
        HashMap<String, String> serverVariables = new HashMap<String, String>();
        String site = System.getenv("DD_SITE");
        if (site != null) {
            serverVariables.put("site", site);
            defaultClient.setServerVariables(serverVariables);
        }
        // Configure API key authorization: 
        HashMap<String, String> secrets = new HashMap<String, String>();
        secrets.put("apiKeyAuth", System.getenv("DD_CLIENT_API_KEY"));
        secrets.put("appKeyAuth", System.getenv("DD_CLIENT_APP_KEY"));
        defaultClient.configureApiKeys(secrets);

        LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient);
        String pipelineId = "pipelineId_example"; // String | ID of the pipeline to delete.
        try {
            api.deleteLogsPipeline(pipelineId)
                .execute();
        } catch (ApiException e) {
            System.err.println("Exception when calling LogsPipelinesApi#deleteLogsPipeline");
            System.err.println("Status code: " + e.getCode());
            System.err.println("Reason: " + e.getResponseBody());
            System.err.println("Response headers: " + e.getResponseHeaders());
            e.printStackTrace();
        }
    }
}

Get a pipeline

GET https://api.datadoghq.comhttps://api.datadoghq.eu/api/v1/logs/config/pipelines/{pipeline_id}

Présentation

Get a specific pipeline from your organization. This endpoint takes no JSON arguments.

Arguments

Paramètres du chemin

Nom

Type

Description

pipeline_id [required]

string

ID of the pipeline to get.

Réponse

OK

Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying.

Note: These endpoints are only available for admin users. Make sure to use an application key created by an admin.

Expand All

Champ

Type

Description

filter

object

Filter for logs.

query

string

The filter query.

id

string

ID of the pipeline.

is_enabled

boolean

Whether or not the pipeline is enabled.

is_read_only

boolean

Whether or not the pipeline can be edited.

name [required]

string

Name of the pipeline.

processors

[object]

Ordered list of processors in this pipeline.

type

string

Type of pipeline.

{
  "filter": {
    "query": "source:python"
  },
  "id": "string",
  "is_enabled": false,
  "is_read_only": false,
  "name": "",
  "processors": [],
  "type": "pipeline"
}

Bad Request

Response returned by the Logs API when errors occur.

Expand All

Champ

Type

Description

error

object

Error returned by the Logs API

code

string

Code identifying the error

details

[object]

Additional error details

message

string

Error message

{
  "error": {
    "code": "string",
    "details": [],
    "message": "string"
  }
}

Forbidden

Error response object.

Expand All

Champ

Type

Description

errors [required]

[string]

Array of errors returned by the API.

{
  "errors": [
    "Bad Request"
  ]
}

Exemple de code


                                        # Path parameters
export pipeline_id="CHANGE_ME"
# Curl command curl -X GET "https://api.datadoghq.com"https://api.datadoghq.eu/api/v1/logs/config/pipelines/${pipeline_id}" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_CLIENT_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_CLIENT_APP_KEY}"
package main

import (
    "context"
    "fmt"
    "os"
    datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )

    pipelineId := "pipelineId_example" // string | ID of the pipeline to get.

    configuration := datadog.NewConfiguration()
    api_client := datadog.NewAPIClient(configuration)
    resp, r, err := api_client.LogsPipelinesApi.GetLogsPipeline(context.Background(), pipelineId).Execute()
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.GetLogsPipeline``: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
    // response from `GetLogsPipeline`: LogsPipeline
    fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.GetLogsPipeline`: %v\n", resp)
}
// Import classes:
import com.datadog.api.v1.client.ApiClient;
import com.datadog.api.v1.client.ApiException;
import com.datadog.api.v1.client.Configuration;
import com.datadog.api.v1.client.auth.*;
import com.datadog.api.v1.client.model.*;
import com.datadog.api.v1.client.api.LogsPipelinesApi;

public class Example {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        // Configure the Datadog site to send API calls to
        HashMap<String, String> serverVariables = new HashMap<String, String>();
        String site = System.getenv("DD_SITE");
        if (site != null) {
            serverVariables.put("site", site);
            defaultClient.setServerVariables(serverVariables);
        }
        // Configure API key authorization: 
        HashMap<String, String> secrets = new HashMap<String, String>();
        secrets.put("apiKeyAuth", System.getenv("DD_CLIENT_API_KEY"));
        secrets.put("appKeyAuth", System.getenv("DD_CLIENT_APP_KEY"));
        defaultClient.configureApiKeys(secrets);

        LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient);
        String pipelineId = "pipelineId_example"; // String | ID of the pipeline to get.
        try {
            LogsPipeline result = api.getLogsPipeline(pipelineId)
                .execute();
            System.out.println(result);
        } catch (ApiException e) {
            System.err.println("Exception when calling LogsPipelinesApi#getLogsPipeline");
            System.err.println("Status code: " + e.getCode());
            System.err.println("Reason: " + e.getResponseBody());
            System.err.println("Response headers: " + e.getResponseHeaders());
            e.printStackTrace();
        }
    }
}

Get all pipelines

GET https://api.datadoghq.comhttps://api.datadoghq.eu/api/v1/logs/config/pipelines

Présentation

Get all pipelines from your organization. This endpoint takes no JSON arguments.

Réponse

OK

Array of pipeline ID strings.

Expand All

Champ

Type

Description

filter

object

Filter for logs.

query

string

The filter query.

id

string

ID of the pipeline.

is_enabled

boolean

Whether or not the pipeline is enabled.

is_read_only

boolean

Whether or not the pipeline can be edited.

name

string

Name of the pipeline.

processors

[object]

Ordered list of processors in this pipeline.

type

string

Type of pipeline.

{
  "filter": {
    "query": "source:python"
  },
  "id": "string",
  "is_enabled": false,
  "is_read_only": false,
  "name": "",
  "processors": [],
  "type": "pipeline"
}

Forbidden

Error response object.

Expand All

Champ

Type

Description

errors [required]

[string]

Array of errors returned by the API.

{
  "errors": [
    "Bad Request"
  ]
}

Exemple de code


                                        # Curl command
curl -X GET "https://api.datadoghq.com"https://api.datadoghq.eu/api/v1/logs/config/pipelines" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_CLIENT_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_CLIENT_APP_KEY}"
package main

import (
    "context"
    "fmt"
    "os"
    datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )


    configuration := datadog.NewConfiguration()
    api_client := datadog.NewAPIClient(configuration)
    resp, r, err := api_client.LogsPipelinesApi.ListLogsPipelines(context.Background()).Execute()
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.ListLogsPipelines``: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
    // response from `ListLogsPipelines`: []LogsPipeline
    fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.ListLogsPipelines`: %v\n", resp)
}
// Import classes:
import com.datadog.api.v1.client.ApiClient;
import com.datadog.api.v1.client.ApiException;
import com.datadog.api.v1.client.Configuration;
import com.datadog.api.v1.client.auth.*;
import com.datadog.api.v1.client.model.*;
import com.datadog.api.v1.client.api.LogsPipelinesApi;

public class Example {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        // Configure the Datadog site to send API calls to
        HashMap<String, String> serverVariables = new HashMap<String, String>();
        String site = System.getenv("DD_SITE");
        if (site != null) {
            serverVariables.put("site", site);
            defaultClient.setServerVariables(serverVariables);
        }
        // Configure API key authorization: 
        HashMap<String, String> secrets = new HashMap<String, String>();
        secrets.put("apiKeyAuth", System.getenv("DD_CLIENT_API_KEY"));
        secrets.put("appKeyAuth", System.getenv("DD_CLIENT_APP_KEY"));
        defaultClient.configureApiKeys(secrets);

        LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient);
        try {
            List<LogsPipeline> result = api.listLogsPipelines()
                .execute();
            System.out.println(result);
        } catch (ApiException e) {
            System.err.println("Exception when calling LogsPipelinesApi#listLogsPipelines");
            System.err.println("Status code: " + e.getCode());
            System.err.println("Reason: " + e.getResponseBody());
            System.err.println("Response headers: " + e.getResponseHeaders());
            e.printStackTrace();
        }
    }
}

Get pipeline order

GET https://api.datadoghq.comhttps://api.datadoghq.eu/api/v1/logs/config/pipeline-order

Présentation

Get the current order of your pipelines. This endpoint takes no JSON arguments.

Réponse

OK

Object containing the ordered list of pipeline IDs.

Expand All

Champ

Type

Description

pipeline_ids [required]

[string]

Ordered Array of <PIPELINE_ID> strings, the order of pipeline IDs in the array define the overall Pipelines order for Datadog.

{
  "pipeline_ids": [
    "tags",
    "org_ids",
    "products"
  ]
}

Forbidden

Error response object.

Expand All

Champ

Type

Description

errors [required]

[string]

Array of errors returned by the API.

{
  "errors": [
    "Bad Request"
  ]
}

Exemple de code


                                        # Curl command
curl -X GET "https://api.datadoghq.com"https://api.datadoghq.eu/api/v1/logs/config/pipeline-order" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_CLIENT_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_CLIENT_APP_KEY}"
package main

import (
    "context"
    "fmt"
    "os"
    datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )


    configuration := datadog.NewConfiguration()
    api_client := datadog.NewAPIClient(configuration)
    resp, r, err := api_client.LogsPipelinesApi.GetLogsPipelineOrder(context.Background()).Execute()
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.GetLogsPipelineOrder``: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
    // response from `GetLogsPipelineOrder`: LogsPipelinesOrder
    fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.GetLogsPipelineOrder`: %v\n", resp)
}
// Import classes:
import com.datadog.api.v1.client.ApiClient;
import com.datadog.api.v1.client.ApiException;
import com.datadog.api.v1.client.Configuration;
import com.datadog.api.v1.client.auth.*;
import com.datadog.api.v1.client.model.*;
import com.datadog.api.v1.client.api.LogsPipelinesApi;

public class Example {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        // Configure the Datadog site to send API calls to
        HashMap<String, String> serverVariables = new HashMap<String, String>();
        String site = System.getenv("DD_SITE");
        if (site != null) {
            serverVariables.put("site", site);
            defaultClient.setServerVariables(serverVariables);
        }
        // Configure API key authorization: 
        HashMap<String, String> secrets = new HashMap<String, String>();
        secrets.put("apiKeyAuth", System.getenv("DD_CLIENT_API_KEY"));
        secrets.put("appKeyAuth", System.getenv("DD_CLIENT_APP_KEY"));
        defaultClient.configureApiKeys(secrets);

        LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient);
        try {
            LogsPipelinesOrder result = api.getLogsPipelineOrder()
                .execute();
            System.out.println(result);
        } catch (ApiException e) {
            System.err.println("Exception when calling LogsPipelinesApi#getLogsPipelineOrder");
            System.err.println("Status code: " + e.getCode());
            System.err.println("Reason: " + e.getResponseBody());
            System.err.println("Response headers: " + e.getResponseHeaders());
            e.printStackTrace();
        }
    }
}

Update a pipeline

PUT https://api.datadoghq.comhttps://api.datadoghq.eu/api/v1/logs/config/pipelines/{pipeline_id}

Présentation

Update a given pipeline configuration to change it’s processors or their order.

Note: Using this method updates your pipeline configuration by replacing your current configuration with the new one sent to your Datadog organization.

Arguments

Paramètres du chemin

Nom

Type

Description

pipeline_id [required]

string

ID of the pipeline to delete.

Requête

Body Data (required)

New definition of the pipeline.

Expand All

Champ

Type

Description

filter

object

Filter for logs.

query

string

The filter query.

id

string

ID of the pipeline.

is_enabled

boolean

Whether or not the pipeline is enabled.

is_read_only

boolean

Whether or not the pipeline can be edited.

name [required]

string

Name of the pipeline.

processors

[object]

Ordered list of processors in this pipeline.

type

string

Type of pipeline.

{
  "filter": {
    "query": "source:python"
  },
  "is_enabled": false,
  "name": "",
  "processors": []
}

Réponse

OK

Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying.

Note: These endpoints are only available for admin users. Make sure to use an application key created by an admin.

Expand All

Champ

Type

Description

filter

object

Filter for logs.

query

string

The filter query.

id

string

ID of the pipeline.

is_enabled

boolean

Whether or not the pipeline is enabled.

is_read_only

boolean

Whether or not the pipeline can be edited.

name [required]

string

Name of the pipeline.

processors

[object]

Ordered list of processors in this pipeline.

type

string

Type of pipeline.

{
  "filter": {
    "query": "source:python"
  },
  "id": "string",
  "is_enabled": false,
  "is_read_only": false,
  "name": "",
  "processors": [],
  "type": "pipeline"
}

Bad Request

Response returned by the Logs API when errors occur.

Expand All

Champ

Type

Description

error

object

Error returned by the Logs API

code

string

Code identifying the error

details

[object]

Additional error details

message

string

Error message

{
  "error": {
    "code": "string",
    "details": [],
    "message": "string"
  }
}

Forbidden

Error response object.

Expand All

Champ

Type

Description

errors [required]

[string]

Array of errors returned by the API.

{
  "errors": [
    "Bad Request"
  ]
}

Exemple de code


                                        # Path parameters
export pipeline_id="CHANGE_ME"
# Curl command curl -X PUT "https://api.datadoghq.com"https://api.datadoghq.eu/api/v1/logs/config/pipelines/${pipeline_id}" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_CLIENT_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_CLIENT_APP_KEY}" \ -d @- << EOF { "name": "" } EOF
package main

import (
    "context"
    "fmt"
    "os"
    datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )

    pipelineId := "pipelineId_example" // string | ID of the pipeline to delete.
    body := *datadog.NewLogsPipeline("Name_example") // LogsPipeline | New definition of the pipeline.

    configuration := datadog.NewConfiguration()
    api_client := datadog.NewAPIClient(configuration)
    resp, r, err := api_client.LogsPipelinesApi.UpdateLogsPipeline(context.Background(), pipelineId).Body(body).Execute()
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.UpdateLogsPipeline``: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
    // response from `UpdateLogsPipeline`: LogsPipeline
    fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.UpdateLogsPipeline`: %v\n", resp)
}
// Import classes:
import com.datadog.api.v1.client.ApiClient;
import com.datadog.api.v1.client.ApiException;
import com.datadog.api.v1.client.Configuration;
import com.datadog.api.v1.client.auth.*;
import com.datadog.api.v1.client.model.*;
import com.datadog.api.v1.client.api.LogsPipelinesApi;

public class Example {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        // Configure the Datadog site to send API calls to
        HashMap<String, String> serverVariables = new HashMap<String, String>();
        String site = System.getenv("DD_SITE");
        if (site != null) {
            serverVariables.put("site", site);
            defaultClient.setServerVariables(serverVariables);
        }
        // Configure API key authorization: 
        HashMap<String, String> secrets = new HashMap<String, String>();
        secrets.put("apiKeyAuth", System.getenv("DD_CLIENT_API_KEY"));
        secrets.put("appKeyAuth", System.getenv("DD_CLIENT_APP_KEY"));
        defaultClient.configureApiKeys(secrets);

        LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient);
        String pipelineId = "pipelineId_example"; // String | ID of the pipeline to delete.
        LogsPipeline body = new LogsPipeline(); // LogsPipeline | New definition of the pipeline.
        try {
            LogsPipeline result = api.updateLogsPipeline(pipelineId)
                .body(body)
                .execute();
            System.out.println(result);
        } catch (ApiException e) {
            System.err.println("Exception when calling LogsPipelinesApi#updateLogsPipeline");
            System.err.println("Status code: " + e.getCode());
            System.err.println("Reason: " + e.getResponseBody());
            System.err.println("Response headers: " + e.getResponseHeaders());
            e.printStackTrace();
        }
    }
}

Update pipeline order

PUT https://api.datadoghq.comhttps://api.datadoghq.eu/api/v1/logs/config/pipeline-order

Présentation

Update the order of your pipelines. Since logs are processed sequentially, reordering a pipeline may change the structure and content of the data processed by other pipelines and their processors.

Note: Using the PUT method updates your pipeline order by replacing your current order with the new one sent to your Datadog organization.

Requête

Body Data (required)

Object containing the new ordered list of pipeline IDs.

Expand All

Champ

Type

Description

pipeline_ids [required]

[string]

Ordered Array of <PIPELINE_ID> strings, the order of pipeline IDs in the array define the overall Pipelines order for Datadog.

{
  "pipeline_ids": [
    "tags",
    "org_ids",
    "products"
  ]
}

Réponse

OK

Object containing the ordered list of pipeline IDs.

Expand All

Champ

Type

Description

pipeline_ids [required]

[string]

Ordered Array of <PIPELINE_ID> strings, the order of pipeline IDs in the array define the overall Pipelines order for Datadog.

{
  "pipeline_ids": [
    "tags",
    "org_ids",
    "products"
  ]
}

Bad Request

Response returned by the Logs API when errors occur.

Expand All

Champ

Type

Description

error

object

Error returned by the Logs API

code

string

Code identifying the error

details

[object]

Additional error details

message

string

Error message

{
  "error": {
    "code": "string",
    "details": [],
    "message": "string"
  }
}

Forbidden

Error response object.

Expand All

Champ

Type

Description

errors [required]

[string]

Array of errors returned by the API.

{
  "errors": [
    "Bad Request"
  ]
}

Unprocessable Entity

Response returned by the Logs API when errors occur.

Expand All

Champ

Type

Description

error

object

Error returned by the Logs API

code

string

Code identifying the error

details

[object]

Additional error details

message

string

Error message

{
  "error": {
    "code": "string",
    "details": [],
    "message": "string"
  }
}

Exemple de code


                                        # Curl command
curl -X PUT "https://api.datadoghq.com"https://api.datadoghq.eu/api/v1/logs/config/pipeline-order" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_CLIENT_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_CLIENT_APP_KEY}" \
-d @- << EOF
{
  "pipeline_ids": [
    "tags",
    "org_ids",
    "products"
  ]
}
EOF
package main

import (
    "context"
    "fmt"
    "os"
    datadog "github.com/DataDog/datadog-api-client-go/api/v1/datadog"
)

func main() {
    ctx := context.WithValue(
        context.Background(),
        datadog.ContextAPIKeys,
        map[string]datadog.APIKey{
            "apiKeyAuth": {
                Key: os.Getenv("DD_CLIENT_API_KEY"),
            },
            "appKeyAuth": {
                Key: os.Getenv("DD_CLIENT_APP_KEY"),
            },
        },
    )

    body := *datadog.NewLogsPipelinesOrder([]string{"PipelineIds_example"}) // LogsPipelinesOrder | Object containing the new ordered list of pipeline IDs.

    configuration := datadog.NewConfiguration()
    api_client := datadog.NewAPIClient(configuration)
    resp, r, err := api_client.LogsPipelinesApi.UpdateLogsPipelineOrder(context.Background()).Body(body).Execute()
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.UpdateLogsPipelineOrder``: %v\n", err)
        fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
    }
    // response from `UpdateLogsPipelineOrder`: LogsPipelinesOrder
    fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.UpdateLogsPipelineOrder`: %v\n", resp)
}
// Import classes:
import com.datadog.api.v1.client.ApiClient;
import com.datadog.api.v1.client.ApiException;
import com.datadog.api.v1.client.Configuration;
import com.datadog.api.v1.client.auth.*;
import com.datadog.api.v1.client.model.*;
import com.datadog.api.v1.client.api.LogsPipelinesApi;

public class Example {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        // Configure the Datadog site to send API calls to
        HashMap<String, String> serverVariables = new HashMap<String, String>();
        String site = System.getenv("DD_SITE");
        if (site != null) {
            serverVariables.put("site", site);
            defaultClient.setServerVariables(serverVariables);
        }
        // Configure API key authorization: 
        HashMap<String, String> secrets = new HashMap<String, String>();
        secrets.put("apiKeyAuth", System.getenv("DD_CLIENT_API_KEY"));
        secrets.put("appKeyAuth", System.getenv("DD_CLIENT_APP_KEY"));
        defaultClient.configureApiKeys(secrets);

        LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient);
        LogsPipelinesOrder body = new LogsPipelinesOrder(); // LogsPipelinesOrder | Object containing the new ordered list of pipeline IDs.
        try {
            LogsPipelinesOrder result = api.updateLogsPipelineOrder()
                .body(body)
                .execute();
            System.out.println(result);
        } catch (ApiException e) {
            System.err.println("Exception when calling LogsPipelinesApi#updateLogsPipelineOrder");
            System.err.println("Status code: " + e.getCode());
            System.err.println("Reason: " + e.getResponseBody());
            System.err.println("Response headers: " + e.getResponseHeaders());
            e.printStackTrace();
        }
    }
}