---
title: Getting Started with Datadog
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Infrastructure > Datadog Resource Catalog
---

# aws_transfer_workflow{% #aws_transfer_workflow %}

## `account_id`{% #account_id %}

**Type**: `STRING`

## `arn`{% #arn %}

**Type**: `STRING`**Provider name**: `Arn`**Description**: Specifies the unique Amazon Resource Name (ARN) for the workflow.

## `description`{% #description %}

**Type**: `STRING`**Provider name**: `Description`**Description**: Specifies the text description for the workflow.

## `on_exception_steps`{% #on_exception_steps %}

**Type**: `UNORDERED_LIST_STRUCT`**Provider name**: `OnExceptionSteps`**Description**: Specifies the steps (actions) to take if errors are encountered during execution of the workflow.

- `copy_step_details`**Type**: `STRUCT`**Provider name**: `CopyStepDetails`**Description**: Details for a step that performs a file copy. Consists of the following values:
  - A description
  - An Amazon S3 location for the destination of the file copy.
  - A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`.

  - `destination_file_location`**Type**: `STRUCT`**Provider name**: `DestinationFileLocation`**Description**: Specifies the location for the file being copied. Use `${Transfer:UserName}` or `${Transfer:UploadDate}` in this field to parametrize the destination prefix by username or uploaded date.
    - Set the value of `DestinationFileLocation` to `${Transfer:UserName}` to copy uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
    - Set the value of `DestinationFileLocation` to `${Transfer:UploadDate}` to copy uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.The system resolves `UploadDate` to a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.

    - `efs_file_location`**Type**: `STRUCT`**Provider name**: `EfsFileLocation`**Description**: Specifies the details for the Amazon Elastic File System (Amazon EFS) file that's being decrypted.
      - `file_system_id`**Type**: `STRING`**Provider name**: `FileSystemId`**Description**: The identifier of the file system, assigned by Amazon EFS.
      - `path`**Type**: `STRING`**Provider name**: `Path`**Description**: The pathname for the folder being used by a workflow.
    - `s3_file_location`**Type**: `STRUCT`**Provider name**: `S3FileLocation`**Description**: Specifies the details for the Amazon S3 file that's being copied or decrypted.
      - `bucket`**Type**: `STRING`**Provider name**: `Bucket`**Description**: Specifies the S3 bucket for the customer input file.
      - `key`**Type**: `STRING`**Provider name**: `Key`**Description**: The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `overwrite_existing`**Type**: `STRING`**Provider name**: `OverwriteExisting`**Description**: A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`. If the workflow is processing a file that has the same name as an existing file, the behavior is as follows:
    - If `OverwriteExisting` is `TRUE`, the existing file is replaced with the file being processed.
    - If `OverwriteExisting` is `FALSE`, nothing happens, and the workflow processing stops.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
- `custom_step_details`**Type**: `STRUCT`**Provider name**: `CustomStepDetails`**Description**: Details for a step that invokes an Lambda function. Consists of the Lambda function's name, target, and timeout (in seconds).
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
  - `target`**Type**: `STRING`**Provider name**: `Target`**Description**: The ARN for the Lambda function that is being called.
  - `timeout_seconds`**Type**: `INT32`**Provider name**: `TimeoutSeconds`**Description**: Timeout, in seconds, for the step.
- `decrypt_step_details`**Type**: `STRUCT`**Provider name**: `DecryptStepDetails`**Description**: Details for a step that decrypts an encrypted file. Consists of the following values:
  - A descriptive name
  - An Amazon S3 or Amazon Elastic File System (Amazon EFS) location for the source file to decrypt.
  - An S3 or Amazon EFS location for the destination of the file decryption.
  - A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`.
  - The type of encryption that's used. Currently, only PGP encryption is supported.

  - `destination_file_location`**Type**: `STRUCT`**Provider name**: `DestinationFileLocation`**Description**: Specifies the location for the file being decrypted. Use `${Transfer:UserName}` or `${Transfer:UploadDate}` in this field to parametrize the destination prefix by username or uploaded date.
    - Set the value of `DestinationFileLocation` to `${Transfer:UserName}` to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
    - Set the value of `DestinationFileLocation` to `${Transfer:UploadDate}` to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.The system resolves `UploadDate` to a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.

    - `efs_file_location`**Type**: `STRUCT`**Provider name**: `EfsFileLocation`**Description**: Specifies the details for the Amazon Elastic File System (Amazon EFS) file that's being decrypted.
      - `file_system_id`**Type**: `STRING`**Provider name**: `FileSystemId`**Description**: The identifier of the file system, assigned by Amazon EFS.
      - `path`**Type**: `STRING`**Provider name**: `Path`**Description**: The pathname for the folder being used by a workflow.
    - `s3_file_location`**Type**: `STRUCT`**Provider name**: `S3FileLocation`**Description**: Specifies the details for the Amazon S3 file that's being copied or decrypted.
      - `bucket`**Type**: `STRING`**Provider name**: `Bucket`**Description**: Specifies the S3 bucket for the customer input file.
      - `key`**Type**: `STRING`**Provider name**: `Key`**Description**: The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `overwrite_existing`**Type**: `STRING`**Provider name**: `OverwriteExisting`**Description**: A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`. If the workflow is processing a file that has the same name as an existing file, the behavior is as follows:
    - If `OverwriteExisting` is `TRUE`, the existing file is replaced with the file being processed.
    - If `OverwriteExisting` is `FALSE`, nothing happens, and the workflow processing stops.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
  - `type`**Type**: `STRING`**Provider name**: `Type`**Description**: The type of encryption used. Currently, this value must be `PGP`.
- `delete_step_details`**Type**: `STRUCT`**Provider name**: `DeleteStepDetails`**Description**: Details for a step that deletes the file.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
- `tag_step_details`**Type**: `STRUCT`**Provider name**: `TagStepDetails`**Description**: Details for a step that creates one or more tags. You specify one or more tags. Each tag contains a key-value pair.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
- `type`**Type**: `STRING`**Provider name**: `Type`**Description**: Currently, the following step types are supported.
  - `COPY` - Copy the file to another location.
  - `CUSTOM` - Perform a custom step with an Lambda function target.
  - `DECRYPT` - Decrypt a file that was encrypted before it was uploaded.
  - `DELETE` - Delete the file.
  - `TAG` - Add a tag to the file.

## `steps`{% #steps %}

**Type**: `UNORDERED_LIST_STRUCT`**Provider name**: `Steps`**Description**: Specifies the details for the steps that are in the specified workflow.

- `copy_step_details`**Type**: `STRUCT`**Provider name**: `CopyStepDetails`**Description**: Details for a step that performs a file copy. Consists of the following values:
  - A description
  - An Amazon S3 location for the destination of the file copy.
  - A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`.

  - `destination_file_location`**Type**: `STRUCT`**Provider name**: `DestinationFileLocation`**Description**: Specifies the location for the file being copied. Use `${Transfer:UserName}` or `${Transfer:UploadDate}` in this field to parametrize the destination prefix by username or uploaded date.
    - Set the value of `DestinationFileLocation` to `${Transfer:UserName}` to copy uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
    - Set the value of `DestinationFileLocation` to `${Transfer:UploadDate}` to copy uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.The system resolves `UploadDate` to a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.

    - `efs_file_location`**Type**: `STRUCT`**Provider name**: `EfsFileLocation`**Description**: Specifies the details for the Amazon Elastic File System (Amazon EFS) file that's being decrypted.
      - `file_system_id`**Type**: `STRING`**Provider name**: `FileSystemId`**Description**: The identifier of the file system, assigned by Amazon EFS.
      - `path`**Type**: `STRING`**Provider name**: `Path`**Description**: The pathname for the folder being used by a workflow.
    - `s3_file_location`**Type**: `STRUCT`**Provider name**: `S3FileLocation`**Description**: Specifies the details for the Amazon S3 file that's being copied or decrypted.
      - `bucket`**Type**: `STRING`**Provider name**: `Bucket`**Description**: Specifies the S3 bucket for the customer input file.
      - `key`**Type**: `STRING`**Provider name**: `Key`**Description**: The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `overwrite_existing`**Type**: `STRING`**Provider name**: `OverwriteExisting`**Description**: A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`. If the workflow is processing a file that has the same name as an existing file, the behavior is as follows:
    - If `OverwriteExisting` is `TRUE`, the existing file is replaced with the file being processed.
    - If `OverwriteExisting` is `FALSE`, nothing happens, and the workflow processing stops.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
- `custom_step_details`**Type**: `STRUCT`**Provider name**: `CustomStepDetails`**Description**: Details for a step that invokes an Lambda function. Consists of the Lambda function's name, target, and timeout (in seconds).
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
  - `target`**Type**: `STRING`**Provider name**: `Target`**Description**: The ARN for the Lambda function that is being called.
  - `timeout_seconds`**Type**: `INT32`**Provider name**: `TimeoutSeconds`**Description**: Timeout, in seconds, for the step.
- `decrypt_step_details`**Type**: `STRUCT`**Provider name**: `DecryptStepDetails`**Description**: Details for a step that decrypts an encrypted file. Consists of the following values:
  - A descriptive name
  - An Amazon S3 or Amazon Elastic File System (Amazon EFS) location for the source file to decrypt.
  - An S3 or Amazon EFS location for the destination of the file decryption.
  - A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`.
  - The type of encryption that's used. Currently, only PGP encryption is supported.

  - `destination_file_location`**Type**: `STRUCT`**Provider name**: `DestinationFileLocation`**Description**: Specifies the location for the file being decrypted. Use `${Transfer:UserName}` or `${Transfer:UploadDate}` in this field to parametrize the destination prefix by username or uploaded date.
    - Set the value of `DestinationFileLocation` to `${Transfer:UserName}` to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
    - Set the value of `DestinationFileLocation` to `${Transfer:UploadDate}` to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.The system resolves `UploadDate` to a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.

    - `efs_file_location`**Type**: `STRUCT`**Provider name**: `EfsFileLocation`**Description**: Specifies the details for the Amazon Elastic File System (Amazon EFS) file that's being decrypted.
      - `file_system_id`**Type**: `STRING`**Provider name**: `FileSystemId`**Description**: The identifier of the file system, assigned by Amazon EFS.
      - `path`**Type**: `STRING`**Provider name**: `Path`**Description**: The pathname for the folder being used by a workflow.
    - `s3_file_location`**Type**: `STRUCT`**Provider name**: `S3FileLocation`**Description**: Specifies the details for the Amazon S3 file that's being copied or decrypted.
      - `bucket`**Type**: `STRING`**Provider name**: `Bucket`**Description**: Specifies the S3 bucket for the customer input file.
      - `key`**Type**: `STRING`**Provider name**: `Key`**Description**: The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `overwrite_existing`**Type**: `STRING`**Provider name**: `OverwriteExisting`**Description**: A flag that indicates whether to overwrite an existing file of the same name. The default is `FALSE`. If the workflow is processing a file that has the same name as an existing file, the behavior is as follows:
    - If `OverwriteExisting` is `TRUE`, the existing file is replaced with the file being processed.
    - If `OverwriteExisting` is `FALSE`, nothing happens, and the workflow processing stops.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
  - `type`**Type**: `STRING`**Provider name**: `Type`**Description**: The type of encryption used. Currently, this value must be `PGP`.
- `delete_step_details`**Type**: `STRUCT`**Provider name**: `DeleteStepDetails`**Description**: Details for a step that deletes the file.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
- `tag_step_details`**Type**: `STRUCT`**Provider name**: `TagStepDetails`**Description**: Details for a step that creates one or more tags. You specify one or more tags. Each tag contains a key-value pair.
  - `name`**Type**: `STRING`**Provider name**: `Name`**Description**: The name of the step, used as an identifier.
  - `source_file_location`**Type**: `STRING`**Provider name**: `SourceFileLocation`**Description**: Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
    - To use the previous file as the input, enter `${previous.file}`. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
    - To use the originally uploaded file location as input for this step, enter `${original.file}`.
- `type`**Type**: `STRING`**Provider name**: `Type`**Description**: Currently, the following step types are supported.
  - `COPY` - Copy the file to another location.
  - `CUSTOM` - Perform a custom step with an Lambda function target.
  - `DECRYPT` - Decrypt a file that was encrypted before it was uploaded.
  - `DELETE` - Delete the file.
  - `TAG` - Add a tag to the file.

## `tags`{% #tags %}

**Type**: `UNORDERED_LIST_STRING`

## `workflow_id`{% #workflow_id %}

**Type**: `STRING`**Provider name**: `WorkflowId`**Description**: A unique identifier for the workflow.
