このページは日本語には対応しておりません。随時翻訳に取り組んでいます。
翻訳に関してご質問やご意見ございましたら、お気軽にご連絡ください

Use Observability Pipelines’ HTTP/S Client source to pull logs from the upstream HTTP/S server. Select and set up this source when you set up a pipeline.

Prerequisites

To use Observability Pipelines’s HTTP/S Client source, you need the following information available:

  1. The full path of the HTTP Server endpoint that the Observability Pipelines Worker collects log events from. For example, https://127.0.0.8/logs.
  2. The HTTP authentication token or password.

The HTTP/S Client source pulls data from your upstream HTTP server. Your HTTP server must support GET requests for the HTTP Client endpoint URL that you set as an environment variable when you install the Worker.

Set up the source in the pipeline UI

Select and set up this source when you set up a pipeline. The information below is for the source settings in the pipeline UI.

To configure your HTTP/S Client source:

  1. Select your authorization strategy.
  2. Select the decoder you want to use on the HTTP messages. Logs pulled from the HTTP source must be in this format.
  3. Optionally, toggle the switch to enable TLS. If you enable TLS, the following certificate and key files are required:
    • Server Certificate Path: The path to the certificate file that has been signed by your Certificate Authority (CA) Root File in DER or PEM (X.509) format.
    • CA Certificate Path: The path to the certificate file that is your Certificate Authority (CA) Root File in DER or PEM (X.509) format.
    • Private Key Path: The path to the .key private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
  4. Enter the interval between scrapes.
    • Your HTTP Server must be able to handle GET requests at this interval.
    • Since requests run concurrently, if a scrape takes longer than the interval given, a new scrape is started, which can consume extra resources. Set the timeout to a value lower than the scrape interval to prevent this from happening.
  5. Enter the timeout for each scrape request.