Getting Started with Notebooks Analysis Features

Join the Preview!

Advanced Analysis is in Preview. To enable, reach out to your Customer Success Manager.

Overview

The workspace datasets

This example notebook walks through a multi-step analysis that transforms and visualizes log and reference data. It begins by importing several data sources, then processes and enriches them through parsing, joins, and transformations to build derived datasets. The walkthrough concludes with a visualization that highlights the final results, helping illustrate the full data journey from raw logs to structured insights.

This diagram shows how the data flows through each transformation and analysis step.

A flowchart showing the steps that the data sources go through

Step-by-Step walkthrough on building the analysis

1. Importing data sources

The analysis begins with two primary log data sources:

  • trade_start_logs: Contains information about trade initiation
  • trade_execution_logs: Contains details about trade execution

2. Extracting transaction IDs

The first transformation uses a transform cell to create parsed_execution_logs. This cell applies grok parsing syntax to extract transaction IDs from the message column of trade_execution_logs, creating a new transaction_id column:

transaction %{notSpace:transaction_id}

An example of the resulting parsed_execution_logs dataset:

timestamphostmessagetransaction_id
May 29 11:09:28.000shopist.internalExecuting trade for transaction 5651956519
May 29 10:59:29.000shopist.internalExecuting trade for transaction 2326923269
May 29 10:58:54.000shopist.internalExecuting trade for transaction 9687096870
May 31 12:20:01.152shopist.internalExecuting trade for transaction 8020780207

3. Joining trade start and execution logs

The next step uses an analysis cell to create transaction_record. This SQL query selects specific columns from both datasets, transforms the status field (converting ‘INFO’ to ‘OK’), and joins the datasets on transaction_id:

SELECT
    start_logs.timestamp,
    start_logs.customer_id,
    start_logs.transaction_id,
    start_logs.dollar_value,
    CASE
        WHEN executed_logs.status = 'INFO' THEN 'OK'
        ELSE executed_logs.status
    END AS status
FROM
    trade_start_logs AS start_logs
JOIN
    trade_execution_logs AS executed_logs
ON
    start_logs.transaction_id = executed_logs.transaction_id;

An example of the resulting transaction_record dataset:

timestampcustomer_idtransaction_iddollar_valuestatus
May 29 11:09:28.00092446085cc56c-a54f838.32OK
May 29 10:59:29.00078037b1fad476-fd4f479.96OK
May 29 10:58:54.00047694cb23d1a7-c0cb703.71OK
May 31 12:20:01.152802072c75b835-4194386.21ERROR

Then the reference table trading_platform_users is added as a data source:

customer_namecustomer_idaccount_status
Meghan Key92446verified
Anthony Gill78037verified
Tanya Mejia47694verified
Michael Kaiser80207fraudulent

4. Enriching transaction data with customer information

The analysis cell transaction_record_with_names uses SQL to join the transaction data with customer information. This query selects columns from both datasets, enriching the transaction records with customer names and account status:

SELECT tr.timestamp, tr.customer_id, tpu.customer_name, tpu.account_status, tr.transaction_id, tr.dollar_value, tr.status
FROM transaction_record AS tr
LEFT JOIN trading_platform_users AS tpu ON tr.customer_id = tpu.customer_id;

An example of the resulting transaction_record_with_names dataset:

timestampcustomer_idcustomer_nameaccount_statustransaction_iddollar_valuestatus
May 29 11:09:28.00092446Meghan Keyverified085cc56c-a54f838.32OK
May 29 10:59:29.00078037Anthony Gillverifiedb1fad476-fd4f479.96OK
May 29 10:58:54.00047694Tanya Mejiaverifiedcb23d1a7-c0cb703.71OK
May 31 12:20:01.15280207Michael Kaiserfraudulent2c75b835-4194386.21ERROR

5. Visualizing error transactions

Finally, a treemap visualization cell is created to identify problematic transactions. The visualization uses the transaction_record_with_names dataset with a filter for status:ERROR logs and groups the data by dollar_value, account_status, and customer_name, making it easy to spot patterns in failed transactions.

The workspace datasets

Further reading

Additional helpful documentation, links, and articles: