Metrics Explorer

Query, visualize, and analyze your AI agent metrics interactively.

Overview

The Metrics Explorer lets you build custom queries to analyze your agent data. Create visualizations, compare time periods, and export data for further analysis.

Accessing Metrics Explorer

Navigate to Analysis → Explore to open the Metrics Explorer.

Building Queries

Select Metric

Choose from available metrics:

  • latency_ms - Execution duration
  • input_tokens - Input token count
  • output_tokens - Output token count
  • total_tokens - Combined tokens
  • cost_usd - Estimated cost
  • error_count - Number of errors
  • Custom KPI metrics

Aggregation

Choose how to aggregate values:

  • avg - Average value
  • sum - Total sum
  • count - Number of data points
  • min / max - Extremes
  • p50 / p95 / p99 - Percentiles

Group By

Break down by dimensions:

  • workflow_id - By workflow
  • model - By LLM model
  • status - By success/error
  • environment - By environment
  • Custom labels

Time Range

Select the time period:

  • Last 1 hour, 6 hours, 24 hours
  • Last 7 days, 30 days
  • Custom date range

Visualization Types

  • Time Series - Line chart over time
  • Bar Chart - Categorical comparison
  • Table - Raw data view
  • Heatmap - Time-based patterns

Example Queries

Average Latency by Workflow

Metric: latency_ms
Aggregation: avg
Group By: workflow_id
Time Range: Last 24 hours

Token Usage Over Time

Metric: total_tokens
Aggregation: sum
Group By: (none)
Time Range: Last 7 days
Interval: 1 hour

Error Rate by Model

Metric: error_count
Aggregation: count
Group By: model
Filter: status = "error"
Time Range: Last 24 hours

Saving & Sharing

  • Save Query - Save for later use
  • Add to Dashboard - Create a dashboard widget
  • Export Data - Download as CSV
  • Share Link - Share query URL with team

Next Steps