[HDX-3919] Add @hyperdx/cli package — terminal TUI, source map upload, and agent-friendly commands (#2043)

## Summary

Adds `packages/cli` (`@hyperdx/cli`) — a unified CLI for HyperDX that provides an interactive TUI for searching/tailing logs and traces, source map upload (migrated from `hyperdx-js`), and agent-friendly commands for programmatic access.

Ref: HDX-3919
Ref: HDX-3920

### CLI Commands

```
hdx tui -s <url>                    # Interactive TUI (main command)
hdx sources -s <url>                # List sources with ClickHouse schemas
hdx sources -s <url> --json         # JSON output for agents / scripts
hdx dashboards -s <url>             # List dashboards with tile summaries
hdx dashboards -s <url> --json      # JSON output for agents / scripts
hdx query --source "Logs" --sql "SELECT count() FROM default.otel_logs"
hdx upload-sourcemaps -k <key>      # Upload source maps (ported from hyperdx-js)
hdx auth login -s <url>             # Sign in (interactive or -e/-p flags)
hdx auth status                     # Show auth status
hdx auth logout                     # Clear saved session
```

### Key Features

**Interactive TUI (`hdx tui`)**
- Table view with dynamic columns derived from query results (percentage-based widths)
- Follow mode (live tail) enabled by default, auto-pauses when detail panel is open
- Vim-like keybindings: j/k navigation, G/g jump, Ctrl+D/U half-page scroll
- `/` for Lucene search, `s` to edit SELECT clause in `$EDITOR`, `t` to edit time range
- Tab to cycle between sources and saved searches
- `w` to toggle line wrap

**Detail Panel (3 tabs, full-screen with Ctrl+D/U scrolling)**
- **Overview** — Structured view: Top Level Attributes, Log/Span Attributes, Resource Attributes
- **Column Values** — Full `SELECT *` row data with `__hdx_*` aliased columns
- **Trace** — Waterfall chart (port of `DBTraceWaterfallChart` DAG builder) with correlated log events, j/k span navigation, inverse highlight, Event Details section

**Agent-friendly commands**
- `hdx sources --json` — Full source metadata with ClickHouse `CREATE TABLE` DDL, expression mappings, and correlated source IDs. Detailed `--help` describes the JSON schema for LLM consumption. Schema queries run in parallel.
- `hdx dashboards --json` — Dashboard metadata with simplified tile summaries (name, type, source, sql). Resolves source names for human-readable output.
- `hdx query --source <name> --sql <query>` — Raw SQL execution against any source's ClickHouse connection. Supports `--format` for ClickHouse output formats (JSON, JSONEachRow, CSV, etc.).

**Source map upload (`hdx upload-sourcemaps`)**
- Ported from `hyperdx-js/packages/cli` to consolidate on a single `@hyperdx/cli` package
- Authenticates via service account API key (`-k` / `HYPERDX_SERVICE_KEY` env var)
- Globs `.js` and `.js.map` files, handles Next.js route groups
- Uploads to presigned URLs in parallel with retry (3 attempts) and progress
- Modernized: native `fetch` (Node 22+), ESM-compatible, proper TypeScript types

### Architecture

```
packages/cli/
├── src/
│   ├── cli.tsx              # Commander CLI: tui, sources, dashboards, query,
│   │                        #   upload-sourcemaps, auth
│   ├── App.tsx              # Ink app shell (login → source picker → EventViewer)
│   ├── sourcemaps.ts        # Source map upload logic (ported from hyperdx-js)
│   ├── api/
│   │   ├── client.ts        # ApiClient + ProxyClickhouseClient
│   │   └── eventQuery.ts    # Query builders (renderChartConfig, raw SQL)
│   ├── components/
│   │   ├── EventViewer/     # Main TUI (9 files)
│   │   │   ├── EventViewer.tsx    # Orchestrator (state, hooks, render shell)
│   │   │   ├── types.ts           # Shared types & constants
│   │   │   ├── utils.ts           # Row formatting functions
│   │   │   ├── SubComponents.tsx  # Header, TabBar, SearchBar, Footer, HelpScreen, TableHeader
│   │   │   ├── TableView.tsx      # Table rows rendering
│   │   │   ├── DetailPanel.tsx    # Detail panel (overview/columns/trace tabs)
│   │   │   ├── useEventData.ts    # Data fetching hook
│   │   │   └── useKeybindings.ts  # Input handler hook
│   │   ├── TraceWaterfall/  # Trace chart (6 files)
│   │   │   ├── TraceWaterfall.tsx  # Orchestrator + render
│   │   │   ├── types.ts           # SpanRow, SpanNode, props
│   │   │   ├── utils.ts           # Duration/status/bar helpers
│   │   │   ├── buildTree.ts       # DAG builder (port of DBTraceWaterfallChart)
│   │   │   └── useTraceData.ts    # Data fetching hook
│   │   ├── RowOverview.tsx
│   │   ├── ColumnValues.tsx
│   │   ├── LoginForm.tsx
│   │   └── SourcePicker.tsx
│   ├── shared/              # Ported from packages/app (@source annotated)
│   └── utils/               # Config, editor, log silencing
├── AGENTS.md
├── CONTRIBUTING.md
└── README.md
```

### Tech Stack
- **Ink v6.8.0** (React 19 for terminals) + Commander.js
- **@clickhouse/client** via ProxyClickhouseClient (routes through `/clickhouse-proxy`)
- **@hyperdx/common-utils** for query generation (`renderChartConfig`, `chSqlToAliasMap`)
- **glob v13** for source map file discovery
- **tsup** for bundling (all deps bundled via `noExternal: [/.*/]`, zero runtime deps)
- **Bun 1.3.11** for standalone binary compilation
- Session stored at `~/.config/hyperdx/cli/session.json`

### CI/CD (`release.yml`)
- CLI binaries compiled for macOS ARM64, macOS x64, and Linux x64
- GitHub Release created with download instructions
- Version-change gate: skips release if `cli-v{version}` tag already exists
- `softprops/action-gh-release` pinned to full SHA (v2.6.1) for supply chain safety
- Bun pinned to `1.3.11` for reproducible builds
- npm publishing handled by changesets

### Keybindings

| Key | Action |
|---|---|
| `j/k` | Navigate rows (or spans in Trace tab) |
| `l/Enter` | Expand row detail |
| `h/Esc` | Close detail / blur search |
| `G/g` | Jump to newest/oldest |
| `Ctrl+D/U` | Scroll half-page (table, detail panels, Event Details) |
| `/` | Search (global or detail filter) |
| `Tab` | Cycle sources/searches or detail tabs |
| `s` | Edit SELECT clause in $EDITOR |
| `t` | Edit time range in $EDITOR |
| `f` | Toggle follow mode (live tail) |
| `w` | Toggle line wrap |
| `?` | Help screen |

### Demo
#### Main Search View
<img width="1004" height="1014" alt="image" src="https://github.com/user-attachments/assets/bb6a7f00-38c9-4281-9915-c71b65d852f8" />

#### Event Details Overview
<img width="1003" height="1024" alt="image" src="https://github.com/user-attachments/assets/57025fa5-fddb-452a-9320-93465538d5b2" />

#### Trace Waterfall
<img width="1004" height="1029" alt="image" src="https://github.com/user-attachments/assets/3443c898-ea0d-47f3-acc5-edb7cdd31946" />
This commit is contained in:
Warren Lee 2026-04-09 13:21:34 -07:00 committed by GitHub
parent 5de23e1988
commit d995b78c66
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
41 changed files with 6765 additions and 6 deletions

View file

@ -0,0 +1,5 @@
---
"@hyperdx/cli": minor
---
Add @hyperdx/cli package — terminal CLI for searching, tailing, and inspecting logs and traces from HyperDX with interactive TUI, trace waterfall, raw SQL queries, dashboard listing, and sourcemap uploads.

View file

@ -418,6 +418,104 @@ jobs:
"${IMAGE}:${VERSION}-arm64"
done
# ---------------------------------------------------------------------------
# CLI compile standalone binaries and upload as GitHub Release assets
# npm publishing is handled by changesets in the check_changesets job above.
# This job only compiles platform-specific binaries and creates a GH Release.
# ---------------------------------------------------------------------------
release-cli:
name: Release CLI Binaries
needs: [check_changesets, check_version]
if: needs.check_version.outputs.should_release == 'true'
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup node
uses: actions/setup-node@v4
with:
node-version-file: '.nvmrc'
cache-dependency-path: 'yarn.lock'
cache: 'yarn'
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: '1.3.11'
- name: Install dependencies
run: yarn install
- name: Get CLI version
id: cli_version
run: |
CLI_VERSION=$(node -p "require('./packages/cli/package.json').version")
echo "version=${CLI_VERSION}" >> $GITHUB_OUTPUT
echo "CLI version: ${CLI_VERSION}"
- name: Check if CLI release already exists
id: check_cli_release
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
if gh release view "cli-v${{ steps.cli_version.outputs.version }}" > /dev/null 2>&1; then
echo "Release cli-v${{ steps.cli_version.outputs.version }} already exists. Skipping."
echo "exists=true" >> $GITHUB_OUTPUT
else
echo "Release does not exist. Proceeding."
echo "exists=false" >> $GITHUB_OUTPUT
fi
- name: Compile CLI binaries
if: steps.check_cli_release.outputs.exists == 'false'
working-directory: packages/cli
run: |
yarn compile:linux
yarn compile:macos
yarn compile:macos-x64
- name: Create GitHub Release
if: steps.check_cli_release.outputs.exists == 'false'
uses: softprops/action-gh-release@153bb8e04406b158c6c84fc1615b65b24149a1fe # v2.6.1
with:
tag_name: cli-v${{ steps.cli_version.outputs.version }}
name: '@hyperdx/cli v${{ steps.cli_version.outputs.version }}'
body: |
## @hyperdx/cli v${{ steps.cli_version.outputs.version }}
### Installation
**npm (recommended):**
```bash
npm install -g @hyperdx/cli
```
**Or run directly with npx:**
```bash
npx @hyperdx/cli tui -s <your-hyperdx-api-url>
```
**Manual download (standalone binary, no Node.js required):**
```bash
# macOS Apple Silicon
curl -L https://github.com/hyperdxio/hyperdx/releases/download/cli-v${{ steps.cli_version.outputs.version }}/hdx-darwin-arm64 -o hdx
# macOS Intel
curl -L https://github.com/hyperdxio/hyperdx/releases/download/cli-v${{ steps.cli_version.outputs.version }}/hdx-darwin-x64 -o hdx
# Linux x64
curl -L https://github.com/hyperdxio/hyperdx/releases/download/cli-v${{ steps.cli_version.outputs.version }}/hdx-linux-x64 -o hdx
chmod +x hdx && sudo mv hdx /usr/local/bin/
```
### Usage
```bash
hdx auth login -s <your-hyperdx-api-url>
hdx tui
```
draft: false
prerelease: false
files: |
packages/cli/dist/hdx-linux-x64
packages/cli/dist/hdx-darwin-arm64
packages/cli/dist/hdx-darwin-x64
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# ---------------------------------------------------------------------------
# Downstream notifications
# ---------------------------------------------------------------------------
@ -553,6 +651,7 @@ jobs:
publish-otel-collector,
publish-local,
publish-all-in-one,
release-cli,
notify_helm_charts,
notify_ch,
notify_clickhouse_clickstack,

5
packages/cli/.gitignore vendored Normal file
View file

@ -0,0 +1,5 @@
# Build output
dist/
# Bun build artifacts
*.bun-build

250
packages/cli/AGENTS.md Normal file
View file

@ -0,0 +1,250 @@
# @hyperdx/cli Development Guide
## What is @hyperdx/cli?
A terminal CLI for searching, tailing, and inspecting logs and traces from
HyperDX. It provides both an interactive TUI (built with Ink — React for
terminals) and a non-interactive streaming mode for piping.
The CLI connects to the HyperDX API server and queries ClickHouse directly
through the API's `/clickhouse-proxy` endpoint, using the same query generation
logic (`@hyperdx/common-utils`) as the web frontend.
## CLI Commands
```
hdx tui -s <url> # Interactive TUI (main command)
hdx stream -s <url> --source "Logs" # Non-interactive streaming to stdout
hdx sources -s <url> # List available sources
hdx auth login -s <url> # Sign in (interactive or -e/-p flags)
hdx auth status # Show auth status (reads saved session)
hdx auth logout # Clear saved session
```
The `-s, --server <url>` flag is required for commands that talk to the API. If
omitted, the CLI falls back to the server URL saved in the session file from a
previous `hdx auth login`.
## Architecture
```
src/
├── cli.tsx # Entry point — Commander CLI with commands:
│ # tui, stream, sources, auth (login/logout/status)
│ # Also contains the standalone LoginPrompt component
├── App.tsx # App shell — state machine:
│ # loading → login → pick-source → EventViewer
├── api/
│ ├── client.ts # ApiClient (REST + session cookies)
│ │ # ProxyClickhouseClient (routes through /clickhouse-proxy)
│ └── eventQuery.ts # Query builders:
│ # buildEventSearchQuery (table view, uses renderChartConfig)
│ # buildTraceSpansSql (waterfall trace spans)
│ # buildTraceLogsSql (waterfall correlated logs)
│ # buildFullRowSql (SELECT * for row detail)
├── components/
│ ├── EventViewer.tsx # Main TUI view — table, search, detail panel with tabs
│ ├── TraceWaterfall.tsx # Trace waterfall chart with j/k navigation + event details
│ ├── RowOverview.tsx # Structured overview (top-level attrs, event attrs, resource attrs)
│ ├── ColumnValues.tsx # Shared key-value renderer (used by Column Values tab + Event Details)
│ ├── LoginForm.tsx # Email/password login form (used inside TUI App)
│ └── SourcePicker.tsx # Arrow-key source selector
└── utils/
├── config.ts # Session persistence (~/.config/hyperdx/cli/session.json)
├── editor.ts # $EDITOR integration for time range and select clause editing
└── silenceLogs.ts # Suppresses console.debug/warn/error from common-utils
```
## Key Components
### EventViewer (`components/EventViewer.tsx`)
The main TUI component (~1100 lines). Handles:
- **Table view**: Dynamic columns derived from query results, percentage-based
widths, `overflowX="hidden"` for truncation
- **Search**: Lucene query via `/` key, submits on Enter
- **Follow mode**: Slides time range forward every 2s, pauses when detail panel
is open, restores on close
- **Detail panel**: Three tabs (Overview / Column Values / Trace), cycled via
Tab key. Detail search via `/` filters content within the active tab.
- **Select editor**: `s` key opens `$EDITOR` with the current SELECT clause.
Custom selects are stored per source ID.
- **State management**: ~20 `useState` hooks. Key states include `events`,
`expandedRow`, `detailTab`, `isFollowing`, `customSelectMap`,
`traceSelectedIndex`.
### TraceWaterfall (`components/TraceWaterfall.tsx`)
Port of the web frontend's `DBTraceWaterfallChart`. Key details:
- **Tree building**: Single-pass DAG builder over time-sorted rows. Direct port
of the web frontend's logic — do NOT modify without checking
`DBTraceWaterfallChart` first.
- **Correlated logs**: Fetches log events via `buildTraceLogsSql` and merges
them into the span tree (logs attach as children of the span with matching
SpanId, using `SpanId-log` suffix to avoid key collisions).
- **j/k navigation**: `selectedIndex` + `onSelectedIndexChange` props controlled
by EventViewer. `effectiveIndex` falls back to `highlightHint` when no j/k
navigation has occurred.
- **Event Details**: `SELECT *` fetch for the selected span/log, rendered via
the shared `ColumnValues` component. Uses stable scalar deps
(`selectedNodeSpanId`, `selectedNodeTimestamp`, `selectedNodeKind`) to avoid
infinite re-fetch loops.
- **Duration formatting**: Dynamic units — `1.2s`, `3.5ms`, `45.2μs`, `123ns`.
- **Highlight**: `inverse` on label and duration text for the selected row. Bar
color unchanged.
### RowOverview (`components/RowOverview.tsx`)
Port of the web frontend's `DBRowOverviewPanel`. Three sections:
1. **Top Level Attributes**: Standard OTel fields (TraceId, SpanId, SpanName,
ServiceName, Duration, StatusCode, etc.)
2. **Span/Log Attributes**: Flattened from `source.eventAttributesExpression`,
shown with key count header
3. **Resource Attributes**: Flattened from
`source.resourceAttributesExpression`, rendered as chips with
`backgroundColor="#3a3a3a"` and cyan key / white value
### ColumnValues (`components/ColumnValues.tsx`)
Shared component for rendering key-value pairs from a row data object. Used by:
- Column Values tab in the detail panel
- Event Details section in the Trace tab's waterfall
Supports `searchQuery` filtering and `wrapLines` toggle.
## Web Frontend Alignment
This package mirrors several web frontend components. **Always check the
corresponding web component before making changes** to ensure behavior stays
consistent:
| CLI Component | Web Component | Notes |
| ---------------- | ----------------------- | -------------------------------- |
| `TraceWaterfall` | `DBTraceWaterfallChart` | Tree builder is a direct port |
| `RowOverview` | `DBRowOverviewPanel` | Same sections and field list |
| Trace tab logic | `DBTracePanel` | Source resolution (trace/log) |
| Detail panel | `DBRowSidePanel` | Tab structure, highlight hint |
| Event query | `DBTraceWaterfallChart` | `getConfig()``buildTrace*Sql` |
Key expression mappings from the web frontend's `getConfig()`:
- `Timestamp``displayedTimestampValueExpression` (NOT
`timestampValueExpression`)
- `Duration``durationExpression` (raw, not seconds like web frontend)
- `Body``bodyExpression` (logs) or `spanNameExpression` (traces)
- `SpanId``spanIdExpression`
- `ParentSpanId``parentSpanIdExpression` (traces only)
## Keybindings (TUI mode)
| Key | Action |
| ------------- | ------------------------------------------ |
| `j` / `↓` | Move selection down |
| `k` / `↑` | Move selection up |
| `l` / `Enter` | Expand row detail |
| `h` / `Esc` | Close detail / blur search |
| `G` | Jump to newest |
| `g` | Jump to oldest |
| `/` | Search (global in table, filter in detail) |
| `Tab` | Cycle sources/searches or detail tabs |
| `Shift+Tab` | Cycle backwards |
| `s` | Edit SELECT clause in $EDITOR |
| `t` | Edit time range in $EDITOR |
| `f` | Toggle follow mode (live tail) |
| `w` | Toggle line wrap |
| `?` | Toggle help screen |
| `q` | Quit |
In the **Trace tab**, `j`/`k` navigate spans/logs in the waterfall instead of
the main table.
## Development
```bash
# Run in dev mode (tsx, no compile step)
cd packages/cli
yarn dev tui -s http://localhost:8000
# Type check
npx tsc --noEmit
# Bundle with tsup
yarn build
# Compile standalone binary (current platform)
yarn compile
# Cross-compile
yarn compile:macos # macOS ARM64
yarn compile:macos-x64 # macOS x64
yarn compile:linux # Linux x64
```
## Key Patterns
### Session Management
Session is stored at `~/.config/hyperdx/cli/session.json` with mode `0o600`.
Contains `apiUrl` and `cookies[]`. The `ApiClient` constructor loads the saved
session and checks if the stored `apiUrl` matches the requested one.
### ClickHouse Proxy Client
`ProxyClickhouseClient` extends `BaseClickhouseClient` from common-utils. It:
- Routes queries through `/clickhouse-proxy` (sets `pathname`)
- Injects session cookies for auth
- Passes `x-hyperdx-connection-id` header
- Disables basic auth (`set_basic_auth_header: false`)
- Forces `content-type: text/plain` to prevent Express body parser issues
### Source Expressions
Sources have many expression fields that map to ClickHouse column names. Key
ones used in the CLI:
- `timestampValueExpression` — Primary timestamp (often `TimestampTime`,
DateTime)
- `displayedTimestampValueExpression` — High-precision timestamp (often
`Timestamp`, DateTime64 with nanoseconds). **Use this for waterfall queries.**
- `traceIdExpression`, `spanIdExpression`, `parentSpanIdExpression`
- `bodyExpression`, `spanNameExpression`, `serviceNameExpression`
- `durationExpression` + `durationPrecision` (3=ms, 6=μs, 9=ns)
- `eventAttributesExpression`, `resourceAttributesExpression`
- `logSourceId`, `traceSourceId` — Correlated source IDs
### useInput Handler Ordering
The `useInput` callback in EventViewer has a specific priority order. **Do not
reorder these checks**:
1. `?` toggles help (except when search focused)
2. Any key closes help when showing
3. `focusDetailSearch` — consumes all keys except Esc/Enter
4. `focusSearch` — consumes all keys except Tab/Esc
5. Trace tab j/k — when detail panel open and Trace tab active
6. General j/k, G/g, Enter/Esc, Tab, etc.
7. Single-key shortcuts: `w`, `f`, `/`, `s`, `t`, `q`
### Dynamic Table Columns
When `customSelect` is set (via `s` key), columns are derived from the query
result keys. Otherwise, hardcoded percentage-based columns are used per source
kind. The `getDynamicColumns` function distributes 60% evenly among non-last
columns, with the last column getting the remainder.
### Follow Mode
- Enabled by default on startup
- Slides `timeRange` forward every 2s, triggering a replace fetch
- **Paused** when detail panel opens (`wasFollowingRef` saves previous state)
- **Restored** when detail panel closes
### Custom Select Per Source
`customSelectMap: Record<string, string>` stores custom SELECT overrides keyed
by `source.id`. Each source remembers its own custom select independently.

View file

@ -0,0 +1,306 @@
# @hyperdx/cli — Development Guide
## Prerequisites
- **Node.js** >= 22.16.0
- **Yarn** 4 (workspace managed from monorepo root)
- **Bun** (optional, for standalone binary compilation)
- A running HyperDX instance (API server or Next.js frontend with proxy)
## Getting Started
```bash
# From the monorepo root
yarn install
# Navigate to the CLI package
cd packages/cli
```
### Authentication
Before using the TUI, authenticate with your HyperDX instance:
```bash
# Interactive login (opens email/password prompts)
yarn dev auth login -s http://localhost:8000
# Non-interactive login (for scripting/CI)
yarn dev auth login -s http://localhost:8000 -e user@example.com -p password
# Verify auth status
yarn dev auth status
# Session is saved to ~/.config/hyperdx/cli/session.json
```
Once authenticated, the `-s` flag is optional — the CLI reads the server URL
from the saved session.
### Running in Dev Mode
`yarn dev` uses `tsx` for direct TypeScript execution — no compile step needed.
```bash
# Interactive TUI
yarn dev tui
# With explicit server URL
yarn dev tui -s http://localhost:8000
# Skip source picker
yarn dev tui --source "Logs"
# Start with a search query + follow mode
yarn dev tui -q "level:error" -f
# Non-interactive streaming
yarn dev stream --source "Logs"
# List available sources
yarn dev sources
```
## Building & Compiling
```bash
# Type check (no output on success)
npx tsc --noEmit
# Bundle with tsup (outputs to dist/cli.js)
yarn build
# Compile standalone binary for current platform
yarn compile
# Cross-compile
yarn compile:macos # macOS ARM64
yarn compile:macos-x64 # macOS x64
yarn compile:linux # Linux x64
```
The compiled binary is a single file at `dist/hdx` (or `dist/hdx-<platform>`).
## Project Structure
```
src/
├── cli.tsx # Entry point — Commander CLI commands
│ # (tui, stream, sources, auth login/logout/status)
│ # Also contains LoginPrompt component
├── App.tsx # Ink app shell — state machine:
│ # loading → login → pick-source → EventViewer
├── api/
│ ├── client.ts # ApiClient (REST + session cookies)
│ │ # ProxyClickhouseClient (ClickHouse via /clickhouse-proxy)
│ └── eventQuery.ts # Query builders:
│ # buildEventSearchQuery — table view (uses renderChartConfig)
│ # buildTraceSpansSql — waterfall trace spans
│ # buildTraceLogsSql — waterfall correlated logs
│ # buildFullRowQuery — SELECT * for row detail (uses renderChartConfig)
├── components/
│ ├── EventViewer.tsx # Main TUI view (~1275 lines)
│ │ # Table, search, detail panel with 3 tabs
│ ├── TraceWaterfall.tsx # Trace waterfall chart with j/k navigation
│ ├── RowOverview.tsx # Structured overview (Top Level, Attributes, Resources)
│ ├── ColumnValues.tsx # Shared key-value renderer with scroll support
│ ├── LoginForm.tsx # Email/password login form (used inside TUI App)
│ └── SourcePicker.tsx # j/k source selector
├── shared/ # Logic ported from packages/app (@source annotated)
│ ├── useRowWhere.ts # processRowToWhereClause, buildColumnMap, getRowWhere
│ ├── source.ts # getDisplayedTimestampValueExpression, getEventBody, etc.
│ └── rowDataPanel.ts # ROW_DATA_ALIASES, buildRowDataSelectList
└── utils/
├── config.ts # Session persistence (~/.config/hyperdx/cli/session.json)
├── editor.ts # $EDITOR integration for time range and SELECT editing
└── silenceLogs.ts # Suppresses console.debug/warn/error, verbose file logging
```
## Data Flow
### Table View Query
```
User types search → buildEventSearchQuery()
→ renderChartConfig() from @hyperdx/common-utils
→ ProxyClickhouseClient.query()
→ API /clickhouse-proxy → ClickHouse
→ JSON response with { data, meta }
→ Store chSql in lastTableChSqlRef, meta in lastTableMetaRef
→ Render dynamic table columns from row keys
```
### Row Detail Fetch (on Enter/l)
```
User expands row → buildFullRowQuery()
→ chSqlToAliasMap(lastTableChSql) for alias resolution
→ buildColumnMap(lastTableMeta, aliasMap) for type-aware WHERE
→ processRowToWhereClause() with proper type handling
→ renderChartConfig() with SELECT *, __hdx_* aliases
→ Results include LogAttributes, ResourceAttributes, etc.
```
### Trace Waterfall
```
User switches to Trace tab
→ buildTraceSpansSql() — fetch spans by TraceId
→ buildTraceLogsSql() — fetch correlated logs by TraceId (if logSource exists)
→ buildTree() — single-pass DAG builder (port of DBTraceWaterfallChart)
→ Render waterfall with timing bars
→ j/k navigation highlights spans, fetches SELECT * for Event Details
```
### ClickHouse Proxy Client
`ProxyClickhouseClient` extends `BaseClickhouseClient` from common-utils:
- Derives proxy pathname from API URL (e.g. `/api/clickhouse-proxy` for Next.js
proxy, `/clickhouse-proxy` for direct API)
- Passes `origin` only to `createClient` (not the path, which ClickHouse client
would interpret as a database name)
- Injects session cookies + `x-hyperdx-connection-id` header
- Forces `content-type: text/plain` to prevent Express body parser issues
### Server URL Resolution
The `-s` flag is optional on most commands. Resolution order:
1. Explicit `-s <url>` flag
2. Saved session's `apiUrl` from `~/.config/hyperdx/cli/session.json`
3. Error: "No server specified"
## Key Patterns
### useInput Handler Ordering
The `useInput` callback in EventViewer has a strict priority order. **Do not
reorder these checks**:
1. `?` toggles help (except when search focused)
2. Any key closes help when showing
3. `focusDetailSearch` — consumes all keys except Esc/Enter
4. `focusSearch` — consumes all keys except Tab/Esc
5. Trace tab j/k + Ctrl+D/U — when detail panel open and Trace tab active
6. Column Values / Overview Ctrl+D/U — scroll detail view
7. General j/k, G/g, Enter/Esc, Tab, etc.
8. Single-key shortcuts: `w`, `f`, `/`, `s`, `t`, `q`
### Follow Mode
- Enabled by default on startup
- Slides `timeRange` forward every 2s via `setInterval`, triggering a replace
fetch
- **Paused** when detail panel opens (`wasFollowingRef` saves previous state)
- **Restored** when detail panel closes
### Custom Select Per Source
`customSelectMap: Record<string, string>` stores custom SELECT overrides keyed
by `source.id`. Each source remembers its own custom select independently. Press
`s` to open `$EDITOR` with the current SELECT clause.
### Scrollable Detail Panels
All detail tabs have fixed-height viewports with Ctrl+D/U scrolling:
- **Overview / Column Values** — Uses `fullDetailMaxRows` (full screen minus
overhead)
- **Trace Event Details** — Uses `detailMaxRows` (1/3 of terminal, since
waterfall takes the rest)
## Web Frontend Alignment
This package ports several web frontend components. **Always check the
corresponding web component before making changes**:
| CLI Component | Web Component | Notes |
| ---------------- | ----------------------- | ----------------------------- |
| `TraceWaterfall` | `DBTraceWaterfallChart` | Tree builder is a direct port |
| `RowOverview` | `DBRowOverviewPanel` | Same sections and field list |
| Trace tab logic | `DBTracePanel` | Source resolution (trace/log) |
| Detail panel | `DBRowSidePanel` | Tab structure, highlight hint |
| Row WHERE clause | `useRowWhere.tsx` | processRowToWhereClause |
| Row data fetch | `DBRowDataPanel` | ROW_DATA_ALIASES, SELECT list |
| Source helpers | `source.ts` | Expression getters |
### Shared Modules (`src/shared/`)
Files in `src/shared/` are copied from `packages/app` with `@source` annotations
at the top of each file. These are candidates for future extraction to
`@hyperdx/common-utils`:
```typescript
/**
* Row WHERE clause builder.
*
* @source packages/app/src/hooks/useRowWhere.tsx
*/
```
When updating these files, check the original source in `packages/app` first.
## Common Tasks
### Adding a New Keybinding
1. Add the key handler in `EventViewer.tsx`'s `useInput` callback at the correct
priority level
2. Update the `HelpScreen` component's key list
3. Update `AGENTS.md` keybindings table
4. Update `README.md` keybindings table
### Adding a New Detail Tab
1. Add the tab key to the `detailTab` union type in `EventViewer.tsx`
2. Add the tab to the `tabs` array in the Tab key handler
3. Add the tab to the tab bar rendering
4. Add the tab content rendering block
5. Handle any tab-specific Ctrl+D/U scrolling
### Adding a New CLI Command
1. Add the command in `cli.tsx` using Commander
2. Use `resolveServer(opts.server)` for server URL resolution
3. Use `withVerbose()` if the command needs `--verbose` support
4. Update `README.md` and `AGENTS.md` with the new command
### Porting a Web Frontend Component
1. Create the file in `src/shared/` (if pure logic) or `src/components/` (if UI)
2. Add `@source packages/app/src/...` annotation at the top
3. Use `SourceResponse` type from `@/api/client` instead of `TSource` types
4. Replace React hooks with plain functions where possible (for non-React usage)
5. Add the mapping to the "Web Frontend Alignment" table in `AGENTS.md`
## Troubleshooting
### "Not logged in" error
Run `yarn dev auth login -s <url>` to authenticate. The session may have expired
or the server URL may have changed.
### HTML response instead of JSON
The API URL may be pointing to the Next.js frontend instead of the Express API
server. Both work — the CLI auto-detects the `/api` prefix in the URL and
adjusts the ClickHouse proxy path accordingly.
### "Database api does not exist"
The `ProxyClickhouseClient` was sending the URL path as a database name. This
was fixed by passing `origin` only (without path) to `createClient`.
### Row detail shows partial data
The `SELECT *` row detail fetch requires:
1. `lastTableChSqlRef` — the rendered SQL from the last table query (for
`chSqlToAliasMap`)
2. `lastTableMetaRef` — column metadata from the query response (for type-aware
WHERE clause)
If these are null (e.g. first render before any query), the fetch may fail
silently and fall back to the partial table row data.

40
packages/cli/README.md Normal file
View file

@ -0,0 +1,40 @@
# @hyperdx/cli
Command line interface for HyperDX.
## Uploading Source Maps
Upload JavaScript source maps to HyperDX for stack trace de-obfuscation.
In your build pipeline, run the CLI tool:
```bash
npx @hyperdx/cli upload-sourcemaps \
--serviceKey "$HYPERDX_API_ACCESS_KEY" \
--apiUrl "$HYPERDX_API_URL" \
--path .next \
--releaseId "$RELEASE_ID"
```
You can also add this as an npm script:
```json
{
"scripts": {
"upload-sourcemaps": "npx @hyperdx/cli upload-sourcemaps --path=\".next\""
}
}
```
### Options
| Flag | Description | Default |
| ------------------------- | ------------------------------------------------------ | ------- |
| `-k, --serviceKey <key>` | HyperDX service account API key | |
| `-p, --path <dir>` | Directory containing sourcemaps | `.` |
| `-u, --apiUrl <url>` | HyperDX API URL (required for self-hosted deployments) | |
| `-rid, --releaseId <id>` | Release ID to associate with the sourcemaps | |
| `-bp, --basePath <path>` | Base path for the uploaded sourcemaps | |
Optionally, set the `HYPERDX_SERVICE_KEY` environment variable to avoid passing
the `--serviceKey` flag.

View file

@ -0,0 +1,63 @@
import js from '@eslint/js';
import tseslint from 'typescript-eslint';
import prettierConfig from 'eslint-config-prettier';
import prettierPlugin from 'eslint-plugin-prettier';
export default [
js.configs.recommended,
...tseslint.configs.recommended,
prettierConfig,
{
ignores: [
'node_modules/**',
'dist/**',
'**/*.config.mjs',
'**/*.config.ts',
],
},
{
files: ['src/**/*.ts', 'src/**/*.tsx'],
plugins: {
'@typescript-eslint': tseslint.plugin,
prettier: prettierPlugin,
},
rules: {
'@typescript-eslint/no-explicit-any': 'off',
'@typescript-eslint/no-unused-vars': [
'warn',
{
argsIgnorePattern: '^_',
varsIgnorePattern: '^_',
},
],
'prettier/prettier': 'error',
},
languageOptions: {
parser: tseslint.parser,
parserOptions: {
ecmaVersion: 'latest',
sourceType: 'module',
project: './tsconfig.json',
tsconfigRootDir: import.meta.dirname,
},
globals: {
console: 'readonly',
process: 'readonly',
setTimeout: 'readonly',
clearTimeout: 'readonly',
setInterval: 'readonly',
clearInterval: 'readonly',
React: 'readonly',
fetch: 'readonly',
Response: 'readonly',
URL: 'readonly',
URLSearchParams: 'readonly',
Headers: 'readonly',
RequestInfo: 'readonly',
RequestInit: 'readonly',
ReadableStream: 'readonly',
Buffer: 'readonly',
},
},
},
];

49
packages/cli/package.json Normal file
View file

@ -0,0 +1,49 @@
{
"name": "@hyperdx/cli",
"version": "0.1.3",
"license": "MIT",
"type": "module",
"publishConfig": {
"access": "public"
},
"bin": {
"hdx": "./dist/cli.js"
},
"files": [
"dist/*"
],
"engines": {
"node": ">=22.16.0",
"bun": "1.3.11"
},
"scripts": {
"dev": "tsx src/cli.tsx",
"build": "tsup",
"prepublishOnly": "yarn build",
"compile": "bun build src/cli.tsx --compile --outfile dist/hdx",
"compile:linux": "bun build src/cli.tsx --compile --target=bun-linux-x64 --outfile dist/hdx-linux-x64",
"compile:macos": "bun build src/cli.tsx --compile --target=bun-darwin-arm64 --outfile dist/hdx-darwin-arm64",
"compile:macos-x64": "bun build src/cli.tsx --compile --target=bun-darwin-x64 --outfile dist/hdx-darwin-x64"
},
"devDependencies": {
"@clickhouse/client": "^1.12.1",
"@clickhouse/client-common": "^1.12.1",
"@hyperdx/common-utils": "^0.17.0",
"@types/crypto-js": "^4.2.2",
"@types/react": "^19.0.0",
"@types/sqlstring": "^2.3.2",
"chalk": "^5.3.0",
"commander": "^12.1.0",
"crypto-js": "^4.2.0",
"glob": "^13.0.6",
"ink": "6.8.0",
"ink-spinner": "^5.0.0",
"ink-text-input": "^6.0.0",
"react": "^19.0.0",
"react-devtools-core": "^7.0.1",
"sqlstring": "^2.3.3",
"tsup": "^8.4.0",
"tsx": "^4.19.0",
"typescript": "^5.9.3"
}
}

173
packages/cli/src/App.tsx Normal file
View file

@ -0,0 +1,173 @@
import React, { useState, useEffect, useCallback } from 'react';
import { Box, Text } from 'ink';
import Spinner from 'ink-spinner';
import { SourceKind } from '@hyperdx/common-utils/dist/types';
import {
ApiClient,
type SourceResponse,
type SavedSearchResponse,
} from '@/api/client';
import LoginForm from '@/components/LoginForm';
import SourcePicker from '@/components/SourcePicker';
import EventViewer from '@/components/EventViewer';
type Screen = 'loading' | 'login' | 'pick-source' | 'events';
interface AppProps {
apiUrl: string;
/** Pre-set search query from CLI flags */
query?: string;
/** Pre-set source name from CLI flags */
sourceName?: string;
/** Start in follow/live tail mode */
follow?: boolean;
}
export default function App({ apiUrl, query, sourceName, follow }: AppProps) {
const [screen, setScreen] = useState<Screen>('loading');
const [client] = useState(() => new ApiClient({ apiUrl }));
const [eventSources, setLogSources] = useState<SourceResponse[]>([]);
const [savedSearches, setSavedSearches] = useState<SavedSearchResponse[]>([]);
const [selectedSource, setSelectedSource] = useState<SourceResponse | null>(
null,
);
const [activeQuery, setActiveQuery] = useState(query ?? '');
const [error, setError] = useState<string | null>(null);
// Check existing session on mount
useEffect(() => {
(async () => {
const valid = await client.checkSession();
if (valid) {
await loadData();
} else {
setScreen('login');
}
})();
}, []);
const loadData = async () => {
try {
const [sources, searches] = await Promise.all([
client.getSources(),
client.getSavedSearches().catch(() => [] as SavedSearchResponse[]),
]);
const queryableSources = sources.filter(
s => s.kind === SourceKind.Log || s.kind === SourceKind.Trace,
);
if (queryableSources.length === 0) {
setError(
'No log or trace sources found. Configure a source in HyperDX first.',
);
return;
}
setLogSources(queryableSources);
setSavedSearches(searches);
// Auto-select if source name was provided via CLI
if (sourceName) {
const match = queryableSources.find(
s => s.name.toLowerCase() === sourceName.toLowerCase(),
);
if (match) {
setSelectedSource(match);
setScreen('events');
return;
}
}
// Auto-select if only one source
if (queryableSources.length === 1) {
setSelectedSource(queryableSources[0]);
setScreen('events');
return;
}
setScreen('pick-source');
} catch (err: unknown) {
setError(err instanceof Error ? err.message : String(err));
}
};
const handleLogin = async (email: string, password: string) => {
const ok = await client.login(email, password);
if (ok) {
await loadData();
}
return ok;
};
const handleSourceSelect = (source: SourceResponse) => {
setSelectedSource(source);
setScreen('events');
};
const handleSavedSearchSelect = useCallback(
(search: SavedSearchResponse) => {
const source = eventSources.find(
s => s.id === search.source || s._id === search.source,
);
if (source) {
setSelectedSource(source);
}
setActiveQuery(search.where);
setScreen('events');
},
[eventSources],
);
if (error) {
return (
<Box paddingX={1}>
<Text color="red">Error: {error}</Text>
</Box>
);
}
switch (screen) {
case 'loading':
return (
<Box paddingX={1}>
<Text>
<Spinner type="dots" /> Connecting to {apiUrl}
</Text>
</Box>
);
case 'login':
return <LoginForm apiUrl={apiUrl} onLogin={handleLogin} />;
case 'pick-source':
return (
<Box flexDirection="column">
<Box flexDirection="column" marginBottom={1}>
<Text color="#00c28a" bold>
HyperDX TUI
</Text>
<Text dimColor>Search and tail events from the terminal</Text>
</Box>
<SourcePicker sources={eventSources} onSelect={handleSourceSelect} />
</Box>
);
case 'events':
if (!selectedSource) return null;
return (
<EventViewer
clickhouseClient={client.createClickHouseClient()}
metadata={client.createMetadata()}
source={selectedSource}
sources={eventSources}
savedSearches={savedSearches}
onSavedSearchSelect={handleSavedSearchSelect}
initialQuery={activeQuery}
follow={follow}
/>
);
}
}

View file

@ -0,0 +1,381 @@
/**
* HTTP client for the HyperDX internal API.
*
* Handles session cookie auth and exposes:
* - REST calls (login, sources, connections, me)
* - A ClickHouse node client that routes through /clickhouse-proxy
* with session cookies and connection-id header injection
*/
import { createClient } from '@clickhouse/client';
import type {
BaseResultSet,
ClickHouseSettings,
DataFormat,
} from '@clickhouse/client-common';
import {
BaseClickhouseClient,
type ClickhouseClientOptions,
type QueryInputs,
} from '@hyperdx/common-utils/dist/clickhouse';
import {
getMetadata,
type Metadata,
} from '@hyperdx/common-utils/dist/core/metadata';
import { loadSession, saveSession, clearSession } from '@/utils/config';
// ------------------------------------------------------------------
// API Client (session management + REST calls)
// ------------------------------------------------------------------
interface ApiClientOptions {
apiUrl: string;
}
export class ApiClient {
private apiUrl: string;
private cookies: string[] = [];
constructor(opts: ApiClientOptions) {
this.apiUrl = opts.apiUrl.replace(/\/+$/, '');
const saved = loadSession();
if (saved && saved.apiUrl === this.apiUrl) {
this.cookies = saved.cookies;
}
}
getApiUrl(): string {
return this.apiUrl;
}
getCookieHeader(): string {
return this.cookies.join('; ');
}
// ---- Auth --------------------------------------------------------
async login(email: string, password: string): Promise<boolean> {
const res = await fetch(`${this.apiUrl}/login/password`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ email, password }),
redirect: 'manual',
});
if (res.status === 302 || res.status === 200) {
this.extractCookies(res);
saveSession({ apiUrl: this.apiUrl, cookies: this.cookies });
return true;
}
return false;
}
async checkSession(): Promise<boolean> {
try {
const res = await this.get('/me');
return res.ok;
} catch {
return false;
}
}
logout(): void {
this.cookies = [];
clearSession();
}
// ---- Generic HTTP ------------------------------------------------
async get(path: string): Promise<Response> {
return fetch(`${this.apiUrl}${path}`, {
headers: this.headers(),
redirect: 'manual',
});
}
async post(path: string, body?: unknown): Promise<Response> {
return fetch(`${this.apiUrl}${path}`, {
method: 'POST',
headers: { ...this.headers(), 'Content-Type': 'application/json' },
body: body != null ? JSON.stringify(body) : undefined,
redirect: 'manual',
});
}
// ---- Typed API calls ---------------------------------------------
async getMe(): Promise<MeResponse> {
const res = await this.get('/me');
if (!res.ok) throw new Error(`GET /me failed: ${res.status}`);
return res.json() as Promise<MeResponse>;
}
async getSources(): Promise<SourceResponse[]> {
const res = await this.get('/sources');
if (!res.ok) throw new Error(`GET /sources failed: ${res.status}`);
return res.json() as Promise<SourceResponse[]>;
}
async getConnections(): Promise<ConnectionResponse[]> {
const res = await this.get('/connections');
if (!res.ok) throw new Error(`GET /connections failed: ${res.status}`);
return res.json() as Promise<ConnectionResponse[]>;
}
async getSavedSearches(): Promise<SavedSearchResponse[]> {
const res = await this.get('/saved-searches');
if (!res.ok) throw new Error(`GET /saved-searches failed: ${res.status}`);
return res.json() as Promise<SavedSearchResponse[]>;
}
async getDashboards(): Promise<DashboardResponse[]> {
const res = await this.get('/dashboards');
if (!res.ok) throw new Error(`GET /dashboards failed: ${res.status}`);
return res.json() as Promise<DashboardResponse[]>;
}
// ---- ClickHouse client via proxy ---------------------------------
createClickHouseClient(
opts: Partial<ClickhouseClientOptions> = {},
): ProxyClickhouseClient {
return new ProxyClickhouseClient(this, opts);
}
createMetadata(opts: Partial<ClickhouseClientOptions> = {}): Metadata {
return getMetadata(this.createClickHouseClient(opts));
}
// ---- Internal ----------------------------------------------------
private headers(): Record<string, string> {
const h: Record<string, string> = {};
if (this.cookies.length > 0) {
h['cookie'] = this.cookies.join('; ');
}
return h;
}
private extractCookies(res: Response): void {
const setCookies = res.headers.getSetCookie?.() ?? [];
if (setCookies.length > 0) {
this.cookies = setCookies.map(c => c.split(';')[0]);
}
}
}
// ------------------------------------------------------------------
// ClickHouse client that proxies through /clickhouse-proxy
// using the native Node @clickhouse/client with cookie auth
// ------------------------------------------------------------------
export class ProxyClickhouseClient extends BaseClickhouseClient {
private apiClient: ApiClient;
constructor(
apiClient: ApiClient,
opts: Partial<ClickhouseClientOptions> = {},
) {
super({
host: `${apiClient.getApiUrl()}/clickhouse-proxy`,
...opts,
});
this.apiClient = apiClient;
// The @clickhouse/client treats the path portion of `url` as the
// database name, NOT the HTTP path. Use `pathname` to set the proxy
// path so requests go to http://<host>/clickhouse-proxy/?query=...
// Derive the clickhouse-proxy pathname from the API URL.
// If apiUrl has a path (e.g. /api), the proxy path becomes /api/clickhouse-proxy
// so it works through the Next.js proxy at pages/api/[...all].ts.
// Pass origin-only URL to createClient to prevent the path from being
// interpreted as a ClickHouse database name.
const apiUrlObj = new URL(apiClient.getApiUrl());
const basePath = apiUrlObj.pathname.replace(/\/+$/, '');
const chProxyPath = `${basePath}/clickhouse-proxy`;
this.client = createClient({
url: apiUrlObj.origin,
pathname: chProxyPath,
// No ClickHouse credentials — the proxy handles auth to ClickHouse.
// We authenticate to the proxy via session cookie.
username: '',
password: '',
// Disable the Authorization header — we auth via session cookie,
// and a stray "Authorization: Basic Og==" (empty creds) causes
// Express to reject the request before reading the session cookie.
set_basic_auth_header: false,
request_timeout: this.requestTimeout,
application: 'hyperdx-tui',
http_headers: {
cookie: apiClient.getCookieHeader(),
// Force text/plain so Express's body parsers keep req.body as a
// string. Without this, the proxy's proxyReq.write(req.body) fails
// because express.json() parses the body into an Object.
'content-type': 'text/plain',
},
keep_alive: { enabled: false },
});
}
// Silence the "Sending Query: ..." debug output from BaseClickhouseClient
protected override logDebugQuery(): void {}
protected async __query<Format extends DataFormat>({
query,
format = 'JSON' as Format,
query_params = {},
abort_signal,
clickhouse_settings: externalClickhouseSettings,
connectionId,
queryId,
shouldSkipApplySettings,
}: QueryInputs<Format>): Promise<BaseResultSet<ReadableStream, Format>> {
let clickhouseSettings: ClickHouseSettings | undefined;
if (!shouldSkipApplySettings) {
clickhouseSettings = await this.processClickhouseSettings({
connectionId,
externalClickhouseSettings,
});
}
// Pass connection ID as HTTP header — the proxy uses this to
// look up the ClickHouse connection credentials from MongoDB
const httpHeaders: Record<string, string> = {};
if (connectionId && connectionId !== 'local') {
httpHeaders['x-hyperdx-connection-id'] = connectionId;
}
return this.getClient().query({
query,
query_params,
format,
abort_signal,
http_headers: httpHeaders,
clickhouse_settings: clickhouseSettings,
query_id: queryId,
}) as unknown as Promise<BaseResultSet<ReadableStream, Format>>;
}
}
// ------------------------------------------------------------------
// Response types (matching the internal API shapes)
// ------------------------------------------------------------------
interface MeResponse {
accessKey: string;
createdAt: string;
email: string;
id: string;
name: string;
team: {
id: string;
name: string;
apiKey: string;
};
}
export interface SourceResponse {
id: string;
_id: string;
name: string;
kind: 'log' | 'trace' | 'session' | 'metric';
connection: string;
from: {
databaseName: string;
tableName: string;
};
timestampValueExpression?: string;
displayedTimestampValueExpression?: string;
defaultTableSelectExpression?: string;
implicitColumnExpression?: string;
orderByExpression?: string;
querySettings?: Array<{ setting: string; value: string }>;
// Log source-specific
bodyExpression?: string;
severityTextExpression?: string;
serviceNameExpression?: string;
// Trace-specific
traceIdExpression?: string;
spanIdExpression?: string;
parentSpanIdExpression?: string;
spanNameExpression?: string;
spanKindExpression?: string;
durationExpression?: string;
durationPrecision?: number;
statusCodeExpression?: string;
statusMessageExpression?: string;
eventAttributesExpression?: string;
resourceAttributesExpression?: string;
// Correlated source IDs
logSourceId?: string;
traceSourceId?: string;
metricSourceId?: string;
sessionSourceId?: string;
}
interface ConnectionResponse {
id: string;
_id: string;
name: string;
host: string;
username: string;
}
export interface SavedSearchResponse {
id: string;
_id: string;
name: string;
select: string;
where: string;
whereLanguage: 'lucene' | 'sql';
source: string;
tags: string[];
orderBy?: string;
}
interface DashboardTileConfig {
name?: string;
source?: string;
type?: string;
displayType?: string;
sql?: string;
[key: string]: unknown;
}
interface DashboardTile {
id: string;
x: number;
y: number;
w: number;
h: number;
config: DashboardTileConfig;
containerId?: string;
}
interface DashboardFilter {
key: string;
displayName?: string;
keyExpression?: string;
sourceId?: string;
}
interface DashboardResponse {
id: string;
_id: string;
name: string;
tags: string[];
tiles: DashboardTile[];
filters?: DashboardFilter[];
savedQuery?: string | null;
savedQueryLanguage?: string | null;
createdAt?: string;
updatedAt?: string;
}

View file

@ -0,0 +1,266 @@
/**
* Builds ClickHouse SQL for searching events (logs or traces) using
* renderChartConfig from common-utils.
*/
import type {
ChSql,
ColumnMetaType,
} from '@hyperdx/common-utils/dist/clickhouse';
import { chSqlToAliasMap } from '@hyperdx/common-utils/dist/clickhouse';
import { renderChartConfig } from '@hyperdx/common-utils/dist/core/renderChartConfig';
import type { Metadata } from '@hyperdx/common-utils/dist/core/metadata';
import { DisplayType } from '@hyperdx/common-utils/dist/types';
import type { BuilderChartConfigWithDateRange } from '@hyperdx/common-utils/dist/types';
import SqlString from 'sqlstring';
import type { SourceResponse } from './client';
import {
getFirstTimestampValueExpression,
getDisplayedTimestampValueExpression,
} from '@/shared/source';
import { buildRowDataSelectList } from '@/shared/rowDataPanel';
import { buildColumnMap, getRowWhere } from '@/shared/useRowWhere';
export interface SearchQueryOptions {
source: SourceResponse;
/** Override the SELECT clause (user-edited via $EDITOR) */
selectOverride?: string;
/** Lucene search string */
searchQuery?: string;
/** Date range */
startTime: Date;
endTime: Date;
/** Max rows */
limit?: number;
/** Offset for pagination */
offset?: number;
}
/**
* Build a default SELECT expression for a trace source when
* defaultTableSelectExpression is not set.
*/
function buildTraceSelectExpression(source: SourceResponse): string {
const cols: string[] = [];
const ts = source.timestampValueExpression ?? 'TimestampTime';
cols.push(ts);
if (source.spanNameExpression) cols.push(source.spanNameExpression);
if (source.serviceNameExpression) cols.push(source.serviceNameExpression);
if (source.durationExpression) cols.push(source.durationExpression);
if (source.statusCodeExpression) cols.push(source.statusCodeExpression);
if (source.traceIdExpression) cols.push(source.traceIdExpression);
if (source.spanIdExpression) cols.push(source.spanIdExpression);
return cols.join(', ');
}
/**
* Build a search query using renderChartConfig works for both
* log and trace sources.
*/
export async function buildEventSearchQuery(
opts: SearchQueryOptions,
metadata: Metadata,
): Promise<ChSql> {
const {
source,
selectOverride,
searchQuery = '',
startTime,
endTime,
limit = 100,
offset,
} = opts;
const tsExpr = source.timestampValueExpression ?? 'TimestampTime';
const firstTsExpr = getFirstTimestampValueExpression(tsExpr) ?? tsExpr;
const orderBy = source.orderByExpression ?? `${firstTsExpr} DESC`;
// Use the override if provided, otherwise the source's default
let selectExpr = selectOverride ?? source.defaultTableSelectExpression ?? '';
if (!selectExpr && source.kind === 'trace') {
selectExpr = buildTraceSelectExpression(source);
}
const config: BuilderChartConfigWithDateRange = {
displayType: DisplayType.Search,
select: selectExpr,
from: source.from,
where: searchQuery,
whereLanguage: searchQuery ? 'lucene' : 'sql',
connection: source.connection,
timestampValueExpression: tsExpr,
implicitColumnExpression: source.implicitColumnExpression,
orderBy,
limit: { limit, offset },
dateRange: [startTime, endTime],
};
return renderChartConfig(config, metadata, source.querySettings);
}
// ---- Full row fetch (SELECT *) -------------------------------------
// ---- Trace waterfall query (all spans for a traceId) ----------------
export interface TraceSpansQueryOptions {
source: SourceResponse;
traceId: string;
}
/**
* Build a raw SQL query to fetch all spans for a given traceId.
* Returns columns needed for the waterfall chart.
*/
export function buildTraceSpansSql(opts: TraceSpansQueryOptions): {
sql: string;
connectionId: string;
} {
const { source, traceId } = opts;
const db = source.from.databaseName;
const table = source.from.tableName;
const traceIdExpr = source.traceIdExpression ?? 'TraceId';
const spanIdExpr = source.spanIdExpression ?? 'SpanId';
const parentSpanIdExpr = source.parentSpanIdExpression ?? 'ParentSpanId';
const spanNameExpr = source.spanNameExpression ?? 'SpanName';
const serviceNameExpr = source.serviceNameExpression ?? 'ServiceName';
const durationExpr = source.durationExpression ?? 'Duration';
const statusCodeExpr = source.statusCodeExpression ?? 'StatusCode';
const tsExpr = getDisplayedTimestampValueExpression(source);
const cols = [
`${tsExpr} AS Timestamp`,
`${traceIdExpr} AS TraceId`,
`${spanIdExpr} AS SpanId`,
`${parentSpanIdExpr} AS ParentSpanId`,
`${spanNameExpr} AS SpanName`,
`${serviceNameExpr} AS ServiceName`,
`${durationExpr} AS Duration`,
`${statusCodeExpr} AS StatusCode`,
];
const escapedTraceId = SqlString.escape(traceId);
const sql = `SELECT ${cols.join(', ')} FROM ${db}.${table} WHERE ${traceIdExpr} = ${escapedTraceId} ORDER BY ${tsExpr} ASC LIMIT 10000`;
return {
sql,
connectionId: source.connection,
};
}
/**
* Build a raw SQL query to fetch correlated log events for a given traceId.
* Returns columns matching the SpanRow shape used by the waterfall chart.
* Logs are linked to spans via their SpanId.
*/
export function buildTraceLogsSql(opts: TraceSpansQueryOptions): {
sql: string;
connectionId: string;
} {
const { source, traceId } = opts;
const db = source.from.databaseName;
const table = source.from.tableName;
const traceIdExpr = source.traceIdExpression ?? 'TraceId';
const spanIdExpr = source.spanIdExpression ?? 'SpanId';
const bodyExpr = source.bodyExpression ?? 'Body';
const serviceNameExpr = source.serviceNameExpression ?? 'ServiceName';
const sevExpr = source.severityTextExpression ?? 'SeverityText';
const tsExpr = getDisplayedTimestampValueExpression(source);
const cols = [
`${tsExpr} AS Timestamp`,
`${traceIdExpr} AS TraceId`,
`${spanIdExpr} AS SpanId`,
`'' AS ParentSpanId`,
`${bodyExpr} AS SpanName`,
`${serviceNameExpr} AS ServiceName`,
`0 AS Duration`,
`${sevExpr} AS StatusCode`,
];
const escapedTraceId = SqlString.escape(traceId);
const sql = `SELECT ${cols.join(', ')} FROM ${db}.${table} WHERE ${traceIdExpr} = ${escapedTraceId} ORDER BY ${tsExpr} ASC LIMIT 10000`;
return {
sql,
connectionId: source.connection,
};
}
export interface FullRowQueryOptions {
source: SourceResponse;
/** The partial row data from the table (used to build the WHERE clause) */
row: Record<string, unknown>;
}
/**
* Build a full row query using renderChartConfig, matching the web
* frontend's useRowData in DBRowDataPanel.tsx.
*
* @source packages/app/src/components/DBRowDataPanel.tsx (useRowData)
* @source packages/app/src/hooks/useRowWhere.tsx (processRowToWhereClause)
*
* Uses chSqlToAliasMap from the table query's rendered SQL + column
* metadata to build a proper WHERE clause with type-aware matching,
* then queries:
* SELECT *, <__hdx_* aliases>
* FROM source.from
* WHERE <processRowToWhereClause>
* WITH <aliasWith>
* LIMIT 1
*/
export async function buildFullRowQuery(
opts: FullRowQueryOptions & {
/** The rendered ChSql from the table query (for alias resolution) */
tableChSql: ChSql;
/** Column metadata from the table query response */
tableMeta: ColumnMetaType[];
metadata: Metadata;
},
): Promise<ChSql> {
const { source, row, tableChSql, tableMeta, metadata } = opts;
// Parse the rendered table SQL to get alias → expression mapping
const aliasMap = chSqlToAliasMap(tableChSql);
// Build column map using both meta (types) and aliasMap (expressions)
const columnMap = buildColumnMap(tableMeta, aliasMap);
// Build WHERE using the web frontend's processRowToWhereClause
const rowWhereResult = getRowWhere(
row as Record<string, unknown>,
columnMap,
aliasMap,
);
const selectList = buildRowDataSelectList(source);
// Use a very wide date range — the WHERE clause already uniquely
// identifies the row, so the time range is just a safety net
const now = new Date();
const yearAgo = new Date(now.getTime() - 365 * 24 * 60 * 60 * 1000);
const config: BuilderChartConfigWithDateRange = {
connection: source.connection,
from: source.from,
timestampValueExpression:
source.timestampValueExpression ?? 'TimestampTime',
dateRange: [yearAgo, now],
select: selectList,
where: rowWhereResult.where,
limit: { limit: 1 },
displayType: DisplayType.Table,
...(rowWhereResult.aliasWith.length > 0
? { with: rowWhereResult.aliasWith }
: {}),
};
return renderChartConfig(config, metadata, source.querySettings);
}

638
packages/cli/src/cli.tsx Normal file
View file

@ -0,0 +1,638 @@
#!/usr/bin/env node
// MUST be the first import — silences console.debug/warn/error before
// any common-utils code runs. ESM hoists imports above inline code,
// so this can't be done with inline statements.
import { _origError } from '@/utils/silenceLogs';
import React, { useState, useCallback } from 'react';
import { render, Box, Text, useApp } from 'ink';
import TextInput from 'ink-text-input';
import Spinner from 'ink-spinner';
import { Command } from 'commander';
import chalk from 'chalk';
import App from '@/App';
import { ApiClient } from '@/api/client';
import { clearSession, loadSession } from '@/utils/config';
import { uploadSourcemaps } from '@/sourcemaps';
// ---- Standalone interactive login for `hdx auth login` -------------
function LoginPrompt({
apiUrl,
client,
}: {
apiUrl: string;
client: ApiClient;
}) {
const { exit } = useApp();
const [field, setField] = useState<'email' | 'password'>('email');
const [email, setEmail] = useState('');
const [password, setPassword] = useState('');
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const handleSubmitEmail = useCallback(() => {
if (!email.trim()) return;
setField('password');
}, [email]);
const handleSubmitPassword = useCallback(async () => {
if (!password) return;
setLoading(true);
setError(null);
const ok = await client.login(email, password);
setLoading(false);
if (ok) {
exit();
// Small delay to let Ink unmount before writing to stdout
setTimeout(() => {
process.stdout.write(
chalk.green(`\nLogged in as ${email} (${apiUrl})\n`),
);
}, 50);
} else {
setError('Login failed. Check your email and password.');
setField('email');
setEmail('');
setPassword('');
}
}, [email, password, client, apiUrl, exit]);
return (
<Box flexDirection="column" paddingX={1}>
<Text bold color="cyan">
HyperDX Login
</Text>
<Text dimColor>Server: {apiUrl}</Text>
<Text> </Text>
{error && <Text color="red">{error}</Text>}
{loading ? (
<Text>
<Spinner type="dots" /> Logging in
</Text>
) : field === 'email' ? (
<Box>
<Text>Email: </Text>
<TextInput
value={email}
onChange={setEmail}
onSubmit={handleSubmitEmail}
/>
</Box>
) : (
<Box>
<Text>Password: </Text>
<TextInput
value={password}
onChange={setPassword}
onSubmit={handleSubmitPassword}
mask="*"
/>
</Box>
)}
</Box>
);
}
/**
* Resolve the server URL: use the provided flag, or fall back to the
* saved session's apiUrl. Exits with an error if neither is available.
*/
function resolveServer(flagValue: string | undefined): string {
if (flagValue) return flagValue;
const session = loadSession();
if (session?.apiUrl) return session.apiUrl;
_origError(
chalk.red(
`No server specified. Use ${chalk.bold('-s <url>')} or run ${chalk.bold('hdx auth login -s <url>')} first.\n`,
),
);
process.exit(1);
}
const program = new Command();
program
.name('hdx')
.description('HyperDX CLI — search and tail events from the terminal')
.version('0.1.0')
.enablePositionalOptions();
// ---- Interactive mode (default) ------------------------------------
program
.command('tui')
.description('Interactive TUI for event search and tail')
.option('-s, --server <url>', 'HyperDX API server URL')
.option('-q, --query <query>', 'Initial Lucene search query')
.option('--source <name>', 'Source name (skips picker)')
.option('-f, --follow', 'Start in follow/live tail mode')
.action(opts => {
const server = resolveServer(opts.server);
render(
<App
apiUrl={server}
query={opts.query}
sourceName={opts.source}
follow={opts.follow}
/>,
);
});
// ---- Auth (login / logout / status) --------------------------------
const auth = program
.command('auth')
.description('Manage authentication')
.enablePositionalOptions()
.passThroughOptions();
auth
.command('login')
.description('Sign in to your HyperDX account')
.requiredOption('-s, --server <url>', 'HyperDX API server URL')
.option('-e, --email <email>', 'Email address')
.option('-p, --password <password>', 'Password')
.action(async opts => {
const client = new ApiClient({ apiUrl: opts.server });
if (opts.email && opts.password) {
// Non-interactive login (for scripting/CI)
const ok = await client.login(opts.email, opts.password);
if (ok) {
process.stdout.write(
chalk.green(`Logged in as ${opts.email} (${opts.server})\n`),
);
} else {
_origError(chalk.red('Login failed. Check your email and password.\n'));
process.exit(1);
}
} else {
// Interactive login via Ink
const { waitUntilExit } = render(
<LoginPrompt apiUrl={opts.server} client={client} />,
);
await waitUntilExit();
}
});
auth
.command('logout')
.description('Log out from your HyperDX account')
.action(() => {
clearSession();
process.stdout.write('Session cleared.\n');
});
auth
.command('status')
.description('Show authentication status')
.action(async () => {
const session = loadSession();
if (!session) {
process.stdout.write(
chalk.yellow(
`Not logged in. Run ${chalk.bold('hdx auth login -s <url>')} to sign in.\n`,
),
);
process.exit(1);
}
const client = new ApiClient({ apiUrl: session.apiUrl });
const ok = await client.checkSession();
if (!ok) {
process.stdout.write(
chalk.yellow(
`Session expired. Run ${chalk.bold('hdx auth login -s <url>')} to sign in again.\n`,
),
);
process.exit(1);
}
try {
const me = await client.getMe();
process.stdout.write(
`${chalk.green('Logged in')} as ${chalk.bold(me.email)} (${session.apiUrl})\n`,
);
} catch {
process.stdout.write(chalk.green('Logged in') + ` (${session.apiUrl})\n`);
}
});
// ---- Sources -------------------------------------------------------
program
.command('sources')
.description(
'List data sources (log, trace, session, metric) with ClickHouse table schemas',
)
.option('-s, --server <url>', 'HyperDX API server URL')
.option('--json', 'Output as JSON (for programmatic consumption)')
.addHelpText(
'after',
`
About:
A "source" in HyperDX is a named data source backed by a ClickHouse table.
Each source has a kind (log, trace, session, or metric) and a set of
expression mappings that tell HyperDX which columns hold timestamps, trace
IDs, span names, severity levels, etc.
This command lists all sources and fetches the ClickHouse CREATE TABLE
schema for each (metric sources are skipped since their schema is not
useful for direct queries).
Use --json for structured output suitable for LLM / agent consumption.
JSON output schema (--json):
Array of objects, each with:
id - Source ID (use with other hdx commands)
name - Human-readable source name
kind - "log" | "trace" | "session" | "metric"
database - ClickHouse database name
table - ClickHouse table name
connection - Connection ID for the ClickHouse proxy
schema - Full CREATE TABLE DDL (null for metric sources)
expressions - Column expression mappings:
timestamp - Primary timestamp column (e.g. "TimestampTime")
displayedTimestamp - High-precision display timestamp (DateTime64)
body - Log body column
severityText - Severity level column (e.g. "SeverityText")
serviceName - Service name column
traceId - Trace ID column
spanId - Span ID column
parentSpanId - Parent span ID column
spanName - Span name column
duration - Duration column (raw value)
durationPrecision - Duration unit: 3=ms, 6=μs, 9=ns
statusCode - Status code column
eventAttributes - Span/log attributes (Map/JSON column)
resourceAttributes - Resource attributes (Map/JSON column)
implicitColumn - Implicit column for Lucene search
defaultTableSelect - Default SELECT clause for table view
orderBy - Default ORDER BY clause
correlatedSources - IDs of linked sources:
log - Correlated log source ID
trace - Correlated trace source ID
metric - Correlated metric source ID
session - Correlated session source ID
Examples:
$ hdx sources # Human-readable table with schemas
$ hdx sources --json # JSON for agents / scripts
$ hdx sources --json | jq '.[0]' # Inspect first source
`,
)
.action(async opts => {
const server = resolveServer(opts.server);
const client = new ApiClient({ apiUrl: server });
if (!(await client.checkSession())) {
_origError(
chalk.red(
`Not logged in. Run ${chalk.bold('hdx auth login')} to sign in.\n`,
),
);
process.exit(1);
}
const sources = await client.getSources();
if (sources.length === 0) {
if (opts.json) {
process.stdout.write('[]\n');
} else {
process.stdout.write('No sources found.\n');
}
return;
}
const chClient = client.createClickHouseClient();
// Fetch schemas for non-metric sources (in parallel)
const schemaEntries = await Promise.all(
sources.map(async (s): Promise<[string, string | null]> => {
if (s.kind === 'metric') return [s.id, null];
try {
const resultSet = await chClient.query({
query: `SHOW CREATE TABLE ${s.from.databaseName}.${s.from.tableName}`,
format: 'JSON',
connectionId: s.connection,
});
const json = await resultSet.json<{ statement: string }>();
const row = (json.data as { statement: string }[])?.[0];
return [s.id, row?.statement?.trimEnd() ?? null];
} catch {
return [s.id, null];
}
}),
);
const schemas = new Map(schemaEntries);
if (opts.json) {
const output = sources.map(s => ({
id: s.id,
name: s.name,
kind: s.kind,
database: s.from.databaseName,
table: s.from.tableName,
connection: s.connection,
schema: schemas.get(s.id) ?? null,
expressions: {
timestamp: s.timestampValueExpression ?? null,
displayedTimestamp: s.displayedTimestampValueExpression ?? null,
body: s.bodyExpression ?? null,
severityText: s.severityTextExpression ?? null,
serviceName: s.serviceNameExpression ?? null,
traceId: s.traceIdExpression ?? null,
spanId: s.spanIdExpression ?? null,
parentSpanId: s.parentSpanIdExpression ?? null,
spanName: s.spanNameExpression ?? null,
duration: s.durationExpression ?? null,
durationPrecision: s.durationPrecision ?? null,
statusCode: s.statusCodeExpression ?? null,
eventAttributes: s.eventAttributesExpression ?? null,
resourceAttributes: s.resourceAttributesExpression ?? null,
implicitColumn: s.implicitColumnExpression ?? null,
defaultTableSelect: s.defaultTableSelectExpression ?? null,
orderBy: s.orderByExpression ?? null,
},
correlatedSources: {
log: s.logSourceId ?? null,
trace: s.traceSourceId ?? null,
metric: s.metricSourceId ?? null,
session: s.sessionSourceId ?? null,
},
}));
process.stdout.write(JSON.stringify(output, null, 2) + '\n');
return;
}
// Human-readable output
for (const s of sources) {
const table = `${s.from.databaseName}.${s.from.tableName}`;
process.stdout.write(
`${chalk.bold.cyan(s.name)} ${chalk.dim(s.kind)} ${chalk.dim(table)}\n`,
);
const schema = schemas.get(s.id);
if (schema) {
const lines = schema.split('\n');
for (const line of lines) {
process.stdout.write(chalk.dim(` ${line}\n`));
}
} else if (s.kind !== 'metric') {
process.stdout.write(chalk.dim(' (schema unavailable)\n'));
}
process.stdout.write('\n');
}
});
// ---- Dashboards ----------------------------------------------------
program
.command('dashboards')
.description('List dashboards with tile summaries')
.option('-s, --server <url>', 'HyperDX API server URL')
.option('--json', 'Output as JSON (for programmatic consumption)')
.addHelpText(
'after',
`
About:
Lists all dashboards for the authenticated team. Each dashboard
contains tiles (charts/visualizations) that query ClickHouse sources.
Use --json for structured output suitable for LLM / agent consumption.
JSON output schema (--json):
Array of objects, each with:
id - Dashboard ID
name - Dashboard name
tags - Array of tag strings
filters - Dashboard-level filter keys (key, displayName, sourceId)
savedQuery - Default dashboard query (if set)
createdAt - ISO timestamp
updatedAt - ISO timestamp
tiles - Array of tile summaries:
id - Tile ID
name - Chart name (may be null)
type - Chart type (time, table, number, pie, bar, etc.)
source - Source ID referenced by this tile (null for raw SQL)
sql - Raw SQL query (null for builder-mode charts)
Examples:
$ hdx dashboards # Human-readable list with tiles
$ hdx dashboards --json # JSON for agents / scripts
$ hdx dashboards --json | jq '.[0].tiles' # List tiles of first dashboard
`,
)
.action(async opts => {
const server = resolveServer(opts.server);
const client = new ApiClient({ apiUrl: server });
if (!(await client.checkSession())) {
_origError(
chalk.red(
`Not logged in. Run ${chalk.bold('hdx auth login')} to sign in.\n`,
),
);
process.exit(1);
}
const dashboards = await client.getDashboards();
if (dashboards.length === 0) {
if (opts.json) {
process.stdout.write('[]\n');
} else {
process.stdout.write('No dashboards found.\n');
}
return;
}
if (opts.json) {
const output = dashboards.map(d => ({
id: d.id,
name: d.name,
tags: d.tags ?? [],
filters: d.filters ?? [],
savedQuery: d.savedQuery ?? null,
createdAt: d.createdAt ?? null,
updatedAt: d.updatedAt ?? null,
tiles: d.tiles.map(t => ({
id: t.id,
name: t.config.name ?? null,
type: t.config.type ?? t.config.displayType ?? null,
source: t.config.source ?? null,
sql: t.config.sql ?? null,
})),
}));
process.stdout.write(JSON.stringify(output, null, 2) + '\n');
return;
}
// Fetch sources to resolve source names for display
let sourceNames: Record<string, string> = {};
try {
const sources = await client.getSources();
sourceNames = Object.fromEntries(
sources.flatMap(s => [
[s.id, s.name],
[s._id, s.name],
]),
);
} catch {
// Non-fatal — just won't show source names
}
// Human-readable output
for (const d of dashboards) {
const tags =
d.tags.length > 0 ? ` ${chalk.dim(`[${d.tags.join(', ')}]`)}` : '';
process.stdout.write(
`${chalk.bold.cyan(d.name)}${tags} ${chalk.dim(`${d.tiles.length} tile${d.tiles.length !== 1 ? 's' : ''}`)}\n`,
);
for (let i = 0; i < d.tiles.length; i++) {
const t = d.tiles[i];
const isLast = i === d.tiles.length - 1;
const prefix = isLast ? ' └─ ' : ' ├─ ';
const name = t.config.name || '(untitled)';
const chartType = t.config.type ?? t.config.displayType ?? 'chart';
let sourceLabel = '';
if (t.config.sql) {
sourceLabel = 'raw SQL';
} else if (t.config.source) {
sourceLabel = `source: ${sourceNames[t.config.source] ?? t.config.source}`;
}
const meta = [chartType, sourceLabel].filter(Boolean).join(', ');
process.stdout.write(
`${chalk.dim(prefix)}${name} ${chalk.dim(`(${meta})`)}\n`,
);
}
process.stdout.write('\n');
}
});
// ---- Query ---------------------------------------------------------
program
.command('query')
.description('Run a raw SQL query against a ClickHouse source')
.requiredOption('--source <nameOrId>', 'Source name or ID')
.requiredOption('--sql <query>', 'SQL query to execute')
.option('-s, --server <url>', 'HyperDX API server URL')
.option('--format <format>', 'ClickHouse output format', 'JSON')
.addHelpText(
'after',
`
About:
Execute a raw ClickHouse SQL query through the HyperDX proxy, using
the connection credentials associated with a source. This is useful
for ad-hoc exploration, debugging, and agent-driven queries.
The --source flag accepts either the source name (case-insensitive)
or the source ID (from 'hdx sources --json').
The query is sent as-is to ClickHouse you are responsible for
writing valid SQL. Use 'hdx sources' to discover table names and
column schemas.
Output is written to stdout. Use --format to control the ClickHouse
response format (JSON, JSONEachRow, TabSeparated, CSV, etc.).
Examples:
$ hdx query --source "Logs" --sql "SELECT count() FROM default.otel_logs"
$ hdx query --source "Traces" --sql "SELECT * FROM default.otel_traces LIMIT 5"
$ hdx query --source "Logs" --sql "SELECT Body FROM default.otel_logs LIMIT 3" --format JSONEachRow
`,
)
.action(async opts => {
const server = resolveServer(opts.server);
const client = new ApiClient({ apiUrl: server });
if (!(await client.checkSession())) {
_origError(
chalk.red(
`Not logged in. Run ${chalk.bold('hdx auth login')} to sign in.\n`,
),
);
process.exit(1);
}
const sources = await client.getSources();
const source = sources.find(
s =>
s.name.toLowerCase() === opts.source.toLowerCase() ||
s.id === opts.source ||
s._id === opts.source,
);
if (!source) {
_origError(chalk.red(`Source "${opts.source}" not found.\n`));
_origError('Available sources:');
for (const s of sources) {
_origError(` - ${s.name} (${s.kind}) [${s.id}]`);
}
process.exit(1);
}
const chClient = client.createClickHouseClient();
try {
const resultSet = await chClient.query({
query: opts.sql,
format: opts.format,
connectionId: source.connection,
});
const text = await resultSet.text();
process.stdout.write(text);
// Ensure trailing newline for clean terminal output
if (text.length > 0 && !text.endsWith('\n')) {
process.stdout.write('\n');
}
} catch (err) {
const msg = err instanceof Error ? err.message : String(err);
_origError(chalk.red(`Query failed: ${msg}\n`));
process.exit(1);
}
});
// ---- Upload Sourcemaps ---------------------------------------------
program
.command('upload-sourcemaps')
.description(
'Upload JavaScript source maps to HyperDX for stack trace de-obfuscation',
)
.option('-k, --serviceKey <string>', 'The HyperDX service account API key')
.option(
'-u, --apiUrl [string]',
'An optional api url for self-hosted deployments',
)
.option(
'-rid, --releaseId [string]',
'An optional release id to associate the sourcemaps with',
)
.option(
'-p, --path [string]',
'Sets the directory of where the sourcemaps are',
'.',
)
.option(
'-bp, --basePath [string]',
'An optional base path for the uploaded sourcemaps',
)
.option(
'--apiVersion [string]',
'The API version to use (v1 for HyperDX V1 Cloud, v2 for latest)',
'v1',
)
.action(uploadSourcemaps);
program.parse();

View file

@ -0,0 +1,98 @@
/**
* Renders a key-value list of column values from a row data object.
* Shared between the Column Values tab and Trace tab's Event Details.
*/
import React, { useMemo } from 'react';
import { Box, Text } from 'ink';
function flatten(s: string): string {
return s
.replace(/\n/g, ' ')
.replace(/\s{2,}/g, ' ')
.trim();
}
interface ColumnValuesProps {
data: Record<string, unknown>;
searchQuery?: string;
wrapLines?: boolean;
/** Max visible rows (enables scrolling viewport) */
maxRows?: number;
/** Scroll offset into the entries list */
scrollOffset?: number;
}
const ColumnValues = React.memo(function ColumnValues({
data,
searchQuery,
wrapLines,
maxRows,
scrollOffset = 0,
}: ColumnValuesProps) {
const entries = useMemo(() => {
return Object.entries(data)
.filter(([key, value]) => {
if (!searchQuery) return true;
const q = searchQuery.toLowerCase();
const strVal =
value != null && typeof value === 'object'
? JSON.stringify(value)
: String(value ?? '');
return (
key.toLowerCase().includes(q) || strVal.toLowerCase().includes(q)
);
})
.map(([key, value]) => {
let strVal: string;
if (value != null && typeof value === 'object') {
strVal = JSON.stringify(value, null, 2);
} else {
strVal = String(value ?? '');
}
let displayVal: string;
if (strVal.startsWith('{') || strVal.startsWith('[')) {
try {
displayVal = JSON.stringify(JSON.parse(strVal), null, 2);
} catch {
displayVal = strVal;
}
} else {
displayVal = strVal;
}
return { key, displayVal };
});
}, [data, searchQuery]);
const totalEntries = entries.length;
const visibleEntries =
maxRows != null
? entries.slice(scrollOffset, scrollOffset + maxRows)
: entries;
return (
<Box flexDirection="column">
{visibleEntries.map(({ key, displayVal }) => (
<Box
key={key}
height={wrapLines ? undefined : 1}
overflowX={wrapLines ? undefined : 'hidden'}
overflowY={wrapLines ? undefined : 'hidden'}
>
<Box width={35} flexShrink={0} overflowX="hidden">
<Text color="cyan" wrap="truncate">
{key}
</Text>
</Box>
<Box flexGrow={1} overflowX={wrapLines ? undefined : 'hidden'}>
<Text wrap={wrapLines ? 'wrap' : 'truncate'}>
{wrapLines ? displayVal : flatten(displayVal)}
</Text>
</Box>
</Box>
))}
</Box>
);
});
export default ColumnValues;

View file

@ -0,0 +1,232 @@
import React from 'react';
import { Box, Text } from 'ink';
import Spinner from 'ink-spinner';
import type { SourceResponse, ProxyClickhouseClient } from '@/api/client';
import ColumnValues from '@/components/ColumnValues';
import RowOverview from '@/components/RowOverview';
import TraceWaterfall from '@/components/TraceWaterfall';
import type { FormattedRow } from './types';
import { SearchBar } from './SubComponents';
import { flatten } from './utils';
type DetailTab = 'overview' | 'columns' | 'trace';
type DetailPanelProps = {
source: SourceResponse;
sources: SourceResponse[];
clickhouseClient: ProxyClickhouseClient;
detailTab: DetailTab;
expandedRowData: Record<string, unknown> | null;
expandedRowLoading: boolean;
expandedTraceId: string | null;
expandedSpanId: string | null;
traceSelectedIndex: number | null;
onTraceSelectedIndexChange: (index: number | null) => void;
detailSearchQuery: string;
focusDetailSearch: boolean;
onDetailSearchQueryChange: (v: string) => void;
onDetailSearchSubmit: () => void;
wrapLines: boolean;
termHeight: number;
fullDetailMaxRows: number;
detailMaxRows: number;
columnValuesScrollOffset: number;
traceDetailScrollOffset: number;
/** The formatted row for the summary header */
expandedFormattedRow?: FormattedRow & {
raw: Record<string, string | number>;
};
scrollOffset: number;
expandedRow: number;
};
export function DetailPanel({
source,
sources,
clickhouseClient,
detailTab,
expandedRowData,
expandedRowLoading,
expandedTraceId,
expandedSpanId,
traceSelectedIndex,
onTraceSelectedIndexChange,
detailSearchQuery,
focusDetailSearch,
onDetailSearchQueryChange,
onDetailSearchSubmit,
wrapLines,
termHeight,
fullDetailMaxRows,
detailMaxRows,
columnValuesScrollOffset,
traceDetailScrollOffset,
expandedFormattedRow,
scrollOffset,
expandedRow,
}: DetailPanelProps) {
const hasTrace =
source.kind === 'trace' || (source.kind === 'log' && source.traceSourceId);
const tabs: Array<{ key: DetailTab; label: string }> = [
{ key: 'overview', label: 'Overview' },
{ key: 'columns', label: 'Column Values' },
...(hasTrace ? [{ key: 'trace' as const, label: 'Trace' }] : []),
];
return (
<Box flexDirection="column" marginTop={1} flexGrow={1} overflowY="hidden">
{/* Back hint */}
<Text dimColor>esc=back to table</Text>
{/* Summary header */}
<Box marginTop={1} marginBottom={1}>
<Text color="cyan" bold>
{(() => {
if (!expandedFormattedRow) return '';
return source.kind === 'trace'
? `${expandedFormattedRow.cells[1] || ''} > ${expandedFormattedRow.cells[2] || ''}`
: flatten(
String(
expandedFormattedRow.raw[source.bodyExpression ?? 'Body'] ??
'',
),
).slice(0, 200);
})()}
</Text>
</Box>
{/* Detail tab bar */}
<Box marginBottom={1}>
{tabs.map(tab => (
<Box key={tab.key} marginRight={2}>
<Text
color={detailTab === tab.key ? 'cyan' : undefined}
bold={detailTab === tab.key}
dimColor={detailTab !== tab.key}
>
{detailTab === tab.key ? '▸ ' : ' '}
{tab.label}
</Text>
</Box>
))}
<Text dimColor>(tab to switch)</Text>
</Box>
{/* Detail search bar — only show when focused or has a query */}
{(focusDetailSearch || detailSearchQuery) && (
<SearchBar
focused={focusDetailSearch}
query={detailSearchQuery}
onChange={onDetailSearchQueryChange}
onSubmit={onDetailSearchSubmit}
/>
)}
<Text dimColor>{'─'.repeat(80)}</Text>
{/* Tab content */}
{detailTab === 'overview' && (
/* ---- Overview tab ---- */
<Box
flexDirection="column"
height={fullDetailMaxRows}
overflowY="hidden"
>
{expandedRowLoading ? (
<Text>
<Spinner type="dots" /> Loading
</Text>
) : expandedRowData ? (
<RowOverview
source={source}
rowData={expandedRowData}
searchQuery={detailSearchQuery}
wrapLines={wrapLines}
maxRows={fullDetailMaxRows}
scrollOffset={columnValuesScrollOffset}
/>
) : null}
</Box>
)}
{detailTab === 'trace' &&
/* ---- Trace waterfall tab ---- */
(() => {
if (!expandedTraceId) {
return expandedRowLoading ? (
<Text>
<Spinner type="dots" /> Loading trace ID
</Text>
) : (
<Text dimColor>No trace ID found for this row.</Text>
);
}
const findSource = (id: string | undefined) =>
id
? (sources.find(s => s.id === id || s._id === id) ?? null)
: null;
const traceSource =
source.kind === 'trace' ? source : findSource(source.traceSourceId);
const logSource =
source.kind === 'log' ? source : findSource(source.logSourceId);
if (!traceSource) {
return <Text dimColor>No correlated trace source found.</Text>;
}
// Reserve lines for: header, tab bar, search, separator,
// summary, col headers, separator, Event Details header +
// separator + content (~15 lines overhead)
const waterfallMaxRows = Math.max(10, termHeight - 15);
return (
<TraceWaterfall
clickhouseClient={clickhouseClient}
source={traceSource}
logSource={logSource}
traceId={expandedTraceId}
searchQuery={detailSearchQuery}
selectedIndex={traceSelectedIndex}
onSelectedIndexChange={onTraceSelectedIndexChange}
maxRows={waterfallMaxRows}
highlightHint={
expandedSpanId
? {
spanId: expandedSpanId,
kind: source.kind === 'log' ? 'log' : 'span',
}
: undefined
}
wrapLines={wrapLines}
detailScrollOffset={traceDetailScrollOffset}
detailMaxRows={detailMaxRows}
/>
);
})()}
{detailTab === 'columns' && (
/* ---- Column Values tab ---- */
<Box
flexDirection="column"
height={fullDetailMaxRows}
overflowY="hidden"
>
{expandedRowLoading ? (
<Text>
<Spinner type="dots" /> Loading all fields
</Text>
) : expandedRowData ? (
<ColumnValues
data={expandedRowData}
searchQuery={detailSearchQuery}
wrapLines={wrapLines}
maxRows={fullDetailMaxRows}
scrollOffset={columnValuesScrollOffset}
/>
) : null}
</Box>
)}
</Box>
);
}

View file

@ -0,0 +1,274 @@
import React, { useState, useCallback, useRef, useMemo } from 'react';
import { Box, useStdout } from 'ink';
import type { TimeRange } from '@/utils/editor';
import type { EventViewerProps, SwitchItem } from './types';
import { getColumns, getDynamicColumns, formatDynamicRow } from './utils';
import { Header, TabBar, SearchBar, Footer, HelpScreen } from './SubComponents';
import { TableView } from './TableView';
import { DetailPanel } from './DetailPanel';
import { useEventData } from './useEventData';
import { useKeybindings } from './useKeybindings';
export default function EventViewer({
clickhouseClient,
metadata,
source,
sources,
savedSearches,
onSavedSearchSelect,
initialQuery = '',
follow = true,
}: EventViewerProps) {
const { stdout } = useStdout();
const termHeight = stdout?.rows ?? 24;
const maxRows = Math.max(1, termHeight - 8);
// Fixed height for Event Details in Trace tab (about 1/3 of terminal)
const detailMaxRows = Math.max(5, Math.floor(termHeight / 3));
// Full-screen height for Overview/Column Values tabs
// (termHeight minus header, body preview, tab bar, separator, footer)
const fullDetailMaxRows = Math.max(5, termHeight - 9);
// ---- UI state ----------------------------------------------------
const [searchQuery, setSearchQuery] = useState(initialQuery);
const [submittedQuery, setSubmittedQuery] = useState(initialQuery);
const [isFollowing, setIsFollowing] = useState(follow);
const wasFollowingRef = useRef(false);
const [scrollOffset, setScrollOffset] = useState(0);
const [focusSearch, setFocusSearch] = useState(false);
const [showHelp, setShowHelp] = useState(false);
const [wrapLines, setWrapLines] = useState(false);
const [customSelectMap, setCustomSelectMap] = useState<
Record<string, string>
>({});
const customSelect = customSelectMap[source.id] as string | undefined;
const [selectedRow, setSelectedRow] = useState(0);
const [expandedRow, setExpandedRow] = useState<number | null>(null);
const [detailTab, setDetailTab] = useState<'overview' | 'columns' | 'trace'>(
'overview',
);
const [detailSearchQuery, setDetailSearchQuery] = useState('');
const [focusDetailSearch, setFocusDetailSearch] = useState(false);
const [traceDetailScrollOffset, setTraceDetailScrollOffset] = useState(0);
const [columnValuesScrollOffset, setColumnValuesScrollOffset] = useState(0);
const [traceSelectedIndex, setTraceSelectedIndex] = useState<number | null>(
null,
);
const [timeRange, setTimeRange] = useState<TimeRange>(() => {
const now = new Date();
return { start: new Date(now.getTime() - 60 * 60 * 1000), end: now };
});
// ---- Data fetching -----------------------------------------------
const {
events,
loading,
error,
hasMore,
loadingMore,
expandedRowData,
expandedRowLoading,
expandedTraceId,
expandedSpanId,
fetchNextPage,
} = useEventData({
clickhouseClient,
metadata,
source,
customSelect,
submittedQuery,
timeRange,
isFollowing,
setTimeRange,
expandedRow,
});
// ---- Derived values ----------------------------------------------
const columns = useMemo(
() => (events.length > 0 ? getDynamicColumns(events) : getColumns(source)),
[source, events],
);
const switchItems = useMemo<SwitchItem[]>(() => {
const items: SwitchItem[] = [];
for (const ss of savedSearches) {
const src = sources.find(s => s.id === ss.source || s._id === ss.source);
items.push({
type: 'saved',
label: `${ss.name}${src ? ` (${src.name})` : ''}`,
search: ss,
});
}
for (const src of sources) {
items.push({ type: 'source', label: src.name, source: src });
}
return items;
}, [savedSearches, sources]);
const findActiveIndex = useCallback(() => {
const ssIdx = switchItems.findIndex(
item =>
item.type === 'saved' &&
item.search &&
(item.search.source === source.id ||
item.search.source === source._id) &&
item.search.where === submittedQuery,
);
if (ssIdx >= 0) return ssIdx;
const srcIdx = switchItems.findIndex(
item =>
item.type === 'source' &&
item.source &&
(item.source.id === source.id || item.source._id === source._id),
);
return srcIdx >= 0 ? srcIdx : 0;
}, [switchItems, source, submittedQuery]);
const activeIdx = findActiveIndex();
const visibleRowCount = Math.min(events.length - scrollOffset, maxRows);
// ---- Keybindings -------------------------------------------------
useKeybindings({
focusSearch,
focusDetailSearch,
showHelp,
expandedRow,
detailTab,
selectedRow,
scrollOffset,
isFollowing,
hasMore,
events,
maxRows,
visibleRowCount,
source,
timeRange,
customSelect,
detailMaxRows,
fullDetailMaxRows,
switchItems,
findActiveIndex,
onSavedSearchSelect,
setFocusSearch,
setFocusDetailSearch,
setShowHelp,
setSelectedRow,
setScrollOffset,
setExpandedRow,
setDetailTab,
setIsFollowing,
setWrapLines,
setDetailSearchQuery,
setTraceSelectedIndex,
setTraceDetailScrollOffset,
setColumnValuesScrollOffset,
setTimeRange,
setCustomSelectMap,
wasFollowingRef,
fetchNextPage,
});
// ---- Pre-format visible rows -------------------------------------
const visibleRows = useMemo(() => {
return events.slice(scrollOffset, scrollOffset + maxRows).map(row => ({
...formatDynamicRow(row, columns),
raw: row,
}));
}, [events, scrollOffset, maxRows, columns]);
const errorLine = error ? error.slice(0, 200) : '';
// ---- Render ------------------------------------------------------
if (showHelp) {
return (
<Box flexDirection="column" paddingX={1} height={termHeight}>
<HelpScreen />
</Box>
);
}
return (
<Box flexDirection="column" paddingX={1} height={termHeight}>
<Header
sourceName={source.name}
dbName={source.from.databaseName}
tableName={source.from.tableName}
isFollowing={isFollowing}
loading={loading}
timeRange={timeRange}
/>
{expandedRow === null && (
<>
<TabBar items={switchItems} activeIdx={activeIdx} />
<SearchBar
focused={focusSearch}
query={searchQuery}
onChange={setSearchQuery}
onSubmit={() => {
setSubmittedQuery(searchQuery);
setScrollOffset(0);
setFocusSearch(false);
}}
/>
</>
)}
{expandedRow !== null ? (
<DetailPanel
source={source}
sources={sources}
clickhouseClient={clickhouseClient}
detailTab={detailTab}
expandedRowData={expandedRowData}
expandedRowLoading={expandedRowLoading}
expandedTraceId={expandedTraceId}
expandedSpanId={expandedSpanId}
traceSelectedIndex={traceSelectedIndex}
onTraceSelectedIndexChange={setTraceSelectedIndex}
detailSearchQuery={detailSearchQuery}
focusDetailSearch={focusDetailSearch}
onDetailSearchQueryChange={setDetailSearchQuery}
onDetailSearchSubmit={() => setFocusDetailSearch(false)}
wrapLines={wrapLines}
termHeight={termHeight}
fullDetailMaxRows={fullDetailMaxRows}
detailMaxRows={detailMaxRows}
columnValuesScrollOffset={columnValuesScrollOffset}
traceDetailScrollOffset={traceDetailScrollOffset}
expandedFormattedRow={visibleRows.find(
(_, i) => scrollOffset + i === expandedRow,
)}
scrollOffset={scrollOffset}
expandedRow={expandedRow}
/>
) : (
<TableView
columns={columns}
visibleRows={visibleRows}
selectedRow={selectedRow}
focusSearch={focusSearch}
wrapLines={wrapLines}
maxRows={maxRows}
errorLine={errorLine}
loading={loading}
/>
)}
<Footer
rowCount={events.length}
cursorPos={scrollOffset + selectedRow + 1}
wrapLines={wrapLines}
isFollowing={isFollowing}
loadingMore={loadingMore}
scrollInfo={expandedRow !== null ? `Ctrl+D/U to scroll` : undefined}
/>
</Box>
);
}

View file

@ -0,0 +1,219 @@
import React from 'react';
import { Box, Text } from 'ink';
import TextInput from 'ink-text-input';
import Spinner from 'ink-spinner';
import type { TimeRange } from '@/utils/editor';
import type { Column, SwitchItem } from './types';
import { formatShortDate } from './utils';
// ---- Header --------------------------------------------------------
type HeaderProps = {
sourceName: string;
dbName: string;
tableName: string;
isFollowing: boolean;
loading: boolean;
timeRange: TimeRange;
};
export const Header = React.memo(function Header({
sourceName,
dbName,
tableName,
isFollowing,
loading,
timeRange,
}: HeaderProps) {
return (
<Box>
<Text bold color="cyan">
HyperDX
</Text>
<Text> </Text>
<Text color="green">{sourceName}</Text>
<Text dimColor>
{' '}
({dbName}.{tableName})
</Text>
<Text dimColor>
{' '}
{formatShortDate(timeRange.start)} {formatShortDate(timeRange.end)}
</Text>
{isFollowing && <Text color="yellow"> [FOLLOWING]</Text>}
{loading && (
<Text>
{' '}
<Spinner type="dots" />
</Text>
)}
</Box>
);
});
// ---- TabBar --------------------------------------------------------
type TabBarProps = {
items: SwitchItem[];
activeIdx: number;
};
export const TabBar = React.memo(function TabBar({
items,
activeIdx,
}: TabBarProps) {
if (items.length <= 1) return null;
return (
<Box height={1} overflowX="hidden" overflowY="hidden">
{items.map((item, i) => (
<Box key={`${item.type}-${i}`} marginRight={2}>
<Text
color={i === activeIdx ? 'cyan' : undefined}
bold={i === activeIdx}
dimColor={i !== activeIdx}
wrap="truncate"
>
{i === activeIdx ? '▸ ' : ' '}
{item.label}
</Text>
</Box>
))}
</Box>
);
});
// ---- SearchBar -----------------------------------------------------
type SearchBarProps = {
focused: boolean;
query: string;
onChange: (v: string) => void;
onSubmit: () => void;
};
export const SearchBar = React.memo(function SearchBar({
focused,
query,
onChange,
onSubmit,
}: SearchBarProps) {
return (
<Box>
<Text color={focused ? 'cyan' : 'gray'}>Search: </Text>
{focused ? (
<TextInput
value={query}
onChange={onChange}
onSubmit={onSubmit}
placeholder="Lucene query…"
/>
) : (
<Text dimColor>{query || '(empty)'}</Text>
)}
</Box>
);
});
// ---- Footer --------------------------------------------------------
type FooterProps = {
rowCount: number;
cursorPos: number;
wrapLines: boolean;
isFollowing: boolean;
loadingMore: boolean;
scrollInfo?: string;
};
export const Footer = React.memo(function Footer({
rowCount,
cursorPos,
wrapLines,
isFollowing,
loadingMore,
scrollInfo,
}: FooterProps) {
return (
<Box marginTop={1} justifyContent="space-between">
<Text dimColor>
{isFollowing ? '[FOLLOWING] ' : ''}
{wrapLines ? '[WRAP] ' : ''}
{loadingMore ? '[LOADING…] ' : ''}?=help q=quit
</Text>
<Text dimColor>
{scrollInfo ? `${scrollInfo} ` : ''}
{cursorPos}/{rowCount}
</Text>
</Box>
);
});
// ---- HelpScreen ----------------------------------------------------
export const HelpScreen = React.memo(function HelpScreen() {
const keys: Array<[string, string]> = [
['j / ↓', 'Move selection down'],
['k / ↑', 'Move selection up'],
['l / Enter', 'Expand row detail (SELECT *)'],
['h / Esc', 'Close row detail'],
['Tab (detail)', 'Switch Column Values / Trace'],
['G', 'Jump to last item'],
['g', 'Jump to first item'],
['Ctrl+D', 'Page down (half page)'],
['Ctrl+U', 'Page up (half page)'],
['/', 'Search (global or detail filter)'],
['Esc', 'Blur search bar'],
['Tab', 'Next source / saved search'],
['Shift+Tab', 'Previous source / saved search'],
['t', 'Edit time range in $EDITOR'],
['s', 'Edit select clause in $EDITOR'],
['f', 'Toggle follow mode (live tail)'],
['w', 'Toggle line wrap'],
['?', 'Toggle this help'],
['q', 'Quit'],
];
return (
<Box flexDirection="column" paddingX={1} paddingY={1}>
<Text bold color="cyan">
Keybindings
</Text>
<Text> </Text>
{keys.map(([key, desc]) => (
<Box key={key}>
<Box width={20}>
<Text bold color="yellow">
{key}
</Text>
</Box>
<Text>{desc}</Text>
</Box>
))}
<Text> </Text>
<Text dimColor>Press ? or Esc to close</Text>
</Box>
);
});
// ---- TableHeader ---------------------------------------------------
type TableHeaderProps = {
columns: Column[];
};
export const TableHeader = React.memo(function TableHeader({
columns,
}: TableHeaderProps) {
return (
<Box overflowX="hidden">
{columns.map((col, i) => (
<Box key={i} width={col.width} overflowX="hidden">
<Text bold dimColor wrap="truncate">
{col.header}
</Text>
</Box>
))}
</Box>
);
});

View file

@ -0,0 +1,86 @@
import React from 'react';
import { Box, Text } from 'ink';
import type { Column, FormattedRow } from './types';
import { TableHeader } from './SubComponents';
type VisibleRow = FormattedRow & { raw: Record<string, string | number> };
type TableViewProps = {
columns: Column[];
visibleRows: VisibleRow[];
selectedRow: number;
focusSearch: boolean;
wrapLines: boolean;
maxRows: number;
errorLine: string;
loading: boolean;
};
export function TableView({
columns,
visibleRows,
selectedRow,
focusSearch,
wrapLines,
maxRows,
errorLine,
loading,
}: TableViewProps) {
return (
<Box flexDirection="column" marginTop={1} height={maxRows + 1}>
<TableHeader columns={columns} />
{errorLine ? (
<Text color="red" wrap="truncate">
{errorLine}
</Text>
) : visibleRows.length === 0 && !loading ? (
<Text dimColor>No events found.</Text>
) : null}
{visibleRows.map((row, i) => {
const isSelected = i === selectedRow && !focusSearch;
return (
<Box key={i} overflowX="hidden">
<Box width={2}>
<Text color="cyan" bold>
{isSelected ? '▸' : ' '}
</Text>
</Box>
{row.cells.map((cell, ci) => (
<Box
key={ci}
width={columns[ci]?.width ?? '10%'}
overflowX={wrapLines ? undefined : 'hidden'}
>
<Text
wrap={wrapLines ? 'wrap' : 'truncate'}
color={
isSelected
? 'cyan'
: ci === 0
? 'gray'
: row.severityColor && ci === 1
? row.severityColor
: undefined
}
bold={(ci === 1 && !!row.severityColor) || isSelected}
dimColor={ci === 0 && !isSelected}
inverse={isSelected}
>
{cell}
</Text>
</Box>
))}
</Box>
);
})}
{visibleRows.length < maxRows &&
Array.from({ length: maxRows - visibleRows.length }).map((_, i) => (
<Text key={`pad-${i}`}> </Text>
))}
</Box>
);
}

View file

@ -0,0 +1 @@
export { default } from './EventViewer';

View file

@ -0,0 +1,44 @@
import type { Metadata } from '@hyperdx/common-utils/dist/core/metadata';
import type {
SourceResponse,
SavedSearchResponse,
ProxyClickhouseClient,
} from '@/api/client';
import type { TimeRange } from '@/utils/editor';
export interface EventViewerProps {
clickhouseClient: ProxyClickhouseClient;
metadata: Metadata;
source: SourceResponse;
sources: SourceResponse[];
savedSearches: SavedSearchResponse[];
onSavedSearchSelect: (search: SavedSearchResponse) => void;
initialQuery?: string;
follow?: boolean;
}
export interface EventRow {
[key: string]: string | number;
}
export interface Column {
header: string;
/** Percentage width string, e.g. "20%" */
width: string;
}
export interface FormattedRow {
cells: string[];
severityColor?: 'red' | 'yellow' | 'blue' | 'gray';
}
export interface SwitchItem {
type: 'saved' | 'source';
label: string;
search?: SavedSearchResponse;
source?: SourceResponse;
}
export const TAIL_INTERVAL_MS = 2000;
export const PAGE_SIZE = 200;

View file

@ -0,0 +1,304 @@
import { useState, useCallback, useEffect, useRef } from 'react';
import type { Metadata } from '@hyperdx/common-utils/dist/core/metadata';
import type { SourceResponse, ProxyClickhouseClient } from '@/api/client';
import { buildEventSearchQuery, buildFullRowQuery } from '@/api/eventQuery';
import { ROW_DATA_ALIASES } from '@/shared/rowDataPanel';
import type { TimeRange } from '@/utils/editor';
import type { EventRow } from './types';
import { TAIL_INTERVAL_MS, PAGE_SIZE } from './types';
// ---- Types ---------------------------------------------------------
export interface UseEventDataParams {
clickhouseClient: ProxyClickhouseClient;
metadata: Metadata;
source: SourceResponse;
customSelect: string | undefined;
submittedQuery: string;
timeRange: TimeRange;
isFollowing: boolean;
setTimeRange: React.Dispatch<React.SetStateAction<TimeRange>>;
expandedRow: number | null;
}
export interface UseEventDataReturn {
events: EventRow[];
loading: boolean;
error: string | null;
hasMore: boolean;
loadingMore: boolean;
expandedRowData: Record<string, unknown> | null;
expandedRowLoading: boolean;
expandedTraceId: string | null;
expandedSpanId: string | null;
fetchNextPage: () => Promise<void>;
}
// ---- Hook ----------------------------------------------------------
export function useEventData({
clickhouseClient,
metadata,
source,
customSelect,
submittedQuery,
timeRange,
isFollowing,
setTimeRange,
expandedRow,
}: UseEventDataParams): UseEventDataReturn {
const tsExpr = source.timestampValueExpression ?? 'TimestampTime';
const [events, setEvents] = useState<EventRow[]>([]);
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const [hasMore, setHasMore] = useState(true);
const [loadingMore, setLoadingMore] = useState(false);
const [expandedRowData, setExpandedRowData] = useState<Record<
string,
unknown
> | null>(null);
const [expandedRowLoading, setExpandedRowLoading] = useState(false);
const [expandedTraceId, setExpandedTraceId] = useState<string | null>(null);
const [expandedSpanId, setExpandedSpanId] = useState<string | null>(null);
const lastTimestampRef = useRef<string | null>(null);
const dateRangeRef = useRef<{ start: Date; end: Date } | null>(null);
const lastTableChSqlRef = useRef<{
sql: string;
params: Record<string, unknown>;
} | null>(null);
const lastTableMetaRef = useRef<Array<{ name: string; type: string }> | null>(
null,
);
// ---- fetchEvents -------------------------------------------------
const fetchEvents = useCallback(
async (
query: string,
startTime: Date,
endTime: Date,
mode: 'replace' | 'prepend' = 'replace',
) => {
setLoading(true);
setError(null);
try {
const chSql = await buildEventSearchQuery(
{
source,
selectOverride: customSelect,
searchQuery: query,
startTime,
endTime,
limit: PAGE_SIZE,
},
metadata,
);
lastTableChSqlRef.current = chSql;
const resultSet = await clickhouseClient.query({
query: chSql.sql,
query_params: chSql.params,
format: 'JSON',
connectionId: source.connection,
});
const json = (await resultSet.json()) as {
data: EventRow[];
meta?: Array<{ name: string; type: string }>;
};
const rows = (json.data ?? []) as EventRow[];
if (json.meta) {
lastTableMetaRef.current = json.meta;
}
if (mode === 'prepend' && rows.length > 0) {
setEvents(prev => [...rows, ...prev]);
} else {
setEvents(rows);
setHasMore(rows.length >= PAGE_SIZE);
dateRangeRef.current = { start: startTime, end: endTime };
}
if (rows.length > 0) {
const ts = rows[0][tsExpr];
if (ts) lastTimestampRef.current = String(ts);
}
} catch (err: unknown) {
setError(err instanceof Error ? err.message : String(err));
} finally {
setLoading(false);
}
},
[clickhouseClient, metadata, source, tsExpr, customSelect],
);
// ---- fetchNextPage -----------------------------------------------
const fetchNextPage = useCallback(async () => {
if (!hasMore || loadingMore || !dateRangeRef.current) return;
setLoadingMore(true);
try {
const { start, end } = dateRangeRef.current;
const chSql = await buildEventSearchQuery(
{
source,
selectOverride: customSelect,
searchQuery: submittedQuery,
startTime: start,
endTime: end,
limit: PAGE_SIZE,
offset: events.length,
},
metadata,
);
const resultSet = await clickhouseClient.query({
query: chSql.sql,
query_params: chSql.params,
format: 'JSON',
connectionId: source.connection,
});
const json = await resultSet.json<EventRow>();
const rows = (json.data ?? []) as EventRow[];
if (rows.length > 0) {
setEvents(prev => [...prev, ...rows]);
}
setHasMore(rows.length >= PAGE_SIZE);
} catch {
// Non-fatal — just stop pagination
setHasMore(false);
} finally {
setLoadingMore(false);
}
}, [
hasMore,
loadingMore,
events.length,
submittedQuery,
customSelect,
source,
metadata,
clickhouseClient,
]);
// ---- Effect: initial fetch + re-fetch on query/time change -------
useEffect(() => {
fetchEvents(submittedQuery, timeRange.start, timeRange.end, 'replace');
}, [submittedQuery, timeRange, fetchEvents]);
// ---- Effect: follow mode -----------------------------------------
useEffect(() => {
if (!isFollowing) return;
const tick = () => {
// Slide the time range forward — the replace-fetch effect picks it up
setTimeRange(prev => {
const now = new Date();
const duration = prev.end.getTime() - prev.start.getTime();
return { start: new Date(now.getTime() - duration), end: now };
});
};
// Fire immediately on activation, then repeat on interval
tick();
const interval = setInterval(tick, TAIL_INTERVAL_MS);
return () => clearInterval(interval);
}, [isFollowing]);
// ---- Effect: fetch full row data when expanded (SELECT *) --------
useEffect(() => {
if (expandedRow === null) {
setExpandedRowData(null);
setExpandedTraceId(null);
setExpandedSpanId(null);
return;
}
const row = events[expandedRow];
if (!row) return;
let cancelled = false;
setExpandedRowLoading(true);
(async () => {
try {
const tableChSql = lastTableChSqlRef.current ?? {
sql: '',
params: {},
};
const tableMeta = lastTableMetaRef.current ?? [];
const chSql = await buildFullRowQuery({
source,
row: row as Record<string, unknown>,
tableChSql,
tableMeta,
metadata,
});
const resultSet = await clickhouseClient.query({
query: chSql.sql,
query_params: chSql.params,
format: 'JSON',
connectionId: source.connection,
});
const json = await resultSet.json<Record<string, unknown>>();
const fullRow = (json.data as Record<string, unknown>[])?.[0];
if (!cancelled) {
const data = fullRow ?? (row as Record<string, unknown>);
setExpandedRowData(data);
// Extract trace ID and span ID from the full row.
// Try the source expression name first, then __hdx_* alias.
if (source.kind === 'trace' || source.kind === 'log') {
const traceIdExpr = source.traceIdExpression ?? 'TraceId';
const traceVal = String(
data[traceIdExpr] ?? data[ROW_DATA_ALIASES.TRACE_ID] ?? '',
);
setExpandedTraceId(traceVal || null);
const spanIdExpr = source.spanIdExpression ?? 'SpanId';
const spanVal = String(
data[spanIdExpr] ?? data[ROW_DATA_ALIASES.SPAN_ID] ?? '',
);
setExpandedSpanId(spanVal || null);
}
}
} catch (err) {
// Non-fatal — fall back to partial row data, but include error
const errMsg = err instanceof Error ? err.message : String(err);
// Truncate HTML errors to a readable length
const shortErr = errMsg.startsWith('<!')
? errMsg.slice(0, 200)
: errMsg;
if (!cancelled) {
setExpandedRowData({
...(row as Record<string, unknown>),
__fetch_error: shortErr,
});
}
} finally {
if (!cancelled) setExpandedRowLoading(false);
}
})();
return () => {
cancelled = true;
};
}, [expandedRow, events, source, metadata, clickhouseClient]);
return {
events,
loading,
error,
hasMore,
loadingMore,
expandedRowData,
expandedRowLoading,
expandedTraceId,
expandedSpanId,
fetchNextPage,
};
}

View file

@ -0,0 +1,356 @@
import { useCallback } from 'react';
import { useInput } from 'ink';
import type { SourceResponse, SavedSearchResponse } from '@/api/client';
import {
openEditorForSelect,
openEditorForTimeRange,
type TimeRange,
} from '@/utils/editor';
import type { EventRow, SwitchItem } from './types';
// ---- Types ---------------------------------------------------------
export interface KeybindingParams {
// State values
focusSearch: boolean;
focusDetailSearch: boolean;
showHelp: boolean;
expandedRow: number | null;
detailTab: 'overview' | 'columns' | 'trace';
selectedRow: number;
scrollOffset: number;
isFollowing: boolean;
hasMore: boolean;
events: EventRow[];
maxRows: number;
visibleRowCount: number;
source: SourceResponse;
timeRange: TimeRange;
customSelect: string | undefined;
detailMaxRows: number;
fullDetailMaxRows: number;
// Tab switching
switchItems: SwitchItem[];
findActiveIndex: () => number;
onSavedSearchSelect: (search: SavedSearchResponse) => void;
// State setters
setFocusSearch: React.Dispatch<React.SetStateAction<boolean>>;
setFocusDetailSearch: React.Dispatch<React.SetStateAction<boolean>>;
setShowHelp: React.Dispatch<React.SetStateAction<boolean>>;
setSelectedRow: React.Dispatch<React.SetStateAction<number>>;
setScrollOffset: React.Dispatch<React.SetStateAction<number>>;
setExpandedRow: React.Dispatch<React.SetStateAction<number | null>>;
setDetailTab: React.Dispatch<
React.SetStateAction<'overview' | 'columns' | 'trace'>
>;
setIsFollowing: React.Dispatch<React.SetStateAction<boolean>>;
setWrapLines: React.Dispatch<React.SetStateAction<boolean>>;
setDetailSearchQuery: React.Dispatch<React.SetStateAction<string>>;
setTraceSelectedIndex: React.Dispatch<React.SetStateAction<number | null>>;
setTraceDetailScrollOffset: React.Dispatch<React.SetStateAction<number>>;
setColumnValuesScrollOffset: React.Dispatch<React.SetStateAction<number>>;
setTimeRange: React.Dispatch<React.SetStateAction<TimeRange>>;
setCustomSelectMap: React.Dispatch<
React.SetStateAction<Record<string, string>>
>;
// Refs
wasFollowingRef: React.MutableRefObject<boolean>;
// Callbacks
fetchNextPage: () => Promise<void>;
}
// ---- Hook ----------------------------------------------------------
export function useKeybindings(params: KeybindingParams): void {
const {
focusSearch,
focusDetailSearch,
showHelp,
expandedRow,
detailTab,
selectedRow,
scrollOffset,
isFollowing,
hasMore,
events,
maxRows,
visibleRowCount,
source,
timeRange,
customSelect,
detailMaxRows,
fullDetailMaxRows,
switchItems,
findActiveIndex,
onSavedSearchSelect,
setFocusSearch,
setFocusDetailSearch,
setShowHelp,
setSelectedRow,
setScrollOffset,
setExpandedRow,
setDetailTab,
setIsFollowing,
setWrapLines,
setDetailSearchQuery,
setTraceSelectedIndex,
setTraceDetailScrollOffset,
setColumnValuesScrollOffset,
setTimeRange,
setCustomSelectMap,
wasFollowingRef,
fetchNextPage,
} = params;
const handleTabSwitch = useCallback(
(direction: 1 | -1) => {
if (switchItems.length === 0) return;
const currentIdx = findActiveIndex();
const nextIdx =
(currentIdx + direction + switchItems.length) % switchItems.length;
const item = switchItems[nextIdx];
if (item.type === 'saved' && item.search) {
onSavedSearchSelect(item.search);
} else if (item.type === 'source' && item.source) {
onSavedSearchSelect({
id: '',
_id: '',
name: item.source.name,
select: '',
where: '',
whereLanguage: 'lucene',
source: item.source.id,
tags: [],
});
}
},
[switchItems, findActiveIndex, onSavedSearchSelect],
);
useInput((input, key) => {
// ? toggles help from anywhere (except search input)
if (input === '?' && !focusSearch) {
setShowHelp(s => !s);
return;
}
// When help is showing, any key closes it
if (showHelp) {
setShowHelp(false);
return;
}
if (focusDetailSearch) {
if (key.escape || key.return) {
setFocusDetailSearch(false);
return;
}
return;
}
if (focusSearch) {
if (key.tab) {
handleTabSwitch(key.shift ? -1 : 1);
return;
}
if (key.escape) {
setFocusSearch(false);
return;
}
return;
}
// j/k in Trace tab: navigate spans/log events in the waterfall
// Ctrl+D/U: scroll Event Details section
if (expandedRow !== null && detailTab === 'trace') {
if (input === 'j' || key.downArrow) {
setTraceSelectedIndex(prev => (prev === null ? 0 : prev + 1));
setTraceDetailScrollOffset(0); // reset detail scroll on span change
return;
}
if (input === 'k' || key.upArrow) {
setTraceSelectedIndex(prev =>
prev === null ? 0 : Math.max(0, prev - 1),
);
setTraceDetailScrollOffset(0); // reset detail scroll on span change
return;
}
const detailHalfPage = Math.max(1, Math.floor(detailMaxRows / 2));
if (key.ctrl && input === 'd') {
setTraceDetailScrollOffset(prev => prev + detailHalfPage);
return;
}
if (key.ctrl && input === 'u') {
setTraceDetailScrollOffset(prev => Math.max(0, prev - detailHalfPage));
return;
}
}
// Ctrl+D/U in Column Values / Overview tab: scroll the detail view
if (
expandedRow !== null &&
(detailTab === 'columns' || detailTab === 'overview')
) {
const detailHalfPage = Math.max(1, Math.floor(fullDetailMaxRows / 2));
if (key.ctrl && input === 'd') {
setColumnValuesScrollOffset(prev => prev + detailHalfPage);
return;
}
if (key.ctrl && input === 'u') {
setColumnValuesScrollOffset(prev => Math.max(0, prev - detailHalfPage));
return;
}
}
// j/k move selection cursor within visible rows
if (input === 'j' || key.downArrow) {
const absPos = scrollOffset + selectedRow;
// If at the very last event and more pages available, fetch next page
if (absPos >= events.length - 1 && hasMore) {
fetchNextPage();
return;
}
setSelectedRow(r => {
const next = r + 1;
if (next >= maxRows) {
setScrollOffset(o =>
Math.min(o + 1, Math.max(0, events.length - maxRows)),
);
return r;
}
return Math.min(next, visibleRowCount - 1);
});
}
if (input === 'k' || key.upArrow) {
setSelectedRow(r => {
const next = r - 1;
if (next < 0) {
setScrollOffset(o => Math.max(0, o - 1));
return 0;
}
return next;
});
}
if (key.return || input === 'l') {
if (expandedRow === null) {
setExpandedRow(scrollOffset + selectedRow);
setDetailTab('overview');
setDetailSearchQuery('');
setFocusDetailSearch(false);
setColumnValuesScrollOffset(0);
// Pause follow mode while detail panel is open
wasFollowingRef.current = isFollowing;
setIsFollowing(false);
}
return;
}
if (key.escape || input === 'h') {
if (focusDetailSearch) {
setFocusDetailSearch(false);
return;
}
if (expandedRow !== null) {
setExpandedRow(null);
setDetailTab('columns');
// Restore follow mode if it was active before expanding
if (wasFollowingRef.current) {
setIsFollowing(true);
}
return;
}
}
// G = jump to last item (end of list), fetch more if available
if (input === 'G') {
if (hasMore) fetchNextPage();
const maxOffset = Math.max(0, events.length - maxRows);
setScrollOffset(maxOffset);
setSelectedRow(Math.min(events.length - 1, maxRows - 1));
}
// g = jump to first item (top of list)
if (input === 'g') {
setScrollOffset(0);
setSelectedRow(0);
}
// Ctrl+D = page down, Ctrl+U = page up (half-page scroll like vim)
if (key.ctrl && input === 'd') {
const half = Math.floor(maxRows / 2);
const maxOffset = Math.max(0, events.length - maxRows);
const newOffset = Math.min(scrollOffset + half, maxOffset);
// If scrolling near the end and more data available, fetch next page
if (newOffset >= maxOffset - half && hasMore) {
fetchNextPage();
}
setScrollOffset(newOffset);
}
if (key.ctrl && input === 'u') {
const half = Math.floor(maxRows / 2);
setScrollOffset(o => Math.max(0, o - half));
}
if (key.tab) {
// When detail panel is open, Tab cycles through detail tabs
if (expandedRow !== null) {
const hasTrace =
source.kind === 'trace' ||
(source.kind === 'log' && source.traceSourceId);
const tabs: Array<'overview' | 'columns' | 'trace'> = hasTrace
? ['overview', 'columns', 'trace']
: ['overview', 'columns'];
setTraceSelectedIndex(null);
setTraceDetailScrollOffset(0);
setColumnValuesScrollOffset(0);
setDetailTab(prev => {
const idx = tabs.indexOf(prev);
const dir = key.shift ? -1 : 1;
return tabs[(idx + dir + tabs.length) % tabs.length];
});
return;
}
handleTabSwitch(key.shift ? -1 : 1);
return;
}
if (input === 'w') setWrapLines(w => !w);
if (input === 'f') setIsFollowing(prev => !prev);
if (input === '/') {
if (expandedRow !== null) {
setFocusDetailSearch(true);
} else {
setFocusSearch(true);
}
}
// t = edit time range in $EDITOR
if (input === 't') {
// Use setTimeout to let Ink finish the current render cycle
// before we hand stdin/stdout to the editor
setTimeout(() => {
const result = openEditorForTimeRange(timeRange);
if (result) {
setTimeRange(result);
setScrollOffset(0);
setSelectedRow(0);
}
}, 50);
return;
}
// s = edit select clause in $EDITOR
if (input === 's') {
setTimeout(() => {
const currentSelect =
customSelect ?? source.defaultTableSelectExpression ?? '';
const result = openEditorForSelect(currentSelect);
if (result != null) {
setCustomSelectMap(prev => ({
...prev,
[source.id]: result,
}));
setScrollOffset(0);
setSelectedRow(0);
}
}, 50);
return;
}
if (input === 'q') process.exit(0);
});
}

View file

@ -0,0 +1,77 @@
import type { SourceResponse } from '@/api/client';
import type { Column, EventRow, FormattedRow } from './types';
// ---- Column definitions per source kind ----------------------------
export function getColumns(source: SourceResponse): Column[] {
if (source.kind === 'trace') {
return [
{ header: 'Timestamp', width: '20%' },
{ header: 'Service', width: '15%' },
{ header: 'Span', width: '25%' },
{ header: 'Duration', width: '10%' },
{ header: 'Status', width: '8%' },
{ header: 'Trace ID', width: '22%' },
];
}
// Log source
return [
{ header: 'Timestamp', width: '20%' },
{ header: 'Severity', width: '8%' },
{ header: 'Body', width: '72%' },
];
}
/**
* Derive columns dynamically from the row data.
* Distributes percentage widths: last column gets the remaining space.
*/
export function getDynamicColumns(events: EventRow[]): Column[] {
if (events.length === 0) return [];
const keys = Object.keys(events[0]);
if (keys.length === 0) return [];
const count = keys.length;
if (count === 1) return [{ header: keys[0], width: '100%' }];
// Give the last column (usually Body) more space
const otherWidth = Math.floor(60 / (count - 1));
const lastWidth = 100 - otherWidth * (count - 1);
return keys.map((key, i) => ({
header: key,
width: `${i === count - 1 ? lastWidth : otherWidth}%`,
}));
}
export function flatten(s: string): string {
return s
.replace(/\n/g, ' ')
.replace(/\s{2,}/g, ' ')
.trim();
}
/**
* Format a row generically just stringify each value in column order.
* Used when the user has a custom select clause.
*/
export function formatDynamicRow(
row: EventRow,
columns: Column[],
): FormattedRow {
return {
cells: columns.map(col => {
const val = row[col.header];
if (val == null) return '';
if (typeof val === 'object') return flatten(JSON.stringify(val));
return flatten(String(val));
}),
};
}
export function formatShortDate(d: Date): string {
return d
.toISOString()
.replace('T', ' ')
.replace(/\.\d{3}Z$/, '');
}

View file

@ -0,0 +1,75 @@
import React, { useState } from 'react';
import { Box, Text } from 'ink';
import TextInput from 'ink-text-input';
import Spinner from 'ink-spinner';
interface LoginFormProps {
apiUrl: string;
onLogin: (email: string, password: string) => Promise<boolean>;
}
type Field = 'email' | 'password';
export default function LoginForm({ apiUrl, onLogin }: LoginFormProps) {
const [field, setField] = useState<Field>('email');
const [email, setEmail] = useState('');
const [password, setPassword] = useState('');
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const handleSubmitEmail = () => {
if (!email.trim()) return;
setField('password');
};
const handleSubmitPassword = async () => {
if (!password) return;
setLoading(true);
setError(null);
const ok = await onLogin(email, password);
setLoading(false);
if (!ok) {
setError('Login failed. Check your email and password.');
setField('email');
setEmail('');
setPassword('');
}
};
return (
<Box flexDirection="column" paddingX={1}>
<Text bold color="cyan">
HyperDX TUI Login
</Text>
<Text dimColor>Server: {apiUrl}</Text>
<Text> </Text>
{error && <Text color="red">{error}</Text>}
{loading ? (
<Text>
<Spinner type="dots" /> Logging in
</Text>
) : field === 'email' ? (
<Box>
<Text>Email: </Text>
<TextInput
value={email}
onChange={setEmail}
onSubmit={handleSubmitEmail}
/>
</Box>
) : (
<Box>
<Text>Password: </Text>
<TextInput
value={password}
onChange={setPassword}
onSubmit={handleSubmitPassword}
mask="*"
/>
</Box>
)}
</Box>
);
}

View file

@ -0,0 +1,299 @@
/**
* Structured overview of a row's data, mirroring the web frontend's
* DBRowOverviewPanel. Three sections:
*
* 1. Top Level Attributes standard OTel fields
* 2. Span / Log Attributes from eventAttributesExpression
* 3. Resource Attributes from resourceAttributesExpression
*/
import React, { useMemo } from 'react';
import { Box, Text } from 'ink';
import type { SourceResponse } from '@/api/client';
import { ROW_DATA_ALIASES } from '@/shared/rowDataPanel';
// ---- helpers -------------------------------------------------------
/** Recursively flatten a nested object into dot-separated keys. */
function flattenObject(
obj: Record<string, unknown>,
prefix = '',
): Record<string, string> {
const result: Record<string, string> = {};
for (const [key, value] of Object.entries(obj)) {
const fullKey = prefix ? `${prefix}.${key}` : key;
if (value != null && typeof value === 'object' && !Array.isArray(value)) {
Object.assign(
result,
flattenObject(value as Record<string, unknown>, fullKey),
);
} else {
result[fullKey] =
value != null && typeof value === 'object'
? JSON.stringify(value)
: String(value ?? '');
}
}
return result;
}
function matchesQuery(
key: string,
value: string,
query: string | undefined,
): boolean {
if (!query) return true;
const q = query.toLowerCase();
return key.toLowerCase().includes(q) || value.toLowerCase().includes(q);
}
// ---- component -----------------------------------------------------
// Same list as the web frontend's DBRowOverviewPanel
const TOP_LEVEL_KEYS = [
'TraceId',
'SpanId',
'ParentSpanId',
'SpanName',
'SpanKind',
'ServiceName',
'ScopeName',
'ScopeVersion',
'Duration',
'StatusCode',
'StatusMessage',
'SeverityText',
'Body',
];
interface RowOverviewProps {
source: SourceResponse;
rowData: Record<string, unknown>;
searchQuery?: string;
wrapLines?: boolean;
/** Max visible lines (enables fixed-height viewport) */
maxRows?: number;
/** Scroll offset into the content */
scrollOffset?: number;
}
/** A single keyvalue row. */
function flatten(s: string): string {
return s
.replace(/\n/g, ' ')
.replace(/\s{2,}/g, ' ')
.trim();
}
function KVRow({
label,
value,
wrapLines,
}: {
label: string;
value: string;
wrapLines?: boolean;
}) {
return (
<Box
height={wrapLines ? undefined : 1}
overflowX={wrapLines ? undefined : 'hidden'}
overflowY={wrapLines ? undefined : 'hidden'}
>
<Box width="25%" flexShrink={0} overflowX="hidden">
<Text color="cyan" wrap="truncate">
{label}
</Text>
</Box>
<Box width="75%" overflowX={wrapLines ? undefined : 'hidden'}>
<Text wrap={wrapLines ? 'wrap' : 'truncate'}>
{wrapLines ? value : flatten(value)}
</Text>
</Box>
</Box>
);
}
export default function RowOverview({
source,
rowData,
searchQuery,
wrapLines,
maxRows,
scrollOffset = 0,
}: RowOverviewProps) {
// ---- 1. Top Level Attributes ------------------------------------
const topLevelEntries = useMemo(() => {
return TOP_LEVEL_KEYS.map(key => {
const val = rowData[key];
if (val == null || val === '') return null;
const strVal =
typeof val === 'object' ? JSON.stringify(val) : String(val);
if (!matchesQuery(key, strVal, searchQuery)) return null;
return [key, strVal] as const;
}).filter(Boolean) as [string, string][];
}, [rowData, searchQuery]);
// ---- 2. Event Attributes (Span / Log) ---------------------------
const eventAttrExpr = source.eventAttributesExpression;
const eventAttrs = useMemo(() => {
// Try the source expression name first, then __hdx_* alias
const raw =
(eventAttrExpr ? rowData[eventAttrExpr] : null) ??
rowData[ROW_DATA_ALIASES.EVENT_ATTRIBUTES];
if (raw == null || typeof raw !== 'object') return null;
return flattenObject(raw as Record<string, unknown>);
}, [rowData, eventAttrExpr]);
const filteredEventAttrs = useMemo(() => {
if (!eventAttrs) return null;
const entries = Object.entries(eventAttrs).filter(([k, v]) =>
matchesQuery(k, v, searchQuery),
);
return entries.length > 0 ? entries : null;
}, [eventAttrs, searchQuery]);
const eventAttrLabel =
source.kind === 'log' ? 'Log Attributes' : 'Span Attributes';
const totalEventAttrKeys = eventAttrs ? Object.keys(eventAttrs).length : 0;
// ---- 3. Resource Attributes -------------------------------------
const resourceAttrExpr = source.resourceAttributesExpression;
const resourceAttrs = useMemo(() => {
// Try the source expression name first, then __hdx_* alias
const raw =
(resourceAttrExpr ? rowData[resourceAttrExpr] : null) ??
rowData[ROW_DATA_ALIASES.RESOURCE_ATTRIBUTES];
if (raw == null || typeof raw !== 'object') return null;
return flattenObject(raw as Record<string, unknown>);
}, [rowData, resourceAttrExpr]);
const filteredResourceAttrs = useMemo(() => {
if (!resourceAttrs) return null;
const entries = Object.entries(resourceAttrs).filter(([k, v]) =>
matchesQuery(k, v, searchQuery),
);
return entries.length > 0 ? entries : null;
}, [resourceAttrs, searchQuery]);
// ---- build flat list of renderable rows ---------------------------
const allRows = useMemo(() => {
const rows: React.ReactElement[] = [];
// Top Level Attributes
if (topLevelEntries.length > 0) {
rows.push(
<Text key="top-header" bold color="cyan">
Top Level Attributes
</Text>,
);
for (const [key, value] of topLevelEntries) {
rows.push(
<KVRow
key={`top-${key}`}
label={key}
value={value}
wrapLines={wrapLines}
/>,
);
}
}
// Event Attributes section
if (eventAttrExpr) {
rows.push(<Text key="evt-spacer"> </Text>);
rows.push(
<Box key="evt-header">
<Text bold color="cyan">
{eventAttrLabel}
</Text>
<Text dimColor>
{' '}
{`{} ${totalEventAttrKeys} key${totalEventAttrKeys !== 1 ? 's' : ''}`}
</Text>
</Box>,
);
if (filteredEventAttrs) {
for (const [key, value] of filteredEventAttrs) {
rows.push(
<KVRow
key={`evt-${key}`}
label={` ${key}`}
value={value}
wrapLines={wrapLines}
/>,
);
}
} else {
rows.push(
<Text key="evt-empty" dimColor>
{searchQuery ? 'No matching attributes.' : 'No attributes.'}
</Text>,
);
}
}
// Resource Attributes section
if (resourceAttrExpr) {
rows.push(<Text key="res-spacer"> </Text>);
rows.push(
<Text key="res-header" bold color="cyan">
Resource Attributes
</Text>,
);
if (filteredResourceAttrs) {
rows.push(
<Box
key="res-chips"
flexDirection="row"
flexWrap="wrap"
columnGap={1}
rowGap={1}
>
{filteredResourceAttrs.map(([key, value]) => (
<Text key={key} backgroundColor="#3a3a3a">
{' '}
<Text color="cyan">{key}</Text>
<Text color="whiteBright">
: {flatten(value).slice(0, 80)}
</Text>{' '}
</Text>
))}
</Box>,
);
} else {
rows.push(
<Text key="res-empty" dimColor>
{searchQuery
? 'No matching resource attributes.'
: 'No resource attributes.'}
</Text>,
);
}
}
return rows;
}, [
topLevelEntries,
eventAttrExpr,
eventAttrLabel,
totalEventAttrKeys,
filteredEventAttrs,
resourceAttrExpr,
filteredResourceAttrs,
searchQuery,
wrapLines,
]);
// ---- render with scrolling ---------------------------------------
const totalRows = allRows.length;
const visibleRows =
maxRows != null
? allRows.slice(scrollOffset, scrollOffset + maxRows)
: allRows;
return <Box flexDirection="column">{visibleRows}</Box>;
}

View file

@ -0,0 +1,45 @@
import React, { useState } from 'react';
import { Box, Text, useInput } from 'ink';
import type { SourceResponse } from '@/api/client';
interface SourcePickerProps {
sources: SourceResponse[];
onSelect: (source: SourceResponse) => void;
}
export default function SourcePicker({ sources, onSelect }: SourcePickerProps) {
const [selected, setSelected] = useState(0);
useInput((input, key) => {
if (key.upArrow || input === 'k') {
setSelected(s => Math.max(0, s - 1));
}
if (key.downArrow || input === 'j') {
setSelected(s => Math.min(sources.length - 1, s + 1));
}
if (key.return || input === 'l') {
onSelect(sources[selected]);
}
});
return (
<Box flexDirection="column" paddingX={1}>
<Text bold color="cyan">
Select a source:
</Text>
<Text> </Text>
{sources.map((source, i) => (
<Text key={source.id} color={i === selected ? 'green' : undefined}>
{i === selected ? '▸ ' : ' '}
{source.name}{' '}
<Text dimColor>
({source.from.databaseName}.{source.from.tableName})
</Text>
</Text>
))}
<Text> </Text>
<Text dimColor>j/k to navigate, Enter/l to select</Text>
</Box>
);
}

View file

@ -0,0 +1,326 @@
/**
* Trace waterfall chart for the TUI.
*
* Displays all spans for a given traceId as a tree with horizontal
* timing bars, similar to the web frontend's DBTraceWaterfallChart.
*
* When a correlated log source is provided, log events are fetched
* and merged into the tree as children of the matching trace span
* (linked via SpanId), shown with 0ms duration and a log icon.
*
* Layout per row:
* [indent] ServiceName > SpanName [===bar===] 12.3ms
*/
import React, { useEffect, useMemo } from 'react';
import { Box, Text, useStdout } from 'ink';
import Spinner from 'ink-spinner';
import ColumnValues from '@/components/ColumnValues';
import type { TraceWaterfallProps } from './types';
import {
durationMs,
formatDuration,
getStatusLabel,
getStatusColor,
getBarColor,
renderBar,
} from './utils';
import { buildTree } from './buildTree';
import { useTraceData } from './useTraceData';
export default function TraceWaterfall({
clickhouseClient,
source,
logSource,
traceId,
searchQuery,
highlightHint,
selectedIndex,
onSelectedIndexChange,
wrapLines,
detailScrollOffset = 0,
detailMaxRows,
width: propWidth,
maxRows: propMaxRows,
}: TraceWaterfallProps) {
const { stdout } = useStdout();
const termWidth = propWidth ?? stdout?.columns ?? 80;
const maxRows = propMaxRows ?? 50;
// ---- Data fetching -----------------------------------------------
const {
traceSpans,
logEvents,
loading,
error,
selectedRowData,
selectedRowLoading,
fetchSelectedRow,
} = useTraceData({ clickhouseClient, source, logSource, traceId });
// ---- Derived computations ----------------------------------------
const flatNodes = useMemo(
() => buildTree(traceSpans, logEvents),
[traceSpans, logEvents],
);
// Filter nodes by search query (fuzzy match on ServiceName + SpanName)
const filteredNodes = useMemo(() => {
if (!searchQuery) return flatNodes;
const q = searchQuery.toLowerCase();
return flatNodes.filter(node => {
const name = (node.SpanName || '').toLowerCase();
const svc = (node.ServiceName || '').toLowerCase();
return name.includes(q) || svc.includes(q);
});
}, [flatNodes, searchQuery]);
// Compute global min/max timestamps for bar positioning
// (use all nodes for consistent bar positions regardless of filter)
const { minMs, maxMs } = useMemo(() => {
let min = Infinity;
let max = -Infinity;
for (const node of flatNodes) {
const startMs = new Date(node.Timestamp).getTime();
const dur =
node.kind === 'log'
? 0
: durationMs(node.Duration || 0, source.durationPrecision);
min = Math.min(min, startMs);
max = Math.max(max, startMs + dur);
}
if (!isFinite(min)) min = 0;
if (!isFinite(max)) max = 0;
return { minMs: min, maxMs: max };
}, [flatNodes, source.durationPrecision]);
const totalDurationMs = maxMs - minMs;
const visibleNodesForIndex = useMemo(
() => filteredNodes.slice(0, propMaxRows ?? 50),
[filteredNodes, propMaxRows],
);
// Determine the effective highlighted index:
// - If selectedIndex is set (j/k navigation), use it (clamped)
// - Otherwise, find the highlightHint row
const effectiveIndex = useMemo(() => {
if (selectedIndex != null) {
return Math.max(
0,
Math.min(selectedIndex, visibleNodesForIndex.length - 1),
);
}
if (highlightHint) {
const idx = visibleNodesForIndex.findIndex(
n => n.SpanId === highlightHint.spanId && n.kind === highlightHint.kind,
);
return idx >= 0 ? idx : null;
}
return null;
}, [selectedIndex, highlightHint, visibleNodesForIndex]);
// Clamp selectedIndex if it exceeds bounds
useEffect(() => {
if (
selectedIndex != null &&
visibleNodesForIndex.length > 0 &&
selectedIndex >= visibleNodesForIndex.length
) {
onSelectedIndexChange?.(visibleNodesForIndex.length - 1);
}
}, [selectedIndex, visibleNodesForIndex.length, onSelectedIndexChange]);
// Fetch SELECT * for the selected span/log
const selectedNode =
effectiveIndex != null ? visibleNodesForIndex[effectiveIndex] : null;
useEffect(() => {
fetchSelectedRow(selectedNode);
}, [selectedNode?.SpanId, selectedNode?.Timestamp, selectedNode?.kind]);
// ---- Render ------------------------------------------------------
if (loading) {
return (
<Text>
<Spinner type="dots" /> Loading trace spans
</Text>
);
}
if (error) {
return <Text color="red">Error loading trace: {error}</Text>;
}
if (flatNodes.length === 0) {
return <Text dimColor>No spans found for this trace.</Text>;
}
// Layout: [label (40%)] [bar (50%)] [duration (10%)]
const labelWidth = Math.max(20, Math.floor(termWidth * 0.38));
const durationColWidth = 10;
const barWidth = Math.max(10, termWidth - labelWidth - durationColWidth - 4); // 4 for spacing
const spanCount = filteredNodes.filter(n => n.kind === 'span').length;
const logCount = filteredNodes.filter(n => n.kind === 'log').length;
const errorCount = filteredNodes.filter(n => {
if (n.kind === 'log') {
const sev = n.StatusCode?.toLowerCase();
return sev === 'error' || sev === 'fatal' || sev === 'critical';
}
return n.StatusCode === '2' || n.StatusCode === 'Error';
}).length;
const visibleNodes = filteredNodes.slice(0, maxRows);
const truncated = filteredNodes.length > maxRows;
return (
<Box flexDirection="column">
{/* Summary */}
<Box>
<Text dimColor>
{spanCount} span{spanCount !== 1 ? 's' : ''}
</Text>
{logCount > 0 && (
<Text dimColor>
, {logCount} log{logCount !== 1 ? 's' : ''}
</Text>
)}
{errorCount > 0 && (
<Text color="red">
{' '}
({errorCount} error{errorCount !== 1 ? 's' : ''})
</Text>
)}
<Text dimColor> | total {totalDurationMs.toFixed(1)}ms</Text>
</Box>
{/* Header */}
<Box>
<Box width={labelWidth}>
<Text bold dimColor>
Span
</Text>
</Box>
<Box width={barWidth + 2}>
<Text bold dimColor>
Timeline
</Text>
</Box>
<Box width={durationColWidth}>
<Text bold dimColor>
Duration
</Text>
</Box>
</Box>
<Text dimColor>{'─'.repeat(termWidth - 2)}</Text>
{/* Span rows */}
{visibleNodes.map((node, i) => {
const indent = ' '.repeat(Math.min(node.level, 8));
const isLog = node.kind === 'log';
const icon = isLog ? '[] ' : '';
const svc = node.ServiceName ? `${node.ServiceName} > ` : '';
const name = node.SpanName || '(unknown)';
const label = `${indent}${icon}${svc}${name}`;
// Truncate label to fit
const displayLabel =
label.length > labelWidth - 1
? label.slice(0, labelWidth - 2) + '…'
: label.padEnd(labelWidth);
const startMs = new Date(node.Timestamp).getTime();
const dur = isLog
? 0
: durationMs(node.Duration || 0, source.durationPrecision);
const bar = renderBar(startMs, dur, minMs, maxMs, barWidth);
const durStr = isLog
? '0ms'
: formatDuration(node.Duration || 0, source.durationPrecision);
const statusLabel = getStatusLabel(node);
const statusColor = getStatusColor(node);
const barClr = getBarColor(node);
const isHighlighted = effectiveIndex === i;
return (
<Box key={`${node.SpanId}-${node.kind}-${i}`} overflowX="hidden">
<Box width={labelWidth} overflowX="hidden">
<Text
wrap="truncate"
color={isHighlighted ? 'white' : isLog ? 'green' : statusColor}
bold={!!statusColor}
inverse={isHighlighted}
>
{displayLabel}
</Text>
</Box>
<Box width={barWidth + 2} overflowX="hidden">
<Text color={barClr} wrap="truncate">
{bar}
</Text>
</Box>
<Box width={durationColWidth} overflowX="hidden">
<Text
dimColor={!isHighlighted}
color={isHighlighted ? 'white' : undefined}
inverse={isHighlighted}
>
{durStr}
</Text>
{statusLabel ? (
<Text
color={isHighlighted ? 'white' : statusColor}
bold
inverse={isHighlighted}
>
{' '}
{statusLabel}
</Text>
) : null}
</Box>
</Box>
);
})}
{truncated && (
<Text dimColor>
and {filteredNodes.length - maxRows} more (showing first {maxRows})
</Text>
)}
{/* Event Details for selected span/log — fixed height viewport */}
<Box
flexDirection="column"
marginTop={1}
height={(detailMaxRows ?? 10) + 3}
overflowY="hidden"
>
<Text bold>Event Details</Text>
<Text dimColor>{'─'.repeat(termWidth - 2)}</Text>
{selectedRowLoading ? (
<Text>
<Spinner type="dots" /> Loading event details
</Text>
) : selectedRowData ? (
<ColumnValues
data={selectedRowData}
searchQuery={searchQuery}
wrapLines={wrapLines}
maxRows={detailMaxRows}
scrollOffset={detailScrollOffset}
/>
) : effectiveIndex == null ? (
<Text dimColor>Use j/k to select a span or log event.</Text>
) : (
<Text dimColor>No details available.</Text>
)}
</Box>
</Box>
);
}

View file

@ -0,0 +1,105 @@
/**
* Build a tree from trace spans and (optionally) correlated log events.
*
* This is a direct port of the web frontend's DBTraceWaterfallChart
* DAG builder. Do NOT modify without checking DBTraceWaterfallChart first.
*
* Mirrors the web frontend's logic:
* - All rows (traces + logs) are merged and sorted by timestamp
* - Single pass builds the tree:
* - Trace spans use ParentSpanId for parent-child
* - Log events use `SpanId-log` as their node key and attach to
* the trace span with matching SpanId
* - Placeholder mechanism: if a child arrives before its parent,
* a placeholder is created; when the parent arrives it inherits
* the placeholder's children
* - Children appear in insertion order (already chronological
* because input is time-sorted), so DFS produces a timeline
*/
import type { TaggedSpanRow, SpanNode } from './types';
export function buildTree(
traceSpans: TaggedSpanRow[],
logEvents: TaggedSpanRow[],
): SpanNode[] {
const validSpanIds = new Set(
traceSpans.filter(s => s.SpanId).map(s => s.SpanId),
);
const rootNodes: SpanNode[] = [];
const nodesMap = new Map<string, SpanNode>(); // Maps nodeId → Node
const spanIdMap = new Map<string, string>(); // Maps SpanId → nodeId of FIRST node
// Merge and sort by timestamp (matches web frontend)
const allRows: TaggedSpanRow[] = [...traceSpans, ...logEvents];
allRows.sort(
(a, b) => new Date(a.Timestamp).getTime() - new Date(b.Timestamp).getTime(),
);
let idCounter = 0;
for (const row of allRows) {
const { kind, SpanId, ParentSpanId } = row;
if (!SpanId) continue;
const nodeSpanId = kind === 'log' ? `${SpanId}-log` : SpanId;
const nodeParentSpanId = kind === 'log' ? SpanId : ParentSpanId || '';
const nodeId = `n-${idCounter++}`;
const curNode: SpanNode = { ...row, children: [], level: 0 };
if (kind === 'span') {
if (!spanIdMap.has(nodeSpanId)) {
spanIdMap.set(nodeSpanId, nodeId);
// Inherit children from placeholder if one was created earlier
const placeholderId = `placeholder-${nodeSpanId}`;
const placeholder = nodesMap.get(placeholderId);
if (placeholder) {
curNode.children = placeholder.children || [];
nodesMap.delete(placeholderId);
}
}
// Always add to nodesMap with unique nodeId
nodesMap.set(nodeId, curNode);
}
const isRootNode =
kind === 'span' &&
(!nodeParentSpanId || !validSpanIds.has(nodeParentSpanId));
if (isRootNode) {
rootNodes.push(curNode);
} else {
const parentNodeId = spanIdMap.get(nodeParentSpanId);
let parentNode = parentNodeId ? nodesMap.get(parentNodeId) : undefined;
if (!parentNode) {
const placeholderId = `placeholder-${nodeParentSpanId}`;
parentNode = nodesMap.get(placeholderId);
if (!parentNode) {
parentNode = { children: [] } as unknown as SpanNode;
nodesMap.set(placeholderId, parentNode);
}
}
parentNode.children.push(curNode);
}
}
// Flatten via in-order DFS traversal
const flattenNode = (node: SpanNode, result: SpanNode[], level: number) => {
node.level = level;
result.push(node);
for (const child of node.children) {
flattenNode(child, result, level + 1);
}
};
const flattened: SpanNode[] = [];
for (const root of rootNodes) {
flattenNode(root, flattened, 0);
}
return flattened;
}

View file

@ -0,0 +1 @@
export { default } from './TraceWaterfall';

View file

@ -0,0 +1,51 @@
import type { ProxyClickhouseClient, SourceResponse } from '@/api/client';
export interface SpanRow {
Timestamp: string;
TraceId: string;
SpanId: string;
ParentSpanId: string;
SpanName: string;
ServiceName: string;
Duration: number;
StatusCode: string;
}
/** Extends SpanRow with a kind marker for distinguishing spans from logs */
export interface TaggedSpanRow extends SpanRow {
kind: 'span' | 'log';
}
export interface SpanNode extends TaggedSpanRow {
children: SpanNode[];
level: number;
}
export interface TraceWaterfallProps {
clickhouseClient: ProxyClickhouseClient;
source: SourceResponse;
/** Correlated log source (optional) */
logSource?: SourceResponse | null;
traceId: string;
/** Fuzzy filter query for span/log names */
searchQuery?: string;
/** Hint to identify the initial row to highlight in the waterfall */
highlightHint?: {
spanId: string;
kind: 'span' | 'log';
};
/** Currently selected row index (controlled by parent via j/k) */
selectedIndex?: number | null;
/** Callback when the selected index should change (e.g. clamping) */
onSelectedIndexChange?: (index: number | null) => void;
/** Toggle line wrap in Event Details */
wrapLines?: boolean;
/** Scroll offset for Event Details */
detailScrollOffset?: number;
/** Max visible rows for Event Details */
detailMaxRows?: number;
/** Available width for the chart (characters) */
width?: number;
/** Max visible rows before truncation */
maxRows?: number;
}

View file

@ -0,0 +1,201 @@
import { useState, useEffect, useRef } from 'react';
import SqlString from 'sqlstring';
import type { ProxyClickhouseClient, SourceResponse } from '@/api/client';
import { buildTraceSpansSql, buildTraceLogsSql } from '@/api/eventQuery';
import type { SpanRow, TaggedSpanRow, SpanNode } from './types';
// ---- Types ---------------------------------------------------------
export interface UseTraceDataParams {
clickhouseClient: ProxyClickhouseClient;
source: SourceResponse;
logSource?: SourceResponse | null;
traceId: string;
}
export interface UseTraceDataReturn {
traceSpans: TaggedSpanRow[];
logEvents: TaggedSpanRow[];
loading: boolean;
error: string | null;
selectedRowData: Record<string, unknown> | null;
selectedRowLoading: boolean;
fetchSelectedRow: (node: SpanNode | null) => void;
}
// ---- Hook ----------------------------------------------------------
export function useTraceData({
clickhouseClient,
source,
logSource,
traceId,
}: UseTraceDataParams): UseTraceDataReturn {
const [traceSpans, setTraceSpans] = useState<TaggedSpanRow[]>([]);
const [logEvents, setLogEvents] = useState<TaggedSpanRow[]>([]);
const [loading, setLoading] = useState(true);
const [error, setError] = useState<string | null>(null);
const [selectedRowData, setSelectedRowData] = useState<Record<
string,
unknown
> | null>(null);
const [selectedRowLoading, setSelectedRowLoading] = useState(false);
const fetchIdRef = useRef(0);
// ---- Fetch trace spans + correlated logs -------------------------
useEffect(() => {
let cancelled = false;
setLoading(true);
setError(null);
(async () => {
try {
// Fetch trace spans
const traceQuery = buildTraceSpansSql({ source, traceId });
const traceResultSet = await clickhouseClient.query({
query: traceQuery.sql,
format: 'JSON',
connectionId: traceQuery.connectionId,
});
const traceJson = await traceResultSet.json<SpanRow>();
const spans = ((traceJson.data ?? []) as SpanRow[]).map(r => ({
...r,
kind: 'span' as const,
}));
// Fetch correlated log events (if log source exists)
let logs: TaggedSpanRow[] = [];
if (logSource) {
const logQuery = buildTraceLogsSql({
source: logSource,
traceId,
});
const logResultSet = await clickhouseClient.query({
query: logQuery.sql,
format: 'JSON',
connectionId: logQuery.connectionId,
});
const logJson = await logResultSet.json<SpanRow>();
logs = ((logJson.data ?? []) as SpanRow[]).map(r => ({
...r,
Duration: 0, // Logs have no duration
kind: 'log' as const,
}));
}
if (!cancelled) {
setTraceSpans(spans);
setLogEvents(logs);
}
} catch (err) {
if (!cancelled) {
setError(err instanceof Error ? err.message : String(err));
}
} finally {
if (!cancelled) setLoading(false);
}
})();
return () => {
cancelled = true;
};
}, [clickhouseClient, source, logSource, traceId]);
// ---- Fetch SELECT * for the selected span/log --------------------
// Stable scalar deps (SpanId, Timestamp, kind) are used to avoid
// infinite re-fetch loops from array ref changes.
const lastNodeRef = useRef<{
spanId: string | null;
timestamp: string | null;
kind: 'span' | 'log' | null;
}>({ spanId: null, timestamp: null, kind: null });
const fetchSelectedRow = (node: SpanNode | null) => {
const spanId = node?.SpanId ?? null;
const timestamp = node?.Timestamp ?? null;
const kind = node?.kind ?? null;
// Skip if same node as last fetch
const last = lastNodeRef.current;
if (
last.spanId === spanId &&
last.timestamp === timestamp &&
last.kind === kind
) {
return;
}
lastNodeRef.current = { spanId, timestamp, kind };
if (!timestamp || !kind) {
// Don't clear existing data — keep showing old data while
// transitioning between spans to avoid flashing
return;
}
const isLog = kind === 'log';
const rowSource = isLog && logSource ? logSource : source;
const db = rowSource.from.databaseName;
const table = rowSource.from.tableName;
const spanIdExpr = rowSource.spanIdExpression ?? 'SpanId';
const tsExpr =
rowSource.displayedTimestampValueExpression ??
rowSource.timestampValueExpression ??
'TimestampTime';
const traceIdExpr = rowSource.traceIdExpression ?? 'TraceId';
const escapedTs = SqlString.escape(timestamp);
const escapedTraceId = SqlString.escape(traceId);
// Build WHERE: always use TraceId + Timestamp; add SpanId if available
const clauses = [
`${traceIdExpr} = ${escapedTraceId}`,
`${tsExpr} = parseDateTime64BestEffort(${escapedTs}, 9)`,
];
if (spanId) {
clauses.push(`${spanIdExpr} = ${SqlString.escape(spanId)}`);
}
const where = clauses.join(' AND ');
const sql = `SELECT * FROM ${db}.${table} WHERE ${where} LIMIT 1`;
const fetchId = ++fetchIdRef.current;
// Don't clear existing data or set loading — keep old data visible
// while fetching to avoid flashing
(async () => {
try {
const resultSet = await clickhouseClient.query({
query: sql,
format: 'JSON',
connectionId: rowSource.connection,
});
const json = await resultSet.json<Record<string, unknown>>();
const row = (json.data as Record<string, unknown>[])?.[0];
if (fetchId === fetchIdRef.current) {
setSelectedRowData(row ?? null);
setSelectedRowLoading(false);
}
} catch {
if (fetchId === fetchIdRef.current) {
setSelectedRowData(null);
setSelectedRowLoading(false);
}
}
})();
};
return {
traceSpans,
logEvents,
loading,
error,
selectedRowData,
selectedRowLoading,
fetchSelectedRow,
};
}

View file

@ -0,0 +1,82 @@
import type { SpanNode } from './types';
// ---- Duration formatting -------------------------------------------
export function durationMs(
durationRaw: number,
precision: number | undefined,
): number {
const p = precision ?? 3;
if (p === 9) return durationRaw / 1_000_000;
if (p === 6) return durationRaw / 1_000;
return durationRaw;
}
export function formatDuration(
durationRaw: number,
precision: number | undefined,
): string {
const ms = durationMs(durationRaw, precision);
if (ms === 0) return '0ms';
if (ms >= 1000) return `${(ms / 1000).toFixed(2)}s`;
if (ms >= 1) return `${ms.toFixed(1)}ms`;
if (ms >= 0.001) return `${(ms * 1000).toFixed(1)}μs`;
return `${(ms * 1_000_000).toFixed(0)}ns`;
}
// ---- Status helpers ------------------------------------------------
export function getStatusLabel(node: SpanNode): string {
if (node.kind === 'log') {
const sev = node.StatusCode?.toLowerCase();
if (sev === 'error' || sev === 'fatal' || sev === 'critical') return 'ERR';
if (sev === 'warn' || sev === 'warning') return 'WARN';
return '';
}
if (node.StatusCode === '2' || node.StatusCode === 'Error') return 'ERR';
if (node.StatusCode === '1') return 'WARN';
return '';
}
export function getStatusColor(node: SpanNode): 'red' | 'yellow' | undefined {
if (node.kind === 'log') {
const sev = node.StatusCode?.toLowerCase();
if (sev === 'error' || sev === 'fatal' || sev === 'critical') return 'red';
if (sev === 'warn' || sev === 'warning') return 'yellow';
return undefined;
}
if (node.StatusCode === '2' || node.StatusCode === 'Error') return 'red';
if (node.StatusCode === '1') return 'yellow';
return undefined;
}
export function getBarColor(node: SpanNode): string {
if (node.kind === 'log') return 'green';
if (node.StatusCode === '2' || node.StatusCode === 'Error') return 'red';
if (node.StatusCode === '1') return 'yellow';
return 'cyan';
}
// ---- Bar rendering -------------------------------------------------
export function renderBar(
startMs: number,
durMs: number,
minMs: number,
maxMs: number,
barWidth: number,
): string {
const totalMs = maxMs - minMs;
if (totalMs <= 0 || barWidth <= 0) return '';
const startFrac = (startMs - minMs) / totalMs;
const durFrac = durMs / totalMs;
const startCol = Math.round(startFrac * barWidth);
const barLen = Math.max(1, Math.round(durFrac * barWidth));
const endCol = Math.min(startCol + barLen, barWidth);
const leading = ' '.repeat(Math.max(0, startCol));
const bar = '█'.repeat(Math.max(1, endCol - Math.max(0, startCol)));
return (leading + bar).slice(0, barWidth);
}

View file

@ -0,0 +1,71 @@
/**
* Row data panel helpers.
*
* @source packages/app/src/components/DBRowDataPanel.tsx
*
* Contains ROW_DATA_ALIASES and the SELECT list builder for the full row
* fetch query. Same exports as the web frontend for future move to common-utils.
*/
import type { SelectList } from '@hyperdx/common-utils/dist/types';
import type { SourceResponse } from '@/api/client';
import {
getDisplayedTimestampValueExpression,
getEventBody,
isLogSource,
isTraceSource,
} from '@/shared/source';
export enum ROW_DATA_ALIASES {
TIMESTAMP = '__hdx_timestamp',
BODY = '__hdx_body',
TRACE_ID = '__hdx_trace_id',
SPAN_ID = '__hdx_span_id',
SEVERITY_TEXT = '__hdx_severity_text',
SERVICE_NAME = '__hdx_service_name',
RESOURCE_ATTRIBUTES = '__hdx_resource_attributes',
EVENT_ATTRIBUTES = '__hdx_event_attributes',
EVENTS_EXCEPTION_ATTRIBUTES = '__hdx_events_exception_attributes',
SPAN_EVENTS = '__hdx_span_events',
}
/**
* Build the SELECT list with __hdx_* aliases for a full row fetch,
* matching the web frontend's useRowData in DBRowDataPanel.tsx.
*/
export function buildRowDataSelectList(source: SourceResponse): SelectList {
const select: SelectList = [{ valueExpression: '*' }];
const add = (expr: string | undefined, alias: string) => {
if (expr) select.push({ valueExpression: expr, alias });
};
add(getDisplayedTimestampValueExpression(source), ROW_DATA_ALIASES.TIMESTAMP);
const eventBodyExpr = getEventBody(source);
add(eventBodyExpr ?? undefined, ROW_DATA_ALIASES.BODY);
if (isLogSource(source) || isTraceSource(source)) {
add(source.traceIdExpression, ROW_DATA_ALIASES.TRACE_ID);
add(source.spanIdExpression, ROW_DATA_ALIASES.SPAN_ID);
add(source.serviceNameExpression, ROW_DATA_ALIASES.SERVICE_NAME);
}
if (isLogSource(source)) {
add(source.severityTextExpression, ROW_DATA_ALIASES.SEVERITY_TEXT);
} else if (isTraceSource(source)) {
add(source.statusCodeExpression, ROW_DATA_ALIASES.SEVERITY_TEXT);
}
add(
source.resourceAttributesExpression,
ROW_DATA_ALIASES.RESOURCE_ATTRIBUTES,
);
if (isLogSource(source) || isTraceSource(source)) {
add(source.eventAttributesExpression, ROW_DATA_ALIASES.EVENT_ATTRIBUTES);
}
return select;
}

View file

@ -0,0 +1,50 @@
/**
* Source helper functions.
*
* @source packages/app/src/source.ts
*
* Only the pure functions needed by the CLI are included (no React hooks,
* no API calls, no Mantine notifications).
*
* Same exports as the web frontend so they can be moved to common-utils later.
*/
import { splitAndTrimWithBracket } from '@hyperdx/common-utils/dist/core/utils';
import type { SourceResponse } from '@/api/client';
// If a user specifies a timestampValueExpression with multiple columns,
// this will return the first one. We'll want to refine this over time
export function getFirstTimestampValueExpression(valueExpression: string) {
return splitAndTrimWithBracket(valueExpression)[0];
}
export function getDisplayedTimestampValueExpression(
eventModel: SourceResponse,
) {
return (
eventModel.displayedTimestampValueExpression ??
getFirstTimestampValueExpression(
eventModel.timestampValueExpression ?? 'TimestampTime',
)
);
}
export function getEventBody(eventModel: SourceResponse) {
let expression: string | undefined;
if (eventModel.kind === 'trace') {
expression = eventModel.spanNameExpression ?? undefined;
} else if (eventModel.kind === 'log') {
expression = eventModel.bodyExpression;
}
const multiExpr = splitAndTrimWithBracket(expression ?? '');
return multiExpr.length === 1 ? expression : multiExpr[0];
}
export function isLogSource(source: SourceResponse): boolean {
return source.kind === 'log';
}
export function isTraceSource(source: SourceResponse): boolean {
return source.kind === 'trace';
}

View file

@ -0,0 +1,187 @@
/**
* Row WHERE clause builder.
*
* @source packages/app/src/hooks/useRowWhere.tsx
*
* Generates a WHERE clause to uniquely identify a row for a SELECT * lookup.
* Uses column metadata + alias map to resolve aliased column names back to
* actual ClickHouse expressions, with proper type handling for each column.
*
* This file uses the same exports as the web frontend so it can be moved
* to common-utils later.
*/
import MD5 from 'crypto-js/md5';
import SqlString from 'sqlstring';
import {
ColumnMetaType,
convertCHDataTypeToJSType,
JSDataType,
} from '@hyperdx/common-utils/dist/clickhouse';
import { aliasMapToWithClauses } from '@hyperdx/common-utils/dist/core/utils';
import type { BuilderChartConfig } from '@hyperdx/common-utils/dist/types';
const MAX_STRING_LENGTH = 512;
// Type for WITH clause entries, derived from ChartConfig's with property
export type WithClause = NonNullable<BuilderChartConfig['with']>[number];
// Internal row field names used by the table component for row tracking
const INTERNAL_ROW_FIELDS = {
ID: '__hyperdx_id',
ALIAS_WITH: '__hyperdx_alias_with',
} as const;
// Result type for row WHERE clause with alias support
export type RowWhereResult = {
where: string;
aliasWith: WithClause[];
};
type ColumnWithMeta = ColumnMetaType & {
valueExpr: string;
jsType: JSDataType | null;
};
function processRowToWhereClause(
row: Record<string, unknown>,
columnMap: Map<string, ColumnWithMeta>,
): string {
const res = Object.entries(row)
.map(([column, value]) => {
const cm = columnMap.get(column);
const chType = cm?.type;
const jsType = cm?.jsType;
const valueExpr = cm?.valueExpr;
if (chType == null) {
throw new Error(
`Column type not found for ${column}, ${JSON.stringify([...columnMap])}`,
);
}
if (valueExpr == null) {
throw new Error(
`valueExpr not found for ${column}, ${JSON.stringify([...columnMap])}`,
);
}
const strValue = value != null ? String(value) : null;
switch (jsType) {
case JSDataType.Date:
return SqlString.format(`?=parseDateTime64BestEffort(?, 9)`, [
SqlString.raw(valueExpr),
strValue,
]);
case JSDataType.Array:
case JSDataType.Map:
return SqlString.format(`?=JSONExtract(?, ?)`, [
SqlString.raw(valueExpr),
strValue,
chType,
]);
case JSDataType.Tuple:
return SqlString.format(`toJSONString(?)=?`, [
SqlString.raw(valueExpr),
strValue,
]);
case JSDataType.JSON:
// Handle case for whole json object, ex: json
return SqlString.format(`lower(hex(MD5(toString(?))))=?`, [
SqlString.raw(valueExpr),
MD5(strValue ?? '').toString(),
]);
case JSDataType.Dynamic:
// Handle case for json element, ex: json.c
// Currently we can't distinguish null or 'null'
if (value == null || strValue === 'null') {
return SqlString.format(`isNull(??)`, [valueExpr]);
}
if ((strValue?.length ?? 0) > 1000 || column.length > 1000) {
console.warn('Search value/object key too large.');
}
return SqlString.format(
"toJSONString(?) = coalesce(toJSONString(JSONExtract(?, 'Dynamic')), toJSONString(?))",
[SqlString.raw(valueExpr), strValue, strValue],
);
default:
// Handle nullish values
if (value == null) {
return SqlString.format(`isNull(?)`, [SqlString.raw(valueExpr)]);
}
// Handle the case when string is too long
if ((strValue?.length ?? 0) > MAX_STRING_LENGTH) {
return SqlString.format(`lower(hex(MD5(leftUTF8(?, 1000))))=?`, [
SqlString.raw(valueExpr),
MD5((strValue ?? '').substring(0, 1000)).toString(),
]);
}
return SqlString.format(`?=?`, [
SqlString.raw(valueExpr), // don't escape expressions
strValue,
]);
}
})
.join(' AND ');
return res;
}
/**
* Build a column map from query metadata and alias map.
* This is the non-React equivalent of the useRowWhere hook.
*/
export function buildColumnMap(
meta: ColumnMetaType[] | undefined,
aliasMap: Record<string, string | undefined> | undefined,
): Map<string, ColumnWithMeta> {
return new Map(
meta?.map(c => {
// if aliasMap is provided, use the alias as the valueExpr
// but if the alias is not found, use the column name as the valueExpr
const valueExpr =
aliasMap != null ? (aliasMap[c.name] ?? c.name) : c.name;
return [
c.name,
{
...c,
valueExpr,
jsType: convertCHDataTypeToJSType(c.type),
},
];
}),
);
}
/**
* Build aliasWith array from aliasMap.
*/
function buildAliasWith(
aliasMap: Record<string, string | undefined> | undefined,
): WithClause[] {
return aliasMapToWithClauses(aliasMap) ?? [];
}
/**
* Generate a RowWhereResult from a row, column map, and alias map.
* Non-React equivalent of the useRowWhere hook's returned callback.
*/
export function getRowWhere(
row: Record<string, unknown>,
columnMap: Map<string, ColumnWithMeta>,
aliasMap: Record<string, string | undefined> | undefined,
): RowWhereResult {
// Filter out synthetic columns that aren't in the database schema
const {
[INTERNAL_ROW_FIELDS.ID]: _id,
[INTERNAL_ROW_FIELDS.ALIAS_WITH]: _aliasWith,
...dbRow
} = row;
return {
where: processRowToWhereClause(dbRow, columnMap),
aliasWith: buildAliasWith(aliasMap),
};
}

View file

@ -0,0 +1,266 @@
/**
* Source map upload logic.
*
* Ported from hyperdx-js/packages/cli/src/lib.ts.
* Authenticates via a service account API key (not session cookies),
* globs for .js and .js.map files, and uploads them to presigned URLs.
*/
import { basename, join, resolve } from 'path';
import { readFileSync, statSync } from 'fs';
import { globSync } from 'glob';
import { createRequire } from 'module';
// Use process.stderr/stdout directly because console.error/log/info
// are silenced by silenceLogs.ts for the TUI mode.
const log = (msg: string) => process.stdout.write(`${msg}\n`);
const logError = (msg: string) => process.stderr.write(`${msg}\n`);
const require = createRequire(import.meta.url);
const PKG_VERSION: string = (require('../package.json') as { version: string })
.version;
const MAX_RETRIES = 3;
const RETRY_DELAYS = [1000, 3000]; // ms between retries
const sleep = (ms: number) => new Promise(r => setTimeout(r, ms));
/** Join URL paths without mangling the protocol (path.join strips '//') */
function urlJoin(base: string, ...segments: string[]): string {
const url = new URL(
segments.join('/'),
base.endsWith('/') ? base : `${base}/`,
);
return url.toString();
}
export interface UploadSourcemapsOptions {
allowNoop?: boolean;
serviceKey: string;
apiUrl?: string;
apiVersion?: string;
basePath?: string;
path: string;
releaseId?: string;
}
export async function uploadSourcemaps({
allowNoop,
serviceKey,
apiUrl,
apiVersion,
basePath,
path,
releaseId,
}: UploadSourcemapsOptions): Promise<void> {
if (!serviceKey || serviceKey === '') {
if (process.env.HYPERDX_SERVICE_KEY) {
serviceKey = process.env.HYPERDX_SERVICE_KEY;
} else {
throw new Error('service key cannot be empty');
}
}
const backend = apiUrl || 'https://api.hyperdx.io';
const version = apiVersion || 'v1';
const res = await fetch(urlJoin(backend, 'api', version), {
method: 'get',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${serviceKey}`,
},
})
.then(response => {
if (!response.ok) {
throw new Error(
`Authentication failed (${response.status}). Check your --serviceKey and --apiUrl.`,
);
}
return response.json();
})
.then(data => {
return data as { user?: { team?: string } };
})
.catch(e => {
logError(e.message || String(e));
return undefined;
});
const teamId = res?.user?.team;
if (!teamId) {
throw new Error('invalid service key');
}
log(`Starting to upload source maps from ${path}`);
const fileList = getAllSourceMapFiles([path], { allowNoop });
if (fileList.length === 0) {
logError(
`Error: No source maps found in ${path}, is this the correct path?`,
);
logError('Failed to upload source maps. Please see reason above.');
return;
}
const uploadKeys = fileList.map(({ name }) => ({
basePath: basePath || '',
fullName: name,
releaseId,
}));
const urlRes = await fetch(
urlJoin(backend, 'api', version, 'sourcemaps', 'upload-presigned-urls'),
{
method: 'post',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${serviceKey}`,
},
body: JSON.stringify({
pkgVersion: PKG_VERSION,
keys: uploadKeys,
}),
},
)
.then(response => {
if (!response.ok) {
throw new Error(`Failed to get upload URLs (${response.status}).`);
}
return response.json();
})
.then(data => {
return data as { data?: string[] };
})
.catch(e => {
logError(e.message || String(e));
return undefined;
});
if (!Array.isArray(urlRes?.data)) {
logError(
`Error: Unable to generate source map upload urls. Response: ${JSON.stringify(urlRes)}`,
);
logError('Failed to upload source maps. Please see reason above.');
return;
}
const uploadUrls = urlRes.data;
const results = await Promise.all(
fileList.map(({ path, name }, idx) =>
uploadFile(path, uploadUrls[idx], name, idx, fileList.length),
),
);
const succeeded = results.filter(Boolean).length;
const failed = results.length - succeeded;
log(
`\n[HyperDX] Upload complete: ${succeeded} succeeded, ${failed} failed out of ${results.length} files.`,
);
if (failed > 0) {
logError('[HyperDX] Some files failed to upload. See errors above.');
}
}
// ---- Helpers -------------------------------------------------------
function getAllSourceMapFiles(
paths: string[],
{ allowNoop }: { allowNoop?: boolean },
): { path: string; name: string }[] {
const map: { path: string; name: string }[] = [];
for (const path of paths) {
const realPath = resolve(path);
if (statSync(realPath).isFile()) {
map.push({
path: realPath,
name: basename(realPath),
});
continue;
}
if (
!allowNoop &&
!globSync('**/*.js.map', {
cwd: realPath,
nodir: true,
ignore: '**/node_modules/**/*',
}).length
) {
throw new Error(
'No .js.map files found. Please double check that you have generated sourcemaps for your app.',
);
}
for (const file of globSync('**/*.js?(.map)', {
cwd: realPath,
nodir: true,
ignore: '**/node_modules/**/*',
})) {
map.push({
path: join(realPath, file),
name: file,
});
const routeGroupRemovedPath = file.replaceAll(
new RegExp(/(\(.+?\))\//gm),
'',
);
if (file !== routeGroupRemovedPath) {
// also upload the file to a path without the route group for frontend errors
map.push({
path: join(realPath, file),
name: routeGroupRemovedPath,
});
}
}
}
return map;
}
async function uploadFile(
filePath: string,
uploadUrl: string,
name: string,
index: number,
total: number,
): Promise<boolean> {
const fileContent = readFileSync(filePath);
const prefix = `[HyperDX] [${index + 1}/${total}]`;
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
try {
const res = await fetch(uploadUrl, { method: 'put', body: fileContent });
if (res.ok) {
log(`${prefix} Uploaded ${name}`);
return true;
}
// 4xx — permanent failure, don't retry
if (res.status >= 400 && res.status < 500) {
logError(`${prefix} Failed to upload ${name} (${res.status})`);
return false;
}
// 5xx — server error, retry
throw new Error(`Server error (${res.status})`);
} catch (e) {
const msg = e instanceof Error ? e.message : String(e);
if (attempt < MAX_RETRIES) {
const delay = RETRY_DELAYS[attempt - 1] ?? 3000;
logError(
`${prefix} Upload failed (${msg}), retrying in ${delay / 1000}s...`,
);
await sleep(delay);
} else {
logError(
`${prefix} Failed to upload ${name} after ${MAX_RETRIES} attempts: ${msg}`,
);
return false;
}
}
}
return false;
}

View file

@ -0,0 +1,44 @@
import * as fs from 'fs';
import * as path from 'path';
import * as os from 'os';
const CONFIG_DIR = path.join(os.homedir(), '.config', 'hyperdx', 'cli');
const SESSION_FILE = path.join(CONFIG_DIR, 'session.json');
export interface SessionConfig {
apiUrl: string;
cookies: string[];
}
function ensureConfigDir() {
if (!fs.existsSync(CONFIG_DIR)) {
fs.mkdirSync(CONFIG_DIR, { recursive: true, mode: 0o700 });
}
}
export function saveSession(config: SessionConfig): void {
ensureConfigDir();
fs.writeFileSync(SESSION_FILE, JSON.stringify(config, null, 2), {
mode: 0o600,
});
}
export function loadSession(): SessionConfig | null {
try {
if (!fs.existsSync(SESSION_FILE)) return null;
const data = fs.readFileSync(SESSION_FILE, 'utf-8');
return JSON.parse(data) as SessionConfig;
} catch {
return null;
}
}
export function clearSession(): void {
try {
if (fs.existsSync(SESSION_FILE)) {
fs.unlinkSync(SESSION_FILE);
}
} catch {
// ignore
}
}

View file

@ -0,0 +1,202 @@
/**
* Opens $EDITOR with a temp file containing the current time range.
* The user edits the file, saves, and we parse the result back.
*
* File format:
* # HyperDX Time Range
* # Edit the start and end times below. Supports:
* # - ISO 8601: 2026-03-18T05:00:00Z
* # - Relative: now-1h, now-30m, now-24h, now-7d
* # - Date only: 2026-03-18 (interpreted as start of day UTC)
* #
* # Lines starting with # are ignored.
*
* start: 2026-03-18T04:00:00.000Z
* end: 2026-03-18T05:00:00.000Z
*/
import * as fs from 'fs';
import * as os from 'os';
import * as path from 'path';
import { execSync } from 'child_process';
export interface TimeRange {
start: Date;
end: Date;
}
function formatDate(d: Date): string {
return d.toISOString();
}
function buildFileContent(range: TimeRange): string {
return [
'# HyperDX Time Range',
'# Edit the start and end times below. Supports:',
'# - ISO 8601: 2026-03-18T05:00:00Z',
'# - Relative: now-1h, now-30m, now-24h, now-7d',
'# - Date only: 2026-03-18 (interpreted as start of day UTC)',
'#',
'# Lines starting with # are ignored.',
'',
`start: ${formatDate(range.start)}`,
`end: ${formatDate(range.end)}`,
'',
].join('\n');
}
/**
* Parse a time string that can be:
* - ISO 8601: "2026-03-18T05:00:00Z"
* - Relative: "now-1h", "now-30m", "now-24h", "now-7d", "now"
* - Date only: "2026-03-18"
*/
function parseTimeValue(value: string): Date | null {
const trimmed = value.trim();
// Relative time: now, now-1h, now-30m, etc.
if (trimmed.startsWith('now')) {
const now = Date.now();
if (trimmed === 'now') return new Date(now);
const match = trimmed.match(/^now-(\d+)(s|m|h|d|w)$/);
if (match) {
const n = parseInt(match[1], 10);
const unit = match[2];
const ms: Record<string, number> = {
s: 1000,
m: 60_000,
h: 3_600_000,
d: 86_400_000,
w: 604_800_000,
};
return new Date(now - n * (ms[unit] ?? 3_600_000));
}
return null;
}
// Try parsing as ISO / date string
const d = new Date(trimmed);
if (!isNaN(d.getTime())) return d;
return null;
}
function parseFileContent(content: string): TimeRange | null {
let start: Date | null = null;
let end: Date | null = null;
for (const line of content.split('\n')) {
const trimmed = line.trim();
if (trimmed.startsWith('#') || trimmed === '') continue;
const startMatch = trimmed.match(/^start:\s*(.+)$/i);
if (startMatch) {
start = parseTimeValue(startMatch[1]);
}
const endMatch = trimmed.match(/^end:\s*(.+)$/i);
if (endMatch) {
end = parseTimeValue(endMatch[1]);
}
}
if (start && end && start < end) {
return { start, end };
}
return null;
}
/**
* Opens $EDITOR (or vi) with the current time range.
* Returns the edited time range, or null if cancelled / invalid.
*
* This is a blocking call Ink's render loop pauses while the editor
* is open, which is the desired behavior (like git commit).
*/
// ---- Select clause editor ------------------------------------------
function buildSelectFileContent(currentSelect: string): string {
return [
'# HyperDX Select Clause',
'# Edit the SELECT columns below.',
'# One expression per line, or comma-separated on a single line.',
'# Examples:',
'# TimestampTime, Body, SeverityText',
'# SpanName, ServiceName, Duration',
'# toString(LogAttributes) AS attrs',
'#',
'# Lines starting with # are ignored.',
'',
currentSelect,
'',
].join('\n');
}
function parseSelectFileContent(content: string): string | null {
const lines: string[] = [];
for (const line of content.split('\n')) {
const trimmed = line.trim();
if (trimmed.startsWith('#') || trimmed === '') continue;
lines.push(trimmed);
}
const result = lines.join(', ').trim();
return result || null;
}
/**
* Opens $EDITOR with the current SELECT clause.
* Returns the edited select string, or null if cancelled / empty.
*/
export function openEditorForSelect(currentSelect: string): string | null {
const editor = process.env.EDITOR || process.env.VISUAL || 'vi';
const tmpFile = path.join(os.tmpdir(), `hdx-select-${Date.now()}.sql`);
try {
fs.writeFileSync(tmpFile, buildSelectFileContent(currentSelect), 'utf-8');
execSync(`${editor} ${tmpFile}`, {
stdio: 'inherit',
});
const edited = fs.readFileSync(tmpFile, 'utf-8');
return parseSelectFileContent(edited);
} catch {
return null;
} finally {
try {
fs.unlinkSync(tmpFile);
} catch {
// ignore
}
}
}
// ---- Time range editor ---------------------------------------------
export function openEditorForTimeRange(
currentRange: TimeRange,
): TimeRange | null {
const editor = process.env.EDITOR || process.env.VISUAL || 'vi';
const tmpFile = path.join(os.tmpdir(), `hdx-timerange-${Date.now()}.txt`);
try {
fs.writeFileSync(tmpFile, buildFileContent(currentRange), 'utf-8');
// Open editor — this blocks until the user saves and exits
execSync(`${editor} ${tmpFile}`, {
stdio: 'inherit', // Inherit stdin/stdout/stderr so the editor works
});
const edited = fs.readFileSync(tmpFile, 'utf-8');
return parseFileContent(edited);
} catch {
return null;
} finally {
try {
fs.unlinkSync(tmpFile);
} catch {
// ignore
}
}
}

View file

@ -0,0 +1,15 @@
/**
* Must be the very first import in cli.tsx so it runs before
* any common-utils code calls console.debug/warn/error.
*
* Exports the original methods for use in non-TUI commands
* (e.g. auth, sources) that write directly to stderr.
*/
const _origDebug = console.debug;
const _origWarn = console.warn;
export const _origError = console.error;
console.debug = () => {};
console.warn = () => {};
console.error = () => {};

View file

@ -0,0 +1,22 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "ES2022",
"moduleResolution": "bundler",
"jsx": "react-jsx",
"outDir": "dist",
"rootDir": "src",
"declaration": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"strict": true,
"skipLibCheck": true,
"sourceMap": true,
"resolveJsonModule": true,
"paths": {
"@/*": ["./src/*"]
}
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
}

View file

@ -0,0 +1,28 @@
import { defineConfig } from 'tsup';
import module from 'node:module';
export default defineConfig({
splitting: false,
clean: true,
format: ['esm'],
platform: 'node',
bundle: true,
outDir: 'dist',
entry: ['src/cli.tsx'],
// Bundle all dependencies into a single file for distribution.
noExternal: [/.*/],
// Keep Node.js built-ins + optional deps external
external: [
...module.builtinModules,
...module.builtinModules.map(m => `node:${m}`),
'react-devtools-core',
],
// Inject createRequire shim so CJS deps (signal-exit, etc.) can use
// require() for Node.js built-ins inside the ESM bundle.
banner: {
js: [
'import { createRequire as __cr } from "module";',
'const require = __cr(import.meta.url);',
].join('\n'),
},
});

735
yarn.lock

File diff suppressed because it is too large Load diff