mirror of
https://github.com/hyperdxio/hyperdx
synced 2026-04-21 13:37:15 +00:00
feat: integrate Model Context Protocol (MCP) server for dashboards & investigations (#2030)
## Summary Adds an MCP (Model Context Protocol) server to the HyperDX API, enabling AI assistants (Claude, Cursor, OpenCode, etc.) to query observability data, manage dashboards, and explore data sources directly via standardized tool calls. Key changes: - **MCP server** (`packages/api/src/mcp/`) — Streamable HTTP transport at `/api/mcp`, authenticated via Personal API Access Key - **Tools** — `hyperdx_list_sources`, `hyperdx_query`, `hyperdx_get_dashboard`, `hyperdx_save_dashboard`, `hyperdx_delete_dashboard`, `hyperdx_query_tile` - **Dashboard prompts** — Detailed prompt templates that guide LLMs in generating valid, high-quality dashboards - **Shared logic** — Refactored dashboard validation/transformation out of the external API router into reusable utils (`packages/api/src/routers/external-api/v2/utils/dashboards.ts`) - **Documentation** — `MCP.md` with setup instructions for Claude Code, OpenCode, Cursor, MCP Inspector, and other clients - **Tests** — Unit tests for dashboard tools, query tools, tracing, and response trimming ### Screenshots https://github.com/user-attachments/assets/8c5aa582-c79e-47e0-8f75-e03feabdf8a6 ### How to test locally 1. Start the dev stack: `yarn dev` 2. Connect an MCP client (e.g. MCP Inspector): ```bash cd packages/api && yarn dev:mcp ``` Then configure the inspector: - **Transport Type:** Streamable HTTP - **URL:** `http://localhost:8080/api/mcp` - **Header:** `Authorization: Bearer <your-personal-access-key>` - Click **Connect** 3. Alternatively, connect via Claude Code or OpenCode: ```bash claude mcp add --transport http hyperdx http://localhost:8080/api/mcp \ --header "Authorization: Bearer <your-personal-access-key>" ``` 4. Try listing sources, querying data, or creating/updating a dashboard through the connected AI assistant. 5. Run unit tests: ```bash cd packages/api && yarn ci:unit ``` ### References - Linear Issue: HDX-3710
This commit is contained in:
parent
fe3ab41c43
commit
9781ae6387
39 changed files with 4673 additions and 245 deletions
5
.changeset/cool-pants-train.md
Normal file
5
.changeset/cool-pants-train.md
Normal file
|
|
@ -0,0 +1,5 @@
|
||||||
|
---
|
||||||
|
"@hyperdx/api": minor
|
||||||
|
---
|
||||||
|
|
||||||
|
Add an MCP (Model Context Protocol) server to the HyperDX API, enabling AI assistants (Claude, Cursor, OpenCode, etc.) to query observability data, manage dashboards, and explore data sources directly via standardized tool calls.
|
||||||
|
|
@ -125,7 +125,11 @@ yarn dev:unit
|
||||||
|
|
||||||
## AI-Assisted Development
|
## AI-Assisted Development
|
||||||
|
|
||||||
The repo ships with configuration for AI coding assistants that enables interactive browser-based E2E test generation and debugging via the [Playwright MCP server](https://github.com/microsoft/playwright-mcp).
|
HyperDX includes an [MCP server](https://modelcontextprotocol.io/) that lets AI assistants query observability data, manage dashboards, and
|
||||||
|
explore data sources. See [MCP.md](/MCP.md) for setup instructions.
|
||||||
|
|
||||||
|
The repo also ships with configuration for AI coding assistants that enables interactive browser-based E2E test generation and debugging via
|
||||||
|
the [Playwright MCP server](https://github.com/microsoft/playwright-mcp).
|
||||||
|
|
||||||
### Claude Code
|
### Claude Code
|
||||||
|
|
||||||
|
|
|
||||||
89
MCP.md
Normal file
89
MCP.md
Normal file
|
|
@ -0,0 +1,89 @@
|
||||||
|
# HyperDX MCP Server
|
||||||
|
|
||||||
|
HyperDX exposes a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) server that lets AI assistants query your observability
|
||||||
|
data, manage dashboards, and explore data sources directly.
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- A running HyperDX instance (see [CONTRIBUTING.md](/CONTRIBUTING.md) for local development setup, or [DEPLOY.md](/DEPLOY.md) for
|
||||||
|
self-hosted deployment)
|
||||||
|
- A **Personal API Access Key** — find yours in the HyperDX UI under **Team Settings > API Keys > Personal API Access Key**
|
||||||
|
|
||||||
|
## Endpoint
|
||||||
|
|
||||||
|
The MCP server is available at the `/api/mcp` path on your HyperDX instance. For local development this is:
|
||||||
|
|
||||||
|
```
|
||||||
|
http://localhost:8080/api/mcp
|
||||||
|
```
|
||||||
|
|
||||||
|
Replace `localhost:8080` with your instance's host and port if you've customized the defaults.
|
||||||
|
|
||||||
|
## Connecting an MCP Client
|
||||||
|
|
||||||
|
The MCP server uses the **Streamable HTTP** transport with Bearer token authentication. In the examples below, replace `<your-hyperdx-url>`
|
||||||
|
with your instance URL (e.g. `http://localhost:8080`).
|
||||||
|
|
||||||
|
### Claude Code
|
||||||
|
|
||||||
|
```bash
|
||||||
|
claude mcp add --transport http hyperdx <your-hyperdx-url>/api/mcp \
|
||||||
|
--header "Authorization: Bearer <your-personal-access-key>"
|
||||||
|
```
|
||||||
|
|
||||||
|
### OpenCode
|
||||||
|
|
||||||
|
```bash
|
||||||
|
opencode mcp add --transport http hyperdx <your-hyperdx-url>/api/mcp \
|
||||||
|
--header "Authorization: Bearer <your-personal-access-key>"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cursor
|
||||||
|
|
||||||
|
Add the following to `.cursor/mcp.json` in your project (or your global Cursor settings):
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"mcpServers": {
|
||||||
|
"hyperdx": {
|
||||||
|
"url": "<your-hyperdx-url>/api/mcp",
|
||||||
|
"headers": {
|
||||||
|
"Authorization": "Bearer <your-personal-access-key>"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### MCP Inspector
|
||||||
|
|
||||||
|
The MCP Inspector is useful for interactively testing and debugging the server.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd packages/api && yarn dev:mcp
|
||||||
|
```
|
||||||
|
|
||||||
|
Then configure the inspector:
|
||||||
|
|
||||||
|
1. **Transport Type:** Streamable HTTP
|
||||||
|
2. **URL:** `<your-hyperdx-url>/api/mcp`
|
||||||
|
3. **Authentication:** Header `Authorization` with value `Bearer <your-personal-access-key>`
|
||||||
|
4. Click **Connect**
|
||||||
|
|
||||||
|
### Other Clients
|
||||||
|
|
||||||
|
Any MCP client that supports Streamable HTTP transport can connect. Configure it with:
|
||||||
|
|
||||||
|
- **URL:** `<your-hyperdx-url>/api/mcp`
|
||||||
|
- **Header:** `Authorization: Bearer <your-personal-access-key>`
|
||||||
|
|
||||||
|
## Available Tools
|
||||||
|
|
||||||
|
| Tool | Description |
|
||||||
|
| -------------------------- | -------------------------------------------------------------------------------------------- |
|
||||||
|
| `hyperdx_list_sources` | List all data sources and database connections, including column schemas and attribute keys |
|
||||||
|
| `hyperdx_query` | Query observability data (logs, metrics, traces) using builder mode, search mode, or raw SQL |
|
||||||
|
| `hyperdx_get_dashboard` | List all dashboards or get full detail for a specific dashboard |
|
||||||
|
| `hyperdx_save_dashboard` | Create or update a dashboard with tiles (charts, tables, numbers, search, markdown) |
|
||||||
|
| `hyperdx_delete_dashboard` | Permanently delete a dashboard and its attached alerts |
|
||||||
|
| `hyperdx_query_tile` | Execute the query for a specific dashboard tile to validate results |
|
||||||
|
|
@ -13,6 +13,7 @@
|
||||||
"@hyperdx/common-utils": "^0.17.1",
|
"@hyperdx/common-utils": "^0.17.1",
|
||||||
"@hyperdx/node-opentelemetry": "^0.9.0",
|
"@hyperdx/node-opentelemetry": "^0.9.0",
|
||||||
"@hyperdx/passport-local-mongoose": "^9.0.1",
|
"@hyperdx/passport-local-mongoose": "^9.0.1",
|
||||||
|
"@modelcontextprotocol/sdk": "^1.27.1",
|
||||||
"@opentelemetry/api": "^1.8.0",
|
"@opentelemetry/api": "^1.8.0",
|
||||||
"@opentelemetry/host-metrics": "^0.35.5",
|
"@opentelemetry/host-metrics": "^0.35.5",
|
||||||
"@opentelemetry/sdk-metrics": "^1.30.1",
|
"@opentelemetry/sdk-metrics": "^1.30.1",
|
||||||
|
|
@ -82,6 +83,7 @@
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"start": "node ./build/index.js",
|
"start": "node ./build/index.js",
|
||||||
"dev": "DOTENV_CONFIG_PATH=.env.development nodemon --exec 'ts-node' --transpile-only -r tsconfig-paths/register -r dotenv-expand/config -r '@hyperdx/node-opentelemetry/build/src/tracing' ./src/index.ts",
|
"dev": "DOTENV_CONFIG_PATH=.env.development nodemon --exec 'ts-node' --transpile-only -r tsconfig-paths/register -r dotenv-expand/config -r '@hyperdx/node-opentelemetry/build/src/tracing' ./src/index.ts",
|
||||||
|
"dev:mcp": "npx @modelcontextprotocol/inspector",
|
||||||
"dev-task": "DOTENV_CONFIG_PATH=.env.development nodemon --exec 'ts-node' --transpile-only -r tsconfig-paths/register -r dotenv-expand/config -r '@hyperdx/node-opentelemetry/build/src/tracing' ./src/tasks/index.ts",
|
"dev-task": "DOTENV_CONFIG_PATH=.env.development nodemon --exec 'ts-node' --transpile-only -r tsconfig-paths/register -r dotenv-expand/config -r '@hyperdx/node-opentelemetry/build/src/tracing' ./src/tasks/index.ts",
|
||||||
"build": "rimraf ./build && tsc && tsc-alias && cp -r ./src/opamp/proto ./build/opamp/",
|
"build": "rimraf ./build && tsc && tsc-alias && cp -r ./src/opamp/proto ./build/opamp/",
|
||||||
"lint": "npx eslint --quiet . --ext .ts",
|
"lint": "npx eslint --quiet . --ext .ts",
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,7 @@ import session from 'express-session';
|
||||||
import onHeaders from 'on-headers';
|
import onHeaders from 'on-headers';
|
||||||
|
|
||||||
import * as config from './config';
|
import * as config from './config';
|
||||||
|
import mcpRouter from './mcp/app';
|
||||||
import { isUserAuthenticated } from './middleware/auth';
|
import { isUserAuthenticated } from './middleware/auth';
|
||||||
import defaultCors from './middleware/cors';
|
import defaultCors from './middleware/cors';
|
||||||
import { appErrorHandler } from './middleware/error';
|
import { appErrorHandler } from './middleware/error';
|
||||||
|
|
@ -90,6 +91,9 @@ if (config.USAGE_STATS_ENABLED) {
|
||||||
// PUBLIC ROUTES
|
// PUBLIC ROUTES
|
||||||
app.use('/', routers.rootRouter);
|
app.use('/', routers.rootRouter);
|
||||||
|
|
||||||
|
// SELF-AUTHENTICATED ROUTES (validated via access key, not session middleware)
|
||||||
|
app.use('/mcp', mcpRouter);
|
||||||
|
|
||||||
// PRIVATE ROUTES
|
// PRIVATE ROUTES
|
||||||
app.use('/ai', isUserAuthenticated, routers.aiRouter);
|
app.use('/ai', isUserAuthenticated, routers.aiRouter);
|
||||||
app.use('/alerts', isUserAuthenticated, routers.alertsRouter);
|
app.use('/alerts', isUserAuthenticated, routers.alertsRouter);
|
||||||
|
|
|
||||||
545
packages/api/src/mcp/__tests__/dashboards.test.ts
Normal file
545
packages/api/src/mcp/__tests__/dashboards.test.ts
Normal file
|
|
@ -0,0 +1,545 @@
|
||||||
|
import { SourceKind } from '@hyperdx/common-utils/dist/types';
|
||||||
|
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
|
||||||
|
|
||||||
|
import * as config from '@/config';
|
||||||
|
import {
|
||||||
|
DEFAULT_DATABASE,
|
||||||
|
DEFAULT_TRACES_TABLE,
|
||||||
|
getLoggedInAgent,
|
||||||
|
getServer,
|
||||||
|
} from '@/fixtures';
|
||||||
|
import Connection from '@/models/connection';
|
||||||
|
import Dashboard from '@/models/dashboard';
|
||||||
|
import { Source } from '@/models/source';
|
||||||
|
|
||||||
|
import { McpContext } from '../tools/types';
|
||||||
|
import { callTool, createTestClient, getFirstText } from './mcpTestUtils';
|
||||||
|
|
||||||
|
describe('MCP Dashboard Tools', () => {
|
||||||
|
const server = getServer();
|
||||||
|
let team: any;
|
||||||
|
let user: any;
|
||||||
|
let traceSource: any;
|
||||||
|
let connection: any;
|
||||||
|
let client: Client;
|
||||||
|
|
||||||
|
beforeAll(async () => {
|
||||||
|
await server.start();
|
||||||
|
});
|
||||||
|
|
||||||
|
beforeEach(async () => {
|
||||||
|
const result = await getLoggedInAgent(server);
|
||||||
|
team = result.team;
|
||||||
|
user = result.user;
|
||||||
|
|
||||||
|
connection = await Connection.create({
|
||||||
|
team: team._id,
|
||||||
|
name: 'Default',
|
||||||
|
host: config.CLICKHOUSE_HOST,
|
||||||
|
username: config.CLICKHOUSE_USER,
|
||||||
|
password: config.CLICKHOUSE_PASSWORD,
|
||||||
|
});
|
||||||
|
|
||||||
|
traceSource = await Source.create({
|
||||||
|
kind: SourceKind.Trace,
|
||||||
|
team: team._id,
|
||||||
|
from: {
|
||||||
|
databaseName: DEFAULT_DATABASE,
|
||||||
|
tableName: DEFAULT_TRACES_TABLE,
|
||||||
|
},
|
||||||
|
timestampValueExpression: 'Timestamp',
|
||||||
|
connection: connection._id,
|
||||||
|
name: 'Traces',
|
||||||
|
});
|
||||||
|
|
||||||
|
const context: McpContext = {
|
||||||
|
teamId: team._id.toString(),
|
||||||
|
userId: user._id.toString(),
|
||||||
|
};
|
||||||
|
client = await createTestClient(context);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
await client.close();
|
||||||
|
await server.clearDBs();
|
||||||
|
});
|
||||||
|
|
||||||
|
afterAll(async () => {
|
||||||
|
await server.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('hyperdx_list_sources', () => {
|
||||||
|
it('should list available sources and connections', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_list_sources');
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output.sources).toHaveLength(1);
|
||||||
|
expect(output.sources[0]).toMatchObject({
|
||||||
|
id: traceSource._id.toString(),
|
||||||
|
name: 'Traces',
|
||||||
|
kind: SourceKind.Trace,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(output.connections).toHaveLength(1);
|
||||||
|
expect(output.connections[0]).toMatchObject({
|
||||||
|
id: connection._id.toString(),
|
||||||
|
name: 'Default',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(output.usage).toBeDefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should include column schema for sources', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_list_sources');
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
const source = output.sources[0];
|
||||||
|
|
||||||
|
expect(source.columns).toBeDefined();
|
||||||
|
expect(Array.isArray(source.columns)).toBe(true);
|
||||||
|
expect(source.columns.length).toBeGreaterThan(0);
|
||||||
|
// Each column should have name, type, and jsType
|
||||||
|
expect(source.columns[0]).toHaveProperty('name');
|
||||||
|
expect(source.columns[0]).toHaveProperty('type');
|
||||||
|
expect(source.columns[0]).toHaveProperty('jsType');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return empty sources for a team with no sources', async () => {
|
||||||
|
// Clear everything and re-register with new team
|
||||||
|
await client.close();
|
||||||
|
await server.clearDBs();
|
||||||
|
const result2 = await getLoggedInAgent(server);
|
||||||
|
const context2: McpContext = {
|
||||||
|
teamId: result2.team._id.toString(),
|
||||||
|
};
|
||||||
|
const client2 = await createTestClient(context2);
|
||||||
|
|
||||||
|
const result = await callTool(client2, 'hyperdx_list_sources');
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
|
||||||
|
expect(output.sources).toHaveLength(0);
|
||||||
|
expect(output.connections).toHaveLength(0);
|
||||||
|
|
||||||
|
await client2.close();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('hyperdx_get_dashboard', () => {
|
||||||
|
it('should list all dashboards when no id provided', async () => {
|
||||||
|
await new Dashboard({
|
||||||
|
name: 'Dashboard 1',
|
||||||
|
tiles: [],
|
||||||
|
team: team._id,
|
||||||
|
tags: ['tag1'],
|
||||||
|
}).save();
|
||||||
|
await new Dashboard({
|
||||||
|
name: 'Dashboard 2',
|
||||||
|
tiles: [],
|
||||||
|
team: team._id,
|
||||||
|
tags: ['tag2'],
|
||||||
|
}).save();
|
||||||
|
|
||||||
|
const result = await callTool(client, 'hyperdx_get_dashboard', {});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output).toHaveLength(2);
|
||||||
|
expect(output[0]).toHaveProperty('id');
|
||||||
|
expect(output[0]).toHaveProperty('name');
|
||||||
|
expect(output[0]).toHaveProperty('tags');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should get dashboard detail when id is provided', async () => {
|
||||||
|
const dashboard = await new Dashboard({
|
||||||
|
name: 'My Dashboard',
|
||||||
|
tiles: [],
|
||||||
|
team: team._id,
|
||||||
|
tags: ['test'],
|
||||||
|
}).save();
|
||||||
|
|
||||||
|
const result = await callTool(client, 'hyperdx_get_dashboard', {
|
||||||
|
id: dashboard._id.toString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output.id).toBe(dashboard._id.toString());
|
||||||
|
expect(output.name).toBe('My Dashboard');
|
||||||
|
expect(output.tags).toEqual(['test']);
|
||||||
|
expect(output.tiles).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for non-existent dashboard id', async () => {
|
||||||
|
const fakeId = '000000000000000000000000';
|
||||||
|
const result = await callTool(client, 'hyperdx_get_dashboard', {
|
||||||
|
id: fakeId,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('not found');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('hyperdx_save_dashboard', () => {
|
||||||
|
it('should create a new dashboard with tiles', async () => {
|
||||||
|
const sourceId = traceSource._id.toString();
|
||||||
|
const result = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'New MCP Dashboard',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Line Chart',
|
||||||
|
x: 0,
|
||||||
|
y: 0,
|
||||||
|
w: 12,
|
||||||
|
h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: 'line',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count', where: '' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
tags: ['mcp-test'],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output.id).toBeDefined();
|
||||||
|
expect(output.name).toBe('New MCP Dashboard');
|
||||||
|
expect(output.tiles).toHaveLength(1);
|
||||||
|
expect(output.tiles[0].config.displayType).toBe('line');
|
||||||
|
expect(output.tags).toEqual(['mcp-test']);
|
||||||
|
|
||||||
|
// Verify in database
|
||||||
|
const dashboard = await Dashboard.findById(output.id);
|
||||||
|
expect(dashboard).not.toBeNull();
|
||||||
|
expect(dashboard?.name).toBe('New MCP Dashboard');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should create a dashboard with a markdown tile', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'Markdown Dashboard',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Notes',
|
||||||
|
config: {
|
||||||
|
displayType: 'markdown',
|
||||||
|
markdown: '# Hello World',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output.tiles).toHaveLength(1);
|
||||||
|
expect(output.tiles[0].config.displayType).toBe('markdown');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should update an existing dashboard', async () => {
|
||||||
|
const sourceId = traceSource._id.toString();
|
||||||
|
|
||||||
|
// Create first
|
||||||
|
const createResult = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'Original Name',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Tile 1',
|
||||||
|
config: {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
const created = JSON.parse(getFirstText(createResult));
|
||||||
|
|
||||||
|
// Update
|
||||||
|
const updateResult = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
id: created.id,
|
||||||
|
name: 'Updated Name',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Updated Tile',
|
||||||
|
config: {
|
||||||
|
displayType: 'table',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
tags: ['updated'],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(updateResult.isError).toBeFalsy();
|
||||||
|
const updated = JSON.parse(getFirstText(updateResult));
|
||||||
|
expect(updated.id).toBe(created.id);
|
||||||
|
expect(updated.name).toBe('Updated Name');
|
||||||
|
expect(updated.tiles).toHaveLength(1);
|
||||||
|
expect(updated.tiles[0].name).toBe('Updated Tile');
|
||||||
|
expect(updated.tiles[0].config.displayType).toBe('table');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for missing source ID', async () => {
|
||||||
|
const fakeSourceId = '000000000000000000000000';
|
||||||
|
const result = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'Bad Dashboard',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Bad Tile',
|
||||||
|
config: {
|
||||||
|
displayType: 'line',
|
||||||
|
sourceId: fakeSourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('source');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error when updating non-existent dashboard', async () => {
|
||||||
|
const sourceId = traceSource._id.toString();
|
||||||
|
const result = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
id: '000000000000000000000000',
|
||||||
|
name: 'Ghost Dashboard',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Tile',
|
||||||
|
config: {
|
||||||
|
displayType: 'line',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('not found');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should create a dashboard with multiple tile types', async () => {
|
||||||
|
const sourceId = traceSource._id.toString();
|
||||||
|
const result = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'Multi-tile Dashboard',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Line',
|
||||||
|
x: 0,
|
||||||
|
y: 0,
|
||||||
|
w: 12,
|
||||||
|
h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: 'line',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'Table',
|
||||||
|
x: 0,
|
||||||
|
y: 4,
|
||||||
|
w: 12,
|
||||||
|
h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: 'table',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'Number',
|
||||||
|
x: 0,
|
||||||
|
y: 8,
|
||||||
|
w: 6,
|
||||||
|
h: 3,
|
||||||
|
config: {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'Pie',
|
||||||
|
x: 6,
|
||||||
|
y: 8,
|
||||||
|
w: 6,
|
||||||
|
h: 3,
|
||||||
|
config: {
|
||||||
|
displayType: 'pie',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
groupBy: 'SpanName',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'Notes',
|
||||||
|
x: 0,
|
||||||
|
y: 11,
|
||||||
|
w: 12,
|
||||||
|
h: 2,
|
||||||
|
config: { displayType: 'markdown', markdown: '# Dashboard Notes' },
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output.tiles).toHaveLength(5);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should create a dashboard with a raw SQL tile', async () => {
|
||||||
|
const connectionId = connection._id.toString();
|
||||||
|
const result = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'SQL Dashboard',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Raw SQL',
|
||||||
|
config: {
|
||||||
|
configType: 'sql',
|
||||||
|
displayType: 'table',
|
||||||
|
connectionId,
|
||||||
|
sqlTemplate: 'SELECT 1 AS value LIMIT 1',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output.tiles).toHaveLength(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('hyperdx_delete_dashboard', () => {
|
||||||
|
it('should delete an existing dashboard', async () => {
|
||||||
|
const dashboard = await new Dashboard({
|
||||||
|
name: 'To Delete',
|
||||||
|
tiles: [],
|
||||||
|
team: team._id,
|
||||||
|
}).save();
|
||||||
|
|
||||||
|
const result = await callTool(client, 'hyperdx_delete_dashboard', {
|
||||||
|
id: dashboard._id.toString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output.deleted).toBe(true);
|
||||||
|
expect(output.id).toBe(dashboard._id.toString());
|
||||||
|
|
||||||
|
// Verify deleted from database
|
||||||
|
const found = await Dashboard.findById(dashboard._id);
|
||||||
|
expect(found).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for non-existent dashboard', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_delete_dashboard', {
|
||||||
|
id: '000000000000000000000000',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('not found');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('hyperdx_query_tile', () => {
|
||||||
|
it('should return error for non-existent dashboard', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query_tile', {
|
||||||
|
dashboardId: '000000000000000000000000',
|
||||||
|
tileId: 'some-tile-id',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('not found');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for non-existent tile', async () => {
|
||||||
|
const sourceId = traceSource._id.toString();
|
||||||
|
const createResult = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'Tile Query Test',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'My Tile',
|
||||||
|
config: {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
const dashboard = JSON.parse(getFirstText(createResult));
|
||||||
|
|
||||||
|
const result = await callTool(client, 'hyperdx_query_tile', {
|
||||||
|
dashboardId: dashboard.id,
|
||||||
|
tileId: 'non-existent-tile-id',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('Tile not found');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for invalid time range', async () => {
|
||||||
|
const sourceId = traceSource._id.toString();
|
||||||
|
const createResult = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'Time Range Test',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Tile',
|
||||||
|
config: {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
const dashboard = JSON.parse(getFirstText(createResult));
|
||||||
|
|
||||||
|
const result = await callTool(client, 'hyperdx_query_tile', {
|
||||||
|
dashboardId: dashboard.id,
|
||||||
|
tileId: dashboard.tiles[0].id,
|
||||||
|
startTime: 'not-a-date',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('Invalid');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should execute query for a valid tile', async () => {
|
||||||
|
const sourceId = traceSource._id.toString();
|
||||||
|
const createResult = await callTool(client, 'hyperdx_save_dashboard', {
|
||||||
|
name: 'Query Tile Test',
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: 'Count Tile',
|
||||||
|
config: {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId,
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
const dashboard = JSON.parse(getFirstText(createResult));
|
||||||
|
|
||||||
|
const result = await callTool(client, 'hyperdx_query_tile', {
|
||||||
|
dashboardId: dashboard.id,
|
||||||
|
tileId: dashboard.tiles[0].id,
|
||||||
|
startTime: new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Should succeed (may have empty results since no data inserted)
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
53
packages/api/src/mcp/__tests__/mcpTestUtils.ts
Normal file
53
packages/api/src/mcp/__tests__/mcpTestUtils.ts
Normal file
|
|
@ -0,0 +1,53 @@
|
||||||
|
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
|
||||||
|
import { InMemoryTransport } from '@modelcontextprotocol/sdk/inMemory.js';
|
||||||
|
import {
|
||||||
|
type CallToolResult,
|
||||||
|
CallToolResultSchema,
|
||||||
|
} from '@modelcontextprotocol/sdk/types.js';
|
||||||
|
|
||||||
|
import { createServer } from '../mcpServer';
|
||||||
|
import { McpContext } from '../tools/types';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Connect an MCP server to an in-process Client via InMemoryTransport and
|
||||||
|
* return the client. This is the officially supported way to test MCP servers
|
||||||
|
* without accessing private SDK internals.
|
||||||
|
*/
|
||||||
|
export async function createTestClient(context: McpContext): Promise<Client> {
|
||||||
|
const mcpServer = createServer(context);
|
||||||
|
const [clientTransport, serverTransport] =
|
||||||
|
InMemoryTransport.createLinkedPair();
|
||||||
|
await mcpServer.connect(serverTransport);
|
||||||
|
const client = new Client({ name: 'test-client', version: '1.0.0' });
|
||||||
|
await client.connect(clientTransport);
|
||||||
|
return client;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Call a named MCP tool and return a properly-typed result.
|
||||||
|
*
|
||||||
|
* The SDK's `Client.callTool()` return type carries an index signature
|
||||||
|
* `[x: string]: unknown` that widens all property accesses to `unknown`.
|
||||||
|
* Re-parsing through `CallToolResultSchema` gives the concrete named type
|
||||||
|
* needed for clean test assertions.
|
||||||
|
*/
|
||||||
|
export async function callTool(
|
||||||
|
c: Client,
|
||||||
|
name: string,
|
||||||
|
args: Record<string, unknown> = {},
|
||||||
|
): Promise<CallToolResult> {
|
||||||
|
const raw = await c.callTool({ name, arguments: args });
|
||||||
|
return CallToolResultSchema.parse(raw);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract the text from the first content item of a tool result.
|
||||||
|
* Throws if the item is not a text block.
|
||||||
|
*/
|
||||||
|
export function getFirstText(result: CallToolResult): string {
|
||||||
|
const item = result.content[0];
|
||||||
|
if (!item || item.type !== 'text') {
|
||||||
|
throw new Error(`Expected text content, got: ${JSON.stringify(item)}`);
|
||||||
|
}
|
||||||
|
return item.text;
|
||||||
|
}
|
||||||
74
packages/api/src/mcp/__tests__/query.test.ts
Normal file
74
packages/api/src/mcp/__tests__/query.test.ts
Normal file
|
|
@ -0,0 +1,74 @@
|
||||||
|
import { parseTimeRange } from '../tools/query/helpers';
|
||||||
|
|
||||||
|
describe('parseTimeRange', () => {
|
||||||
|
it('should return default range (last 15 minutes) when no arguments provided', () => {
|
||||||
|
const before = Date.now();
|
||||||
|
const result = parseTimeRange();
|
||||||
|
const after = Date.now();
|
||||||
|
|
||||||
|
expect(result).not.toHaveProperty('error');
|
||||||
|
if ('error' in result) return;
|
||||||
|
|
||||||
|
// endDate should be approximately now
|
||||||
|
expect(result.endDate.getTime()).toBeGreaterThanOrEqual(before);
|
||||||
|
expect(result.endDate.getTime()).toBeLessThanOrEqual(after);
|
||||||
|
// startDate should be ~15 minutes before endDate
|
||||||
|
const diffMs = result.endDate.getTime() - result.startDate.getTime();
|
||||||
|
expect(diffMs).toBe(15 * 60 * 1000);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use provided startTime and endTime', () => {
|
||||||
|
const result = parseTimeRange(
|
||||||
|
'2025-01-01T00:00:00Z',
|
||||||
|
'2025-01-02T00:00:00Z',
|
||||||
|
);
|
||||||
|
expect(result).not.toHaveProperty('error');
|
||||||
|
if ('error' in result) return;
|
||||||
|
|
||||||
|
expect(result.startDate.toISOString()).toBe('2025-01-01T00:00:00.000Z');
|
||||||
|
expect(result.endDate.toISOString()).toBe('2025-01-02T00:00:00.000Z');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should default startTime to 15 minutes before endTime', () => {
|
||||||
|
const result = parseTimeRange(undefined, '2025-06-15T10:00:00Z');
|
||||||
|
expect(result).not.toHaveProperty('error');
|
||||||
|
if ('error' in result) return;
|
||||||
|
|
||||||
|
expect(result.endDate.toISOString()).toBe('2025-06-15T10:00:00.000Z');
|
||||||
|
expect(result.startDate.toISOString()).toBe('2025-06-15T09:45:00.000Z');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should default endTime to now', () => {
|
||||||
|
const before = Date.now();
|
||||||
|
const result = parseTimeRange('2025-06-15T11:00:00Z');
|
||||||
|
const after = Date.now();
|
||||||
|
|
||||||
|
expect(result).not.toHaveProperty('error');
|
||||||
|
if ('error' in result) return;
|
||||||
|
|
||||||
|
expect(result.startDate.toISOString()).toBe('2025-06-15T11:00:00.000Z');
|
||||||
|
expect(result.endDate.getTime()).toBeGreaterThanOrEqual(before);
|
||||||
|
expect(result.endDate.getTime()).toBeLessThanOrEqual(after);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for invalid startTime', () => {
|
||||||
|
const result = parseTimeRange('not-a-date', '2025-01-01T00:00:00Z');
|
||||||
|
expect(result).toHaveProperty('error');
|
||||||
|
if (!('error' in result)) return;
|
||||||
|
|
||||||
|
expect(result.error).toContain('Invalid');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for invalid endTime', () => {
|
||||||
|
const result = parseTimeRange('2025-01-01T00:00:00Z', 'garbage');
|
||||||
|
expect(result).toHaveProperty('error');
|
||||||
|
if (!('error' in result)) return;
|
||||||
|
|
||||||
|
expect(result.error).toContain('Invalid');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error when both times are invalid', () => {
|
||||||
|
const result = parseTimeRange('bad', 'also-bad');
|
||||||
|
expect(result).toHaveProperty('error');
|
||||||
|
});
|
||||||
|
});
|
||||||
233
packages/api/src/mcp/__tests__/queryTool.test.ts
Normal file
233
packages/api/src/mcp/__tests__/queryTool.test.ts
Normal file
|
|
@ -0,0 +1,233 @@
|
||||||
|
import { SourceKind } from '@hyperdx/common-utils/dist/types';
|
||||||
|
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
|
||||||
|
|
||||||
|
import * as config from '@/config';
|
||||||
|
import {
|
||||||
|
DEFAULT_DATABASE,
|
||||||
|
DEFAULT_TRACES_TABLE,
|
||||||
|
getLoggedInAgent,
|
||||||
|
getServer,
|
||||||
|
} from '@/fixtures';
|
||||||
|
import Connection from '@/models/connection';
|
||||||
|
import { Source } from '@/models/source';
|
||||||
|
|
||||||
|
import { McpContext } from '../tools/types';
|
||||||
|
import { callTool, createTestClient, getFirstText } from './mcpTestUtils';
|
||||||
|
|
||||||
|
describe('MCP Query Tool', () => {
|
||||||
|
const server = getServer();
|
||||||
|
let team: any;
|
||||||
|
let user: any;
|
||||||
|
let traceSource: any;
|
||||||
|
let connection: any;
|
||||||
|
let client: Client;
|
||||||
|
|
||||||
|
beforeAll(async () => {
|
||||||
|
await server.start();
|
||||||
|
});
|
||||||
|
|
||||||
|
beforeEach(async () => {
|
||||||
|
const result = await getLoggedInAgent(server);
|
||||||
|
team = result.team;
|
||||||
|
user = result.user;
|
||||||
|
|
||||||
|
connection = await Connection.create({
|
||||||
|
team: team._id,
|
||||||
|
name: 'Default',
|
||||||
|
host: config.CLICKHOUSE_HOST,
|
||||||
|
username: config.CLICKHOUSE_USER,
|
||||||
|
password: config.CLICKHOUSE_PASSWORD,
|
||||||
|
});
|
||||||
|
|
||||||
|
traceSource = await Source.create({
|
||||||
|
kind: SourceKind.Trace,
|
||||||
|
team: team._id,
|
||||||
|
from: {
|
||||||
|
databaseName: DEFAULT_DATABASE,
|
||||||
|
tableName: DEFAULT_TRACES_TABLE,
|
||||||
|
},
|
||||||
|
timestampValueExpression: 'Timestamp',
|
||||||
|
connection: connection._id,
|
||||||
|
name: 'Traces',
|
||||||
|
});
|
||||||
|
|
||||||
|
const context: McpContext = {
|
||||||
|
teamId: team._id.toString(),
|
||||||
|
userId: user._id.toString(),
|
||||||
|
};
|
||||||
|
client = await createTestClient(context);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
await client.close();
|
||||||
|
await server.clearDBs();
|
||||||
|
});
|
||||||
|
|
||||||
|
afterAll(async () => {
|
||||||
|
await server.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('builder queries', () => {
|
||||||
|
it('should execute a number query', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
const output = JSON.parse(getFirstText(result));
|
||||||
|
expect(output).toHaveProperty('result');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should execute a line chart query', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'line',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should execute a table query', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'table',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
groupBy: 'SpanName',
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should execute a pie query', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'pie',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
groupBy: 'SpanName',
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should execute a stacked_bar query', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'stacked_bar',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use default time range when not provided', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return result for query with no matching data', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count', where: 'SpanName:z_impossible_value_xyz' }],
|
||||||
|
startTime: new Date(Date.now() - 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('search queries', () => {
|
||||||
|
it('should execute a search query', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'search',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
where: '',
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should respect maxResults parameter', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'search',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
maxResults: 10,
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('SQL queries', () => {
|
||||||
|
it('should execute a raw SQL query', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'sql',
|
||||||
|
connectionId: connection._id.toString(),
|
||||||
|
sql: 'SELECT 1 AS value',
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should execute SQL with time macros', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'sql',
|
||||||
|
connectionId: connection._id.toString(),
|
||||||
|
sql: `SELECT count() AS cnt FROM ${DEFAULT_DATABASE}.${DEFAULT_TRACES_TABLE} WHERE $__timeFilter(Timestamp) LIMIT 10`,
|
||||||
|
startTime: new Date(Date.now() - 60 * 60 * 1000).toISOString(),
|
||||||
|
endTime: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBeFalsy();
|
||||||
|
expect(result.content).toHaveLength(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('error handling', () => {
|
||||||
|
it('should return error for invalid time range', async () => {
|
||||||
|
const result = await callTool(client, 'hyperdx_query', {
|
||||||
|
displayType: 'number',
|
||||||
|
sourceId: traceSource._id.toString(),
|
||||||
|
select: [{ aggFn: 'count' }],
|
||||||
|
startTime: 'invalid-date',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.isError).toBe(true);
|
||||||
|
expect(getFirstText(result)).toContain('Invalid');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
160
packages/api/src/mcp/__tests__/tracing.test.ts
Normal file
160
packages/api/src/mcp/__tests__/tracing.test.ts
Normal file
|
|
@ -0,0 +1,160 @@
|
||||||
|
// Mock OpenTelemetry and all modules that transitively import it
|
||||||
|
// These must be declared before any imports
|
||||||
|
|
||||||
|
const mockSpan = {
|
||||||
|
setAttribute: jest.fn(),
|
||||||
|
setStatus: jest.fn(),
|
||||||
|
recordException: jest.fn(),
|
||||||
|
end: jest.fn(),
|
||||||
|
};
|
||||||
|
|
||||||
|
const mockTracer = {
|
||||||
|
startActiveSpan: (
|
||||||
|
_name: string,
|
||||||
|
fn: (span: typeof mockSpan) => Promise<unknown>,
|
||||||
|
) => fn(mockSpan),
|
||||||
|
};
|
||||||
|
|
||||||
|
jest.mock('@opentelemetry/api', () => ({
|
||||||
|
__esModule: true,
|
||||||
|
default: {
|
||||||
|
trace: {
|
||||||
|
getTracer: () => mockTracer,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
SpanStatusCode: {
|
||||||
|
OK: 1,
|
||||||
|
ERROR: 2,
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
jest.mock('@/config', () => ({
|
||||||
|
CODE_VERSION: 'test-version',
|
||||||
|
}));
|
||||||
|
|
||||||
|
jest.mock('@/utils/logger', () => ({
|
||||||
|
__esModule: true,
|
||||||
|
default: {
|
||||||
|
info: jest.fn(),
|
||||||
|
warn: jest.fn(),
|
||||||
|
error: jest.fn(),
|
||||||
|
debug: jest.fn(),
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
import { withToolTracing } from '../utils/tracing';
|
||||||
|
|
||||||
|
describe('withToolTracing', () => {
|
||||||
|
const context = { teamId: 'team-123', userId: 'user-456' };
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
jest.clearAllMocks();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call the handler and return its result', async () => {
|
||||||
|
const handler = jest.fn().mockResolvedValue({
|
||||||
|
content: [{ type: 'text', text: 'hello' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
const traced = withToolTracing('test_tool', context, handler);
|
||||||
|
const result = await traced({ some: 'args' });
|
||||||
|
|
||||||
|
expect(handler).toHaveBeenCalledWith({ some: 'args' });
|
||||||
|
expect(result).toEqual({
|
||||||
|
content: [{ type: 'text', text: 'hello' }],
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set span attributes for tool name, team, and user', async () => {
|
||||||
|
const handler = jest.fn().mockResolvedValue({
|
||||||
|
content: [{ type: 'text', text: 'ok' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
const traced = withToolTracing('my_tool', context, handler);
|
||||||
|
await traced({});
|
||||||
|
|
||||||
|
expect(mockSpan.setAttribute).toHaveBeenCalledWith(
|
||||||
|
'mcp.tool.name',
|
||||||
|
'my_tool',
|
||||||
|
);
|
||||||
|
expect(mockSpan.setAttribute).toHaveBeenCalledWith(
|
||||||
|
'mcp.team.id',
|
||||||
|
'team-123',
|
||||||
|
);
|
||||||
|
expect(mockSpan.setAttribute).toHaveBeenCalledWith(
|
||||||
|
'mcp.user.id',
|
||||||
|
'user-456',
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not set user id attribute when userId is undefined', async () => {
|
||||||
|
const noUserContext = { teamId: 'team-123' };
|
||||||
|
const handler = jest.fn().mockResolvedValue({
|
||||||
|
content: [{ type: 'text', text: 'ok' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
const traced = withToolTracing('my_tool', noUserContext, handler);
|
||||||
|
await traced({});
|
||||||
|
|
||||||
|
expect(mockSpan.setAttribute).not.toHaveBeenCalledWith(
|
||||||
|
'mcp.user.id',
|
||||||
|
expect.anything(),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set OK status for successful results', async () => {
|
||||||
|
const handler = jest.fn().mockResolvedValue({
|
||||||
|
content: [{ type: 'text', text: 'ok' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
const traced = withToolTracing('my_tool', context, handler);
|
||||||
|
await traced({});
|
||||||
|
|
||||||
|
expect(mockSpan.setStatus).toHaveBeenCalledWith({ code: 1 }); // SpanStatusCode.OK
|
||||||
|
expect(mockSpan.end).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set ERROR status for isError results', async () => {
|
||||||
|
const handler = jest.fn().mockResolvedValue({
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text', text: 'something went wrong' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
const traced = withToolTracing('my_tool', context, handler);
|
||||||
|
await traced({});
|
||||||
|
|
||||||
|
expect(mockSpan.setStatus).toHaveBeenCalledWith({ code: 2 }); // SpanStatusCode.ERROR
|
||||||
|
expect(mockSpan.setAttribute).toHaveBeenCalledWith('mcp.tool.error', true);
|
||||||
|
expect(mockSpan.end).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set ERROR status and re-throw on handler exception', async () => {
|
||||||
|
const error = new Error('boom');
|
||||||
|
const handler = jest.fn().mockRejectedValue(error);
|
||||||
|
|
||||||
|
const traced = withToolTracing('my_tool', context, handler);
|
||||||
|
|
||||||
|
await expect(traced({})).rejects.toThrow('boom');
|
||||||
|
|
||||||
|
expect(mockSpan.setStatus).toHaveBeenCalledWith({
|
||||||
|
code: 2,
|
||||||
|
message: 'boom',
|
||||||
|
});
|
||||||
|
expect(mockSpan.recordException).toHaveBeenCalledWith(error);
|
||||||
|
expect(mockSpan.end).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should record duration on the span', async () => {
|
||||||
|
const handler = jest.fn().mockResolvedValue({
|
||||||
|
content: [{ type: 'text', text: 'ok' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
const traced = withToolTracing('my_tool', context, handler);
|
||||||
|
await traced({});
|
||||||
|
|
||||||
|
expect(mockSpan.setAttribute).toHaveBeenCalledWith(
|
||||||
|
'mcp.tool.duration_ms',
|
||||||
|
expect.any(Number),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
58
packages/api/src/mcp/app.ts
Normal file
58
packages/api/src/mcp/app.ts
Normal file
|
|
@ -0,0 +1,58 @@
|
||||||
|
import { setTraceAttributes } from '@hyperdx/node-opentelemetry';
|
||||||
|
import { createMcpExpressApp } from '@modelcontextprotocol/sdk/server/express.js';
|
||||||
|
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
|
||||||
|
|
||||||
|
import { validateUserAccessKey } from '../middleware/auth';
|
||||||
|
import logger from '../utils/logger';
|
||||||
|
import rateLimiter, { rateLimiterKeyGenerator } from '../utils/rateLimiter';
|
||||||
|
import { createServer } from './mcpServer';
|
||||||
|
import { McpContext } from './tools/types';
|
||||||
|
|
||||||
|
const app = createMcpExpressApp();
|
||||||
|
|
||||||
|
const mcpRateLimiter = rateLimiter({
|
||||||
|
windowMs: 60 * 1000, // 1 minute
|
||||||
|
max: 100,
|
||||||
|
standardHeaders: true,
|
||||||
|
legacyHeaders: false,
|
||||||
|
keyGenerator: rateLimiterKeyGenerator,
|
||||||
|
});
|
||||||
|
|
||||||
|
app.all('/', mcpRateLimiter, validateUserAccessKey, async (req, res) => {
|
||||||
|
const transport = new StreamableHTTPServerTransport({
|
||||||
|
sessionIdGenerator: undefined, // stateless
|
||||||
|
});
|
||||||
|
|
||||||
|
const teamId = req.user?.team;
|
||||||
|
|
||||||
|
if (!teamId) {
|
||||||
|
logger.warn('MCP request rejected: no teamId');
|
||||||
|
res.sendStatus(403);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const userId = req.user?._id?.toString();
|
||||||
|
const context: McpContext = {
|
||||||
|
teamId: teamId.toString(),
|
||||||
|
userId,
|
||||||
|
};
|
||||||
|
|
||||||
|
setTraceAttributes({
|
||||||
|
'mcp.team.id': context.teamId,
|
||||||
|
...(userId && { 'mcp.user.id': userId }),
|
||||||
|
});
|
||||||
|
|
||||||
|
logger.info({ teamId: context.teamId, userId }, 'MCP request received');
|
||||||
|
|
||||||
|
const server = createServer(context);
|
||||||
|
|
||||||
|
try {
|
||||||
|
await server.connect(transport);
|
||||||
|
await transport.handleRequest(req, res, req.body);
|
||||||
|
} finally {
|
||||||
|
await server.close();
|
||||||
|
await transport.close();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default app;
|
||||||
21
packages/api/src/mcp/mcpServer.ts
Normal file
21
packages/api/src/mcp/mcpServer.ts
Normal file
|
|
@ -0,0 +1,21 @@
|
||||||
|
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
|
||||||
|
import { CODE_VERSION } from '@/config';
|
||||||
|
|
||||||
|
import dashboardPrompts from './prompts/dashboards/index';
|
||||||
|
import dashboardsTools from './tools/dashboards/index';
|
||||||
|
import queryTools from './tools/query/index';
|
||||||
|
import { McpContext } from './tools/types';
|
||||||
|
|
||||||
|
export function createServer(context: McpContext) {
|
||||||
|
const server = new McpServer({
|
||||||
|
name: 'hyperdx',
|
||||||
|
version: `${CODE_VERSION}-beta`,
|
||||||
|
});
|
||||||
|
|
||||||
|
dashboardsTools(server, context);
|
||||||
|
queryTools(server, context);
|
||||||
|
dashboardPrompts(server, context);
|
||||||
|
|
||||||
|
return server;
|
||||||
|
}
|
||||||
719
packages/api/src/mcp/prompts/dashboards/content.ts
Normal file
719
packages/api/src/mcp/prompts/dashboards/content.ts
Normal file
|
|
@ -0,0 +1,719 @@
|
||||||
|
// ─── Prompt content builders ──────────────────────────────────────────────────
|
||||||
|
// Each function returns a plain string that is injected as a prompt message.
|
||||||
|
|
||||||
|
export function buildCreateDashboardPrompt(
|
||||||
|
sourceSummary: string,
|
||||||
|
traceSourceId: string,
|
||||||
|
logSourceId: string,
|
||||||
|
description?: string,
|
||||||
|
): string {
|
||||||
|
const userContext = description
|
||||||
|
? `\nThe user wants to create a dashboard for: ${description}\nTailor the dashboard tiles to match this goal.\n`
|
||||||
|
: '';
|
||||||
|
|
||||||
|
return `You are an expert at creating HyperDX observability dashboards.
|
||||||
|
${userContext}
|
||||||
|
${sourceSummary}
|
||||||
|
|
||||||
|
IMPORTANT: Call hyperdx_list_sources first to get the full column schema and attribute keys for each source. The source IDs above are correct, but you need the schema details to write accurate queries.
|
||||||
|
|
||||||
|
== WORKFLOW ==
|
||||||
|
|
||||||
|
1. Call hyperdx_list_sources — get source IDs, column schemas, and attribute keys
|
||||||
|
2. Design tiles — pick tile types that match the monitoring goal
|
||||||
|
3. Call hyperdx_save_dashboard — create the dashboard with all tiles
|
||||||
|
4. Call hyperdx_query_tile on each tile — validate queries return data
|
||||||
|
|
||||||
|
== TILE TYPE GUIDE ==
|
||||||
|
|
||||||
|
Use BUILDER tiles (with sourceId) for most cases:
|
||||||
|
line — Time-series trends (error rate, request volume, latency over time)
|
||||||
|
stacked_bar — Compare categories over time (requests by service, errors by status code)
|
||||||
|
number — Single KPI metric (total requests, current error rate, p99 latency)
|
||||||
|
table — Ranked lists (top endpoints by latency, error counts by service)
|
||||||
|
pie — Proportional breakdowns (traffic share by service, errors by type)
|
||||||
|
search — Browse raw log/event rows (error logs, recent traces)
|
||||||
|
markdown — Dashboard notes, section headers, or documentation
|
||||||
|
|
||||||
|
Use RAW SQL tiles (with connectionId) only for advanced queries:
|
||||||
|
Requires configType: "sql" plus a displayType (line, stacked_bar, table, number, pie)
|
||||||
|
Use when you need JOINs, sub-queries, CTEs, or queries the builder cannot express
|
||||||
|
|
||||||
|
== COLUMN NAMING ==
|
||||||
|
|
||||||
|
- Top-level columns use PascalCase by default: Duration, StatusCode, SpanName, Body, SeverityText, ServiceName
|
||||||
|
NOTE: These are defaults for the standard HyperDX schema. Custom sources may use different names.
|
||||||
|
Always call hyperdx_list_sources to get the real column names and keyColumns for each source.
|
||||||
|
- Map-type columns use bracket syntax: SpanAttributes['http.method'], ResourceAttributes['service.name']
|
||||||
|
NEVER use dot notation for Map columns (SpanAttributes.http.method) — always use brackets.
|
||||||
|
- JSON-type columns use dot notation: JsonColumn.key.subkey
|
||||||
|
Check the jsType returned by hyperdx_list_sources to determine whether a column is Map or JSON.
|
||||||
|
- Call hyperdx_list_sources to discover the exact column names, types, and attribute keys
|
||||||
|
|
||||||
|
== LAYOUT GRID ==
|
||||||
|
|
||||||
|
The dashboard grid is 24 columns wide. Tiles are positioned with (x, y, w, h):
|
||||||
|
- Number tiles: w=6, h=4 — fit 4 across in a row
|
||||||
|
- Line/Bar charts: w=12, h=4 — fit 2 side-by-side
|
||||||
|
- Tables: w=24, h=6 — full width
|
||||||
|
- Search tiles: w=24, h=6 — full width
|
||||||
|
- Markdown: w=24, h=2 — full width section header
|
||||||
|
|
||||||
|
Recommended layout pattern (top to bottom):
|
||||||
|
Row 0: KPI number tiles across the top (y=0)
|
||||||
|
Row 1: Time-series charts (y=4)
|
||||||
|
Row 2: Tables or search tiles (y=8)
|
||||||
|
|
||||||
|
== FILTER SYNTAX (Lucene) ==
|
||||||
|
|
||||||
|
Simple match: level:error
|
||||||
|
AND: service.name:api AND http.status_code:>=500
|
||||||
|
OR: level:error OR level:fatal
|
||||||
|
Wildcards: service.name:front*
|
||||||
|
Negation: NOT level:debug
|
||||||
|
Exists: _exists_:http.route
|
||||||
|
Range: Duration:>1000000000
|
||||||
|
Phrase: Body:"connection refused"
|
||||||
|
Grouped: (level:error OR level:fatal) AND service.name:api
|
||||||
|
|
||||||
|
== COMPLETE EXAMPLE ==
|
||||||
|
|
||||||
|
Here is a full dashboard creation call with properly structured tiles:
|
||||||
|
|
||||||
|
hyperdx_save_dashboard({
|
||||||
|
name: "Service Overview",
|
||||||
|
tags: ["overview"],
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: "Total Requests",
|
||||||
|
x: 0, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "count" }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Error Count",
|
||||||
|
x: 6, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "count", where: "StatusCode:STATUS_CODE_ERROR" }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "P95 Latency (ms)",
|
||||||
|
x: 12, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "quantile", valueExpression: "Duration", level: 0.95 }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Request Rate by Service",
|
||||||
|
x: 0, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "line",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "count" }],
|
||||||
|
groupBy: "ResourceAttributes['service.name']"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Error Rate Over Time",
|
||||||
|
x: 12, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "line",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [
|
||||||
|
{ aggFn: "count", where: "StatusCode:STATUS_CODE_ERROR", alias: "Errors" },
|
||||||
|
{ aggFn: "count", alias: "Total" }
|
||||||
|
],
|
||||||
|
asRatio: true
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Top Endpoints by Request Count",
|
||||||
|
x: 0, y: 8, w: 24, h: 6,
|
||||||
|
config: {
|
||||||
|
displayType: "table",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
groupBy: "SpanName",
|
||||||
|
select: [
|
||||||
|
{ aggFn: "count", alias: "Requests" },
|
||||||
|
{ aggFn: "avg", valueExpression: "Duration", alias: "Avg Duration" },
|
||||||
|
{ aggFn: "quantile", valueExpression: "Duration", level: 0.95, alias: "P95 Duration" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
|
||||||
|
== STATUS CODE & SEVERITY VALUES ==
|
||||||
|
|
||||||
|
IMPORTANT: The exact values for StatusCode and SeverityText vary by deployment.
|
||||||
|
Do NOT assume values like "STATUS_CODE_ERROR", "Ok", "error", or "fatal".
|
||||||
|
Always call hyperdx_list_sources first and inspect the keyValues / mapAttributeKeys
|
||||||
|
returned for each source to discover the real values used in your data.
|
||||||
|
|
||||||
|
== COMMON MISTAKES TO AVOID ==
|
||||||
|
|
||||||
|
- Using valueExpression with aggFn "count" — count does not take a valueExpression
|
||||||
|
- Forgetting valueExpression for non-count aggFns — avg, sum, min, max, quantile all require it
|
||||||
|
- Using dot notation for Map-type attributes — always use SpanAttributes['key'] bracket syntax for Map columns
|
||||||
|
- Not calling hyperdx_list_sources first — you need real source IDs, not placeholders
|
||||||
|
- Not validating with hyperdx_query_tile after saving — tiles can silently fail
|
||||||
|
- Number and Pie tiles accept exactly 1 select item — not multiple
|
||||||
|
- Missing level for quantile aggFn — must specify 0.5, 0.9, 0.95, or 0.99
|
||||||
|
- Assuming StatusCode or SeverityText values — always inspect the source first`;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildDashboardExamplesPrompt(
|
||||||
|
traceSourceId: string,
|
||||||
|
logSourceId: string,
|
||||||
|
connectionId: string,
|
||||||
|
pattern?: string,
|
||||||
|
): string {
|
||||||
|
const examples: Record<string, string> = {};
|
||||||
|
|
||||||
|
examples['service_overview'] = `
|
||||||
|
== SERVICE HEALTH OVERVIEW ==
|
||||||
|
|
||||||
|
A high-level view of service health with KPIs, trends, and endpoint details.
|
||||||
|
|
||||||
|
{
|
||||||
|
name: "Service Health Overview",
|
||||||
|
tags: ["overview", "service"],
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: "Total Requests",
|
||||||
|
x: 0, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "count" }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Error Count",
|
||||||
|
x: 6, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "count", where: "StatusCode:STATUS_CODE_ERROR" }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Avg Latency",
|
||||||
|
x: 12, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "avg", valueExpression: "Duration" }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "P99 Latency",
|
||||||
|
x: 18, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "quantile", valueExpression: "Duration", level: 0.99 }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Request Volume Over Time",
|
||||||
|
x: 0, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "line",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "count" }],
|
||||||
|
groupBy: "ResourceAttributes['service.name']"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Error Rate Over Time",
|
||||||
|
x: 12, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "line",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [
|
||||||
|
{ aggFn: "count", where: "StatusCode:STATUS_CODE_ERROR", alias: "Errors" },
|
||||||
|
{ aggFn: "count", alias: "Total" }
|
||||||
|
],
|
||||||
|
asRatio: true
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Top Endpoints",
|
||||||
|
x: 0, y: 8, w: 24, h: 6,
|
||||||
|
config: {
|
||||||
|
displayType: "table",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
groupBy: "SpanName",
|
||||||
|
select: [
|
||||||
|
{ aggFn: "count", alias: "Requests" },
|
||||||
|
{ aggFn: "avg", valueExpression: "Duration", alias: "Avg Duration" },
|
||||||
|
{ aggFn: "quantile", valueExpression: "Duration", level: 0.95, alias: "P95" },
|
||||||
|
{ aggFn: "count", where: "StatusCode:STATUS_CODE_ERROR", alias: "Errors" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}`;
|
||||||
|
|
||||||
|
examples['error_tracking'] = `
|
||||||
|
== ERROR TRACKING ==
|
||||||
|
|
||||||
|
Focus on errors: volume, distribution, and raw error logs.
|
||||||
|
|
||||||
|
{
|
||||||
|
name: "Error Tracking",
|
||||||
|
tags: ["errors"],
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: "Total Errors",
|
||||||
|
x: 0, y: 0, w: 8, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
select: [{ aggFn: "count", where: "SeverityText:error OR SeverityText:fatal" }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Errors Over Time by Service",
|
||||||
|
x: 0, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "line",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
select: [{ aggFn: "count", where: "SeverityText:error OR SeverityText:fatal" }],
|
||||||
|
groupBy: "ResourceAttributes['service.name']"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Error Breakdown by Service",
|
||||||
|
x: 12, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "pie",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
select: [{ aggFn: "count", where: "SeverityText:error OR SeverityText:fatal" }],
|
||||||
|
groupBy: "ResourceAttributes['service.name']"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Error Logs",
|
||||||
|
x: 0, y: 8, w: 24, h: 6,
|
||||||
|
config: {
|
||||||
|
displayType: "search",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
where: "SeverityText:error OR SeverityText:fatal"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}`;
|
||||||
|
|
||||||
|
examples['latency'] = `
|
||||||
|
== LATENCY MONITORING ==
|
||||||
|
|
||||||
|
Track response times with percentile breakdowns and slow endpoint identification.
|
||||||
|
|
||||||
|
{
|
||||||
|
name: "Latency Monitoring",
|
||||||
|
tags: ["latency", "performance"],
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: "P50 Latency",
|
||||||
|
x: 0, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "quantile", valueExpression: "Duration", level: 0.5 }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "P95 Latency",
|
||||||
|
x: 6, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "quantile", valueExpression: "Duration", level: 0.95 }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "P99 Latency",
|
||||||
|
x: 12, y: 0, w: 6, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "quantile", valueExpression: "Duration", level: 0.99 }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Latency Percentiles Over Time",
|
||||||
|
x: 0, y: 4, w: 24, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "line",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [
|
||||||
|
{ aggFn: "quantile", valueExpression: "Duration", level: 0.5, alias: "P50" },
|
||||||
|
{ aggFn: "quantile", valueExpression: "Duration", level: 0.95, alias: "P95" },
|
||||||
|
{ aggFn: "quantile", valueExpression: "Duration", level: 0.99, alias: "P99" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Latency by Service",
|
||||||
|
x: 0, y: 8, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "stacked_bar",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
select: [{ aggFn: "avg", valueExpression: "Duration" }],
|
||||||
|
groupBy: "ResourceAttributes['service.name']"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Slowest Endpoints",
|
||||||
|
x: 12, y: 8, w: 12, h: 6,
|
||||||
|
config: {
|
||||||
|
displayType: "table",
|
||||||
|
sourceId: "${traceSourceId}",
|
||||||
|
groupBy: "SpanName",
|
||||||
|
select: [
|
||||||
|
{ aggFn: "quantile", valueExpression: "Duration", level: 0.95, alias: "P95 Duration" },
|
||||||
|
{ aggFn: "avg", valueExpression: "Duration", alias: "Avg Duration" },
|
||||||
|
{ aggFn: "count", alias: "Request Count" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}`;
|
||||||
|
|
||||||
|
examples['log_analysis'] = `
|
||||||
|
== LOG ANALYSIS ==
|
||||||
|
|
||||||
|
Analyze log volume, severity distribution, and browse log events.
|
||||||
|
|
||||||
|
{
|
||||||
|
name: "Log Analysis",
|
||||||
|
tags: ["logs"],
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: "Total Log Events",
|
||||||
|
x: 0, y: 0, w: 8, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "number",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
select: [{ aggFn: "count" }]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Log Volume by Severity",
|
||||||
|
x: 0, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "stacked_bar",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
select: [{ aggFn: "count" }],
|
||||||
|
groupBy: "SeverityText"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Severity Breakdown",
|
||||||
|
x: 12, y: 4, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: "pie",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
select: [{ aggFn: "count" }],
|
||||||
|
groupBy: "SeverityText"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Top Services by Log Volume",
|
||||||
|
x: 0, y: 8, w: 12, h: 6,
|
||||||
|
config: {
|
||||||
|
displayType: "table",
|
||||||
|
sourceId: "${logSourceId}",
|
||||||
|
groupBy: "ResourceAttributes['service.name']",
|
||||||
|
select: [
|
||||||
|
{ aggFn: "count", alias: "Log Count" },
|
||||||
|
{ aggFn: "count", where: "SeverityText:error OR SeverityText:fatal", alias: "Error Count" }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Recent Logs",
|
||||||
|
x: 12, y: 8, w: 12, h: 6,
|
||||||
|
config: {
|
||||||
|
displayType: "search",
|
||||||
|
sourceId: "${logSourceId}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}`;
|
||||||
|
|
||||||
|
examples['infrastructure_sql'] = `
|
||||||
|
== INFRASTRUCTURE MONITORING (Raw SQL) ==
|
||||||
|
|
||||||
|
Advanced dashboard using raw SQL tiles for custom ClickHouse queries.
|
||||||
|
Use this pattern when you need JOINs, CTEs, or queries the builder cannot express.
|
||||||
|
|
||||||
|
{
|
||||||
|
name: "Infrastructure (SQL)",
|
||||||
|
tags: ["infrastructure", "sql"],
|
||||||
|
tiles: [
|
||||||
|
{
|
||||||
|
name: "Log Ingestion Rate Over Time",
|
||||||
|
x: 0, y: 0, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
configType: "sql",
|
||||||
|
displayType: "line",
|
||||||
|
connectionId: "${connectionId}",
|
||||||
|
sqlTemplate: "SELECT $__timeInterval(Timestamp) AS ts, count() AS logs_per_interval FROM otel_logs WHERE $__timeFilter(Timestamp) GROUP BY ts ORDER BY ts"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Top 20 Services by Span Count",
|
||||||
|
x: 12, y: 0, w: 12, h: 4,
|
||||||
|
config: {
|
||||||
|
configType: "sql",
|
||||||
|
displayType: "table",
|
||||||
|
connectionId: "${connectionId}",
|
||||||
|
sqlTemplate: "SELECT ServiceName, count() AS span_count, avg(Duration) AS avg_duration FROM otel_traces WHERE Timestamp >= fromUnixTimestamp64Milli({startDateMilliseconds:Int64}) AND Timestamp < fromUnixTimestamp64Milli({endDateMilliseconds:Int64}) GROUP BY ServiceName ORDER BY span_count DESC LIMIT 20"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Error Rate by Service (SQL)",
|
||||||
|
x: 0, y: 4, w: 24, h: 4,
|
||||||
|
config: {
|
||||||
|
configType: "sql",
|
||||||
|
displayType: "line",
|
||||||
|
connectionId: "${connectionId}",
|
||||||
|
sqlTemplate: "SELECT $__timeInterval(Timestamp) AS ts, ServiceName, countIf(StatusCode = 'STATUS_CODE_ERROR') / count() AS error_rate FROM otel_traces WHERE $__timeFilter(Timestamp) GROUP BY ServiceName, ts ORDER BY ts"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
SQL TEMPLATE REFERENCE:
|
||||||
|
Macros (expanded before execution):
|
||||||
|
$__timeFilter(col) — col >= <start> AND col <= <end> (DateTime)
|
||||||
|
$__timeFilter_ms(col) — same with DateTime64 millisecond precision
|
||||||
|
$__dateFilter(col) — same with Date precision
|
||||||
|
$__timeInterval(col) — time bucket: toStartOfInterval(toDateTime(col), INTERVAL ...)
|
||||||
|
$__timeInterval_ms(col) — same with millisecond precision
|
||||||
|
$__fromTime / $__toTime — start/end as DateTime values
|
||||||
|
$__fromTime_ms / $__toTime_ms — start/end as DateTime64 values
|
||||||
|
$__interval_s — raw interval in seconds
|
||||||
|
$__filters — dashboard filter conditions (resolves to 1=1 when none)
|
||||||
|
|
||||||
|
Query parameters:
|
||||||
|
{startDateMilliseconds:Int64} — start of date range in milliseconds
|
||||||
|
{endDateMilliseconds:Int64} — end of date range in milliseconds
|
||||||
|
{intervalSeconds:Int64} — time bucket size in seconds
|
||||||
|
{intervalMilliseconds:Int64} — time bucket size in milliseconds
|
||||||
|
|
||||||
|
Available parameters by displayType:
|
||||||
|
line / stacked_bar — startDate, endDate, interval (all available)
|
||||||
|
table / number / pie — startDate, endDate only (no interval)`;
|
||||||
|
|
||||||
|
if (pattern) {
|
||||||
|
const key = pattern.toLowerCase().replace(/[\s-]+/g, '_');
|
||||||
|
const matched = Object.entries(examples).find(([k]) => k === key);
|
||||||
|
if (matched) {
|
||||||
|
return `Dashboard example for pattern: ${pattern}\n\nReplace sourceId/connectionId values with real IDs from hyperdx_list_sources.\nNOTE: Column names below (Duration, StatusCode, SpanName, etc.) are defaults for the standard schema. Call hyperdx_list_sources to get the actual column names for your sources.\n${matched[1]}`;
|
||||||
|
}
|
||||||
|
return (
|
||||||
|
`No example found for pattern "${pattern}". Available patterns: ${Object.keys(examples).join(', ')}\n\n` +
|
||||||
|
`Showing all examples below.\n\n` +
|
||||||
|
Object.values(examples).join('\n')
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
`Complete dashboard examples for common observability patterns.\n` +
|
||||||
|
`Replace sourceId/connectionId values with real IDs from hyperdx_list_sources.\n` +
|
||||||
|
`NOTE: Column names below (Duration, StatusCode, SpanName, etc.) are defaults for the standard schema. Call hyperdx_list_sources to get the actual column names for your sources.\n\n` +
|
||||||
|
`Available patterns: ${Object.keys(examples).join(', ')}\n` +
|
||||||
|
Object.values(examples).join('\n')
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildQueryGuidePrompt(): string {
|
||||||
|
return `Reference guide for writing queries with HyperDX MCP tools (hyperdx_query and hyperdx_save_dashboard).
|
||||||
|
|
||||||
|
== AGGREGATION FUNCTIONS (aggFn) ==
|
||||||
|
|
||||||
|
count — Count matching rows. Does NOT take a valueExpression.
|
||||||
|
sum — Sum of a numeric column. Requires valueExpression.
|
||||||
|
avg — Average of a numeric column. Requires valueExpression.
|
||||||
|
min — Minimum value. Requires valueExpression.
|
||||||
|
max — Maximum value. Requires valueExpression.
|
||||||
|
count_distinct — Count of unique values. Requires valueExpression.
|
||||||
|
quantile — Percentile value. Requires valueExpression AND level (0.5, 0.9, 0.95, or 0.99).
|
||||||
|
last_value — Most recent value of a column. Requires valueExpression.
|
||||||
|
none — Pass a raw expression unchanged. Requires valueExpression.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
{ aggFn: "count" }
|
||||||
|
{ aggFn: "avg", valueExpression: "Duration" }
|
||||||
|
{ aggFn: "quantile", valueExpression: "Duration", level: 0.95 }
|
||||||
|
{ aggFn: "count_distinct", valueExpression: "ResourceAttributes['service.name']" }
|
||||||
|
{ aggFn: "sum", valueExpression: "Duration", where: "StatusCode:STATUS_CODE_ERROR" }
|
||||||
|
|
||||||
|
== COLUMN NAMING ==
|
||||||
|
|
||||||
|
Top-level columns (PascalCase defaults — use directly in valueExpression and groupBy):
|
||||||
|
Duration, StatusCode, SpanName, ServiceName, Body, SeverityText,
|
||||||
|
Timestamp, TraceId, SpanId, SpanKind, ParentSpanId
|
||||||
|
NOTE: These are the defaults for the standard HyperDX schema. Custom sources may
|
||||||
|
use different column names. Always verify with hyperdx_list_sources, which returns
|
||||||
|
the real column names and keyColumns expressions for each source.
|
||||||
|
|
||||||
|
Map-type columns (bracket syntax — access keys via ['key']):
|
||||||
|
SpanAttributes['http.method']
|
||||||
|
SpanAttributes['http.route']
|
||||||
|
SpanAttributes['http.status_code']
|
||||||
|
ResourceAttributes['service.name']
|
||||||
|
ResourceAttributes['deployment.environment']
|
||||||
|
|
||||||
|
IMPORTANT: Always use bracket syntax for Map-type columns. Never use dot notation for Maps.
|
||||||
|
Correct: SpanAttributes['http.method']
|
||||||
|
Incorrect: SpanAttributes.http.method
|
||||||
|
|
||||||
|
JSON-type columns (dot notation — access nested keys via dot path):
|
||||||
|
JsonColumn.key.subkey
|
||||||
|
NOTE: Check the jsType field returned by hyperdx_list_sources to determine
|
||||||
|
whether a column is Map (use brackets) or JSON (use dots).
|
||||||
|
|
||||||
|
== LUCENE FILTER SYNTAX ==
|
||||||
|
|
||||||
|
Used in the "where" field of select items and search tiles.
|
||||||
|
|
||||||
|
Basic match: level:error
|
||||||
|
AND: service.name:api AND http.status_code:>=500
|
||||||
|
OR: level:error OR level:fatal
|
||||||
|
NOT: NOT level:debug
|
||||||
|
Wildcard: service.name:front*
|
||||||
|
Phrase: Body:"connection refused"
|
||||||
|
Exists: _exists_:http.route
|
||||||
|
Range (numeric): Duration:>1000000000
|
||||||
|
Range (inclusive): http.status_code:[400 TO 499]
|
||||||
|
Grouped: (level:error OR level:fatal) AND service.name:api
|
||||||
|
|
||||||
|
NOTE: In Lucene filters, use dot notation for attribute keys (service.name, http.method).
|
||||||
|
This is different from valueExpression/groupBy which requires bracket syntax (SpanAttributes['http.method']).
|
||||||
|
|
||||||
|
== SQL FILTER SYNTAX ==
|
||||||
|
|
||||||
|
Alternative to Lucene. Set whereLanguage: "sql" when using SQL syntax.
|
||||||
|
|
||||||
|
Basic: SeverityText = 'error'
|
||||||
|
AND/OR: ServiceName = 'api' AND StatusCode = 'STATUS_CODE_ERROR'
|
||||||
|
IN: ServiceName IN ('api', 'web', 'worker')
|
||||||
|
LIKE: Body LIKE '%timeout%'
|
||||||
|
Comparison: Duration > 1000000000
|
||||||
|
Map access: SpanAttributes['http.status_code'] = '500'
|
||||||
|
|
||||||
|
== RAW SQL TEMPLATES ==
|
||||||
|
|
||||||
|
For configType: "sql" tiles, write ClickHouse SQL with template macros:
|
||||||
|
|
||||||
|
MACROS (expanded before execution):
|
||||||
|
$__timeFilter(col) — col >= <start> AND col <= <end>
|
||||||
|
$__timeFilter_ms(col) — same with DateTime64 millisecond precision
|
||||||
|
$__dateFilter(col) — same with Date precision
|
||||||
|
$__dateTimeFilter(d, t) — filters on both Date and DateTime columns
|
||||||
|
$__timeInterval(col) — time bucket expression for GROUP BY
|
||||||
|
$__timeInterval_ms(col) — same with millisecond precision
|
||||||
|
$__fromTime / $__toTime — start/end as DateTime values
|
||||||
|
$__fromTime_ms / $__toTime_ms — start/end as DateTime64 values
|
||||||
|
$__interval_s — raw interval in seconds (for arithmetic)
|
||||||
|
$__filters — dashboard filter conditions (1=1 when none)
|
||||||
|
|
||||||
|
QUERY PARAMETERS (ClickHouse parameterized syntax):
|
||||||
|
{startDateMilliseconds:Int64}
|
||||||
|
{endDateMilliseconds:Int64}
|
||||||
|
{intervalSeconds:Int64}
|
||||||
|
{intervalMilliseconds:Int64}
|
||||||
|
|
||||||
|
TIME-SERIES EXAMPLE (line / stacked_bar):
|
||||||
|
SELECT
|
||||||
|
$__timeInterval(Timestamp) AS ts,
|
||||||
|
ServiceName,
|
||||||
|
count() AS requests
|
||||||
|
FROM otel_traces
|
||||||
|
WHERE $__timeFilter(Timestamp)
|
||||||
|
GROUP BY ServiceName, ts
|
||||||
|
ORDER BY ts
|
||||||
|
|
||||||
|
TABLE EXAMPLE:
|
||||||
|
SELECT
|
||||||
|
ServiceName,
|
||||||
|
count() AS request_count,
|
||||||
|
avg(Duration) AS avg_duration,
|
||||||
|
quantile(0.95)(Duration) AS p95_duration
|
||||||
|
FROM otel_traces
|
||||||
|
WHERE Timestamp >= fromUnixTimestamp64Milli({startDateMilliseconds:Int64})
|
||||||
|
AND Timestamp < fromUnixTimestamp64Milli({endDateMilliseconds:Int64})
|
||||||
|
GROUP BY ServiceName
|
||||||
|
ORDER BY request_count DESC
|
||||||
|
LIMIT 50
|
||||||
|
|
||||||
|
IMPORTANT: Always include a LIMIT clause in table/number/pie SQL queries.
|
||||||
|
|
||||||
|
== PER-TILE TYPE CONSTRAINTS ==
|
||||||
|
|
||||||
|
number — Exactly 1 select item. No groupBy.
|
||||||
|
pie — Exactly 1 select item. groupBy defines the slices.
|
||||||
|
line — 1-20 select items. Optional groupBy splits into series.
|
||||||
|
stacked_bar — 1-20 select items. Optional groupBy splits into stacks.
|
||||||
|
table — 1-20 select items. Optional groupBy defines row groups.
|
||||||
|
search — No select items (select is a column list string). where is the filter.
|
||||||
|
markdown — No select items. Set markdown field with content.
|
||||||
|
|
||||||
|
== asRatio ==
|
||||||
|
|
||||||
|
Set asRatio: true on line/stacked_bar/table tiles with exactly 2 select items
|
||||||
|
to plot the first as a ratio of the second. Useful for error rates:
|
||||||
|
select: [
|
||||||
|
{ aggFn: "count", where: "StatusCode:STATUS_CODE_ERROR", alias: "Errors" },
|
||||||
|
{ aggFn: "count", alias: "Total" }
|
||||||
|
],
|
||||||
|
asRatio: true
|
||||||
|
|
||||||
|
== COMMON MISTAKES ==
|
||||||
|
|
||||||
|
1. Using valueExpression with aggFn "count"
|
||||||
|
Wrong: { aggFn: "count", valueExpression: "Duration" }
|
||||||
|
Correct: { aggFn: "count" }
|
||||||
|
|
||||||
|
2. Forgetting valueExpression for non-count aggFns
|
||||||
|
Wrong: { aggFn: "avg" }
|
||||||
|
Correct: { aggFn: "avg", valueExpression: "Duration" }
|
||||||
|
|
||||||
|
3. Using dot notation for Map-type attributes in valueExpression/groupBy
|
||||||
|
Wrong: groupBy: "SpanAttributes.http.method"
|
||||||
|
Correct: groupBy: "SpanAttributes['http.method']"
|
||||||
|
NOTE: JSON-type columns DO use dot notation. Check jsType from hyperdx_list_sources.
|
||||||
|
|
||||||
|
4. Multiple select items on number/pie tiles
|
||||||
|
Wrong: displayType: "number", select: [{ aggFn: "count" }, { aggFn: "avg", ... }]
|
||||||
|
Correct: displayType: "number", select: [{ aggFn: "count" }]
|
||||||
|
|
||||||
|
5. Missing level for quantile
|
||||||
|
Wrong: { aggFn: "quantile", valueExpression: "Duration" }
|
||||||
|
Correct: { aggFn: "quantile", valueExpression: "Duration", level: 0.95 }
|
||||||
|
|
||||||
|
6. Forgetting to validate tiles after saving
|
||||||
|
Always call hyperdx_query_tile after hyperdx_save_dashboard to verify each tile returns data.
|
||||||
|
|
||||||
|
7. Using sourceId with SQL tiles or connectionId with builder tiles
|
||||||
|
Builder tiles (line, table, etc.) use sourceId.
|
||||||
|
SQL tiles (configType: "sql") use connectionId.
|
||||||
|
|
||||||
|
8. Assuming StatusCode or SeverityText values
|
||||||
|
Values like STATUS_CODE_ERROR, Ok, error, fatal vary by deployment.
|
||||||
|
Always call hyperdx_list_sources and inspect real keyValues from the source
|
||||||
|
before writing filters that depend on these columns.`;
|
||||||
|
}
|
||||||
48
packages/api/src/mcp/prompts/dashboards/helpers.ts
Normal file
48
packages/api/src/mcp/prompts/dashboards/helpers.ts
Normal file
|
|
@ -0,0 +1,48 @@
|
||||||
|
// ─── Source/connection summary helpers ───────────────────────────────────────
|
||||||
|
|
||||||
|
export function buildSourceSummary(
|
||||||
|
sources: { _id: unknown; name: string; kind: string; connection: unknown }[],
|
||||||
|
connections: { _id: unknown; name: string }[],
|
||||||
|
): string {
|
||||||
|
if (sources.length === 0 && connections.length === 0) {
|
||||||
|
return 'No sources or connections found. Call hyperdx_list_sources to discover available data.';
|
||||||
|
}
|
||||||
|
|
||||||
|
const lines: string[] = [];
|
||||||
|
|
||||||
|
if (sources.length > 0) {
|
||||||
|
lines.push('AVAILABLE SOURCES (use sourceId with builder tiles):');
|
||||||
|
for (const s of sources) {
|
||||||
|
lines.push(
|
||||||
|
` - "${s.name}" (${s.kind}) — sourceId: "${s._id}", connectionId: "${s.connection}"`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (connections.length > 0) {
|
||||||
|
lines.push('');
|
||||||
|
lines.push(
|
||||||
|
'AVAILABLE CONNECTIONS (use connectionId with raw SQL tiles only):',
|
||||||
|
);
|
||||||
|
for (const c of connections) {
|
||||||
|
lines.push(` - "${c.name}" — connectionId: "${c._id}"`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return lines.join('\n');
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getFirstSourceId(
|
||||||
|
sources: { _id: unknown; kind: string }[],
|
||||||
|
preferredKind?: string,
|
||||||
|
): string {
|
||||||
|
const preferred = preferredKind
|
||||||
|
? sources.find(s => s.kind === preferredKind)
|
||||||
|
: undefined;
|
||||||
|
const source = preferred ?? sources[0];
|
||||||
|
return source ? String(source._id) : '<SOURCE_ID>';
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getFirstConnectionId(connections: { _id: unknown }[]): string {
|
||||||
|
return connections[0] ? String(connections[0]._id) : '<CONNECTION_ID>';
|
||||||
|
}
|
||||||
195
packages/api/src/mcp/prompts/dashboards/index.ts
Normal file
195
packages/api/src/mcp/prompts/dashboards/index.ts
Normal file
|
|
@ -0,0 +1,195 @@
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import { getConnectionsByTeam } from '@/controllers/connection';
|
||||||
|
import { getSources } from '@/controllers/sources';
|
||||||
|
import logger from '@/utils/logger';
|
||||||
|
|
||||||
|
import type { PromptDefinition } from '../../tools/types';
|
||||||
|
import {
|
||||||
|
buildCreateDashboardPrompt,
|
||||||
|
buildDashboardExamplesPrompt,
|
||||||
|
buildQueryGuidePrompt,
|
||||||
|
} from './content';
|
||||||
|
import {
|
||||||
|
buildSourceSummary,
|
||||||
|
getFirstConnectionId,
|
||||||
|
getFirstSourceId,
|
||||||
|
} from './helpers';
|
||||||
|
|
||||||
|
const dashboardPrompts: PromptDefinition = (server, context) => {
|
||||||
|
const { teamId } = context;
|
||||||
|
|
||||||
|
// ── create_dashboard ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
server.registerPrompt(
|
||||||
|
'create_dashboard',
|
||||||
|
{
|
||||||
|
title: 'Create a Dashboard',
|
||||||
|
description:
|
||||||
|
'Create a HyperDX dashboard with the MCP tools. ' +
|
||||||
|
'Follow the recommended workflow, pick tile types, write queries, ' +
|
||||||
|
'and validate results — using your real data sources.',
|
||||||
|
argsSchema: {
|
||||||
|
description: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'What the dashboard should monitor (e.g. "API error rates and latency")',
|
||||||
|
),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
async ({ description }) => {
|
||||||
|
let sourceSummary: string;
|
||||||
|
let traceSourceId: string;
|
||||||
|
let logSourceId: string;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const [sources, connections] = await Promise.all([
|
||||||
|
getSources(teamId),
|
||||||
|
getConnectionsByTeam(teamId),
|
||||||
|
]);
|
||||||
|
|
||||||
|
sourceSummary = buildSourceSummary(
|
||||||
|
sources.map(s => ({
|
||||||
|
_id: s._id,
|
||||||
|
name: s.name,
|
||||||
|
kind: s.kind,
|
||||||
|
connection: s.connection,
|
||||||
|
})),
|
||||||
|
connections.map(c => ({ _id: c._id, name: c.name })),
|
||||||
|
);
|
||||||
|
traceSourceId = getFirstSourceId(
|
||||||
|
sources.map(s => ({ _id: s._id, kind: s.kind })),
|
||||||
|
'trace',
|
||||||
|
);
|
||||||
|
logSourceId = getFirstSourceId(
|
||||||
|
sources.map(s => ({ _id: s._id, kind: s.kind })),
|
||||||
|
'log',
|
||||||
|
);
|
||||||
|
} catch (e) {
|
||||||
|
logger.warn(
|
||||||
|
{ teamId, error: e },
|
||||||
|
'Failed to fetch sources for create_dashboard prompt',
|
||||||
|
);
|
||||||
|
sourceSummary =
|
||||||
|
'Could not fetch sources. Call hyperdx_list_sources to discover available data.';
|
||||||
|
traceSourceId = '<SOURCE_ID>';
|
||||||
|
logSourceId = '<SOURCE_ID>';
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
messages: [
|
||||||
|
{
|
||||||
|
role: 'user' as const,
|
||||||
|
content: {
|
||||||
|
type: 'text' as const,
|
||||||
|
text: buildCreateDashboardPrompt(
|
||||||
|
sourceSummary,
|
||||||
|
traceSourceId,
|
||||||
|
logSourceId,
|
||||||
|
description,
|
||||||
|
),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
// ── dashboard_examples ────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
server.registerPrompt(
|
||||||
|
'dashboard_examples',
|
||||||
|
{
|
||||||
|
title: 'Dashboard Examples',
|
||||||
|
description:
|
||||||
|
'Get copy-paste-ready dashboard examples for common observability patterns: ' +
|
||||||
|
'service_overview, error_tracking, latency, log_analysis, infrastructure_sql.',
|
||||||
|
argsSchema: {
|
||||||
|
pattern: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Filter to a specific pattern: service_overview, error_tracking, latency, log_analysis, infrastructure_sql',
|
||||||
|
),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
async ({ pattern }) => {
|
||||||
|
let traceSourceId: string;
|
||||||
|
let logSourceId: string;
|
||||||
|
let connectionId: string;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const [sources, connections] = await Promise.all([
|
||||||
|
getSources(teamId),
|
||||||
|
getConnectionsByTeam(teamId),
|
||||||
|
]);
|
||||||
|
|
||||||
|
traceSourceId = getFirstSourceId(
|
||||||
|
sources.map(s => ({ _id: s._id, kind: s.kind })),
|
||||||
|
'trace',
|
||||||
|
);
|
||||||
|
logSourceId = getFirstSourceId(
|
||||||
|
sources.map(s => ({ _id: s._id, kind: s.kind })),
|
||||||
|
'log',
|
||||||
|
);
|
||||||
|
connectionId = getFirstConnectionId(
|
||||||
|
connections.map(c => ({ _id: c._id })),
|
||||||
|
);
|
||||||
|
} catch (e) {
|
||||||
|
logger.warn(
|
||||||
|
{ teamId, error: e },
|
||||||
|
'Failed to fetch sources for dashboard_examples prompt',
|
||||||
|
);
|
||||||
|
traceSourceId = '<TRACE_SOURCE_ID>';
|
||||||
|
logSourceId = '<LOG_SOURCE_ID>';
|
||||||
|
connectionId = '<CONNECTION_ID>';
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
messages: [
|
||||||
|
{
|
||||||
|
role: 'user' as const,
|
||||||
|
content: {
|
||||||
|
type: 'text' as const,
|
||||||
|
text: buildDashboardExamplesPrompt(
|
||||||
|
traceSourceId,
|
||||||
|
logSourceId,
|
||||||
|
connectionId,
|
||||||
|
pattern,
|
||||||
|
),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
// ── query_guide ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
server.registerPrompt(
|
||||||
|
'query_guide',
|
||||||
|
{
|
||||||
|
title: 'Query Writing Guide',
|
||||||
|
description:
|
||||||
|
'Look up HyperDX query syntax: aggregation functions, ' +
|
||||||
|
'Lucene/SQL filters, raw SQL macros, column naming, ' +
|
||||||
|
'per-tile constraints, and common mistakes.',
|
||||||
|
},
|
||||||
|
async () => {
|
||||||
|
return {
|
||||||
|
messages: [
|
||||||
|
{
|
||||||
|
role: 'user' as const,
|
||||||
|
content: {
|
||||||
|
type: 'text' as const,
|
||||||
|
text: buildQueryGuidePrompt(),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
},
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default dashboardPrompts;
|
||||||
63
packages/api/src/mcp/tools/dashboards/deleteDashboard.ts
Normal file
63
packages/api/src/mcp/tools/dashboards/deleteDashboard.ts
Normal file
|
|
@ -0,0 +1,63 @@
|
||||||
|
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
import mongoose from 'mongoose';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import { deleteDashboard } from '@/controllers/dashboard';
|
||||||
|
import Dashboard from '@/models/dashboard';
|
||||||
|
|
||||||
|
import { withToolTracing } from '../../utils/tracing';
|
||||||
|
import type { McpContext } from '../types';
|
||||||
|
|
||||||
|
export function registerDeleteDashboard(
|
||||||
|
server: McpServer,
|
||||||
|
context: McpContext,
|
||||||
|
): void {
|
||||||
|
const { teamId } = context;
|
||||||
|
|
||||||
|
server.registerTool(
|
||||||
|
'hyperdx_delete_dashboard',
|
||||||
|
{
|
||||||
|
title: 'Delete Dashboard',
|
||||||
|
description:
|
||||||
|
'Permanently delete a dashboard by ID. Also removes any alerts attached to its tiles. ' +
|
||||||
|
'Use hyperdx_get_dashboard (without an ID) to list available dashboard IDs.',
|
||||||
|
inputSchema: z.object({
|
||||||
|
id: z.string().describe('Dashboard ID to delete.'),
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
withToolTracing(
|
||||||
|
'hyperdx_delete_dashboard',
|
||||||
|
context,
|
||||||
|
async ({ id: dashboardId }) => {
|
||||||
|
if (!mongoose.Types.ObjectId.isValid(dashboardId)) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Invalid dashboard ID' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const existing = await Dashboard.findOne({
|
||||||
|
_id: dashboardId,
|
||||||
|
team: teamId,
|
||||||
|
}).lean();
|
||||||
|
if (!existing) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Dashboard not found' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
await deleteDashboard(dashboardId, new mongoose.Types.ObjectId(teamId));
|
||||||
|
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: JSON.stringify({ deleted: true, id: dashboardId }, null, 2),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
},
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
87
packages/api/src/mcp/tools/dashboards/getDashboard.ts
Normal file
87
packages/api/src/mcp/tools/dashboards/getDashboard.ts
Normal file
|
|
@ -0,0 +1,87 @@
|
||||||
|
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
import mongoose from 'mongoose';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import * as config from '@/config';
|
||||||
|
import { getDashboards } from '@/controllers/dashboard';
|
||||||
|
import Dashboard from '@/models/dashboard';
|
||||||
|
import { convertToExternalDashboard } from '@/routers/external-api/v2/utils/dashboards';
|
||||||
|
|
||||||
|
import { withToolTracing } from '../../utils/tracing';
|
||||||
|
import type { McpContext } from '../types';
|
||||||
|
|
||||||
|
export function registerGetDashboard(
|
||||||
|
server: McpServer,
|
||||||
|
context: McpContext,
|
||||||
|
): void {
|
||||||
|
const { teamId } = context;
|
||||||
|
const frontendUrl = config.FRONTEND_URL;
|
||||||
|
|
||||||
|
server.registerTool(
|
||||||
|
'hyperdx_get_dashboard',
|
||||||
|
{
|
||||||
|
title: 'Get Dashboard(s)',
|
||||||
|
description:
|
||||||
|
'Without an ID: list all dashboards (returns IDs, names, tags). ' +
|
||||||
|
'With an ID: get full dashboard detail including all tiles and configuration.',
|
||||||
|
inputSchema: z.object({
|
||||||
|
id: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Dashboard ID. Omit to list all dashboards, provide to get full detail.',
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
withToolTracing('hyperdx_get_dashboard', context, async ({ id }) => {
|
||||||
|
if (!id) {
|
||||||
|
const dashboards = await getDashboards(
|
||||||
|
new mongoose.Types.ObjectId(teamId),
|
||||||
|
);
|
||||||
|
const output = dashboards.map(d => ({
|
||||||
|
id: d._id.toString(),
|
||||||
|
name: d.name,
|
||||||
|
tags: d.tags,
|
||||||
|
...(frontendUrl ? { url: `${frontendUrl}/dashboards/${d._id}` } : {}),
|
||||||
|
}));
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{ type: 'text' as const, text: JSON.stringify(output, null, 2) },
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!mongoose.Types.ObjectId.isValid(id)) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Invalid dashboard ID' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const dashboard = await Dashboard.findOne({ _id: id, team: teamId });
|
||||||
|
if (!dashboard) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Dashboard not found' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: JSON.stringify(
|
||||||
|
{
|
||||||
|
...convertToExternalDashboard(dashboard),
|
||||||
|
...(frontendUrl
|
||||||
|
? { url: `${frontendUrl}/dashboards/${dashboard._id}` }
|
||||||
|
: {}),
|
||||||
|
},
|
||||||
|
null,
|
||||||
|
2,
|
||||||
|
),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
}
|
||||||
23
packages/api/src/mcp/tools/dashboards/index.ts
Normal file
23
packages/api/src/mcp/tools/dashboards/index.ts
Normal file
|
|
@ -0,0 +1,23 @@
|
||||||
|
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
|
||||||
|
import type { McpContext, ToolDefinition } from '../types';
|
||||||
|
import { registerDeleteDashboard } from './deleteDashboard';
|
||||||
|
import { registerGetDashboard } from './getDashboard';
|
||||||
|
import { registerListSources } from './listSources';
|
||||||
|
import { registerQueryTile } from './queryTile';
|
||||||
|
import { registerSaveDashboard } from './saveDashboard';
|
||||||
|
|
||||||
|
export * from './schemas';
|
||||||
|
|
||||||
|
const dashboardsTools: ToolDefinition = (
|
||||||
|
server: McpServer,
|
||||||
|
context: McpContext,
|
||||||
|
) => {
|
||||||
|
registerListSources(server, context);
|
||||||
|
registerGetDashboard(server, context);
|
||||||
|
registerSaveDashboard(server, context);
|
||||||
|
registerDeleteDashboard(server, context);
|
||||||
|
registerQueryTile(server, context);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default dashboardsTools;
|
||||||
183
packages/api/src/mcp/tools/dashboards/listSources.ts
Normal file
183
packages/api/src/mcp/tools/dashboards/listSources.ts
Normal file
|
|
@ -0,0 +1,183 @@
|
||||||
|
import {
|
||||||
|
convertCHDataTypeToJSType,
|
||||||
|
filterColumnMetaByType,
|
||||||
|
JSDataType,
|
||||||
|
} from '@hyperdx/common-utils/dist/clickhouse';
|
||||||
|
import { ClickhouseClient } from '@hyperdx/common-utils/dist/clickhouse/node';
|
||||||
|
import { getMetadata } from '@hyperdx/common-utils/dist/core/metadata';
|
||||||
|
import { SourceKind } from '@hyperdx/common-utils/dist/types';
|
||||||
|
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import {
|
||||||
|
getConnectionById,
|
||||||
|
getConnectionsByTeam,
|
||||||
|
} from '@/controllers/connection';
|
||||||
|
import { getSources } from '@/controllers/sources';
|
||||||
|
import logger from '@/utils/logger';
|
||||||
|
|
||||||
|
import { withToolTracing } from '../../utils/tracing';
|
||||||
|
import type { McpContext } from '../types';
|
||||||
|
|
||||||
|
export function registerListSources(
|
||||||
|
server: McpServer,
|
||||||
|
context: McpContext,
|
||||||
|
): void {
|
||||||
|
const { teamId } = context;
|
||||||
|
|
||||||
|
server.registerTool(
|
||||||
|
'hyperdx_list_sources',
|
||||||
|
{
|
||||||
|
title: 'List Sources & Connections',
|
||||||
|
description:
|
||||||
|
'List all data sources (logs, metrics, traces) and database connections available to this team. ' +
|
||||||
|
'Returns source IDs (use as sourceId in hyperdx_query and dashboard tiles) and ' +
|
||||||
|
'connection IDs (use as connectionId for advanced raw SQL queries). ' +
|
||||||
|
'Each source includes its full column schema and sampled attribute keys from map columns ' +
|
||||||
|
'(e.g. SpanAttributes, ResourceAttributes). ' +
|
||||||
|
'Column names are PascalCase (e.g. Duration, not duration). ' +
|
||||||
|
"Map attributes must be accessed via bracket syntax: SpanAttributes['key'].\n\n" +
|
||||||
|
'NOTE: For most queries, use source IDs with the builder display types. ' +
|
||||||
|
'Connection IDs are only needed for advanced raw SQL queries (displayType "sql").',
|
||||||
|
inputSchema: z.object({}),
|
||||||
|
},
|
||||||
|
withToolTracing('hyperdx_list_sources', context, async () => {
|
||||||
|
const [sources, connections] = await Promise.all([
|
||||||
|
getSources(teamId.toString()),
|
||||||
|
getConnectionsByTeam(teamId.toString()),
|
||||||
|
]);
|
||||||
|
|
||||||
|
const sourcesWithSchema = await Promise.all(
|
||||||
|
sources.map(async s => {
|
||||||
|
const meta: Record<string, unknown> = {
|
||||||
|
id: s._id.toString(),
|
||||||
|
name: s.name,
|
||||||
|
kind: s.kind,
|
||||||
|
connectionId: s.connection.toString(),
|
||||||
|
timestampColumn: s.timestampValueExpression,
|
||||||
|
};
|
||||||
|
|
||||||
|
if ('eventAttributesExpression' in s && s.eventAttributesExpression) {
|
||||||
|
meta.eventAttributesColumn = s.eventAttributesExpression;
|
||||||
|
}
|
||||||
|
if (
|
||||||
|
'resourceAttributesExpression' in s &&
|
||||||
|
s.resourceAttributesExpression
|
||||||
|
) {
|
||||||
|
meta.resourceAttributesColumn = s.resourceAttributesExpression;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (s.kind === SourceKind.Trace) {
|
||||||
|
meta.keyColumns = {
|
||||||
|
spanName: s.spanNameExpression,
|
||||||
|
duration: s.durationExpression,
|
||||||
|
durationPrecision: s.durationPrecision,
|
||||||
|
statusCode: s.statusCodeExpression,
|
||||||
|
serviceName: s.serviceNameExpression,
|
||||||
|
traceId: s.traceIdExpression,
|
||||||
|
spanId: s.spanIdExpression,
|
||||||
|
};
|
||||||
|
} else if (s.kind === SourceKind.Log) {
|
||||||
|
meta.keyColumns = {
|
||||||
|
body: s.bodyExpression,
|
||||||
|
serviceName: s.serviceNameExpression,
|
||||||
|
severityText: s.severityTextExpression,
|
||||||
|
traceId: s.traceIdExpression,
|
||||||
|
};
|
||||||
|
} else if (s.kind === SourceKind.Metric) {
|
||||||
|
meta.metricTables = s.metricTables;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip column schema fetch for sources without a table (e.g. metrics
|
||||||
|
// sources store their tables in metricTables, not from.tableName).
|
||||||
|
if (s.from.tableName) {
|
||||||
|
try {
|
||||||
|
const connection = await getConnectionById(
|
||||||
|
teamId.toString(),
|
||||||
|
s.connection.toString(),
|
||||||
|
true,
|
||||||
|
);
|
||||||
|
if (!connection) {
|
||||||
|
throw new Error(`Connection not found for source ${s._id}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const clickhouseClient = new ClickhouseClient({
|
||||||
|
host: connection.host,
|
||||||
|
username: connection.username,
|
||||||
|
password: connection.password,
|
||||||
|
});
|
||||||
|
const metadata = getMetadata(clickhouseClient);
|
||||||
|
|
||||||
|
const columns = await metadata.getColumns({
|
||||||
|
databaseName: s.from.databaseName,
|
||||||
|
tableName: s.from.tableName,
|
||||||
|
connectionId: s.connection.toString(),
|
||||||
|
});
|
||||||
|
|
||||||
|
meta.columns = columns.map(c => ({
|
||||||
|
name: c.name,
|
||||||
|
type: c.type,
|
||||||
|
jsType: convertCHDataTypeToJSType(c.type),
|
||||||
|
}));
|
||||||
|
|
||||||
|
const mapColumns = filterColumnMetaByType(columns, [
|
||||||
|
JSDataType.Map,
|
||||||
|
]);
|
||||||
|
const mapKeysResults: Record<string, string[]> = {};
|
||||||
|
await Promise.all(
|
||||||
|
(mapColumns ?? []).map(async col => {
|
||||||
|
try {
|
||||||
|
const keys = await metadata.getMapKeys({
|
||||||
|
databaseName: s.from.databaseName,
|
||||||
|
tableName: s.from.tableName,
|
||||||
|
column: col.name,
|
||||||
|
maxKeys: 50,
|
||||||
|
connectionId: s.connection.toString(),
|
||||||
|
});
|
||||||
|
mapKeysResults[col.name] = keys;
|
||||||
|
} catch {
|
||||||
|
// Skip columns where key sampling fails
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
if (Object.keys(mapKeysResults).length > 0) {
|
||||||
|
meta.mapAttributeKeys = mapKeysResults;
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
logger.warn(
|
||||||
|
{ teamId, sourceId: s._id, error: e },
|
||||||
|
'Failed to fetch schema for source',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return meta;
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
const output = {
|
||||||
|
sources: sourcesWithSchema,
|
||||||
|
connections: connections.map(c => ({
|
||||||
|
id: c._id.toString(),
|
||||||
|
name: c.name,
|
||||||
|
})),
|
||||||
|
usage: {
|
||||||
|
topLevelColumns:
|
||||||
|
'Use directly in valueExpression/groupBy with PascalCase: Duration, StatusCode, SpanName',
|
||||||
|
mapAttributes:
|
||||||
|
"Use bracket syntax: SpanAttributes['http.method'], ResourceAttributes['service.name']",
|
||||||
|
sourceIds:
|
||||||
|
'Use sourceId with builder display types (line, stacked_bar, table, number, pie, search) for standard queries',
|
||||||
|
connectionIds:
|
||||||
|
'ADVANCED: Use connectionId only with raw SQL queries (displayType "sql" or configType "sql"). ' +
|
||||||
|
'Raw SQL is for advanced use cases like JOINs, sub-queries, or querying tables not registered as sources.',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{ type: 'text' as const, text: JSON.stringify(output, null, 2) },
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
}
|
||||||
97
packages/api/src/mcp/tools/dashboards/queryTile.ts
Normal file
97
packages/api/src/mcp/tools/dashboards/queryTile.ts
Normal file
|
|
@ -0,0 +1,97 @@
|
||||||
|
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
import mongoose from 'mongoose';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import Dashboard from '@/models/dashboard';
|
||||||
|
import { convertToExternalDashboard } from '@/routers/external-api/v2/utils/dashboards';
|
||||||
|
|
||||||
|
import { withToolTracing } from '../../utils/tracing';
|
||||||
|
import { parseTimeRange, runConfigTile } from '../query/helpers';
|
||||||
|
import type { McpContext } from '../types';
|
||||||
|
|
||||||
|
export function registerQueryTile(
|
||||||
|
server: McpServer,
|
||||||
|
context: McpContext,
|
||||||
|
): void {
|
||||||
|
const { teamId } = context;
|
||||||
|
|
||||||
|
server.registerTool(
|
||||||
|
'hyperdx_query_tile',
|
||||||
|
{
|
||||||
|
title: 'Query a Dashboard Tile',
|
||||||
|
description:
|
||||||
|
'Execute the query for a specific tile on an existing dashboard. ' +
|
||||||
|
'Useful for validating that a tile returns data or for spot-checking results ' +
|
||||||
|
'without rebuilding the query from scratch. ' +
|
||||||
|
'Use hyperdx_get_dashboard with an ID to find tile IDs.',
|
||||||
|
inputSchema: z.object({
|
||||||
|
dashboardId: z.string().describe('Dashboard ID.'),
|
||||||
|
tileId: z
|
||||||
|
.string()
|
||||||
|
.describe(
|
||||||
|
'Tile ID within the dashboard. ' +
|
||||||
|
'Obtain from hyperdx_get_dashboard.',
|
||||||
|
),
|
||||||
|
startTime: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Start of the query window as ISO 8601. Default: 15 minutes ago. ' +
|
||||||
|
'If results are empty, try a wider range (e.g. 24 hours).',
|
||||||
|
),
|
||||||
|
endTime: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe('End of the query window as ISO 8601. Default: now.'),
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
withToolTracing(
|
||||||
|
'hyperdx_query_tile',
|
||||||
|
context,
|
||||||
|
async ({ dashboardId, tileId, startTime, endTime }) => {
|
||||||
|
const timeRange = parseTimeRange(startTime, endTime);
|
||||||
|
if ('error' in timeRange) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: timeRange.error }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const { startDate, endDate } = timeRange;
|
||||||
|
|
||||||
|
if (!mongoose.Types.ObjectId.isValid(dashboardId)) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Invalid dashboard ID' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const dashboard = await Dashboard.findOne({
|
||||||
|
_id: dashboardId,
|
||||||
|
team: teamId,
|
||||||
|
});
|
||||||
|
if (!dashboard) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Dashboard not found' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const externalDashboard = convertToExternalDashboard(dashboard);
|
||||||
|
const tile = externalDashboard.tiles.find(t => t.id === tileId);
|
||||||
|
if (!tile) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Tile not found: ${tileId}. Available tile IDs: ${externalDashboard.tiles.map(t => t.id).join(', ')}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return runConfigTile(teamId.toString(), tile, startDate, endDate);
|
||||||
|
},
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
338
packages/api/src/mcp/tools/dashboards/saveDashboard.ts
Normal file
338
packages/api/src/mcp/tools/dashboards/saveDashboard.ts
Normal file
|
|
@ -0,0 +1,338 @@
|
||||||
|
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
import { uniq } from 'lodash';
|
||||||
|
import mongoose from 'mongoose';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import * as config from '@/config';
|
||||||
|
import Dashboard from '@/models/dashboard';
|
||||||
|
import {
|
||||||
|
cleanupDashboardAlerts,
|
||||||
|
convertExternalFiltersToInternal,
|
||||||
|
convertExternalTilesToInternal,
|
||||||
|
convertToExternalDashboard,
|
||||||
|
createDashboardBodySchema,
|
||||||
|
getMissingConnections,
|
||||||
|
getMissingSources,
|
||||||
|
resolveSavedQueryLanguage,
|
||||||
|
updateDashboardBodySchema,
|
||||||
|
} from '@/routers/external-api/v2/utils/dashboards';
|
||||||
|
import type { ExternalDashboardTileWithId } from '@/utils/zod';
|
||||||
|
|
||||||
|
import { withToolTracing } from '../../utils/tracing';
|
||||||
|
import type { McpContext } from '../types';
|
||||||
|
import { mcpTilesParam } from './schemas';
|
||||||
|
|
||||||
|
export function registerSaveDashboard(
|
||||||
|
server: McpServer,
|
||||||
|
context: McpContext,
|
||||||
|
): void {
|
||||||
|
const { teamId } = context;
|
||||||
|
const frontendUrl = config.FRONTEND_URL;
|
||||||
|
|
||||||
|
server.registerTool(
|
||||||
|
'hyperdx_save_dashboard',
|
||||||
|
{
|
||||||
|
title: 'Create or Update Dashboard',
|
||||||
|
description:
|
||||||
|
'Create a new dashboard (omit id) or update an existing one (provide id). ' +
|
||||||
|
'Call hyperdx_list_sources first to obtain sourceId and connectionId values. ' +
|
||||||
|
'IMPORTANT: After saving a dashboard, always run hyperdx_query_tile on each tile ' +
|
||||||
|
'to confirm the queries work and return expected data. Tiles can silently fail ' +
|
||||||
|
'due to incorrect filter syntax, missing attributes, or wrong column names.',
|
||||||
|
inputSchema: z.object({
|
||||||
|
id: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Dashboard ID. Omit to create a new dashboard, provide to update an existing one.',
|
||||||
|
),
|
||||||
|
name: z.string().describe('Dashboard name'),
|
||||||
|
tiles: mcpTilesParam,
|
||||||
|
tags: z.array(z.string()).optional().describe('Dashboard tags'),
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
withToolTracing(
|
||||||
|
'hyperdx_save_dashboard',
|
||||||
|
context,
|
||||||
|
async ({ id: dashboardId, name, tiles: inputTiles, tags }) => {
|
||||||
|
if (!dashboardId) {
|
||||||
|
return createDashboard({
|
||||||
|
teamId,
|
||||||
|
frontendUrl,
|
||||||
|
name,
|
||||||
|
inputTiles,
|
||||||
|
tags,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return updateDashboard({
|
||||||
|
teamId,
|
||||||
|
frontendUrl,
|
||||||
|
dashboardId,
|
||||||
|
name,
|
||||||
|
inputTiles,
|
||||||
|
tags,
|
||||||
|
});
|
||||||
|
},
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Create helper ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async function createDashboard({
|
||||||
|
teamId,
|
||||||
|
frontendUrl,
|
||||||
|
name,
|
||||||
|
inputTiles,
|
||||||
|
tags,
|
||||||
|
}: {
|
||||||
|
teamId: string;
|
||||||
|
frontendUrl: string | undefined;
|
||||||
|
name: string;
|
||||||
|
inputTiles: unknown[];
|
||||||
|
tags: string[] | undefined;
|
||||||
|
}) {
|
||||||
|
const parsed = createDashboardBodySchema.safeParse({
|
||||||
|
name,
|
||||||
|
tiles: inputTiles,
|
||||||
|
tags,
|
||||||
|
});
|
||||||
|
if (!parsed.success) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Validation error: ${JSON.stringify(parsed.error.errors)}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const { tiles, filters } = parsed.data;
|
||||||
|
const tilesWithId = tiles as ExternalDashboardTileWithId[];
|
||||||
|
|
||||||
|
const [missingSources, missingConnections] = await Promise.all([
|
||||||
|
getMissingSources(teamId, tilesWithId, filters),
|
||||||
|
getMissingConnections(teamId, tilesWithId),
|
||||||
|
]);
|
||||||
|
if (missingSources.length > 0) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Could not find source IDs: ${missingSources.join(', ')}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (missingConnections.length > 0) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Could not find connection IDs: ${missingConnections.join(', ')}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const internalTiles = convertExternalTilesToInternal(tilesWithId);
|
||||||
|
const filtersWithIds = convertExternalFiltersToInternal(filters ?? []);
|
||||||
|
|
||||||
|
const normalizedSavedQueryLanguage = resolveSavedQueryLanguage({
|
||||||
|
savedQuery: undefined,
|
||||||
|
savedQueryLanguage: undefined,
|
||||||
|
});
|
||||||
|
|
||||||
|
const newDashboard = await new Dashboard({
|
||||||
|
name: parsed.data.name,
|
||||||
|
tiles: internalTiles,
|
||||||
|
tags: tags && uniq(tags),
|
||||||
|
filters: filtersWithIds,
|
||||||
|
savedQueryLanguage: normalizedSavedQueryLanguage,
|
||||||
|
savedFilterValues: parsed.data.savedFilterValues,
|
||||||
|
team: teamId,
|
||||||
|
}).save();
|
||||||
|
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: JSON.stringify(
|
||||||
|
{
|
||||||
|
...convertToExternalDashboard(newDashboard),
|
||||||
|
...(frontendUrl
|
||||||
|
? { url: `${frontendUrl}/dashboards/${newDashboard._id}` }
|
||||||
|
: {}),
|
||||||
|
hint: 'Use hyperdx_query to test individual tile queries before viewing the dashboard.',
|
||||||
|
},
|
||||||
|
null,
|
||||||
|
2,
|
||||||
|
),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Update helper ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async function updateDashboard({
|
||||||
|
teamId,
|
||||||
|
frontendUrl,
|
||||||
|
dashboardId,
|
||||||
|
name,
|
||||||
|
inputTiles,
|
||||||
|
tags,
|
||||||
|
}: {
|
||||||
|
teamId: string;
|
||||||
|
frontendUrl: string | undefined;
|
||||||
|
dashboardId: string;
|
||||||
|
name: string;
|
||||||
|
inputTiles: unknown[];
|
||||||
|
tags: string[] | undefined;
|
||||||
|
}) {
|
||||||
|
if (!mongoose.Types.ObjectId.isValid(dashboardId)) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Invalid dashboard ID' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const parsed = updateDashboardBodySchema.safeParse({
|
||||||
|
name,
|
||||||
|
tiles: inputTiles,
|
||||||
|
tags,
|
||||||
|
});
|
||||||
|
if (!parsed.success) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Validation error: ${JSON.stringify(parsed.error.errors)}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const { tiles, filters } = parsed.data;
|
||||||
|
const tilesWithId = tiles as ExternalDashboardTileWithId[];
|
||||||
|
|
||||||
|
const [missingSources, missingConnections] = await Promise.all([
|
||||||
|
getMissingSources(teamId, tilesWithId, filters),
|
||||||
|
getMissingConnections(teamId, tilesWithId),
|
||||||
|
]);
|
||||||
|
if (missingSources.length > 0) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Could not find source IDs: ${missingSources.join(', ')}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (missingConnections.length > 0) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Could not find connection IDs: ${missingConnections.join(', ')}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const existingDashboard = await Dashboard.findOne(
|
||||||
|
{ _id: dashboardId, team: teamId },
|
||||||
|
{ tiles: 1, filters: 1 },
|
||||||
|
).lean();
|
||||||
|
|
||||||
|
if (!existingDashboard) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Dashboard not found' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const existingTileIds = new Set(
|
||||||
|
(existingDashboard.tiles ?? []).map((t: { id: string }) => t.id),
|
||||||
|
);
|
||||||
|
const existingFilterIds = new Set(
|
||||||
|
(existingDashboard.filters ?? []).map((f: { id: string }) => f.id),
|
||||||
|
);
|
||||||
|
|
||||||
|
const internalTiles = convertExternalTilesToInternal(
|
||||||
|
tilesWithId,
|
||||||
|
existingTileIds,
|
||||||
|
);
|
||||||
|
|
||||||
|
const setPayload: Record<string, unknown> = {
|
||||||
|
name,
|
||||||
|
tiles: internalTiles,
|
||||||
|
tags: tags && uniq(tags),
|
||||||
|
};
|
||||||
|
|
||||||
|
if (filters !== undefined) {
|
||||||
|
setPayload.filters = convertExternalFiltersToInternal(
|
||||||
|
filters,
|
||||||
|
existingFilterIds,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const normalizedSavedQueryLanguage = resolveSavedQueryLanguage({
|
||||||
|
savedQuery: undefined,
|
||||||
|
savedQueryLanguage: undefined,
|
||||||
|
});
|
||||||
|
if (normalizedSavedQueryLanguage !== undefined) {
|
||||||
|
setPayload.savedQueryLanguage = normalizedSavedQueryLanguage;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (parsed.data.savedFilterValues !== undefined) {
|
||||||
|
setPayload.savedFilterValues = parsed.data.savedFilterValues;
|
||||||
|
}
|
||||||
|
|
||||||
|
const updatedDashboard = await Dashboard.findOneAndUpdate(
|
||||||
|
{ _id: dashboardId, team: teamId },
|
||||||
|
{ $set: setPayload },
|
||||||
|
{ new: true },
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!updatedDashboard) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: 'Dashboard not found' }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
await cleanupDashboardAlerts({
|
||||||
|
dashboardId,
|
||||||
|
teamId,
|
||||||
|
internalTiles,
|
||||||
|
existingTileIds,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: JSON.stringify(
|
||||||
|
{
|
||||||
|
...convertToExternalDashboard(updatedDashboard),
|
||||||
|
...(frontendUrl
|
||||||
|
? { url: `${frontendUrl}/dashboards/${updatedDashboard._id}` }
|
||||||
|
: {}),
|
||||||
|
hint: 'Use hyperdx_query to test individual tile queries before viewing the dashboard.',
|
||||||
|
},
|
||||||
|
null,
|
||||||
|
2,
|
||||||
|
),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
320
packages/api/src/mcp/tools/dashboards/schemas.ts
Normal file
320
packages/api/src/mcp/tools/dashboards/schemas.ts
Normal file
|
|
@ -0,0 +1,320 @@
|
||||||
|
import {
|
||||||
|
AggregateFunctionSchema,
|
||||||
|
SearchConditionLanguageSchema,
|
||||||
|
} from '@hyperdx/common-utils/dist/types';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import { externalQuantileLevelSchema } from '@/utils/zod';
|
||||||
|
|
||||||
|
// ─── Shared tile schemas for MCP dashboard tools ─────────────────────────────
|
||||||
|
const mcpTileSelectItemSchema = z
|
||||||
|
.object({
|
||||||
|
aggFn: AggregateFunctionSchema.describe(
|
||||||
|
'Aggregation function. "count" requires no valueExpression; all others do.',
|
||||||
|
),
|
||||||
|
valueExpression: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Column or expression to aggregate. Required for all aggFn except "count". ' +
|
||||||
|
'Use PascalCase for top-level columns (e.g. "Duration", "StatusCode"). ' +
|
||||||
|
"For span attributes use: SpanAttributes['key'] (e.g. SpanAttributes['http.method']). " +
|
||||||
|
"For resource attributes use: ResourceAttributes['key'] (e.g. ResourceAttributes['service.name']).",
|
||||||
|
),
|
||||||
|
where: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.default('')
|
||||||
|
.describe('Filter in Lucene syntax. Example: "level:error"'),
|
||||||
|
whereLanguage: SearchConditionLanguageSchema.optional().default('lucene'),
|
||||||
|
alias: z.string().optional().describe('Display label for this series'),
|
||||||
|
level: externalQuantileLevelSchema
|
||||||
|
.optional()
|
||||||
|
.describe('Percentile level for aggFn="quantile"'),
|
||||||
|
})
|
||||||
|
.superRefine((data, ctx) => {
|
||||||
|
if (data.level && data.aggFn !== 'quantile') {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
message: 'Level can only be used with quantile aggregation function',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (data.valueExpression && data.aggFn === 'count') {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
message:
|
||||||
|
'Value expression cannot be used with count aggregation function',
|
||||||
|
});
|
||||||
|
} else if (!data.valueExpression && data.aggFn !== 'count') {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
message:
|
||||||
|
'Value expression is required for non-count aggregation functions',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpTileLayoutSchema = z.object({
|
||||||
|
name: z.string().describe('Tile title shown on the dashboard'),
|
||||||
|
x: z
|
||||||
|
.number()
|
||||||
|
.min(0)
|
||||||
|
.max(23)
|
||||||
|
.optional()
|
||||||
|
.default(0)
|
||||||
|
.describe('Horizontal grid position (0–23). Default 0'),
|
||||||
|
y: z
|
||||||
|
.number()
|
||||||
|
.min(0)
|
||||||
|
.optional()
|
||||||
|
.default(0)
|
||||||
|
.describe('Vertical grid position. Default 0'),
|
||||||
|
w: z
|
||||||
|
.number()
|
||||||
|
.min(1)
|
||||||
|
.max(24)
|
||||||
|
.optional()
|
||||||
|
.default(12)
|
||||||
|
.describe('Width in grid columns (1–24). Default 12'),
|
||||||
|
h: z
|
||||||
|
.number()
|
||||||
|
.min(1)
|
||||||
|
.optional()
|
||||||
|
.default(4)
|
||||||
|
.describe('Height in grid rows. Default 4'),
|
||||||
|
id: z
|
||||||
|
.string()
|
||||||
|
.max(36)
|
||||||
|
.optional()
|
||||||
|
.describe('Tile ID (auto-generated if omitted)'),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpLineTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
displayType: z.literal('line').describe('Line chart over time'),
|
||||||
|
sourceId: z.string().describe('Source ID – call hyperdx_list_sources'),
|
||||||
|
select: z
|
||||||
|
.array(mcpTileSelectItemSchema)
|
||||||
|
.min(1)
|
||||||
|
.max(20)
|
||||||
|
.describe('Metrics to plot (one series per item)'),
|
||||||
|
groupBy: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Column to split/group by. ' +
|
||||||
|
'Top-level columns use PascalCase (e.g. "SpanName", "StatusCode"). ' +
|
||||||
|
"Span attributes: SpanAttributes['key'] (e.g. SpanAttributes['http.method']). " +
|
||||||
|
"Resource attributes: ResourceAttributes['key'] (e.g. ResourceAttributes['service.name']).",
|
||||||
|
),
|
||||||
|
fillNulls: z.boolean().optional().default(true),
|
||||||
|
alignDateRangeToGranularity: z.boolean().optional(),
|
||||||
|
asRatio: z
|
||||||
|
.boolean()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Plot as ratio of two metrics (requires exactly 2 select items)',
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpBarTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
displayType: z
|
||||||
|
.literal('stacked_bar')
|
||||||
|
.describe('Stacked bar chart over time'),
|
||||||
|
sourceId: z.string().describe('Source ID – call hyperdx_list_sources'),
|
||||||
|
select: z.array(mcpTileSelectItemSchema).min(1).max(20),
|
||||||
|
groupBy: z.string().optional(),
|
||||||
|
fillNulls: z.boolean().optional().default(true),
|
||||||
|
alignDateRangeToGranularity: z.boolean().optional(),
|
||||||
|
asRatio: z.boolean().optional(),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpTableTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
displayType: z.literal('table').describe('Tabular aggregated data'),
|
||||||
|
sourceId: z.string().describe('Source ID – call hyperdx_list_sources'),
|
||||||
|
select: z.array(mcpTileSelectItemSchema).min(1).max(20),
|
||||||
|
groupBy: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Group rows by this column. Use PascalCase for top-level columns (e.g. "SpanName"). ' +
|
||||||
|
"For attributes: SpanAttributes['key'] or ResourceAttributes['key'].",
|
||||||
|
),
|
||||||
|
orderBy: z.string().optional().describe('Sort results by this column'),
|
||||||
|
asRatio: z.boolean().optional(),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpNumberFormatSchema = z
|
||||||
|
.object({
|
||||||
|
output: z
|
||||||
|
.enum(['currency', 'percent', 'byte', 'time', 'number'])
|
||||||
|
.describe(
|
||||||
|
'Format category. "time" auto-formats durations (use factor for input unit). ' +
|
||||||
|
'"byte" formats as KB/MB/GB. "currency" prepends a symbol. "percent" appends %.',
|
||||||
|
),
|
||||||
|
mantissa: z
|
||||||
|
.number()
|
||||||
|
.int()
|
||||||
|
.optional()
|
||||||
|
.describe('Decimal places (0–10). Not used for "time" output.'),
|
||||||
|
thousandSeparated: z
|
||||||
|
.boolean()
|
||||||
|
.optional()
|
||||||
|
.describe('Separate thousands (e.g. 1,234,567)'),
|
||||||
|
average: z
|
||||||
|
.boolean()
|
||||||
|
.optional()
|
||||||
|
.describe('Abbreviate large numbers (e.g. 1.2m)'),
|
||||||
|
decimalBytes: z
|
||||||
|
.boolean()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Use decimal base for bytes (1KB = 1000). Only for "byte" output.',
|
||||||
|
),
|
||||||
|
factor: z
|
||||||
|
.number()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Input unit factor for "time" output. ' +
|
||||||
|
'1 = seconds, 0.001 = milliseconds, 0.000001 = microseconds, 0.000000001 = nanoseconds.',
|
||||||
|
),
|
||||||
|
currencySymbol: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe('Currency symbol (e.g. "$"). Only for "currency" output.'),
|
||||||
|
unit: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe('Suffix appended to the value (e.g. " req/s")'),
|
||||||
|
})
|
||||||
|
.describe(
|
||||||
|
'Controls how the number value is formatted for display. ' +
|
||||||
|
'Most useful: { output: "time", factor: 0.000000001 } to auto-format nanosecond durations, ' +
|
||||||
|
'or { output: "number", mantissa: 2, thousandSeparated: true } for clean counts.',
|
||||||
|
);
|
||||||
|
|
||||||
|
const mcpNumberTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
displayType: z.literal('number').describe('Single aggregate scalar value'),
|
||||||
|
sourceId: z.string().describe('Source ID – call hyperdx_list_sources'),
|
||||||
|
select: z
|
||||||
|
.array(mcpTileSelectItemSchema)
|
||||||
|
.length(1)
|
||||||
|
.describe('Exactly one metric to display'),
|
||||||
|
numberFormat: mcpNumberFormatSchema
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Display formatting for the number value. Example: { output: "time", factor: 0.000000001 } ' +
|
||||||
|
'to auto-format nanosecond durations as human-readable time.',
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpPieTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
displayType: z.literal('pie').describe('Pie chart'),
|
||||||
|
sourceId: z.string().describe('Source ID – call hyperdx_list_sources'),
|
||||||
|
select: z.array(mcpTileSelectItemSchema).length(1),
|
||||||
|
groupBy: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Column that defines pie slices. Use PascalCase for top-level columns. ' +
|
||||||
|
"For attributes: SpanAttributes['key'] or ResourceAttributes['key'].",
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpSearchTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
displayType: z.literal('search').describe('Log/event search results list'),
|
||||||
|
sourceId: z.string().describe('Source ID – call hyperdx_list_sources'),
|
||||||
|
where: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.default('')
|
||||||
|
.describe('Filter in Lucene syntax. Example: "level:error"'),
|
||||||
|
whereLanguage: SearchConditionLanguageSchema.optional().default('lucene'),
|
||||||
|
select: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.default('')
|
||||||
|
.describe(
|
||||||
|
'Columns to display (empty = defaults). Example: "body,service.name,duration"',
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpMarkdownTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
displayType: z.literal('markdown').describe('Free-form Markdown text tile'),
|
||||||
|
markdown: z.string().optional().default(''),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpSqlTileSchema = mcpTileLayoutSchema.extend({
|
||||||
|
config: z.object({
|
||||||
|
configType: z
|
||||||
|
.literal('sql')
|
||||||
|
.describe(
|
||||||
|
'Must be "sql" for raw SQL tiles. ' +
|
||||||
|
'ADVANCED: Only use raw SQL tiles when the builder tile types cannot express the query you need.',
|
||||||
|
),
|
||||||
|
displayType: z
|
||||||
|
.enum(['line', 'stacked_bar', 'table', 'number', 'pie'])
|
||||||
|
.describe('How to render the SQL results'),
|
||||||
|
connectionId: z
|
||||||
|
.string()
|
||||||
|
.describe(
|
||||||
|
'Connection ID (not sourceId) – call hyperdx_list_sources to find available connections',
|
||||||
|
),
|
||||||
|
sqlTemplate: z
|
||||||
|
.string()
|
||||||
|
.describe(
|
||||||
|
'Raw ClickHouse SQL query. Always include a LIMIT clause to avoid excessive data.\n' +
|
||||||
|
'Use query parameters: {startDateMilliseconds:Int64}, {endDateMilliseconds:Int64}, ' +
|
||||||
|
'{intervalSeconds:Int64}, {intervalMilliseconds:Int64}.\n' +
|
||||||
|
'Or use macros: $__timeFilter(col), $__timeFilter_ms(col), $__dateFilter(col), ' +
|
||||||
|
'$__fromTime, $__toTime, $__fromTime_ms, $__toTime_ms, ' +
|
||||||
|
'$__timeInterval(col), $__timeInterval_ms(col), $__interval_s, $__filters.\n' +
|
||||||
|
'Example: "SELECT $__timeInterval(TimestampTime) AS ts, ServiceName, count() ' +
|
||||||
|
'FROM otel_logs WHERE $__timeFilter(TimestampTime) AND $__filters ' +
|
||||||
|
'GROUP BY ServiceName, ts ORDER BY ts"',
|
||||||
|
),
|
||||||
|
fillNulls: z.boolean().optional(),
|
||||||
|
alignDateRangeToGranularity: z.boolean().optional(),
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpTileSchema = z.union([
|
||||||
|
mcpLineTileSchema,
|
||||||
|
mcpBarTileSchema,
|
||||||
|
mcpTableTileSchema,
|
||||||
|
mcpNumberTileSchema,
|
||||||
|
mcpPieTileSchema,
|
||||||
|
mcpSearchTileSchema,
|
||||||
|
mcpMarkdownTileSchema,
|
||||||
|
mcpSqlTileSchema,
|
||||||
|
]);
|
||||||
|
|
||||||
|
export const mcpTilesParam = z
|
||||||
|
.array(mcpTileSchema)
|
||||||
|
.describe(
|
||||||
|
'Array of dashboard tiles. Each tile needs a name, optional layout (x/y/w/h), and a config block. ' +
|
||||||
|
'The config block varies by displayType – use hyperdx_list_sources for sourceId and connectionId values.\n\n' +
|
||||||
|
'Example tiles:\n' +
|
||||||
|
'1. Line chart: { "name": "Error Rate", "config": { "displayType": "line", "sourceId": "<from list_sources>", ' +
|
||||||
|
'"groupBy": "ResourceAttributes[\'service.name\']", "select": [{ "aggFn": "count", "where": "StatusCode:STATUS_CODE_ERROR" }] } }\n' +
|
||||||
|
'2. Table: { "name": "Top Endpoints", "config": { "displayType": "table", "sourceId": "<from list_sources>", ' +
|
||||||
|
'"groupBy": "SpanAttributes[\'http.route\']", "select": [{ "aggFn": "count" }, { "aggFn": "avg", "valueExpression": "Duration" }] } }\n' +
|
||||||
|
'3. Number: { "name": "Total Requests", "config": { "displayType": "number", "sourceId": "<from list_sources>", ' +
|
||||||
|
'"select": [{ "aggFn": "count" }], "numberFormat": { "output": "number", "average": true } } }\n' +
|
||||||
|
'4. Number (duration): { "name": "P95 Latency", "config": { "displayType": "number", "sourceId": "<from list_sources>", ' +
|
||||||
|
'"select": [{ "aggFn": "quantile", "level": 0.95, "valueExpression": "Duration" }], ' +
|
||||||
|
'"numberFormat": { "output": "time", "factor": 0.000000001 } } }',
|
||||||
|
);
|
||||||
264
packages/api/src/mcp/tools/query/helpers.ts
Normal file
264
packages/api/src/mcp/tools/query/helpers.ts
Normal file
|
|
@ -0,0 +1,264 @@
|
||||||
|
import { ClickhouseClient } from '@hyperdx/common-utils/dist/clickhouse/node';
|
||||||
|
import { getMetadata } from '@hyperdx/common-utils/dist/core/metadata';
|
||||||
|
import { getFirstTimestampValueExpression } from '@hyperdx/common-utils/dist/core/utils';
|
||||||
|
import { isRawSqlSavedChartConfig } from '@hyperdx/common-utils/dist/guards';
|
||||||
|
import type {
|
||||||
|
ChartConfigWithDateRange,
|
||||||
|
MetricTable,
|
||||||
|
} from '@hyperdx/common-utils/dist/types';
|
||||||
|
import { DisplayType, SourceKind } from '@hyperdx/common-utils/dist/types';
|
||||||
|
import ms from 'ms';
|
||||||
|
|
||||||
|
import { getConnectionById } from '@/controllers/connection';
|
||||||
|
import { getSource } from '@/controllers/sources';
|
||||||
|
import {
|
||||||
|
convertToInternalTileConfig,
|
||||||
|
isConfigTile,
|
||||||
|
} from '@/routers/external-api/v2/utils/dashboards';
|
||||||
|
import { trimToolResponse } from '@/utils/trimToolResponse';
|
||||||
|
import type { ExternalDashboardTileWithId } from '@/utils/zod';
|
||||||
|
|
||||||
|
// ─── Time range ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
export function parseTimeRange(
|
||||||
|
startTime?: string,
|
||||||
|
endTime?: string,
|
||||||
|
): { error: string } | { startDate: Date; endDate: Date } {
|
||||||
|
const endDate = endTime ? new Date(endTime) : new Date();
|
||||||
|
const startDate = startTime
|
||||||
|
? new Date(startTime)
|
||||||
|
: new Date(endDate.getTime() - ms('15m'));
|
||||||
|
if (isNaN(endDate.getTime()) || isNaN(startDate.getTime())) {
|
||||||
|
return {
|
||||||
|
error: 'Invalid startTime or endTime: must be valid ISO 8601 strings',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return { startDate, endDate };
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Result helpers ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
function isEmptyResult(result: unknown): boolean {
|
||||||
|
if (result == null) return true;
|
||||||
|
if (Array.isArray(result)) return result.length === 0;
|
||||||
|
if (typeof result === 'object' && result !== null) {
|
||||||
|
const obj = result as Record<string, unknown>;
|
||||||
|
if (Array.isArray(obj.data) && obj.data.length === 0) return true;
|
||||||
|
if (obj.rows != null && Number(obj.rows) === 0) return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatQueryResult(result: unknown) {
|
||||||
|
const trimmedResult = trimToolResponse(result);
|
||||||
|
const isTrimmed =
|
||||||
|
JSON.stringify(trimmedResult).length < JSON.stringify(result).length;
|
||||||
|
const empty = isEmptyResult(result);
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: JSON.stringify(
|
||||||
|
{
|
||||||
|
result: trimmedResult,
|
||||||
|
...(isTrimmed
|
||||||
|
? {
|
||||||
|
note: 'Result was trimmed for context size. Narrow the time range or add filters to reduce data.',
|
||||||
|
}
|
||||||
|
: {}),
|
||||||
|
...(empty
|
||||||
|
? {
|
||||||
|
hint: 'No data found in the queried time range. Try setting startTime to a wider window (e.g. 24 hours ago) or check that filters match existing data.',
|
||||||
|
}
|
||||||
|
: {}),
|
||||||
|
},
|
||||||
|
null,
|
||||||
|
2,
|
||||||
|
),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Tile execution ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
export async function runConfigTile(
|
||||||
|
teamId: string,
|
||||||
|
tile: ExternalDashboardTileWithId,
|
||||||
|
startDate: Date,
|
||||||
|
endDate: Date,
|
||||||
|
options?: { maxResults?: number },
|
||||||
|
) {
|
||||||
|
if (!isConfigTile(tile)) {
|
||||||
|
return {
|
||||||
|
isError: true as const,
|
||||||
|
content: [
|
||||||
|
{ type: 'text' as const, text: 'Invalid tile: config field missing' },
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const internalTile = convertToInternalTileConfig(tile);
|
||||||
|
const savedConfig = internalTile.config;
|
||||||
|
|
||||||
|
if (!isRawSqlSavedChartConfig(savedConfig)) {
|
||||||
|
const builderConfig = savedConfig;
|
||||||
|
|
||||||
|
if (
|
||||||
|
!builderConfig.source ||
|
||||||
|
builderConfig.displayType === DisplayType.Markdown
|
||||||
|
) {
|
||||||
|
return {
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: 'Markdown tile: no query to execute.',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const source = await getSource(teamId, builderConfig.source);
|
||||||
|
if (!source) {
|
||||||
|
return {
|
||||||
|
isError: true as const,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Source not found: ${builderConfig.source}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const connection = await getConnectionById(
|
||||||
|
teamId,
|
||||||
|
source.connection.toString(),
|
||||||
|
true,
|
||||||
|
);
|
||||||
|
if (!connection) {
|
||||||
|
return {
|
||||||
|
isError: true as const,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Connection not found for source: ${builderConfig.source}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const clickhouseClient = new ClickhouseClient({
|
||||||
|
host: connection.host,
|
||||||
|
username: connection.username,
|
||||||
|
password: connection.password,
|
||||||
|
});
|
||||||
|
|
||||||
|
const isSearch = builderConfig.displayType === DisplayType.Search;
|
||||||
|
const defaultTableSelect =
|
||||||
|
'defaultTableSelectExpression' in source
|
||||||
|
? source.defaultTableSelectExpression
|
||||||
|
: undefined;
|
||||||
|
const implicitColumn =
|
||||||
|
'implicitColumnExpression' in source
|
||||||
|
? source.implicitColumnExpression
|
||||||
|
: undefined;
|
||||||
|
const searchOverrides = isSearch
|
||||||
|
? {
|
||||||
|
select: builderConfig.select || defaultTableSelect || '*',
|
||||||
|
groupBy: undefined,
|
||||||
|
granularity: undefined,
|
||||||
|
orderBy: [
|
||||||
|
{
|
||||||
|
ordering: 'DESC' as const,
|
||||||
|
valueExpression: getFirstTimestampValueExpression(
|
||||||
|
source.timestampValueExpression,
|
||||||
|
),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
limit: { limit: options?.maxResults ?? 50, offset: 0 },
|
||||||
|
}
|
||||||
|
: {};
|
||||||
|
|
||||||
|
const chartConfig = {
|
||||||
|
...builderConfig,
|
||||||
|
...searchOverrides,
|
||||||
|
from: {
|
||||||
|
databaseName: source.from.databaseName,
|
||||||
|
tableName: source.from.tableName,
|
||||||
|
},
|
||||||
|
connection: source.connection.toString(),
|
||||||
|
timestampValueExpression: source.timestampValueExpression,
|
||||||
|
implicitColumnExpression: implicitColumn,
|
||||||
|
dateRange: [startDate, endDate] as [Date, Date],
|
||||||
|
} satisfies ChartConfigWithDateRange;
|
||||||
|
|
||||||
|
const metadata = getMetadata(clickhouseClient);
|
||||||
|
const result = await clickhouseClient.queryChartConfig({
|
||||||
|
config: chartConfig,
|
||||||
|
metadata,
|
||||||
|
querySettings: source.querySettings,
|
||||||
|
});
|
||||||
|
|
||||||
|
return formatQueryResult(result);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Raw SQL tile — hydrate source fields for macro support ($__sourceTable, $__filters)
|
||||||
|
let sourceFields: {
|
||||||
|
from?: { databaseName: string; tableName: string };
|
||||||
|
implicitColumnExpression?: string;
|
||||||
|
metricTables?: MetricTable;
|
||||||
|
} = {};
|
||||||
|
if (savedConfig.source) {
|
||||||
|
const source = await getSource(teamId, savedConfig.source);
|
||||||
|
if (source) {
|
||||||
|
sourceFields = {
|
||||||
|
from: source.from,
|
||||||
|
implicitColumnExpression:
|
||||||
|
'implicitColumnExpression' in source
|
||||||
|
? source.implicitColumnExpression
|
||||||
|
: undefined,
|
||||||
|
metricTables:
|
||||||
|
source.kind === SourceKind.Metric ? source.metricTables : undefined,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const connection = await getConnectionById(
|
||||||
|
teamId,
|
||||||
|
savedConfig.connection,
|
||||||
|
true,
|
||||||
|
);
|
||||||
|
if (!connection) {
|
||||||
|
return {
|
||||||
|
isError: true as const,
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'text' as const,
|
||||||
|
text: `Connection not found: ${savedConfig.connection}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const clickhouseClient = new ClickhouseClient({
|
||||||
|
host: connection.host,
|
||||||
|
username: connection.username,
|
||||||
|
password: connection.password,
|
||||||
|
});
|
||||||
|
|
||||||
|
const chartConfig = {
|
||||||
|
...savedConfig,
|
||||||
|
...sourceFields,
|
||||||
|
dateRange: [startDate, endDate] as [Date, Date],
|
||||||
|
} satisfies ChartConfigWithDateRange;
|
||||||
|
|
||||||
|
const metadata = getMetadata(clickhouseClient);
|
||||||
|
const result = await clickhouseClient.queryChartConfig({
|
||||||
|
config: chartConfig,
|
||||||
|
metadata,
|
||||||
|
querySettings: undefined,
|
||||||
|
});
|
||||||
|
|
||||||
|
return formatQueryResult(result);
|
||||||
|
}
|
||||||
117
packages/api/src/mcp/tools/query/index.ts
Normal file
117
packages/api/src/mcp/tools/query/index.ts
Normal file
|
|
@ -0,0 +1,117 @@
|
||||||
|
import { ObjectId } from 'mongodb';
|
||||||
|
|
||||||
|
import type { ExternalDashboardTileWithId } from '@/utils/zod';
|
||||||
|
import { externalDashboardTileSchemaWithId } from '@/utils/zod';
|
||||||
|
|
||||||
|
import { withToolTracing } from '../../utils/tracing';
|
||||||
|
import type { ToolDefinition } from '../types';
|
||||||
|
import { parseTimeRange, runConfigTile } from './helpers';
|
||||||
|
import { hyperdxQuerySchema } from './schemas';
|
||||||
|
|
||||||
|
// ─── Tool definition ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const queryTools: ToolDefinition = (server, context) => {
|
||||||
|
const { teamId } = context;
|
||||||
|
|
||||||
|
server.registerTool(
|
||||||
|
'hyperdx_query',
|
||||||
|
{
|
||||||
|
title: 'Query Data',
|
||||||
|
description:
|
||||||
|
'Query observability data (logs, metrics, traces) from HyperDX. ' +
|
||||||
|
'Use hyperdx_list_sources first to find sourceId/connectionId values. ' +
|
||||||
|
'Set displayType to control the query shape.\n\n' +
|
||||||
|
'PREFERRED: Use the builder display types (line, stacked_bar, table, number, pie) ' +
|
||||||
|
'for aggregated metrics, or "search" for browsing individual log/event rows. ' +
|
||||||
|
'These are safer, easier to construct, and cover most use cases.\n\n' +
|
||||||
|
'ADVANCED: Use displayType "sql" only when you need capabilities the builder cannot express, ' +
|
||||||
|
'such as JOINs, sub-queries, CTEs, or querying tables not registered as sources. ' +
|
||||||
|
'Raw SQL requires a connectionId (not sourceId) and a hand-written ClickHouse SQL query.\n\n' +
|
||||||
|
'Column naming: Top-level columns are PascalCase (Duration, StatusCode, SpanName). ' +
|
||||||
|
"Map attributes use bracket syntax: SpanAttributes['http.method'], ResourceAttributes['service.name']. " +
|
||||||
|
'Call hyperdx_list_sources to discover available columns and attribute keys for each source.',
|
||||||
|
inputSchema: hyperdxQuerySchema,
|
||||||
|
},
|
||||||
|
withToolTracing('hyperdx_query', context, async input => {
|
||||||
|
const timeRange = parseTimeRange(input.startTime, input.endTime);
|
||||||
|
if ('error' in timeRange) {
|
||||||
|
return {
|
||||||
|
isError: true,
|
||||||
|
content: [{ type: 'text' as const, text: timeRange.error }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const { startDate, endDate } = timeRange;
|
||||||
|
|
||||||
|
let tile: ExternalDashboardTileWithId;
|
||||||
|
|
||||||
|
if (input.displayType === 'sql') {
|
||||||
|
tile = externalDashboardTileSchemaWithId.parse({
|
||||||
|
id: new ObjectId().toString(),
|
||||||
|
name: 'MCP SQL',
|
||||||
|
x: 0,
|
||||||
|
y: 0,
|
||||||
|
w: 24,
|
||||||
|
h: 6,
|
||||||
|
config: {
|
||||||
|
configType: 'sql' as const,
|
||||||
|
displayType: 'table' as const,
|
||||||
|
connectionId: input.connectionId,
|
||||||
|
sqlTemplate: input.sql,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
} else if (input.displayType === 'search') {
|
||||||
|
tile = externalDashboardTileSchemaWithId.parse({
|
||||||
|
id: new ObjectId().toString(),
|
||||||
|
name: 'MCP Search',
|
||||||
|
x: 0,
|
||||||
|
y: 0,
|
||||||
|
w: 24,
|
||||||
|
h: 6,
|
||||||
|
config: {
|
||||||
|
displayType: 'search' as const,
|
||||||
|
sourceId: input.sourceId,
|
||||||
|
select: input.columns ?? '',
|
||||||
|
where: input.where ?? '',
|
||||||
|
whereLanguage: input.whereLanguage ?? 'lucene',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
tile = externalDashboardTileSchemaWithId.parse({
|
||||||
|
id: new ObjectId().toString(),
|
||||||
|
name: 'MCP Query',
|
||||||
|
x: 0,
|
||||||
|
y: 0,
|
||||||
|
w: 12,
|
||||||
|
h: 4,
|
||||||
|
config: {
|
||||||
|
displayType: input.displayType,
|
||||||
|
sourceId: input.sourceId,
|
||||||
|
select: input.select.map(s => ({
|
||||||
|
aggFn: s.aggFn,
|
||||||
|
where: s.where ?? '',
|
||||||
|
whereLanguage: s.whereLanguage ?? 'lucene',
|
||||||
|
valueExpression: s.valueExpression,
|
||||||
|
alias: s.alias,
|
||||||
|
level: s.level,
|
||||||
|
})),
|
||||||
|
groupBy: input.groupBy ?? undefined,
|
||||||
|
orderBy: input.orderBy ?? undefined,
|
||||||
|
...(input.granularity ? { granularity: input.granularity } : {}),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return runConfigTile(
|
||||||
|
teamId.toString(),
|
||||||
|
tile,
|
||||||
|
startDate,
|
||||||
|
endDate,
|
||||||
|
input.displayType === 'search'
|
||||||
|
? { maxResults: input.maxResults }
|
||||||
|
: undefined,
|
||||||
|
);
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default queryTools;
|
||||||
220
packages/api/src/mcp/tools/query/schemas.ts
Normal file
220
packages/api/src/mcp/tools/query/schemas.ts
Normal file
|
|
@ -0,0 +1,220 @@
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
// ─── Shared schemas ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const mcpAggFnSchema = z
|
||||||
|
.enum([
|
||||||
|
'avg',
|
||||||
|
'count',
|
||||||
|
'count_distinct',
|
||||||
|
'last_value',
|
||||||
|
'max',
|
||||||
|
'min',
|
||||||
|
'quantile',
|
||||||
|
'sum',
|
||||||
|
'none',
|
||||||
|
])
|
||||||
|
.describe(
|
||||||
|
'Aggregation function:\n' +
|
||||||
|
' count – count matching rows (no valueExpression needed)\n' +
|
||||||
|
' sum / avg / min / max – aggregate a numeric column (valueExpression required)\n' +
|
||||||
|
' count_distinct – unique value count (valueExpression required)\n' +
|
||||||
|
' quantile – percentile; also set level (valueExpression required)\n' +
|
||||||
|
' last_value – most recent value of a column\n' +
|
||||||
|
' none – pass a raw expression through unchanged',
|
||||||
|
);
|
||||||
|
|
||||||
|
const mcpSelectItemSchema = z.object({
|
||||||
|
aggFn: mcpAggFnSchema,
|
||||||
|
valueExpression: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Column or expression to aggregate. Required for every aggFn except "count". ' +
|
||||||
|
'Use PascalCase for top-level columns (e.g. "Duration", "StatusCode"). ' +
|
||||||
|
"For span attributes use: SpanAttributes['key'] (e.g. SpanAttributes['http.method']). " +
|
||||||
|
"For resource attributes use: ResourceAttributes['key'] (e.g. ResourceAttributes['service.name']).",
|
||||||
|
),
|
||||||
|
where: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.default('')
|
||||||
|
.describe(
|
||||||
|
'Row filter in Lucene syntax. ' +
|
||||||
|
'Examples: "level:error", "service.name:api AND http.status_code:>=500"',
|
||||||
|
),
|
||||||
|
whereLanguage: z
|
||||||
|
.enum(['lucene', 'sql'])
|
||||||
|
.optional()
|
||||||
|
.default('lucene')
|
||||||
|
.describe('Query language for the where filter. Default: lucene'),
|
||||||
|
alias: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe('Display label for this series. Example: "Error rate"'),
|
||||||
|
level: z
|
||||||
|
.union([z.literal(0.5), z.literal(0.9), z.literal(0.95), z.literal(0.99)])
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Percentile level. Only applicable when aggFn is "quantile". ' +
|
||||||
|
'Allowed values: 0.5, 0.9, 0.95, 0.99',
|
||||||
|
),
|
||||||
|
});
|
||||||
|
|
||||||
|
const mcpTimeRangeSchema = z.object({
|
||||||
|
startTime: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Start of the query window as ISO 8601. Default: 15 minutes ago. ' +
|
||||||
|
'If results are empty, try a wider range (e.g. 24 hours).',
|
||||||
|
),
|
||||||
|
endTime: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe('End of the query window as ISO 8601. Default: now.'),
|
||||||
|
});
|
||||||
|
|
||||||
|
// ─── Discriminated union schema for hyperdx_query ───────────────────────────
|
||||||
|
|
||||||
|
const builderQuerySchema = mcpTimeRangeSchema.extend({
|
||||||
|
displayType: z
|
||||||
|
.enum(['line', 'stacked_bar', 'table', 'number', 'pie'])
|
||||||
|
.describe(
|
||||||
|
'How to visualize the query results:\n' +
|
||||||
|
' line – time-series line chart\n' +
|
||||||
|
' stacked_bar – time-series stacked bar chart\n' +
|
||||||
|
' table – grouped aggregation as rows\n' +
|
||||||
|
' number – single aggregate scalar\n' +
|
||||||
|
' pie – pie chart (one metric, grouped)',
|
||||||
|
),
|
||||||
|
sourceId: z
|
||||||
|
.string()
|
||||||
|
.describe(
|
||||||
|
'Source ID. Call hyperdx_list_sources to find available sources.',
|
||||||
|
),
|
||||||
|
select: z
|
||||||
|
.array(mcpSelectItemSchema)
|
||||||
|
.min(1)
|
||||||
|
.max(10)
|
||||||
|
.describe(
|
||||||
|
'Metrics to compute. Each item defines an aggregation. ' +
|
||||||
|
'For "number" display, provide exactly 1 item. ' +
|
||||||
|
'Example: [{ aggFn: "count" }, { aggFn: "avg", valueExpression: "Duration" }]',
|
||||||
|
),
|
||||||
|
groupBy: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Column to group/split by. ' +
|
||||||
|
'Top-level columns use PascalCase (e.g. "SpanName", "StatusCode"). ' +
|
||||||
|
"Span attributes: SpanAttributes['key'] (e.g. SpanAttributes['http.method']). " +
|
||||||
|
"Resource attributes: ResourceAttributes['key'] (e.g. ResourceAttributes['service.name']).",
|
||||||
|
),
|
||||||
|
orderBy: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe('Column to sort results by (table display only).'),
|
||||||
|
granularity: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Time bucket size for time-series charts (line, stacked_bar). ' +
|
||||||
|
'Format: "<number> <unit>" where unit is second, minute, hour, or day. ' +
|
||||||
|
'Examples: "1 minute", "5 minute", "1 hour", "1 day". ' +
|
||||||
|
'Omit to let HyperDX pick automatically based on the time range.',
|
||||||
|
),
|
||||||
|
});
|
||||||
|
|
||||||
|
const searchQuerySchema = mcpTimeRangeSchema.extend({
|
||||||
|
displayType: z
|
||||||
|
.literal('search')
|
||||||
|
.describe('Search and filter individual log/event rows'),
|
||||||
|
sourceId: z
|
||||||
|
.string()
|
||||||
|
.describe(
|
||||||
|
'Source ID. Call hyperdx_list_sources to find available sources.',
|
||||||
|
),
|
||||||
|
where: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.default('')
|
||||||
|
.describe(
|
||||||
|
'Row filter. Examples: "level:error", "service.name:api AND duration:>500"',
|
||||||
|
),
|
||||||
|
whereLanguage: z
|
||||||
|
.enum(['lucene', 'sql'])
|
||||||
|
.optional()
|
||||||
|
.default('lucene')
|
||||||
|
.describe('Query language for the where filter. Default: lucene'),
|
||||||
|
columns: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.default('')
|
||||||
|
.describe(
|
||||||
|
'Comma-separated columns to include. Leave empty for defaults. ' +
|
||||||
|
'Example: "body,service.name,duration"',
|
||||||
|
),
|
||||||
|
maxResults: z
|
||||||
|
.number()
|
||||||
|
.min(1)
|
||||||
|
.max(200)
|
||||||
|
.optional()
|
||||||
|
.default(50)
|
||||||
|
.describe(
|
||||||
|
'Maximum number of rows to return (1–200). Default: 50. ' +
|
||||||
|
'Use smaller values to reduce response size.',
|
||||||
|
),
|
||||||
|
});
|
||||||
|
|
||||||
|
const sqlQuerySchema = mcpTimeRangeSchema.extend({
|
||||||
|
displayType: z
|
||||||
|
.literal('sql')
|
||||||
|
.describe(
|
||||||
|
'ADVANCED: Execute raw SQL directly against ClickHouse. ' +
|
||||||
|
'Only use this when the builder query types (line, stacked_bar, table, number, pie, search) ' +
|
||||||
|
'cannot express the query you need — e.g. complex JOINs, sub-queries, CTEs, or ' +
|
||||||
|
'querying tables not registered as sources. ' +
|
||||||
|
'Prefer the builder display types for standard queries as they are safer and easier to use.',
|
||||||
|
),
|
||||||
|
connectionId: z
|
||||||
|
.string()
|
||||||
|
.describe(
|
||||||
|
'Connection ID (not sourceId). Call hyperdx_list_sources to find available connections.',
|
||||||
|
),
|
||||||
|
sql: z
|
||||||
|
.string()
|
||||||
|
.describe(
|
||||||
|
'Raw ClickHouse SQL query to execute. ' +
|
||||||
|
'Always include a LIMIT clause to avoid returning excessive data.\n\n' +
|
||||||
|
'QUERY PARAMETERS (ClickHouse native parameterized syntax):\n' +
|
||||||
|
' {startDateMilliseconds:Int64} — start of date range in ms since epoch\n' +
|
||||||
|
' {endDateMilliseconds:Int64} — end of date range in ms since epoch\n' +
|
||||||
|
' {intervalSeconds:Int64} — time bucket size in seconds (time-series only)\n' +
|
||||||
|
' {intervalMilliseconds:Int64} — time bucket size in milliseconds (time-series only)\n\n' +
|
||||||
|
'MACROS (expanded before execution):\n' +
|
||||||
|
' $__timeFilter(column) — expands to: column >= <start> AND column <= <end> (DateTime precision)\n' +
|
||||||
|
' $__timeFilter_ms(column) — same but with DateTime64 millisecond precision\n' +
|
||||||
|
' $__dateFilter(column) — same but with Date precision\n' +
|
||||||
|
' $__dateTimeFilter(dateCol, timeCol) — filters on both a Date and DateTime column\n' +
|
||||||
|
' $__dt(dateCol, timeCol) — alias for $__dateTimeFilter\n' +
|
||||||
|
' $__fromTime / $__toTime — start/end as DateTime values\n' +
|
||||||
|
' $__fromTime_ms / $__toTime_ms — start/end as DateTime64 values\n' +
|
||||||
|
' $__timeInterval(column) — time bucket expression: toStartOfInterval(toDateTime(column), INTERVAL ...)\n' +
|
||||||
|
' $__timeInterval_ms(column) — same with millisecond precision\n' +
|
||||||
|
' $__interval_s — raw interval in seconds\n' +
|
||||||
|
' $__filters — placeholder for dashboard filter conditions (resolves to 1=1 when no filters)\n\n' +
|
||||||
|
'Example (time-series): "SELECT $__timeInterval(TimestampTime) AS ts, ServiceName, count() ' +
|
||||||
|
'FROM otel_logs WHERE $__timeFilter(TimestampTime) GROUP BY ServiceName, ts ORDER BY ts"\n\n' +
|
||||||
|
'Example (table): "SELECT ServiceName, count() AS n FROM otel_logs ' +
|
||||||
|
'WHERE TimestampTime >= fromUnixTimestamp64Milli({startDateMilliseconds:Int64}) ' +
|
||||||
|
'AND TimestampTime < fromUnixTimestamp64Milli({endDateMilliseconds:Int64}) ' +
|
||||||
|
'GROUP BY ServiceName ORDER BY n DESC LIMIT 20"',
|
||||||
|
),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const hyperdxQuerySchema = z.discriminatedUnion('displayType', [
|
||||||
|
builderQuerySchema,
|
||||||
|
searchQuerySchema,
|
||||||
|
sqlQuerySchema,
|
||||||
|
]);
|
||||||
10
packages/api/src/mcp/tools/types.ts
Normal file
10
packages/api/src/mcp/tools/types.ts
Normal file
|
|
@ -0,0 +1,10 @@
|
||||||
|
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||||
|
|
||||||
|
export type McpContext = {
|
||||||
|
teamId: string;
|
||||||
|
userId?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type ToolDefinition = (server: McpServer, context: McpContext) => void;
|
||||||
|
|
||||||
|
export type PromptDefinition = (server: McpServer, context: McpContext) => void;
|
||||||
83
packages/api/src/mcp/utils/tracing.ts
Normal file
83
packages/api/src/mcp/utils/tracing.ts
Normal file
|
|
@ -0,0 +1,83 @@
|
||||||
|
import opentelemetry, { SpanStatusCode } from '@opentelemetry/api';
|
||||||
|
|
||||||
|
import { CODE_VERSION } from '@/config';
|
||||||
|
import logger from '@/utils/logger';
|
||||||
|
|
||||||
|
import type { McpContext } from '../tools/types';
|
||||||
|
|
||||||
|
const mcpTracer = opentelemetry.trace.getTracer('hyperdx-mcp', CODE_VERSION);
|
||||||
|
|
||||||
|
type ToolResult = {
|
||||||
|
content: { type: 'text'; text: string }[];
|
||||||
|
isError?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Wraps an MCP tool handler with tracing and structured logging.
|
||||||
|
* Creates a span for each tool invocation and logs start/end with duration.
|
||||||
|
*/
|
||||||
|
export function withToolTracing<TArgs>(
|
||||||
|
toolName: string,
|
||||||
|
context: McpContext,
|
||||||
|
handler: (args: TArgs) => Promise<ToolResult>,
|
||||||
|
): (args: TArgs) => Promise<ToolResult> {
|
||||||
|
return async (args: TArgs) => {
|
||||||
|
return mcpTracer.startActiveSpan(`mcp.tool.${toolName}`, async span => {
|
||||||
|
const startTime = Date.now();
|
||||||
|
const logContext = {
|
||||||
|
tool: toolName,
|
||||||
|
teamId: context.teamId,
|
||||||
|
userId: context.userId,
|
||||||
|
};
|
||||||
|
|
||||||
|
span.setAttribute('mcp.tool.name', toolName);
|
||||||
|
span.setAttribute('mcp.team.id', context.teamId);
|
||||||
|
if (context.userId) {
|
||||||
|
span.setAttribute('mcp.user.id', context.userId);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info(logContext, `MCP tool invoked: ${toolName}`);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await handler(args);
|
||||||
|
const durationMs = Date.now() - startTime;
|
||||||
|
|
||||||
|
if (result.isError) {
|
||||||
|
span.setStatus({ code: SpanStatusCode.ERROR });
|
||||||
|
span.setAttribute('mcp.tool.error', true);
|
||||||
|
logger.warn(
|
||||||
|
{ ...logContext, durationMs },
|
||||||
|
`MCP tool error: ${toolName}`,
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
span.setStatus({ code: SpanStatusCode.OK });
|
||||||
|
logger.info(
|
||||||
|
{ ...logContext, durationMs },
|
||||||
|
`MCP tool completed: ${toolName}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
span.setAttribute('mcp.tool.duration_ms', durationMs);
|
||||||
|
span.end();
|
||||||
|
return result;
|
||||||
|
} catch (err) {
|
||||||
|
const durationMs = Date.now() - startTime;
|
||||||
|
span.setStatus({
|
||||||
|
code: SpanStatusCode.ERROR,
|
||||||
|
message: err instanceof Error ? err.message : String(err),
|
||||||
|
});
|
||||||
|
span.recordException(
|
||||||
|
err instanceof Error ? err : new Error(String(err)),
|
||||||
|
);
|
||||||
|
span.setAttribute('mcp.tool.duration_ms', durationMs);
|
||||||
|
span.end();
|
||||||
|
|
||||||
|
logger.error(
|
||||||
|
{ ...logContext, durationMs, error: err },
|
||||||
|
`MCP tool failed: ${toolName}`,
|
||||||
|
);
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
@ -1,105 +1,30 @@
|
||||||
import { displayTypeSupportsRawSqlAlerts } from '@hyperdx/common-utils/dist/core/utils';
|
|
||||||
import { isRawSqlSavedChartConfig } from '@hyperdx/common-utils/dist/guards';
|
|
||||||
import { SearchConditionLanguageSchema as whereLanguageSchema } from '@hyperdx/common-utils/dist/types';
|
|
||||||
import express from 'express';
|
import express from 'express';
|
||||||
import { uniq } from 'lodash';
|
import { uniq } from 'lodash';
|
||||||
import { ObjectId } from 'mongodb';
|
|
||||||
import mongoose from 'mongoose';
|
import mongoose from 'mongoose';
|
||||||
import { z } from 'zod';
|
import { z } from 'zod';
|
||||||
|
|
||||||
import { deleteDashboardAlerts } from '@/controllers/alerts';
|
|
||||||
import { getConnectionsByTeam } from '@/controllers/connection';
|
|
||||||
import { deleteDashboard } from '@/controllers/dashboard';
|
import { deleteDashboard } from '@/controllers/dashboard';
|
||||||
import { getSources } from '@/controllers/sources';
|
import { getSources } from '@/controllers/sources';
|
||||||
import Dashboard from '@/models/dashboard';
|
import Dashboard from '@/models/dashboard';
|
||||||
import { validateRequestWithEnhancedErrors as validateRequest } from '@/utils/enhancedErrors';
|
import { validateRequestWithEnhancedErrors as validateRequest } from '@/utils/enhancedErrors';
|
||||||
import {
|
|
||||||
translateExternalChartToTileConfig,
|
|
||||||
translateExternalFilterToFilter,
|
|
||||||
} from '@/utils/externalApi';
|
|
||||||
import logger from '@/utils/logger';
|
import logger from '@/utils/logger';
|
||||||
import {
|
import { ExternalDashboardTileWithId, objectIdSchema } from '@/utils/zod';
|
||||||
ExternalDashboardFilter,
|
|
||||||
externalDashboardFilterSchema,
|
|
||||||
externalDashboardFilterSchemaWithId,
|
|
||||||
ExternalDashboardFilterWithId,
|
|
||||||
externalDashboardSavedFilterValueSchema,
|
|
||||||
externalDashboardTileListSchema,
|
|
||||||
ExternalDashboardTileWithId,
|
|
||||||
objectIdSchema,
|
|
||||||
tagsSchema,
|
|
||||||
} from '@/utils/zod';
|
|
||||||
|
|
||||||
import {
|
import {
|
||||||
|
cleanupDashboardAlerts,
|
||||||
|
convertExternalFiltersToInternal,
|
||||||
|
convertExternalTilesToInternal,
|
||||||
convertToExternalDashboard,
|
convertToExternalDashboard,
|
||||||
convertToInternalTileConfig,
|
createDashboardBodySchema,
|
||||||
|
getMissingConnections,
|
||||||
|
getMissingSources,
|
||||||
isConfigTile,
|
isConfigTile,
|
||||||
isRawSqlExternalTileConfig,
|
isRawSqlExternalTileConfig,
|
||||||
isSeriesTile,
|
isSeriesTile,
|
||||||
|
resolveSavedQueryLanguage,
|
||||||
|
updateDashboardBodySchema,
|
||||||
} from './utils/dashboards';
|
} from './utils/dashboards';
|
||||||
|
|
||||||
/** Returns an array of source IDs that are referenced in the tiles/filters but do not exist in the team's sources */
|
|
||||||
async function getMissingSources(
|
|
||||||
team: string | mongoose.Types.ObjectId,
|
|
||||||
tiles: ExternalDashboardTileWithId[],
|
|
||||||
filters?: (ExternalDashboardFilter | ExternalDashboardFilterWithId)[],
|
|
||||||
): Promise<string[]> {
|
|
||||||
const sourceIds = new Set<string>();
|
|
||||||
|
|
||||||
for (const tile of tiles) {
|
|
||||||
if (isSeriesTile(tile)) {
|
|
||||||
for (const series of tile.series) {
|
|
||||||
if ('sourceId' in series) {
|
|
||||||
sourceIds.add(series.sourceId);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} else if (isConfigTile(tile)) {
|
|
||||||
if ('sourceId' in tile.config && tile.config.sourceId) {
|
|
||||||
sourceIds.add(tile.config.sourceId);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (filters?.length) {
|
|
||||||
for (const filter of filters) {
|
|
||||||
if ('sourceId' in filter) {
|
|
||||||
sourceIds.add(filter.sourceId);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const existingSources = await getSources(team.toString());
|
|
||||||
const existingSourceIds = new Set(
|
|
||||||
existingSources.map(source => source._id.toString()),
|
|
||||||
);
|
|
||||||
return [...sourceIds].filter(sourceId => !existingSourceIds.has(sourceId));
|
|
||||||
}
|
|
||||||
|
|
||||||
/** Returns an array of connection IDs that are referenced in the tiles but do not belong to the team */
|
|
||||||
async function getMissingConnections(
|
|
||||||
team: string | mongoose.Types.ObjectId,
|
|
||||||
tiles: ExternalDashboardTileWithId[],
|
|
||||||
): Promise<string[]> {
|
|
||||||
const connectionIds = new Set<string>();
|
|
||||||
|
|
||||||
for (const tile of tiles) {
|
|
||||||
if (isConfigTile(tile) && isRawSqlExternalTileConfig(tile.config)) {
|
|
||||||
connectionIds.add(tile.config.connectionId);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (connectionIds.size === 0) return [];
|
|
||||||
|
|
||||||
const existingConnections = await getConnectionsByTeam(team.toString());
|
|
||||||
const existingConnectionIds = new Set(
|
|
||||||
existingConnections.map(connection => connection._id.toString()),
|
|
||||||
);
|
|
||||||
|
|
||||||
return [...connectionIds].filter(
|
|
||||||
connectionId => !existingConnectionIds.has(connectionId),
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getSourceConnectionMismatches(
|
async function getSourceConnectionMismatches(
|
||||||
team: string | mongoose.Types.ObjectId,
|
team: string | mongoose.Types.ObjectId,
|
||||||
tiles: ExternalDashboardTileWithId[],
|
tiles: ExternalDashboardTileWithId[],
|
||||||
|
|
@ -124,62 +49,6 @@ async function getSourceConnectionMismatches(
|
||||||
return sourcesWithInvalidConnections;
|
return sourcesWithInvalidConnections;
|
||||||
}
|
}
|
||||||
|
|
||||||
type SavedQueryLanguage = z.infer<typeof whereLanguageSchema>;
|
|
||||||
|
|
||||||
function resolveSavedQueryLanguage(params: {
|
|
||||||
savedQuery: string | null | undefined;
|
|
||||||
savedQueryLanguage: SavedQueryLanguage | null | undefined;
|
|
||||||
}): SavedQueryLanguage | null | undefined {
|
|
||||||
const { savedQuery, savedQueryLanguage } = params;
|
|
||||||
if (savedQueryLanguage !== undefined) return savedQueryLanguage;
|
|
||||||
if (savedQuery === null) return null;
|
|
||||||
if (savedQuery) return 'lucene';
|
|
||||||
|
|
||||||
return undefined;
|
|
||||||
}
|
|
||||||
|
|
||||||
const dashboardBodyBaseShape = {
|
|
||||||
name: z.string().max(1024),
|
|
||||||
tiles: externalDashboardTileListSchema,
|
|
||||||
tags: tagsSchema,
|
|
||||||
savedQuery: z.string().nullable().optional(),
|
|
||||||
savedQueryLanguage: whereLanguageSchema.nullable().optional(),
|
|
||||||
savedFilterValues: z
|
|
||||||
.array(externalDashboardSavedFilterValueSchema)
|
|
||||||
.optional(),
|
|
||||||
};
|
|
||||||
|
|
||||||
function buildDashboardBodySchema(filterSchema: z.ZodTypeAny): z.ZodEffects<
|
|
||||||
z.ZodObject<
|
|
||||||
typeof dashboardBodyBaseShape & {
|
|
||||||
filters: z.ZodOptional<z.ZodArray<z.ZodTypeAny>>;
|
|
||||||
}
|
|
||||||
>
|
|
||||||
> {
|
|
||||||
return z
|
|
||||||
.object({
|
|
||||||
...dashboardBodyBaseShape,
|
|
||||||
filters: z.array(filterSchema).optional(),
|
|
||||||
})
|
|
||||||
.superRefine((data, ctx) => {
|
|
||||||
if (data.savedQuery != null && data.savedQueryLanguage === null) {
|
|
||||||
ctx.addIssue({
|
|
||||||
code: z.ZodIssueCode.custom,
|
|
||||||
message:
|
|
||||||
'savedQueryLanguage cannot be null when savedQuery is provided',
|
|
||||||
path: ['savedQueryLanguage'],
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
const createDashboardBodySchema = buildDashboardBodySchema(
|
|
||||||
externalDashboardFilterSchema,
|
|
||||||
);
|
|
||||||
const updateDashboardBodySchema = buildDashboardBodySchema(
|
|
||||||
externalDashboardFilterSchemaWithId,
|
|
||||||
);
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @openapi
|
* @openapi
|
||||||
* components:
|
* components:
|
||||||
|
|
@ -1749,27 +1618,8 @@ router.post(
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
const internalTiles = tiles.map(tile => {
|
const internalTiles = convertExternalTilesToInternal(tiles);
|
||||||
const tileId = new ObjectId().toString();
|
const filtersWithIds = convertExternalFiltersToInternal(filters || []);
|
||||||
if (isConfigTile(tile)) {
|
|
||||||
return convertToInternalTileConfig({
|
|
||||||
...tile,
|
|
||||||
id: tileId,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
return translateExternalChartToTileConfig({
|
|
||||||
...tile,
|
|
||||||
id: tileId,
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
const filtersWithIds = (filters || []).map(filter =>
|
|
||||||
translateExternalFilterToFilter({
|
|
||||||
...filter,
|
|
||||||
id: new ObjectId().toString(),
|
|
||||||
}),
|
|
||||||
);
|
|
||||||
|
|
||||||
const normalizedSavedQueryLanguage = resolveSavedQueryLanguage({
|
const normalizedSavedQueryLanguage = resolveSavedQueryLanguage({
|
||||||
savedQuery,
|
savedQuery,
|
||||||
|
|
@ -2002,18 +1852,10 @@ router.put(
|
||||||
(existingDashboard?.filters ?? []).map((f: { id: string }) => f.id),
|
(existingDashboard?.filters ?? []).map((f: { id: string }) => f.id),
|
||||||
);
|
);
|
||||||
|
|
||||||
// Convert external tiles to internal charts format.
|
const internalTiles = convertExternalTilesToInternal(
|
||||||
// Generate a new id for any tile whose id doesn't match an existing tile.
|
tiles,
|
||||||
const internalTiles = tiles.map(tile => {
|
existingTileIds,
|
||||||
const tileId = existingTileIds.has(tile.id)
|
);
|
||||||
? tile.id
|
|
||||||
: new ObjectId().toString();
|
|
||||||
if (isConfigTile(tile)) {
|
|
||||||
return convertToInternalTileConfig({ ...tile, id: tileId });
|
|
||||||
}
|
|
||||||
|
|
||||||
return translateExternalChartToTileConfig({ ...tile, id: tileId });
|
|
||||||
});
|
|
||||||
|
|
||||||
const setPayload: Record<string, unknown> = {
|
const setPayload: Record<string, unknown> = {
|
||||||
name,
|
name,
|
||||||
|
|
@ -2021,13 +1863,9 @@ router.put(
|
||||||
tags: tags && uniq(tags),
|
tags: tags && uniq(tags),
|
||||||
};
|
};
|
||||||
if (filters !== undefined) {
|
if (filters !== undefined) {
|
||||||
setPayload.filters = filters.map(
|
setPayload.filters = convertExternalFiltersToInternal(
|
||||||
(filter: ExternalDashboardFilterWithId) => {
|
filters,
|
||||||
const filterId = existingFilterIds.has(filter.id)
|
existingFilterIds,
|
||||||
? filter.id
|
|
||||||
: new ObjectId().toString();
|
|
||||||
return translateExternalFilterToFilter({ ...filter, id: filterId });
|
|
||||||
},
|
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
if (savedQuery !== undefined) {
|
if (savedQuery !== undefined) {
|
||||||
|
|
@ -2054,25 +1892,12 @@ router.put(
|
||||||
return res.sendStatus(404);
|
return res.sendStatus(404);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Delete alerts for tiles that now do not support alerts
|
await cleanupDashboardAlerts({
|
||||||
const newTileIdSet = new Set(internalTiles.map(t => t.id));
|
dashboardId,
|
||||||
const tileIdsToDeleteAlerts = [
|
teamId,
|
||||||
...internalTiles
|
internalTiles,
|
||||||
.filter(
|
existingTileIds,
|
||||||
tile =>
|
});
|
||||||
isRawSqlSavedChartConfig(tile.config) &&
|
|
||||||
!displayTypeSupportsRawSqlAlerts(tile.config.displayType),
|
|
||||||
)
|
|
||||||
.map(tile => tile.id),
|
|
||||||
...[...existingTileIds].filter(id => !newTileIdSet.has(id)),
|
|
||||||
];
|
|
||||||
if (tileIdsToDeleteAlerts.length > 0) {
|
|
||||||
logger.info(
|
|
||||||
{ dashboardId, teamId, tileIds: tileIdsToDeleteAlerts },
|
|
||||||
`Deleting alerts for tiles with unsupported config or removed tiles`,
|
|
||||||
);
|
|
||||||
await deleteDashboardAlerts(dashboardId, teamId, tileIdsToDeleteAlerts);
|
|
||||||
}
|
|
||||||
|
|
||||||
res.json({
|
res.json({
|
||||||
data: convertToExternalDashboard(updatedDashboard),
|
data: convertToExternalDashboard(updatedDashboard),
|
||||||
|
|
|
||||||
|
|
@ -6,17 +6,13 @@ import chartsRouter from '@/routers/external-api/v2/charts';
|
||||||
import dashboardRouter from '@/routers/external-api/v2/dashboards';
|
import dashboardRouter from '@/routers/external-api/v2/dashboards';
|
||||||
import sourcesRouter from '@/routers/external-api/v2/sources';
|
import sourcesRouter from '@/routers/external-api/v2/sources';
|
||||||
import webhooksRouter from '@/routers/external-api/v2/webhooks';
|
import webhooksRouter from '@/routers/external-api/v2/webhooks';
|
||||||
import rateLimiter from '@/utils/rateLimiter';
|
import rateLimiter, { rateLimiterKeyGenerator } from '@/utils/rateLimiter';
|
||||||
|
|
||||||
const router = express.Router();
|
const router = express.Router();
|
||||||
|
|
||||||
const rateLimiterKeyGenerator = (req: express.Request): string => {
|
|
||||||
return req.headers.authorization ?? req.ip ?? 'unknown';
|
|
||||||
};
|
|
||||||
|
|
||||||
const defaultRateLimiter = rateLimiter({
|
const defaultRateLimiter = rateLimiter({
|
||||||
windowMs: 60 * 1000, // 1 minute
|
windowMs: 60 * 1000, // 1 minute
|
||||||
max: 100, // Limit each IP to 100 requests per `window`
|
max: 100, // Limit each API key to 100 requests per `window`
|
||||||
standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers
|
standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers
|
||||||
legacyHeaders: false, // Disable the `X-RateLimit-*` headers
|
legacyHeaders: false, // Disable the `X-RateLimit-*` headers
|
||||||
keyGenerator: rateLimiterKeyGenerator,
|
keyGenerator: rateLimiterKeyGenerator,
|
||||||
|
|
|
||||||
|
|
@ -1,3 +1,4 @@
|
||||||
|
import { displayTypeSupportsRawSqlAlerts } from '@hyperdx/common-utils/dist/core/utils';
|
||||||
import { isRawSqlSavedChartConfig } from '@hyperdx/common-utils/dist/guards';
|
import { isRawSqlSavedChartConfig } from '@hyperdx/common-utils/dist/guards';
|
||||||
import {
|
import {
|
||||||
AggregateFunctionSchema,
|
AggregateFunctionSchema,
|
||||||
|
|
@ -6,19 +7,35 @@ import {
|
||||||
RawSqlSavedChartConfig,
|
RawSqlSavedChartConfig,
|
||||||
SavedChartConfig,
|
SavedChartConfig,
|
||||||
} from '@hyperdx/common-utils/dist/types';
|
} from '@hyperdx/common-utils/dist/types';
|
||||||
|
import { SearchConditionLanguageSchema as whereLanguageSchema } from '@hyperdx/common-utils/dist/types';
|
||||||
import { pick } from 'lodash';
|
import { pick } from 'lodash';
|
||||||
import _ from 'lodash';
|
import _ from 'lodash';
|
||||||
|
import mongoose from 'mongoose';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
import { deleteDashboardAlerts } from '@/controllers/alerts';
|
||||||
|
import { getConnectionsByTeam } from '@/controllers/connection';
|
||||||
|
import { getSources } from '@/controllers/sources';
|
||||||
import { DashboardDocument } from '@/models/dashboard';
|
import { DashboardDocument } from '@/models/dashboard';
|
||||||
import { translateFilterToExternalFilter } from '@/utils/externalApi';
|
import {
|
||||||
|
translateExternalChartToTileConfig,
|
||||||
|
translateExternalFilterToFilter,
|
||||||
|
translateFilterToExternalFilter,
|
||||||
|
} from '@/utils/externalApi';
|
||||||
import logger from '@/utils/logger';
|
import logger from '@/utils/logger';
|
||||||
import {
|
import {
|
||||||
|
ExternalDashboardFilter,
|
||||||
|
externalDashboardFilterSchema,
|
||||||
|
externalDashboardFilterSchemaWithId,
|
||||||
ExternalDashboardFilterWithId,
|
ExternalDashboardFilterWithId,
|
||||||
ExternalDashboardRawSqlTileConfig,
|
ExternalDashboardRawSqlTileConfig,
|
||||||
|
externalDashboardSavedFilterValueSchema,
|
||||||
ExternalDashboardSelectItem,
|
ExternalDashboardSelectItem,
|
||||||
ExternalDashboardTileConfig,
|
ExternalDashboardTileConfig,
|
||||||
|
externalDashboardTileListSchema,
|
||||||
ExternalDashboardTileWithId,
|
ExternalDashboardTileWithId,
|
||||||
externalQuantileLevelSchema,
|
externalQuantileLevelSchema,
|
||||||
|
tagsSchema,
|
||||||
} from '@/utils/zod';
|
} from '@/utils/zod';
|
||||||
|
|
||||||
// --------------------------------------------------------------------------------
|
// --------------------------------------------------------------------------------
|
||||||
|
|
@ -475,3 +492,220 @@ export function convertToInternalTileConfig(
|
||||||
config: strippedConfig,
|
config: strippedConfig,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// --------------------------------------------------------------------------------
|
||||||
|
// Shared dashboard validation helpers (used by both the REST router and MCP tools)
|
||||||
|
// --------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/** Returns source IDs referenced in tiles/filters that do not exist for the team */
|
||||||
|
export async function getMissingSources(
|
||||||
|
team: string | mongoose.Types.ObjectId,
|
||||||
|
tiles: ExternalDashboardTileWithId[],
|
||||||
|
filters?: (ExternalDashboardFilter | ExternalDashboardFilterWithId)[],
|
||||||
|
): Promise<string[]> {
|
||||||
|
const sourceIds = new Set<string>();
|
||||||
|
|
||||||
|
for (const tile of tiles) {
|
||||||
|
if (isSeriesTile(tile)) {
|
||||||
|
for (const series of tile.series) {
|
||||||
|
if ('sourceId' in series) {
|
||||||
|
sourceIds.add(series.sourceId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (isConfigTile(tile)) {
|
||||||
|
if ('sourceId' in tile.config && tile.config.sourceId) {
|
||||||
|
sourceIds.add(tile.config.sourceId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filters?.length) {
|
||||||
|
for (const filter of filters) {
|
||||||
|
if ('sourceId' in filter) {
|
||||||
|
sourceIds.add(filter.sourceId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const existingSources = await getSources(team.toString());
|
||||||
|
const existingSourceIds = new Set(
|
||||||
|
existingSources.map(source => source._id.toString()),
|
||||||
|
);
|
||||||
|
return [...sourceIds].filter(sourceId => !existingSourceIds.has(sourceId));
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Returns connection IDs referenced in tiles that do not belong to the team */
|
||||||
|
export async function getMissingConnections(
|
||||||
|
team: string | mongoose.Types.ObjectId,
|
||||||
|
tiles: ExternalDashboardTileWithId[],
|
||||||
|
): Promise<string[]> {
|
||||||
|
const connectionIds = new Set<string>();
|
||||||
|
|
||||||
|
for (const tile of tiles) {
|
||||||
|
if (isConfigTile(tile) && isRawSqlExternalTileConfig(tile.config)) {
|
||||||
|
connectionIds.add(tile.config.connectionId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (connectionIds.size === 0) return [];
|
||||||
|
|
||||||
|
const existingConnections = await getConnectionsByTeam(team.toString());
|
||||||
|
const existingConnectionIds = new Set(
|
||||||
|
existingConnections.map(connection => connection._id.toString()),
|
||||||
|
);
|
||||||
|
|
||||||
|
return [...connectionIds].filter(
|
||||||
|
connectionId => !existingConnectionIds.has(connectionId),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
type SavedQueryLanguage = z.infer<typeof whereLanguageSchema>;
|
||||||
|
|
||||||
|
export function resolveSavedQueryLanguage(params: {
|
||||||
|
savedQuery: string | null | undefined;
|
||||||
|
savedQueryLanguage: SavedQueryLanguage | null | undefined;
|
||||||
|
}): SavedQueryLanguage | null | undefined {
|
||||||
|
const { savedQuery, savedQueryLanguage } = params;
|
||||||
|
if (savedQueryLanguage !== undefined) return savedQueryLanguage;
|
||||||
|
if (savedQuery === null) return null;
|
||||||
|
if (savedQuery) return 'lucene';
|
||||||
|
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
const dashboardBodyBaseShape = {
|
||||||
|
name: z.string().max(1024),
|
||||||
|
tiles: externalDashboardTileListSchema,
|
||||||
|
tags: tagsSchema,
|
||||||
|
savedQuery: z.string().nullable().optional(),
|
||||||
|
savedQueryLanguage: whereLanguageSchema.nullable().optional(),
|
||||||
|
savedFilterValues: z
|
||||||
|
.array(externalDashboardSavedFilterValueSchema)
|
||||||
|
.optional(),
|
||||||
|
};
|
||||||
|
|
||||||
|
// --------------------------------------------------------------------------------
|
||||||
|
// Shared tile/filter conversion helpers (used by both external API and MCP)
|
||||||
|
// --------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert external tile definitions to internal Mongoose-compatible format.
|
||||||
|
* Generates new ObjectIds for tiles that don't already have a matching ID in
|
||||||
|
* `existingTileIds` (update path) or for all tiles (create path).
|
||||||
|
*/
|
||||||
|
export function convertExternalTilesToInternal(
|
||||||
|
tiles: ExternalDashboardTileWithId[],
|
||||||
|
existingTileIds?: Set<string>,
|
||||||
|
): DashboardDocument['tiles'] {
|
||||||
|
return tiles.map(tile => {
|
||||||
|
const tileId =
|
||||||
|
existingTileIds && tile.id && existingTileIds.has(tile.id)
|
||||||
|
? tile.id
|
||||||
|
: new mongoose.Types.ObjectId().toString();
|
||||||
|
const tileWithId = { ...tile, id: tileId };
|
||||||
|
if (isConfigTile(tileWithId)) {
|
||||||
|
return convertToInternalTileConfig(tileWithId);
|
||||||
|
}
|
||||||
|
if (isSeriesTile(tileWithId)) {
|
||||||
|
return translateExternalChartToTileConfig(tileWithId);
|
||||||
|
}
|
||||||
|
// Fallback for tiles with neither config nor series — treat as empty series tile.
|
||||||
|
// This shouldn't happen with valid input, but matches the previous behavior.
|
||||||
|
return translateExternalChartToTileConfig(tileWithId as SeriesTile);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert external filter definitions to internal format, preserving IDs that
|
||||||
|
* match `existingFilterIds` (update path) or generating new ones (create path).
|
||||||
|
*/
|
||||||
|
export function convertExternalFiltersToInternal(
|
||||||
|
filters: (ExternalDashboardFilter | ExternalDashboardFilterWithId)[],
|
||||||
|
existingFilterIds?: Set<string>,
|
||||||
|
) {
|
||||||
|
return filters.map(filter => {
|
||||||
|
const filterId =
|
||||||
|
existingFilterIds && 'id' in filter && existingFilterIds.has(filter.id)
|
||||||
|
? filter.id
|
||||||
|
: new mongoose.Types.ObjectId().toString();
|
||||||
|
return translateExternalFilterToFilter({ ...filter, id: filterId });
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete alerts for tiles that were removed or converted to raw SQL
|
||||||
|
* (which doesn't support alerts).
|
||||||
|
*/
|
||||||
|
export async function cleanupDashboardAlerts({
|
||||||
|
dashboardId,
|
||||||
|
teamId,
|
||||||
|
internalTiles,
|
||||||
|
existingTileIds,
|
||||||
|
}: {
|
||||||
|
dashboardId: string;
|
||||||
|
teamId: string | mongoose.Types.ObjectId;
|
||||||
|
internalTiles: DashboardDocument['tiles'];
|
||||||
|
existingTileIds: Set<string>;
|
||||||
|
}) {
|
||||||
|
const newTileIdSet = new Set(internalTiles.map(t => t.id));
|
||||||
|
const tileIdsToDeleteAlerts = [
|
||||||
|
...internalTiles
|
||||||
|
.filter(
|
||||||
|
tile =>
|
||||||
|
isRawSqlSavedChartConfig(tile.config) &&
|
||||||
|
!displayTypeSupportsRawSqlAlerts(tile.config.displayType),
|
||||||
|
)
|
||||||
|
.map(tile => tile.id),
|
||||||
|
...[...existingTileIds].filter(id => !newTileIdSet.has(id)),
|
||||||
|
];
|
||||||
|
if (tileIdsToDeleteAlerts.length > 0) {
|
||||||
|
logger.info(
|
||||||
|
{ dashboardId, teamId, tileIds: tileIdsToDeleteAlerts },
|
||||||
|
'Deleting alerts for tiles with unsupported config or removed tiles',
|
||||||
|
);
|
||||||
|
const teamObjectId =
|
||||||
|
teamId instanceof mongoose.Types.ObjectId
|
||||||
|
? teamId
|
||||||
|
: new mongoose.Types.ObjectId(teamId);
|
||||||
|
await deleteDashboardAlerts(
|
||||||
|
dashboardId,
|
||||||
|
teamObjectId,
|
||||||
|
tileIdsToDeleteAlerts,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// --------------------------------------------------------------------------------
|
||||||
|
// Body validation schemas
|
||||||
|
// --------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
function buildDashboardBodySchema(filterSchema: z.ZodTypeAny): z.ZodEffects<
|
||||||
|
z.ZodObject<
|
||||||
|
typeof dashboardBodyBaseShape & {
|
||||||
|
filters: z.ZodOptional<z.ZodArray<z.ZodTypeAny>>;
|
||||||
|
}
|
||||||
|
>
|
||||||
|
> {
|
||||||
|
return z
|
||||||
|
.object({
|
||||||
|
...dashboardBodyBaseShape,
|
||||||
|
filters: z.array(filterSchema).optional(),
|
||||||
|
})
|
||||||
|
.superRefine((data, ctx) => {
|
||||||
|
if (data.savedQuery != null && data.savedQueryLanguage === null) {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
message:
|
||||||
|
'savedQueryLanguage cannot be null when savedQuery is provided',
|
||||||
|
path: ['savedQueryLanguage'],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export const createDashboardBodySchema = buildDashboardBodySchema(
|
||||||
|
externalDashboardFilterSchema,
|
||||||
|
);
|
||||||
|
export const updateDashboardBodySchema = buildDashboardBodySchema(
|
||||||
|
externalDashboardFilterSchemaWithId,
|
||||||
|
);
|
||||||
|
|
|
||||||
121
packages/api/src/utils/__tests__/trimToolResponse.test.ts
Normal file
121
packages/api/src/utils/__tests__/trimToolResponse.test.ts
Normal file
|
|
@ -0,0 +1,121 @@
|
||||||
|
import { trimToolResponse } from '../trimToolResponse';
|
||||||
|
|
||||||
|
describe('trimToolResponse', () => {
|
||||||
|
describe('small data (within maxSize)', () => {
|
||||||
|
it('should return primitive values unchanged', () => {
|
||||||
|
expect(trimToolResponse(42)).toBe(42);
|
||||||
|
expect(trimToolResponse('hello')).toBe('hello');
|
||||||
|
expect(trimToolResponse(null)).toBeNull();
|
||||||
|
expect(trimToolResponse(true)).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return small arrays unchanged', () => {
|
||||||
|
const data = [1, 2, 3, 4, 5];
|
||||||
|
expect(trimToolResponse(data)).toEqual(data);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return small objects unchanged', () => {
|
||||||
|
const data = { a: 1, b: 'hello', c: [1, 2, 3] };
|
||||||
|
expect(trimToolResponse(data)).toEqual(data);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('large arrays', () => {
|
||||||
|
it('should trim large arrays to fit within maxSize', () => {
|
||||||
|
// Create an array that exceeds maxSize
|
||||||
|
const largeArray = Array.from({ length: 500 }, (_, i) => ({
|
||||||
|
id: i,
|
||||||
|
data: 'x'.repeat(200),
|
||||||
|
}));
|
||||||
|
|
||||||
|
const result = trimToolResponse(largeArray, 5000);
|
||||||
|
expect(Array.isArray(result)).toBe(true);
|
||||||
|
expect(result.length).toBeLessThan(largeArray.length);
|
||||||
|
expect(result.length).toBeGreaterThanOrEqual(10); // minimum 10 items
|
||||||
|
expect(JSON.stringify(result).length).toBeLessThanOrEqual(5000);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should keep at least 10 items', () => {
|
||||||
|
const largeArray = Array.from({ length: 100 }, (_, i) => ({
|
||||||
|
id: i,
|
||||||
|
data: 'x'.repeat(500),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// maxSize so small even 10 items may exceed it, but we keep 10 minimum
|
||||||
|
const result = trimToolResponse(largeArray, 100);
|
||||||
|
expect(Array.isArray(result)).toBe(true);
|
||||||
|
expect(result.length).toBeGreaterThanOrEqual(10);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not trim arrays that fit within maxSize', () => {
|
||||||
|
const smallArray = [1, 2, 3, 4, 5];
|
||||||
|
const result = trimToolResponse(smallArray, 50000);
|
||||||
|
expect(result).toEqual(smallArray);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('large objects', () => {
|
||||||
|
it('should trim large objects to fit within maxSize', () => {
|
||||||
|
const largeObj: Record<string, string> = {};
|
||||||
|
for (let i = 0; i < 100; i++) {
|
||||||
|
largeObj[`key_${i}`] = 'x'.repeat(200);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = trimToolResponse(largeObj, 5000);
|
||||||
|
// The trimmed result must be smaller than the original
|
||||||
|
expect(JSON.stringify(result).length).toBeLessThan(
|
||||||
|
JSON.stringify(largeObj).length,
|
||||||
|
);
|
||||||
|
// All keys should still be present (values are trimmed, not dropped)
|
||||||
|
expect(
|
||||||
|
Object.keys(result).filter(k => k !== '__hdx_trimmed'),
|
||||||
|
).toHaveLength(100);
|
||||||
|
// The sentinel flag should be set to indicate trimming occurred
|
||||||
|
expect(result.__hdx_trimmed).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not trim objects that fit within maxSize', () => {
|
||||||
|
const obj = { a: 1, b: 2 };
|
||||||
|
const result = trimToolResponse(obj, 50000);
|
||||||
|
expect(result).toEqual(obj);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('getAIMetadata structure', () => {
|
||||||
|
it('should handle objects with allFieldsWithKeys and keyValues', () => {
|
||||||
|
const metadataObj = {
|
||||||
|
allFieldsWithKeys: Array.from({ length: 200 }, (_, i) => ({
|
||||||
|
field: `field_${i}`,
|
||||||
|
key: `key_${i}`,
|
||||||
|
extra: 'x'.repeat(100),
|
||||||
|
})),
|
||||||
|
keyValues: Object.fromEntries(
|
||||||
|
Array.from({ length: 200 }, (_, i) => [`kv_${i}`, 'x'.repeat(100)]),
|
||||||
|
),
|
||||||
|
otherProp: 'preserved',
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = trimToolResponse(metadataObj, 5000);
|
||||||
|
expect(result).toHaveProperty('allFieldsWithKeys');
|
||||||
|
expect(result).toHaveProperty('keyValues');
|
||||||
|
expect(result).toHaveProperty('otherProp', 'preserved');
|
||||||
|
expect(Array.isArray(result.allFieldsWithKeys)).toBe(true);
|
||||||
|
expect(typeof result.keyValues).toBe('object');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('default maxSize', () => {
|
||||||
|
it('should use 50000 as default maxSize', () => {
|
||||||
|
// Create data just over default size
|
||||||
|
const data = Array.from({ length: 1000 }, (_, i) => ({
|
||||||
|
id: i,
|
||||||
|
payload: 'x'.repeat(100),
|
||||||
|
}));
|
||||||
|
|
||||||
|
const resultDefault = trimToolResponse(data);
|
||||||
|
const resultExplicit = trimToolResponse(data, 50000);
|
||||||
|
// Both should produce the same result
|
||||||
|
expect(resultDefault.length).toBe(resultExplicit.length);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
@ -1,5 +1,10 @@
|
||||||
|
import express from 'express';
|
||||||
import rateLimit, { Options } from 'express-rate-limit';
|
import rateLimit, { Options } from 'express-rate-limit';
|
||||||
|
|
||||||
|
export const rateLimiterKeyGenerator = (req: express.Request): string => {
|
||||||
|
return req.headers.authorization ?? req.ip ?? 'unknown';
|
||||||
|
};
|
||||||
|
|
||||||
export default (config?: Partial<Options>) => {
|
export default (config?: Partial<Options>) => {
|
||||||
return rateLimit({
|
return rateLimit({
|
||||||
...config,
|
...config,
|
||||||
|
|
|
||||||
109
packages/api/src/utils/trimToolResponse.ts
Normal file
109
packages/api/src/utils/trimToolResponse.ts
Normal file
|
|
@ -0,0 +1,109 @@
|
||||||
|
import logger from '@/utils/logger';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Trims large data structures to prevent "Request Entity Too Large" errors
|
||||||
|
* when multiple tool calls accumulate data in the conversation history.
|
||||||
|
*/
|
||||||
|
export function trimToolResponse(data: any, maxSize: number = 50000): any {
|
||||||
|
const serialized = JSON.stringify(data);
|
||||||
|
|
||||||
|
// If data is within acceptable size, return as-is
|
||||||
|
if (serialized.length <= maxSize) {
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.warn(
|
||||||
|
`Tool response too large, trimming data. Original Size: ${serialized.length}, Max Size: ${maxSize}`,
|
||||||
|
);
|
||||||
|
|
||||||
|
// Handle different data structures
|
||||||
|
if (Array.isArray(data)) {
|
||||||
|
return trimArray(data, maxSize);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof data === 'object' && data !== null) {
|
||||||
|
return trimObject(data, maxSize);
|
||||||
|
}
|
||||||
|
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
|
||||||
|
function trimArray(arr: any[], maxSize: number): any[] {
|
||||||
|
// Keep reducing array size until it fits
|
||||||
|
let result = [...arr];
|
||||||
|
let resultSize = JSON.stringify(result).length;
|
||||||
|
|
||||||
|
while (resultSize > maxSize && result.length > 10) {
|
||||||
|
// Keep at least 10 items
|
||||||
|
const newLength = Math.max(10, Math.floor(result.length * 0.7));
|
||||||
|
result = result.slice(0, newLength);
|
||||||
|
resultSize = JSON.stringify(result).length;
|
||||||
|
}
|
||||||
|
|
||||||
|
// If we're still over budget (e.g. a single item exceeds maxSize), truncate
|
||||||
|
// individual oversized items so the array itself stays within the limit.
|
||||||
|
if (resultSize > maxSize) {
|
||||||
|
result = result.map(item => {
|
||||||
|
const itemStr = JSON.stringify(item);
|
||||||
|
if (itemStr.length > maxSize) {
|
||||||
|
logger.info(
|
||||||
|
`Trimming oversized array item (${itemStr.length} bytes > ${maxSize} limit)`,
|
||||||
|
);
|
||||||
|
if (typeof item === 'object' && item !== null) {
|
||||||
|
return trimObject(item, maxSize);
|
||||||
|
}
|
||||||
|
// Scalar that is itself too large — return a truncation marker
|
||||||
|
return { __hdx_trimmed: true, originalSize: itemStr.length };
|
||||||
|
}
|
||||||
|
return item;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (result.length < arr.length) {
|
||||||
|
logger.info(`Trimmed array from ${arr.length} to ${result.length} items`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Keys in trimObject come exclusively from Object.entries() on internal tool
|
||||||
|
// response data — never from user-supplied HTTP input — so bracket-notation
|
||||||
|
// writes are not an injection risk; see inline eslint-disable comments below.
|
||||||
|
function trimObject(obj: any, maxSize: number): any {
|
||||||
|
const entries = Object.entries(obj);
|
||||||
|
if (entries.length === 0) return obj;
|
||||||
|
|
||||||
|
const result: any = {};
|
||||||
|
|
||||||
|
// Give each key an equal share of the budget so that no single large value
|
||||||
|
// crowds out the rest (e.g. a large array at key[0] eating all the budget
|
||||||
|
// before key[1] gets a chance to appear).
|
||||||
|
const perKeyBudget = Math.floor(maxSize / entries.length);
|
||||||
|
let trimmed = false;
|
||||||
|
|
||||||
|
for (const [key, value] of entries) {
|
||||||
|
const valueStr = JSON.stringify(value);
|
||||||
|
|
||||||
|
if (valueStr.length <= perKeyBudget) {
|
||||||
|
result[key] = value; // eslint-disable-line security/detect-object-injection
|
||||||
|
} else {
|
||||||
|
logger.info(
|
||||||
|
`Trimming oversized object value at key "${key}" (${valueStr.length} bytes > ${perKeyBudget} per-key budget)`,
|
||||||
|
);
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
result[key] = trimArray(value, perKeyBudget); // eslint-disable-line security/detect-object-injection
|
||||||
|
} else if (typeof value === 'object' && value !== null) {
|
||||||
|
result[key] = trimObject(value, perKeyBudget); // eslint-disable-line security/detect-object-injection
|
||||||
|
} else {
|
||||||
|
result[key] = { __hdx_trimmed: true, originalSize: valueStr.length }; // eslint-disable-line security/detect-object-injection
|
||||||
|
}
|
||||||
|
trimmed = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (trimmed) {
|
||||||
|
result.__hdx_trimmed = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
@ -3,14 +3,9 @@
|
||||||
"compilerOptions": {
|
"compilerOptions": {
|
||||||
"baseUrl": "./src",
|
"baseUrl": "./src",
|
||||||
"paths": {
|
"paths": {
|
||||||
"@/*": [
|
"@/*": ["./*"]
|
||||||
"./*"
|
|
||||||
]
|
|
||||||
},
|
},
|
||||||
"types": [
|
"types": ["jest", "node"],
|
||||||
"jest",
|
|
||||||
"node"
|
|
||||||
],
|
|
||||||
"outDir": "build",
|
"outDir": "build",
|
||||||
"isolatedModules": true,
|
"isolatedModules": true,
|
||||||
"skipLibCheck": true,
|
"skipLibCheck": true,
|
||||||
|
|
@ -18,12 +13,6 @@
|
||||||
"strict": true,
|
"strict": true,
|
||||||
"target": "ES2022"
|
"target": "ES2022"
|
||||||
},
|
},
|
||||||
"include": [
|
"include": ["src", "migrations", "scripts"],
|
||||||
"src",
|
"exclude": ["node_modules"]
|
||||||
"migrations",
|
}
|
||||||
"scripts"
|
|
||||||
],
|
|
||||||
"exclude": [
|
|
||||||
"node_modules"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
|
|
|
||||||
|
|
@ -2,9 +2,7 @@
|
||||||
"version": "0.1.0",
|
"version": "0.1.0",
|
||||||
"name": ".NET Runtime Metrics",
|
"name": ".NET Runtime Metrics",
|
||||||
"description": "Garbage collection, heap fragmentation, exception, thread pool, and CPU metrics for .NET v9+ applications with the OpenTelemetry.Instrumentation.Runtime package",
|
"description": "Garbage collection, heap fragmentation, exception, thread pool, and CPU metrics for .NET v9+ applications with the OpenTelemetry.Instrumentation.Runtime package",
|
||||||
"tags": [
|
"tags": ["OTel Runtime Metrics"],
|
||||||
"OTel Runtime Metrics"
|
|
||||||
],
|
|
||||||
"tiles": [
|
"tiles": [
|
||||||
{
|
{
|
||||||
"id": "6d4b7e",
|
"id": "6d4b7e",
|
||||||
|
|
|
||||||
|
|
@ -2,9 +2,7 @@
|
||||||
"version": "0.1.0",
|
"version": "0.1.0",
|
||||||
"name": "Go Runtime Metrics",
|
"name": "Go Runtime Metrics",
|
||||||
"description": "Memory usage, allocations, GC targets, CPU utilization, and goroutine metrics for Go applications with the OTel Runtime (v0.62+) and Host Metrics instrumentations",
|
"description": "Memory usage, allocations, GC targets, CPU utilization, and goroutine metrics for Go applications with the OTel Runtime (v0.62+) and Host Metrics instrumentations",
|
||||||
"tags": [
|
"tags": ["OTel Runtime Metrics"],
|
||||||
"OTel Runtime Metrics"
|
|
||||||
],
|
|
||||||
"tiles": [
|
"tiles": [
|
||||||
{
|
{
|
||||||
"id": "109853",
|
"id": "109853",
|
||||||
|
|
|
||||||
|
|
@ -2,9 +2,7 @@
|
||||||
"version": "0.1.0",
|
"version": "0.1.0",
|
||||||
"name": "JVM Runtime Metrics",
|
"name": "JVM Runtime Metrics",
|
||||||
"description": "Heap memory, CPU utilization, threads, and GC metrics for JVM applications with the OTel Java Agent v2+ wth JVM v17+",
|
"description": "Heap memory, CPU utilization, threads, and GC metrics for JVM applications with the OTel Java Agent v2+ wth JVM v17+",
|
||||||
"tags": [
|
"tags": ["OTel Runtime Metrics"],
|
||||||
"OTel Runtime Metrics"
|
|
||||||
],
|
|
||||||
"tiles": [
|
"tiles": [
|
||||||
{
|
{
|
||||||
"id": "dd919b",
|
"id": "dd919b",
|
||||||
|
|
|
||||||
|
|
@ -2,9 +2,7 @@
|
||||||
"version": "0.1.0",
|
"version": "0.1.0",
|
||||||
"name": "Node.js Runtime Metrics",
|
"name": "Node.js Runtime Metrics",
|
||||||
"description": "Event loop delay, heap usage, CPU utilization, and V8 memory for Node.js applications with OTel Runtime and Host Metrics instrumentations",
|
"description": "Event loop delay, heap usage, CPU utilization, and V8 memory for Node.js applications with OTel Runtime and Host Metrics instrumentations",
|
||||||
"tags": [
|
"tags": ["OTel Runtime Metrics"],
|
||||||
"OTel Runtime Metrics"
|
|
||||||
],
|
|
||||||
"tiles": [
|
"tiles": [
|
||||||
{
|
{
|
||||||
"id": "55ef66",
|
"id": "55ef66",
|
||||||
|
|
|
||||||
162
yarn.lock
162
yarn.lock
|
|
@ -4366,6 +4366,15 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"@hono/node-server@npm:^1.19.9":
|
||||||
|
version: 1.19.12
|
||||||
|
resolution: "@hono/node-server@npm:1.19.12"
|
||||||
|
peerDependencies:
|
||||||
|
hono: ^4
|
||||||
|
checksum: 10c0/06b5c7ba775d585abebe1ece155f3b00cc9013319818c58bba6f1b1e71df44d1d0d6c6e66cd50350ab6f0b9219a182f83c9fe3074b81a1d1ebb0a1493a73db9e
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"@hookform/resolvers@npm:^3.9.0":
|
"@hookform/resolvers@npm:^3.9.0":
|
||||||
version: 3.9.0
|
version: 3.9.0
|
||||||
resolution: "@hookform/resolvers@npm:3.9.0"
|
resolution: "@hookform/resolvers@npm:3.9.0"
|
||||||
|
|
@ -4416,6 +4425,7 @@ __metadata:
|
||||||
"@hyperdx/common-utils": "npm:^0.17.1"
|
"@hyperdx/common-utils": "npm:^0.17.1"
|
||||||
"@hyperdx/node-opentelemetry": "npm:^0.9.0"
|
"@hyperdx/node-opentelemetry": "npm:^0.9.0"
|
||||||
"@hyperdx/passport-local-mongoose": "npm:^9.0.1"
|
"@hyperdx/passport-local-mongoose": "npm:^9.0.1"
|
||||||
|
"@modelcontextprotocol/sdk": "npm:^1.27.1"
|
||||||
"@opentelemetry/api": "npm:^1.8.0"
|
"@opentelemetry/api": "npm:^1.8.0"
|
||||||
"@opentelemetry/host-metrics": "npm:^0.35.5"
|
"@opentelemetry/host-metrics": "npm:^0.35.5"
|
||||||
"@opentelemetry/sdk-metrics": "npm:^1.30.1"
|
"@opentelemetry/sdk-metrics": "npm:^1.30.1"
|
||||||
|
|
@ -6081,6 +6091,39 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"@modelcontextprotocol/sdk@npm:^1.27.1":
|
||||||
|
version: 1.29.0
|
||||||
|
resolution: "@modelcontextprotocol/sdk@npm:1.29.0"
|
||||||
|
dependencies:
|
||||||
|
"@hono/node-server": "npm:^1.19.9"
|
||||||
|
ajv: "npm:^8.17.1"
|
||||||
|
ajv-formats: "npm:^3.0.1"
|
||||||
|
content-type: "npm:^1.0.5"
|
||||||
|
cors: "npm:^2.8.5"
|
||||||
|
cross-spawn: "npm:^7.0.5"
|
||||||
|
eventsource: "npm:^3.0.2"
|
||||||
|
eventsource-parser: "npm:^3.0.0"
|
||||||
|
express: "npm:^5.2.1"
|
||||||
|
express-rate-limit: "npm:^8.2.1"
|
||||||
|
hono: "npm:^4.11.4"
|
||||||
|
jose: "npm:^6.1.3"
|
||||||
|
json-schema-typed: "npm:^8.0.2"
|
||||||
|
pkce-challenge: "npm:^5.0.0"
|
||||||
|
raw-body: "npm:^3.0.0"
|
||||||
|
zod: "npm:^3.25 || ^4.0"
|
||||||
|
zod-to-json-schema: "npm:^3.25.1"
|
||||||
|
peerDependencies:
|
||||||
|
"@cfworker/json-schema": ^4.1.1
|
||||||
|
zod: ^3.25 || ^4.0
|
||||||
|
peerDependenciesMeta:
|
||||||
|
"@cfworker/json-schema":
|
||||||
|
optional: true
|
||||||
|
zod:
|
||||||
|
optional: false
|
||||||
|
checksum: 10c0/7c4bc339205b1652330cd4e6b121cc859079655f2b9c0506bbb15563ba0d07924bda3d949705530532db7f4d2cb86d633dc8f92bc32803d97c7bece2ac63e29f
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"@mongodb-js/saslprep@npm:^1.1.0, @mongodb-js/saslprep@npm:^1.1.9":
|
"@mongodb-js/saslprep@npm:^1.1.0, @mongodb-js/saslprep@npm:^1.1.9":
|
||||||
version: 1.2.2
|
version: 1.2.2
|
||||||
resolution: "@mongodb-js/saslprep@npm:1.2.2"
|
resolution: "@mongodb-js/saslprep@npm:1.2.2"
|
||||||
|
|
@ -11403,6 +11446,20 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"ajv-formats@npm:^3.0.1":
|
||||||
|
version: 3.0.1
|
||||||
|
resolution: "ajv-formats@npm:3.0.1"
|
||||||
|
dependencies:
|
||||||
|
ajv: "npm:^8.0.0"
|
||||||
|
peerDependencies:
|
||||||
|
ajv: ^8.0.0
|
||||||
|
peerDependenciesMeta:
|
||||||
|
ajv:
|
||||||
|
optional: true
|
||||||
|
checksum: 10c0/168d6bca1ea9f163b41c8147bae537e67bd963357a5488a1eaf3abe8baa8eec806d4e45f15b10767e6020679315c7e1e5e6803088dfb84efa2b4e9353b83dd0a
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"ajv-keywords@npm:^3.5.2":
|
"ajv-keywords@npm:^3.5.2":
|
||||||
version: 3.5.2
|
version: 3.5.2
|
||||||
resolution: "ajv-keywords@npm:3.5.2"
|
resolution: "ajv-keywords@npm:3.5.2"
|
||||||
|
|
@ -13523,7 +13580,7 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
"content-type@npm:~1.0.4, content-type@npm:~1.0.5":
|
"content-type@npm:^1.0.5, content-type@npm:~1.0.4, content-type@npm:~1.0.5":
|
||||||
version: 1.0.5
|
version: 1.0.5
|
||||||
resolution: "content-type@npm:1.0.5"
|
resolution: "content-type@npm:1.0.5"
|
||||||
checksum: 10c0/b76ebed15c000aee4678c3707e0860cb6abd4e680a598c0a26e17f0bfae723ec9cc2802f0ff1bc6e4d80603719010431d2231018373d4dde10f9ccff9dadf5af
|
checksum: 10c0/b76ebed15c000aee4678c3707e0860cb6abd4e680a598c0a26e17f0bfae723ec9cc2802f0ff1bc6e4d80603719010431d2231018373d4dde10f9ccff9dadf5af
|
||||||
|
|
@ -13739,7 +13796,7 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
"cross-spawn@npm:^7.0.0, cross-spawn@npm:^7.0.3, cross-spawn@npm:^7.0.6":
|
"cross-spawn@npm:^7.0.0, cross-spawn@npm:^7.0.3, cross-spawn@npm:^7.0.5, cross-spawn@npm:^7.0.6":
|
||||||
version: 7.0.6
|
version: 7.0.6
|
||||||
resolution: "cross-spawn@npm:7.0.6"
|
resolution: "cross-spawn@npm:7.0.6"
|
||||||
dependencies:
|
dependencies:
|
||||||
|
|
@ -16124,13 +16181,22 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
"eventsource-parser@npm:^3.0.6":
|
"eventsource-parser@npm:^3.0.0, eventsource-parser@npm:^3.0.1, eventsource-parser@npm:^3.0.6":
|
||||||
version: 3.0.6
|
version: 3.0.6
|
||||||
resolution: "eventsource-parser@npm:3.0.6"
|
resolution: "eventsource-parser@npm:3.0.6"
|
||||||
checksum: 10c0/70b8ccec7dac767ef2eca43f355e0979e70415701691382a042a2df8d6a68da6c2fca35363669821f3da876d29c02abe9b232964637c1b6635c940df05ada78a
|
checksum: 10c0/70b8ccec7dac767ef2eca43f355e0979e70415701691382a042a2df8d6a68da6c2fca35363669821f3da876d29c02abe9b232964637c1b6635c940df05ada78a
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"eventsource@npm:^3.0.2":
|
||||||
|
version: 3.0.7
|
||||||
|
resolution: "eventsource@npm:3.0.7"
|
||||||
|
dependencies:
|
||||||
|
eventsource-parser: "npm:^3.0.1"
|
||||||
|
checksum: 10c0/c48a73c38f300e33e9f11375d4ee969f25cbb0519608a12378a38068055ae8b55b6e0e8a49c3f91c784068434efe1d9f01eb49b6315b04b0da9157879ce2f67d
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"evp_bytestokey@npm:^1.0.0, evp_bytestokey@npm:^1.0.3":
|
"evp_bytestokey@npm:^1.0.0, evp_bytestokey@npm:^1.0.3":
|
||||||
version: 1.0.3
|
version: 1.0.3
|
||||||
resolution: "evp_bytestokey@npm:1.0.3"
|
resolution: "evp_bytestokey@npm:1.0.3"
|
||||||
|
|
@ -16246,6 +16312,17 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"express-rate-limit@npm:^8.2.1":
|
||||||
|
version: 8.3.2
|
||||||
|
resolution: "express-rate-limit@npm:8.3.2"
|
||||||
|
dependencies:
|
||||||
|
ip-address: "npm:10.1.0"
|
||||||
|
peerDependencies:
|
||||||
|
express: ">= 4.11"
|
||||||
|
checksum: 10c0/5b64d0691071086cdb8cfc6bcd5e761f5687cf4fabdebfe2a043ea5b4d31443637181e7be71e7ffabce76aee816daee62c1ca83250045847957da408a129f650
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"express-session@npm:^1.17.3":
|
"express-session@npm:^1.17.3":
|
||||||
version: 1.17.3
|
version: 1.17.3
|
||||||
resolution: "express-session@npm:1.17.3"
|
resolution: "express-session@npm:1.17.3"
|
||||||
|
|
@ -17717,6 +17794,13 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"hono@npm:^4.11.4":
|
||||||
|
version: 4.12.9
|
||||||
|
resolution: "hono@npm:4.12.9"
|
||||||
|
checksum: 10c0/393256552642f681e52935163508d9605e5552e186d9b99ff2caf219d4248341b83e3eb975c40a97149c86890d19d73421efc889aa465a28eb5920ccc42cff34
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"hookified@npm:^1.13.0":
|
"hookified@npm:^1.13.0":
|
||||||
version: 1.14.0
|
version: 1.14.0
|
||||||
resolution: "hookified@npm:1.14.0"
|
resolution: "hookified@npm:1.14.0"
|
||||||
|
|
@ -17984,6 +18068,15 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"iconv-lite@npm:~0.7.0":
|
||||||
|
version: 0.7.2
|
||||||
|
resolution: "iconv-lite@npm:0.7.2"
|
||||||
|
dependencies:
|
||||||
|
safer-buffer: "npm:>= 2.1.2 < 3.0.0"
|
||||||
|
checksum: 10c0/3c228920f3bd307f56bf8363706a776f4a060eb042f131cd23855ceca962951b264d0997ab38a1ad340e1c5df8499ed26e1f4f0db6b2a2ad9befaff22f14b722
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"icss-utils@npm:^5.0.0, icss-utils@npm:^5.1.0":
|
"icss-utils@npm:^5.0.0, icss-utils@npm:^5.1.0":
|
||||||
version: 5.1.0
|
version: 5.1.0
|
||||||
resolution: "icss-utils@npm:5.1.0"
|
resolution: "icss-utils@npm:5.1.0"
|
||||||
|
|
@ -18261,6 +18354,13 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"ip-address@npm:10.1.0":
|
||||||
|
version: 10.1.0
|
||||||
|
resolution: "ip-address@npm:10.1.0"
|
||||||
|
checksum: 10c0/0103516cfa93f6433b3bd7333fa876eb21263912329bfa47010af5e16934eeeff86f3d2ae700a3744a137839ddfad62b900c7a445607884a49b5d1e32a3d7566
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"ip-address@npm:^9.0.5":
|
"ip-address@npm:^9.0.5":
|
||||||
version: 9.0.5
|
version: 9.0.5
|
||||||
resolution: "ip-address@npm:9.0.5"
|
resolution: "ip-address@npm:9.0.5"
|
||||||
|
|
@ -20043,6 +20143,13 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"jose@npm:^6.1.3":
|
||||||
|
version: 6.2.2
|
||||||
|
resolution: "jose@npm:6.2.2"
|
||||||
|
checksum: 10c0/201f4776d77eccd339de99fb3ba940fdf03db15e64be7a99b511e53c232e3f3818e3f21b95223d62f99315a2ab76b4251cedd94e067de56893e45273a8d2151b
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"jotai@npm:^2.5.1":
|
"jotai@npm:^2.5.1":
|
||||||
version: 2.5.1
|
version: 2.5.1
|
||||||
resolution: "jotai@npm:2.5.1"
|
resolution: "jotai@npm:2.5.1"
|
||||||
|
|
@ -20206,6 +20313,13 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"json-schema-typed@npm:^8.0.2":
|
||||||
|
version: 8.0.2
|
||||||
|
resolution: "json-schema-typed@npm:8.0.2"
|
||||||
|
checksum: 10c0/89f5e2fb1495483b705c027203c07277ee6bf2665165ad25a9cb55de5af7f72570326d13d32565180781e4083ad5c9688102f222baed7b353c2f39c1e02b0428
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"json-schema@npm:^0.4.0":
|
"json-schema@npm:^0.4.0":
|
||||||
version: 0.4.0
|
version: 0.4.0
|
||||||
resolution: "json-schema@npm:0.4.0"
|
resolution: "json-schema@npm:0.4.0"
|
||||||
|
|
@ -23523,6 +23637,13 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"pkce-challenge@npm:^5.0.0":
|
||||||
|
version: 5.0.1
|
||||||
|
resolution: "pkce-challenge@npm:5.0.1"
|
||||||
|
checksum: 10c0/207f4cb976682f27e8324eb49cf71937c98fbb8341a0b8f6142bc6f664825b30e049a54a21b5c034e823ee3c3d412f10d74bd21de78e17452a6a496c2991f57c
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"pkg-dir@npm:^4.1.0, pkg-dir@npm:^4.2.0":
|
"pkg-dir@npm:^4.1.0, pkg-dir@npm:^4.2.0":
|
||||||
version: 4.2.0
|
version: 4.2.0
|
||||||
resolution: "pkg-dir@npm:4.2.0"
|
resolution: "pkg-dir@npm:4.2.0"
|
||||||
|
|
@ -24318,6 +24439,18 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"raw-body@npm:^3.0.0":
|
||||||
|
version: 3.0.2
|
||||||
|
resolution: "raw-body@npm:3.0.2"
|
||||||
|
dependencies:
|
||||||
|
bytes: "npm:~3.1.2"
|
||||||
|
http-errors: "npm:~2.0.1"
|
||||||
|
iconv-lite: "npm:~0.7.0"
|
||||||
|
unpipe: "npm:~1.0.0"
|
||||||
|
checksum: 10c0/d266678d08e1e7abea62c0ce5864344e980fa81c64f6b481e9842c5beaed2cdcf975f658a3ccd67ad35fc919c1f6664ccc106067801850286a6cbe101de89f29
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"raw-body@npm:~2.5.3":
|
"raw-body@npm:~2.5.3":
|
||||||
version: 2.5.3
|
version: 2.5.3
|
||||||
resolution: "raw-body@npm:2.5.3"
|
resolution: "raw-body@npm:2.5.3"
|
||||||
|
|
@ -29559,6 +29692,15 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"zod-to-json-schema@npm:^3.25.1":
|
||||||
|
version: 3.25.2
|
||||||
|
resolution: "zod-to-json-schema@npm:3.25.2"
|
||||||
|
peerDependencies:
|
||||||
|
zod: ^3.25.28 || ^4
|
||||||
|
checksum: 10c0/dd300554393903022487688af14fbda5c719ba8179702bb55b3aa86318830467f0f7beb7d654036975ac963dc4843b72e59636448bfff9a0608f277bb6a14939
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"zod-validation-error@npm:^3.0.3":
|
"zod-validation-error@npm:^3.0.3":
|
||||||
version: 3.4.0
|
version: 3.4.0
|
||||||
resolution: "zod-validation-error@npm:3.4.0"
|
resolution: "zod-validation-error@npm:3.4.0"
|
||||||
|
|
@ -29591,6 +29733,13 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
|
"zod@npm:^3.25 || ^4.0, zod@npm:^4.1.11, zod@npm:^4.3.6":
|
||||||
|
version: 4.3.6
|
||||||
|
resolution: "zod@npm:4.3.6"
|
||||||
|
checksum: 10c0/860d25a81ab41d33aa25f8d0d07b091a04acb426e605f396227a796e9e800c44723ed96d0f53a512b57be3d1520f45bf69c0cb3b378a232a00787a2609625307
|
||||||
|
languageName: node
|
||||||
|
linkType: hard
|
||||||
|
|
||||||
"zod@npm:^3.25.0 || ^4.0.0":
|
"zod@npm:^3.25.0 || ^4.0.0":
|
||||||
version: 4.1.13
|
version: 4.1.13
|
||||||
resolution: "zod@npm:4.1.13"
|
resolution: "zod@npm:4.1.13"
|
||||||
|
|
@ -29598,13 +29747,6 @@ __metadata:
|
||||||
languageName: node
|
languageName: node
|
||||||
linkType: hard
|
linkType: hard
|
||||||
|
|
||||||
"zod@npm:^4.1.11, zod@npm:^4.3.6":
|
|
||||||
version: 4.3.6
|
|
||||||
resolution: "zod@npm:4.3.6"
|
|
||||||
checksum: 10c0/860d25a81ab41d33aa25f8d0d07b091a04acb426e605f396227a796e9e800c44723ed96d0f53a512b57be3d1520f45bf69c0cb3b378a232a00787a2609625307
|
|
||||||
languageName: node
|
|
||||||
linkType: hard
|
|
||||||
|
|
||||||
"zustand@npm:^4.4.0":
|
"zustand@npm:^4.4.0":
|
||||||
version: 4.5.7
|
version: 4.5.7
|
||||||
resolution: "zustand@npm:4.5.7"
|
resolution: "zustand@npm:4.5.7"
|
||||||
|
|
|
||||||
Loading…
Reference in a new issue