mirror of
https://github.com/MinaSaad1/pbi-cli
synced 2026-04-21 13:37:19 +00:00
feat: v3.10.0 -- Report layer, 12 skills, Desktop auto-sync
feat: v3.10.0 -- Report layer, 12 skills, Desktop auto-sync
This commit is contained in:
commit
3eb68c56d3
84 changed files with 15011 additions and 216 deletions
106
CHANGELOG.md
106
CHANGELOG.md
|
|
@ -5,6 +5,112 @@ All notable changes to this project will be documented in this file.
|
|||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [3.10.0] - 2026-04-02
|
||||
|
||||
### Added
|
||||
- Split `power-bi-report` skill into 5 focused skills: `power-bi-report` (overview), `power-bi-visuals`, `power-bi-pages`, `power-bi-themes`, `power-bi-filters` (12 skills total)
|
||||
- CLAUDE.md snippet now organises skills by layer (Semantic Model vs Report Layer)
|
||||
- Skill triggering test suite (19 prompts, 12 skills)
|
||||
|
||||
### Fixed
|
||||
- `filter_add_topn` inner subquery now correctly references category table when it differs from order-by table
|
||||
- `theme_set` resourcePackages structure now matches Desktop format (flat `items` array)
|
||||
- `visual_bind` type annotation corrected to `list[dict[str, Any]]`
|
||||
- `tmdl_diff` hierarchy changes reported as `hierarchies_*` instead of falling to `other_*`
|
||||
- Missing `VisualTypeError` and `ReportNotFoundError` classes added to `errors.py`
|
||||
- `report`, `visual`, `filters`, `format`, `bookmarks` command groups registered in CLI
|
||||
|
||||
### Changed
|
||||
- README rewritten to cover both semantic model and report layers, 12 skills, 27 command groups, 32 visual types
|
||||
|
||||
## [3.9.0] - 2026-04-01
|
||||
|
||||
### Added
|
||||
- `pbi database diff-tmdl` command: compare two TMDL export folders offline, summarise changes (tables, measures, columns, relationships, model properties); lineageTag-only changes are stripped to avoid false positives
|
||||
|
||||
### Fixed
|
||||
- `filter_add_topn` inner subquery now correctly references the category table when it differs from the order-by table (cross-table TopN filters)
|
||||
- `theme_set` resourcePackages structure now matches Desktop format (flat `items`, not nested `resourcePackage`)
|
||||
- `visual_bind` type annotation corrected from `list[dict[str, str]]` to `list[dict[str, Any]]`
|
||||
- `tmdl_diff` hierarchy changes now reported as `hierarchies_*` instead of falling through to `other_*`
|
||||
- Missing `VisualTypeError` and `ReportNotFoundError` error classes added to `errors.py`
|
||||
- `report`, `visual`, `filters`, `format`, `bookmarks` command groups registered in CLI (were implemented but inaccessible)
|
||||
|
||||
## [3.8.0] - 2026-04-01
|
||||
|
||||
### Added
|
||||
- `azureMap` visual type (Azure Maps) with Category and Size roles
|
||||
- `pageBinding` field surfaced in `page_get()` for drillthrough pages
|
||||
|
||||
### Fixed
|
||||
- `card` and `multiRowCard` queryState role corrected from `Fields` to `Values` (matches Desktop)
|
||||
- `kpi` template: added `TrendLine` queryState key (date/axis column for sparkline)
|
||||
- `gauge` template: added `MaxValue` queryState key (target/max measure)
|
||||
- `MaxValue` added to `MEASURE_ROLES`
|
||||
- kpi role aliases: `--trend`, `--trend_line`
|
||||
- gauge role aliases: `--max`, `--max_value`, `--target`
|
||||
|
||||
## [3.7.0] - 2026-04-01
|
||||
|
||||
### Added
|
||||
- `page_type`, `filter_config`, and `visual_interactions` fields in page read operations (`page_get`, `page_list`)
|
||||
|
||||
## [3.6.0] - 2026-04-01
|
||||
|
||||
### Added
|
||||
- `image` visual type (static images, no data binding)
|
||||
- `shape` visual type (decorative shapes)
|
||||
- `textbox` visual type (rich text)
|
||||
- `pageNavigator` visual type (page navigation buttons)
|
||||
- `advancedSlicerVisual` visual type (tile/image slicer)
|
||||
|
||||
## [3.5.0] - 2026-04-01
|
||||
|
||||
### Added
|
||||
- `clusteredColumnChart` visual type with aliases `clustered_column`
|
||||
- `clusteredBarChart` visual type with aliases `clustered_bar`
|
||||
- `textSlicer` visual type with alias `text_slicer`
|
||||
- `listSlicer` visual type with alias `list_slicer`
|
||||
|
||||
## [3.4.0] - 2026-03-31
|
||||
|
||||
### Added
|
||||
- `cardVisual` (modern card) visual type with `Data` role and aliases `card_visual`, `modern_card`
|
||||
- `actionButton` visual type with alias `action_button`, `button`
|
||||
- `pbi report set-background` command to set page background colour
|
||||
- `pbi report set-visibility` command to hide/show pages
|
||||
- `pbi visual set-container` command for border, background, and title on visual containers
|
||||
|
||||
### Fixed
|
||||
- Visual container schema URL updated from 1.5.0 to 2.7.0
|
||||
- `visualGroup` containers tagged as type `group` in `visual_list`
|
||||
- Colour validation, KeyError guards, visibility surfacing, no-op detection
|
||||
|
||||
## [3.0.0] - 2026-03-31
|
||||
|
||||
### Added
|
||||
- **PBIR report layer**: `pbi report` command group (create, info, validate, list-pages, add-page, delete-page, get-page, set-theme, get-theme, diff-theme, preview, reload, convert)
|
||||
- **Visual CRUD**: `pbi visual` command group (add, get, list, update, delete, bind, where, bulk-bind, bulk-update, bulk-delete, calc-add, calc-list, calc-delete, set-container)
|
||||
- **Filters**: `pbi filters` command group (list, add-categorical, add-topn, add-relative-date, remove, clear)
|
||||
- **Formatting**: `pbi format` command group (get, clear, background-gradient, background-conditional, background-measure)
|
||||
- **Bookmarks**: `pbi bookmarks` command group (list, get, add, delete, set-visibility)
|
||||
- 20 visual type templates (barChart, lineChart, card, tableEx, pivotTable, slicer, kpi, gauge, donutChart, columnChart, areaChart, ribbonChart, waterfallChart, scatterChart, funnelChart, multiRowCard, treemap, cardNew, stackedBarChart, lineStackedColumnComboChart)
|
||||
- HTML preview server (`pbi report preview`) with live reload
|
||||
- Power BI Desktop reload trigger (`pbi report reload`)
|
||||
- PBIR path auto-detection (walk-up from CWD, `.pbip` sibling detection)
|
||||
- `power-bi-report` Claude Code skill (8th skill)
|
||||
- Visual data binding with `Table[Column]` notation and role aliases
|
||||
- Visual calculations (calc-add, calc-list, calc-delete)
|
||||
- Bulk operations for mass visual updates across pages
|
||||
|
||||
### Changed
|
||||
- Architecture: pbi-cli now covers both semantic model layer (via .NET TOM) and report layer (via PBIR JSON files)
|
||||
|
||||
## [2.2.0] - 2026-03-27
|
||||
|
||||
### Added
|
||||
- Promotional SVG assets and redesigned README
|
||||
|
||||
## [2.0.0] - 2026-03-27
|
||||
|
||||
### Breaking
|
||||
|
|
|
|||
69
README.md
69
README.md
|
|
@ -4,7 +4,7 @@
|
|||
|
||||
<p align="center">
|
||||
<b>Give Claude Code the Power BI skills it needs.</b><br/>
|
||||
Install once, then just ask Claude to work with your semantic models.
|
||||
Install once, then just ask Claude to work with your semantic models <i>and</i> reports.
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
|
|
@ -117,22 +117,34 @@ Add the printed path to your system PATH, then restart your terminal. We recomme
|
|||
|
||||
## Skills
|
||||
|
||||
After running `pbi connect`, Claude Code discovers **7 Power BI skills** automatically. Each skill teaches Claude a different area. You don't need to memorize commands.
|
||||
After running `pbi connect`, Claude Code discovers **12 Power BI skills** automatically. Each skill teaches Claude a different area. You don't need to memorize commands.
|
||||
|
||||
<p align="center">
|
||||
<img src="https://raw.githubusercontent.com/MinaSaad1/pbi-cli/master/assets/skills-hub.svg" alt="7 Skills" width="850"/>
|
||||
<img src="https://raw.githubusercontent.com/MinaSaad1/pbi-cli/master/assets/skills-hub.svg" alt="12 Skills" width="850"/>
|
||||
</p>
|
||||
|
||||
### Semantic Model Skills (require `pbi connect`)
|
||||
|
||||
| Skill | What you say | What Claude does |
|
||||
|-------|-------------|-----------------|
|
||||
| **DAX** | *"What are the top 10 products by revenue?"* | Writes and executes DAX queries, validates syntax |
|
||||
| **Modeling** | *"Create a star schema with Sales and Calendar"* | Creates tables, relationships, measures, hierarchies |
|
||||
| **Deployment** | *"Save a snapshot before I make changes"* | Exports/imports TMDL, manages transactions |
|
||||
| **Deployment** | *"Save a snapshot before I make changes"* | Exports/imports TMDL, manages transactions, diffs snapshots |
|
||||
| **Security** | *"Set up RLS for regional managers"* | Creates roles, filters, perspectives |
|
||||
| **Docs** | *"Document everything in this model"* | Generates data dictionaries, measure inventories |
|
||||
| **Partitions** | *"Show me the M query for the Sales table"* | Manages partitions, expressions, calendar config |
|
||||
| **Diagnostics** | *"Why is this query so slow?"* | Traces queries, checks model health, benchmarks |
|
||||
|
||||
### Report Layer Skills (no connection needed)
|
||||
|
||||
| Skill | What you say | What Claude does |
|
||||
|-------|-------------|-----------------|
|
||||
| **Report** | *"Create a new report project for Sales"* | Scaffolds PBIR reports, validates structure, previews layout |
|
||||
| **Visuals** | *"Add a bar chart showing revenue by region"* | Adds, binds, updates, bulk-manages 32 visual types |
|
||||
| **Pages** | *"Add an Executive Overview page"* | Manages pages, bookmarks, visibility, drillthrough |
|
||||
| **Themes** | *"Apply our corporate brand colours"* | Applies themes, conditional formatting, colour scales |
|
||||
| **Filters** | *"Show only the top 10 products"* | Adds page/visual filters (TopN, date, categorical) |
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
|
@ -141,7 +153,10 @@ After running `pbi connect`, Claude Code discovers **7 Power BI skills** automat
|
|||
<img src="https://raw.githubusercontent.com/MinaSaad1/pbi-cli/master/assets/architecture-flow.svg" alt="Architecture" width="850"/>
|
||||
</p>
|
||||
|
||||
Direct in-process .NET interop from Python to Power BI Desktop. No MCP server, no external binaries, sub-second execution.
|
||||
**Two layers, one CLI:**
|
||||
|
||||
- **Semantic Model layer** -- Direct in-process .NET interop from Python to Power BI Desktop via TOM/ADOMD. No MCP server, no external binaries, sub-second execution.
|
||||
- **Report layer** -- Reads and writes PBIR (Enhanced Report Format) JSON files directly. No connection needed. Works with `.pbip` projects.
|
||||
|
||||
<details>
|
||||
<summary><b>Configuration details</b></summary>
|
||||
|
|
@ -163,21 +178,57 @@ Bundled DLLs ship inside the Python package (`pbi_cli/dlls/`).
|
|||
|
||||
## All Commands
|
||||
|
||||
<p align="center">
|
||||
<img src="https://raw.githubusercontent.com/MinaSaad1/pbi-cli/master/assets/feature-grid.svg" alt="22 Command Groups" width="850"/>
|
||||
</p>
|
||||
27 command groups covering both the semantic model and the report layer.
|
||||
|
||||
| Category | Commands |
|
||||
|----------|----------|
|
||||
| **Queries** | `dax execute`, `dax validate`, `dax clear-cache` |
|
||||
| **Model** | `table`, `column`, `measure`, `relationship`, `hierarchy`, `calc-group` |
|
||||
| **Deploy** | `database export-tmdl`, `database import-tmdl`, `database export-tmsl`, `database diff-tmdl`, `transaction` |
|
||||
| **Security** | `security-role`, `perspective` |
|
||||
| **Connect** | `connect`, `disconnect`, `connections list`, `connections last` |
|
||||
| **Data** | `partition`, `expression`, `calendar`, `advanced culture` |
|
||||
| **Diagnostics** | `trace start/stop/fetch/export`, `model stats` |
|
||||
| **Report** | `report create`, `report info`, `report validate`, `report preview`, `report reload` |
|
||||
| **Pages** | `report add-page`, `report delete-page`, `report get-page`, `report set-background`, `report set-visibility` |
|
||||
| **Visuals** | `visual add/get/list/update/delete`, `visual bind`, `visual bulk-bind/bulk-update/bulk-delete`, `visual where` |
|
||||
| **Filters** | `filters list`, `filters add-categorical/add-topn/add-relative-date`, `filters remove/clear` |
|
||||
| **Formatting** | `format get/clear`, `format background-gradient/background-conditional/background-measure` |
|
||||
| **Bookmarks** | `bookmarks list/get/add/delete/set-visibility` |
|
||||
| **Tools** | `setup`, `repl`, `skills install/list/uninstall` |
|
||||
|
||||
Use `--json` for machine-readable output (for scripts and AI agents):
|
||||
|
||||
```bash
|
||||
pbi --json measure list
|
||||
pbi --json dax execute "EVALUATE Sales"
|
||||
pbi --json visual list --page overview
|
||||
```
|
||||
|
||||
Run `pbi <command> --help` for full options.
|
||||
|
||||
---
|
||||
|
||||
## Supported Visual Types (32)
|
||||
|
||||
pbi-cli supports creating and binding data to 32 Power BI visual types:
|
||||
|
||||
**Charts:** bar, line, column, area, ribbon, waterfall, stacked bar, clustered bar, clustered column, scatter, funnel, combo, donut/pie, treemap
|
||||
|
||||
**Cards/KPIs:** card (legacy), cardVisual (modern), cardNew, multi-row card, KPI, gauge
|
||||
|
||||
**Tables:** table, matrix (pivot table)
|
||||
|
||||
**Slicers:** slicer, text slicer, list slicer, advanced slicer (tile/image)
|
||||
|
||||
**Maps:** Azure Map
|
||||
|
||||
**Decorative:** action button, image, shape, textbox, page navigator
|
||||
|
||||
Use friendly aliases: `pbi visual add --page p1 --type bar` instead of `--type barChart`.
|
||||
|
||||
---
|
||||
|
||||
## REPL Mode
|
||||
|
||||
For interactive work, the REPL keeps a persistent connection:
|
||||
|
|
@ -208,7 +259,7 @@ pip install -e ".[dev]"
|
|||
```bash
|
||||
ruff check src/ tests/ # Lint
|
||||
mypy src/ # Type check
|
||||
pytest -m "not e2e" # Run tests
|
||||
pytest -m "not e2e" # Run tests (488 tests)
|
||||
```
|
||||
|
||||
---
|
||||
|
|
|
|||
252
README.pypi.md
252
README.pypi.md
|
|
@ -1,7 +1,7 @@
|
|||
<img src="https://raw.githubusercontent.com/MinaSaad1/pbi-cli/master/assets/header.svg" alt="pbi-cli" width="800"/>
|
||||
|
||||
**Give Claude Code the Power BI skills it needs.**
|
||||
Install once, then just ask Claude to work with your semantic models.
|
||||
Install once, then just ask Claude to work with your semantic models *and* reports.
|
||||
|
||||
<a href="https://pypi.org/project/pbi-cli-tool/"><img src="https://img.shields.io/pypi/pyversions/pbi-cli-tool?style=flat-square&color=3776ab&label=Python" alt="Python"></a>
|
||||
<a href="https://github.com/MinaSaad1/pbi-cli/actions"><img src="https://img.shields.io/github/actions/workflow/status/MinaSaad1/pbi-cli/ci.yml?branch=master&style=flat-square&label=CI" alt="CI"></a>
|
||||
|
|
@ -13,14 +13,18 @@ Install once, then just ask Claude to work with your semantic models.
|
|||
|
||||
## What is this?
|
||||
|
||||
pbi-cli gives **Claude Code** (and other AI agents) the ability to manage Power BI semantic models. It ships with 7 skills that Claude discovers automatically. You ask in plain English, Claude uses the right `pbi` commands.
|
||||
pbi-cli gives **Claude Code** (and other AI agents) the ability to manage Power BI semantic models **and reports**. It ships with 12 skills that Claude discovers automatically. You ask in plain English, Claude uses the right `pbi` commands.
|
||||
|
||||
```
|
||||
You Claude Code pbi-cli Power BI
|
||||
"Add a YTD measure ---> Uses Power BI ---> CLI commands ---> Desktop
|
||||
to the Sales table" skills
|
||||
to the Sales table" skills (12)
|
||||
```
|
||||
|
||||
**Two layers, one CLI:**
|
||||
- **Semantic Model** -- Direct .NET interop to Power BI Desktop (measures, tables, DAX, security)
|
||||
- **Report Layer** -- Reads/writes PBIR JSON files directly (visuals, pages, themes, filters)
|
||||
|
||||
---
|
||||
|
||||
## Get Started
|
||||
|
|
@ -40,8 +44,6 @@ pbi connect # 2. Auto-detects Power BI Desktop and installs ski
|
|||
|
||||
That's it. Open Power BI Desktop with a `.pbix` file, run `pbi connect`, and everything is set up automatically. Open Claude Code and start asking.
|
||||
|
||||
You can also specify the port manually: `pbi connect -d localhost:54321`
|
||||
|
||||
> **Requires:** Windows with Python 3.10+ and Power BI Desktop running.
|
||||
|
||||
<details>
|
||||
|
|
@ -60,12 +62,7 @@ Find the directory:
|
|||
python -c "import site; print(site.getusersitepackages().replace('site-packages','Scripts'))"
|
||||
```
|
||||
|
||||
Add the printed path to your system PATH:
|
||||
```cmd
|
||||
setx PATH "%PATH%;C:\Users\YourName\AppData\Roaming\Python\PythonXXX\Scripts"
|
||||
```
|
||||
|
||||
Then **restart your terminal**. We recommend `pipx` instead to avoid this entirely.
|
||||
Add the printed path to your system PATH, then restart your terminal. We recommend `pipx` to avoid this entirely.
|
||||
|
||||
</details>
|
||||
|
||||
|
|
@ -73,182 +70,76 @@ Then **restart your terminal**. We recommend `pipx` instead to avoid this entire
|
|||
|
||||
## Skills
|
||||
|
||||
After running `pbi connect`, Claude Code discovers **7 Power BI skills**. Each skill teaches Claude a different area of Power BI development. You don't need to memorize commands. Just describe what you want.
|
||||
After running `pbi connect`, Claude Code discovers **12 Power BI skills**. Each skill teaches Claude a different area. You don't need to memorize commands.
|
||||
|
||||
```
|
||||
You: "Set up RLS for regional managers"
|
||||
|
|
||||
v
|
||||
Claude Code --> Picks the right skill
|
||||
|
|
||||
+-- Modeling
|
||||
+-- DAX
|
||||
+-- Deployment
|
||||
+-- Security
|
||||
+-- Documentation
|
||||
+-- Diagnostics
|
||||
+-- Partitions
|
||||
```
|
||||
### Semantic Model (require `pbi connect`)
|
||||
|
||||
### Modeling
|
||||
| Skill | What you say | What Claude does |
|
||||
|-------|-------------|-----------------|
|
||||
| **DAX** | *"Top 10 products by revenue?"* | Writes and executes DAX queries |
|
||||
| **Modeling** | *"Create a star schema"* | Creates tables, relationships, measures |
|
||||
| **Deployment** | *"Save a snapshot"* | Exports/imports TMDL, diffs snapshots |
|
||||
| **Security** | *"Set up RLS"* | Creates roles, filters, perspectives |
|
||||
| **Docs** | *"Document this model"* | Generates data dictionaries |
|
||||
| **Partitions** | *"Show the M query"* | Manages partitions, expressions |
|
||||
| **Diagnostics** | *"Why is this slow?"* | Traces queries, benchmarks |
|
||||
|
||||
> *"Create a star schema with Sales, Products, and Calendar tables"*
|
||||
### Report Layer (no connection needed)
|
||||
|
||||
Claude creates the tables, sets up relationships, marks the date table, and adds formatted measures. Covers tables, columns, measures, relationships, hierarchies, and calculation groups.
|
||||
|
||||
<details>
|
||||
<summary>Example: what Claude runs behind the scenes</summary>
|
||||
|
||||
```bash
|
||||
pbi table create Sales --mode Import
|
||||
pbi table create Products --mode Import
|
||||
pbi table create Calendar --mode Import
|
||||
pbi relationship create --from-table Sales --from-column ProductKey --to-table Products --to-column ProductKey
|
||||
pbi relationship create --from-table Sales --from-column DateKey --to-table Calendar --to-column DateKey
|
||||
pbi table mark-date Calendar --date-column Date
|
||||
pbi measure create "Total Revenue" -e "SUM(Sales[Revenue])" -t Sales --format-string "$#,##0"
|
||||
```
|
||||
</details>
|
||||
|
||||
### DAX
|
||||
|
||||
> *"What are the top 10 products by revenue this year?"*
|
||||
|
||||
Claude writes and executes DAX queries, validates syntax, and creates measures with time intelligence patterns like YTD, previous year, and rolling averages.
|
||||
|
||||
<details>
|
||||
<summary>Example: what Claude runs behind the scenes</summary>
|
||||
|
||||
```bash
|
||||
pbi dax execute "
|
||||
EVALUATE
|
||||
TOPN(
|
||||
10,
|
||||
ADDCOLUMNS(VALUES(Products[Name]), \"Revenue\", CALCULATE(SUM(Sales[Amount]))),
|
||||
[Revenue], DESC
|
||||
)
|
||||
"
|
||||
```
|
||||
</details>
|
||||
|
||||
### Deployment
|
||||
|
||||
> *"Export the model to Git for version control"*
|
||||
|
||||
Claude exports your model as TMDL files for version control and imports them back. Handles transactions for safe multi-step changes.
|
||||
|
||||
<details>
|
||||
<summary>Example: what Claude runs behind the scenes</summary>
|
||||
|
||||
```bash
|
||||
pbi database export-tmdl ./model/
|
||||
# ... you commit to git ...
|
||||
pbi database import-tmdl ./model/
|
||||
```
|
||||
</details>
|
||||
|
||||
### Security
|
||||
|
||||
> *"Set up row-level security so regional managers only see their region"*
|
||||
|
||||
Claude creates RLS roles with descriptions, sets up perspectives for different user groups, and exports the model for version control.
|
||||
|
||||
<details>
|
||||
<summary>Example: what Claude runs behind the scenes</summary>
|
||||
|
||||
```bash
|
||||
pbi security-role create "Regional Manager" --description "Users see only their region's data"
|
||||
pbi perspective create "Executive Dashboard"
|
||||
pbi perspective create "Regional Detail"
|
||||
pbi database export-tmdl ./model-backup/
|
||||
```
|
||||
</details>
|
||||
|
||||
### Documentation
|
||||
|
||||
> *"Document everything in this model"*
|
||||
|
||||
Claude catalogs every table, measure, column, and relationship. Generates data dictionaries, measure inventories, and can export the full model as TMDL for human-readable reference.
|
||||
|
||||
<details>
|
||||
<summary>Example: what Claude runs behind the scenes</summary>
|
||||
|
||||
```bash
|
||||
pbi --json model get
|
||||
pbi --json model stats
|
||||
pbi --json table list
|
||||
pbi --json measure list
|
||||
pbi --json relationship list
|
||||
pbi database export-tmdl ./model-docs/
|
||||
```
|
||||
</details>
|
||||
|
||||
### Diagnostics
|
||||
|
||||
> *"Why is this DAX query so slow?"*
|
||||
|
||||
Claude traces query execution, clears caches for clean benchmarks, checks model health, and verifies the environment.
|
||||
|
||||
<details>
|
||||
<summary>Example: what Claude runs behind the scenes</summary>
|
||||
|
||||
```bash
|
||||
pbi dax clear-cache
|
||||
pbi trace start
|
||||
pbi dax execute "EVALUATE SUMMARIZECOLUMNS(...)" --timeout 300
|
||||
pbi trace stop
|
||||
pbi trace export ./trace.json
|
||||
```
|
||||
</details>
|
||||
|
||||
### Partitions & Expressions
|
||||
|
||||
> *"Set up partitions for incremental refresh on the Sales table"*
|
||||
|
||||
Claude manages table partitions, shared M/Power Query expressions, and calendar table configuration.
|
||||
|
||||
<details>
|
||||
<summary>Example: what Claude runs behind the scenes</summary>
|
||||
|
||||
```bash
|
||||
pbi partition list --table Sales
|
||||
pbi partition create "Sales_2024" --table Sales --expression "..." --mode Import
|
||||
pbi expression create "ServerURL" --expression '"https://api.example.com"'
|
||||
pbi calendar mark Calendar --date-column Date
|
||||
```
|
||||
</details>
|
||||
| Skill | What you say | What Claude does |
|
||||
|-------|-------------|-----------------|
|
||||
| **Report** | *"Create a new report"* | Scaffolds PBIR reports, validates, previews |
|
||||
| **Visuals** | *"Add a bar chart"* | Adds, binds, bulk-manages 32 visual types |
|
||||
| **Pages** | *"Add a new page"* | Manages pages, bookmarks, drillthrough |
|
||||
| **Themes** | *"Apply brand colours"* | Themes, conditional formatting |
|
||||
| **Filters** | *"Show top 10 only"* | TopN, date, categorical filters |
|
||||
|
||||
---
|
||||
|
||||
## All Commands
|
||||
|
||||
22 command groups covering the full Power BI Tabular Object Model. You rarely need these directly when using Claude Code, but they're available for scripting, CI/CD, or manual use.
|
||||
27 command groups covering both the semantic model and the report layer.
|
||||
|
||||
| Category | Commands |
|
||||
|----------|----------|
|
||||
| **Queries** | `dax execute`, `dax validate`, `dax clear-cache` |
|
||||
| **Model** | `table`, `column`, `measure`, `relationship`, `hierarchy`, `calc-group` |
|
||||
| **Deploy** | `database export-tmdl`, `database import-tmdl`, `database export-tmsl`, `transaction` |
|
||||
| **Deploy** | `database export-tmdl/import-tmdl/export-tmsl/diff-tmdl`, `transaction` |
|
||||
| **Security** | `security-role`, `perspective` |
|
||||
| **Connect** | `connect`, `disconnect`, `connections list`, `connections last` |
|
||||
| **Connect** | `connect`, `disconnect`, `connections list/last` |
|
||||
| **Data** | `partition`, `expression`, `calendar`, `advanced culture` |
|
||||
| **Diagnostics** | `trace start`, `trace stop`, `trace fetch`, `trace export`, `model stats` |
|
||||
| **Tools** | `setup`, `repl`, `skills install`, `skills list` |
|
||||
| **Diagnostics** | `trace start/stop/fetch/export`, `model stats` |
|
||||
| **Report** | `report create/info/validate/preview/reload`, `report add-page/delete-page/get-page` |
|
||||
| **Visuals** | `visual add/get/list/update/delete/bind`, `visual bulk-bind/bulk-update/bulk-delete` |
|
||||
| **Filters** | `filters list/add-categorical/add-topn/add-relative-date/remove/clear` |
|
||||
| **Formatting** | `format get/clear/background-gradient/background-conditional/background-measure` |
|
||||
| **Bookmarks** | `bookmarks list/get/add/delete/set-visibility` |
|
||||
| **Tools** | `setup`, `repl`, `skills install/list/uninstall` |
|
||||
|
||||
Use `--json` for machine-readable output (for scripts and AI agents):
|
||||
Use `--json` for machine-readable output:
|
||||
|
||||
```bash
|
||||
pbi --json measure list
|
||||
pbi --json dax execute "EVALUATE Sales"
|
||||
pbi --json visual list --page overview
|
||||
```
|
||||
|
||||
Run `pbi <command> --help` for full options.
|
||||
---
|
||||
|
||||
## 32 Supported Visual Types
|
||||
|
||||
**Charts:** bar, line, column, area, ribbon, waterfall, stacked bar, clustered bar, clustered column, scatter, funnel, combo, donut/pie, treemap
|
||||
|
||||
**Cards/KPIs:** card, cardVisual (modern), cardNew, multi-row card, KPI, gauge
|
||||
|
||||
**Tables:** table, matrix • **Slicers:** slicer, text, list, advanced • **Maps:** Azure Map
|
||||
|
||||
**Decorative:** action button, image, shape, textbox, page navigator
|
||||
|
||||
---
|
||||
|
||||
## REPL Mode
|
||||
|
||||
For interactive work, the REPL keeps a persistent connection alive between commands:
|
||||
For interactive work, the REPL keeps a persistent connection:
|
||||
|
||||
```
|
||||
$ pbi repl
|
||||
|
|
@ -261,44 +152,7 @@ pbi(localhost-54321)> dax execute "EVALUATE TOPN(5, Sales)"
|
|||
pbi(localhost-54321)> exit
|
||||
```
|
||||
|
||||
Tab completion, command history, and a dynamic prompt showing your active connection.
|
||||
|
||||
---
|
||||
|
||||
## How It Works
|
||||
|
||||
pbi-cli connects directly to Power BI Desktop's Analysis Services engine via pythonnet and the .NET Tabular Object Model (TOM). No external binaries or MCP servers needed. Everything runs in-process for sub-second command execution.
|
||||
|
||||
```
|
||||
+------------------+ +---------------------+ +------------------+
|
||||
| pbi-cli | | Bundled TOM DLLs | | Power BI |
|
||||
| (Python CLI) | pythnet | (.NET in-process) | XMLA | Desktop |
|
||||
| Click commands |-------->| TOM / ADOMD.NET |-------->| msmdsrv.exe |
|
||||
+------------------+ +---------------------+ +------------------+
|
||||
```
|
||||
|
||||
**Why a CLI?** When an AI agent uses an MCP server directly, the tool schemas consume ~4,000+ tokens per tool in the context window. A `pbi` command costs ~30 tokens. Same capabilities, 100x less context.
|
||||
|
||||
<details>
|
||||
<summary><b>Configuration details</b></summary>
|
||||
|
||||
All config lives in `~/.pbi-cli/`:
|
||||
|
||||
```
|
||||
~/.pbi-cli/
|
||||
config.json # Default connection preference
|
||||
connections.json # Named connections
|
||||
repl_history # REPL command history
|
||||
```
|
||||
|
||||
Bundled DLLs ship inside the Python package (`pbi_cli/dlls/`):
|
||||
- Microsoft.AnalysisServices.Tabular.dll
|
||||
- Microsoft.AnalysisServices.AdomdClient.dll
|
||||
- Microsoft.AnalysisServices.Core.dll
|
||||
- Microsoft.AnalysisServices.Tabular.Json.dll
|
||||
- Microsoft.AnalysisServices.dll
|
||||
|
||||
</details>
|
||||
Tab completion, command history, and a dynamic prompt.
|
||||
|
||||
---
|
||||
|
||||
|
|
@ -313,7 +167,7 @@ pip install -e ".[dev]"
|
|||
```bash
|
||||
ruff check src/ tests/ # Lint
|
||||
mypy src/ # Type check
|
||||
pytest -m "not e2e" # Run tests
|
||||
pytest -m "not e2e" # Run tests (488 tests)
|
||||
```
|
||||
|
||||
---
|
||||
|
|
|
|||
|
|
@ -4,15 +4,15 @@ build-backend = "setuptools.build_meta"
|
|||
|
||||
[project]
|
||||
name = "pbi-cli-tool"
|
||||
version = "2.2.0"
|
||||
description = "CLI for Power BI semantic models - direct .NET connection for token-efficient AI agent usage"
|
||||
version = "3.10.0"
|
||||
description = "CLI for Power BI semantic models and PBIR reports - direct .NET connection for token-efficient AI agent usage"
|
||||
readme = "README.pypi.md"
|
||||
license = {text = "MIT"}
|
||||
requires-python = ">=3.10"
|
||||
authors = [
|
||||
{name = "pbi-cli contributors"},
|
||||
]
|
||||
keywords = ["power-bi", "cli", "semantic-model", "dax", "claude-code", "tom"]
|
||||
keywords = ["power-bi", "cli", "semantic-model", "dax", "claude-code", "tom", "pbir", "report"]
|
||||
classifiers = [
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
"Environment :: Console",
|
||||
|
|
@ -50,6 +50,8 @@ dev = [
|
|||
"ruff>=0.4.0",
|
||||
"mypy>=1.10",
|
||||
]
|
||||
reload = ["pywin32>=306"]
|
||||
preview = ["websockets>=12.0"]
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["src"]
|
||||
|
|
@ -57,6 +59,7 @@ where = ["src"]
|
|||
[tool.setuptools.package-data]
|
||||
"pbi_cli.skills" = ["**/*.md"]
|
||||
"pbi_cli.dlls" = ["*.dll"]
|
||||
"pbi_cli.templates" = ["**/*.json"]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py310"
|
||||
|
|
@ -71,6 +74,10 @@ select = ["E", "F", "I", "N", "W", "UP"]
|
|||
"src/pbi_cli/core/session.py" = ["N806"]
|
||||
"src/pbi_cli/core/tom_backend.py" = ["N806", "N814"]
|
||||
"src/pbi_cli/core/dotnet_loader.py" = ["N806", "N814"]
|
||||
# Win32 API constants use UPPER_CASE; PowerShell inline scripts are long
|
||||
"src/pbi_cli/utils/desktop_reload.py" = ["N806", "E501"]
|
||||
# HTML/SVG template strings are inherently long
|
||||
"src/pbi_cli/preview/renderer.py" = ["E501"]
|
||||
# Mock objects mirror .NET CamelCase API
|
||||
"tests/conftest.py" = ["N802", "N806"]
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
"""pbi-cli: CLI for Power BI semantic models via direct .NET interop."""
|
||||
|
||||
__version__ = "2.2.0"
|
||||
__version__ = "3.10.0"
|
||||
|
|
|
|||
|
|
@ -5,12 +5,20 @@ from __future__ import annotations
|
|||
from collections.abc import Callable
|
||||
from typing import TYPE_CHECKING, Any
|
||||
|
||||
import click
|
||||
|
||||
from pbi_cli.core.errors import TomError
|
||||
from pbi_cli.core.output import format_result, print_error
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from pbi_cli.main import PbiContext
|
||||
|
||||
# Statuses that indicate a write operation (triggers Desktop sync)
|
||||
_WRITE_STATUSES = frozenset({
|
||||
"created", "deleted", "updated", "applied", "added",
|
||||
"cleared", "bound", "removed", "set",
|
||||
})
|
||||
|
||||
|
||||
def run_command(
|
||||
ctx: PbiContext,
|
||||
|
|
@ -20,18 +28,82 @@ def run_command(
|
|||
"""Execute a backend function with standard error handling.
|
||||
|
||||
Calls ``fn(**kwargs)`` and formats the output based on the
|
||||
``--json`` flag. Returns the result or exits on error.
|
||||
``--json`` flag.
|
||||
|
||||
If the current Click context has a ``report_path`` key (set by
|
||||
report-layer command groups), write operations automatically
|
||||
trigger a safe Desktop sync: save Desktop's work, re-apply our
|
||||
PBIR changes, and reopen.
|
||||
"""
|
||||
try:
|
||||
result = fn(**kwargs)
|
||||
format_result(result, ctx.json_output)
|
||||
return result
|
||||
except Exception as e:
|
||||
print_error(str(e))
|
||||
if not ctx.repl_mode:
|
||||
raise SystemExit(1)
|
||||
raise TomError(fn.__name__, str(e))
|
||||
|
||||
# Auto-sync Desktop for report-layer write operations
|
||||
if _is_report_write(result):
|
||||
definition_path = kwargs.get("definition_path")
|
||||
_try_desktop_sync(definition_path)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def _is_report_write(result: Any) -> bool:
|
||||
"""Check if the result indicates a report-layer write."""
|
||||
if not isinstance(result, dict):
|
||||
return False
|
||||
status = result.get("status", "")
|
||||
if status not in _WRITE_STATUSES:
|
||||
return False
|
||||
|
||||
# Only sync if we're inside a report-layer command group
|
||||
click_ctx = click.get_current_context(silent=True)
|
||||
if click_ctx is None:
|
||||
return False
|
||||
|
||||
# Walk up to the group to find report_path
|
||||
parent = click_ctx.parent
|
||||
while parent is not None:
|
||||
obj = parent.obj
|
||||
if isinstance(obj, dict) and "report_path" in obj:
|
||||
return True
|
||||
parent = parent.parent
|
||||
return False
|
||||
|
||||
|
||||
def _try_desktop_sync(definition_path: Any = None) -> None:
|
||||
"""Attempt Desktop sync, silently ignore failures."""
|
||||
try:
|
||||
from pbi_cli.utils.desktop_sync import sync_desktop
|
||||
|
||||
# Find report_path hint from the Click context chain
|
||||
report_path = None
|
||||
click_ctx = click.get_current_context(silent=True)
|
||||
parent = click_ctx.parent if click_ctx else None
|
||||
while parent is not None:
|
||||
obj = parent.obj
|
||||
if isinstance(obj, dict) and "report_path" in obj:
|
||||
report_path = obj["report_path"]
|
||||
break
|
||||
parent = parent.parent
|
||||
|
||||
# Convert definition_path to string for sync
|
||||
defn_str = str(definition_path) if definition_path is not None else None
|
||||
|
||||
result = sync_desktop(report_path, definition_path=defn_str)
|
||||
status = result.get("status", "")
|
||||
msg = result.get("message", "")
|
||||
if status == "success":
|
||||
print_error(f" Desktop: {msg}")
|
||||
elif status == "manual":
|
||||
print_error(f" {msg}")
|
||||
except Exception:
|
||||
pass # sync is best-effort, never block the command
|
||||
|
||||
|
||||
def build_definition(
|
||||
required: dict[str, Any],
|
||||
|
|
|
|||
132
src/pbi_cli/commands/bookmarks.py
Normal file
132
src/pbi_cli/commands/bookmarks.py
Normal file
|
|
@ -0,0 +1,132 @@
|
|||
"""PBIR bookmark management commands."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import click
|
||||
|
||||
from pbi_cli.commands._helpers import run_command
|
||||
from pbi_cli.main import PbiContext, pass_context
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.option(
|
||||
"--path",
|
||||
"-p",
|
||||
default=None,
|
||||
help="Path to .Report folder (auto-detected from CWD if omitted).",
|
||||
)
|
||||
@click.pass_context
|
||||
def bookmarks(ctx: click.Context, path: str | None) -> None:
|
||||
"""Manage report bookmarks."""
|
||||
ctx.ensure_object(dict)
|
||||
ctx.obj["report_path"] = path
|
||||
|
||||
|
||||
@bookmarks.command(name="list")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def list_bookmarks(ctx: PbiContext, click_ctx: click.Context) -> None:
|
||||
"""List all bookmarks in the report."""
|
||||
from pbi_cli.core.bookmark_backend import bookmark_list
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, bookmark_list, definition_path=definition_path)
|
||||
|
||||
|
||||
@bookmarks.command(name="get")
|
||||
@click.argument("name")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def get_bookmark(ctx: PbiContext, click_ctx: click.Context, name: str) -> None:
|
||||
"""Get full details for a bookmark by NAME."""
|
||||
from pbi_cli.core.bookmark_backend import bookmark_get
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, bookmark_get, definition_path=definition_path, name=name)
|
||||
|
||||
|
||||
@bookmarks.command(name="add")
|
||||
@click.option("--display-name", "-d", required=True, help="Human-readable bookmark name.")
|
||||
@click.option("--page", "-g", required=True, help="Target page name (active section).")
|
||||
@click.option("--name", "-n", default=None, help="Bookmark ID (auto-generated if omitted).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def add_bookmark(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
display_name: str,
|
||||
page: str,
|
||||
name: str | None,
|
||||
) -> None:
|
||||
"""Add a new bookmark pointing to a page."""
|
||||
from pbi_cli.core.bookmark_backend import bookmark_add
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
bookmark_add,
|
||||
definition_path=definition_path,
|
||||
display_name=display_name,
|
||||
target_page=page,
|
||||
name=name,
|
||||
)
|
||||
|
||||
|
||||
@bookmarks.command(name="delete")
|
||||
@click.argument("name")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def delete_bookmark(ctx: PbiContext, click_ctx: click.Context, name: str) -> None:
|
||||
"""Delete a bookmark by NAME."""
|
||||
from pbi_cli.core.bookmark_backend import bookmark_delete
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, bookmark_delete, definition_path=definition_path, name=name)
|
||||
|
||||
|
||||
@bookmarks.command(name="set-visibility")
|
||||
@click.argument("name")
|
||||
@click.option("--page", "-g", required=True, help="Page name (folder name).")
|
||||
@click.option("--visual", "-v", required=True, help="Visual name (folder name).")
|
||||
@click.option(
|
||||
"--hidden/--visible",
|
||||
default=True,
|
||||
help="Set the visual as hidden (default) or visible in the bookmark.",
|
||||
)
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def set_visibility(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
name: str,
|
||||
page: str,
|
||||
visual: str,
|
||||
hidden: bool,
|
||||
) -> None:
|
||||
"""Set a visual hidden or visible inside bookmark NAME.
|
||||
|
||||
NAME is the bookmark identifier (hex folder name).
|
||||
Use --hidden to hide the visual, --visible to show it.
|
||||
"""
|
||||
from pbi_cli.core.bookmark_backend import bookmark_set_visibility
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
bookmark_set_visibility,
|
||||
definition_path=definition_path,
|
||||
name=name,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
hidden=hidden,
|
||||
)
|
||||
|
|
@ -48,6 +48,24 @@ def export_tmdl(ctx: PbiContext, folder_path: str) -> None:
|
|||
run_command(ctx, _export_tmdl, database=session.database, folder_path=folder_path)
|
||||
|
||||
|
||||
@database.command(name="diff-tmdl")
|
||||
@click.argument("base_folder", type=click.Path(exists=True, file_okay=False))
|
||||
@click.argument("head_folder", type=click.Path(exists=True, file_okay=False))
|
||||
@pass_context
|
||||
def diff_tmdl_cmd(ctx: PbiContext, base_folder: str, head_folder: str) -> None:
|
||||
"""Compare two TMDL export folders and show what changed.
|
||||
|
||||
Useful for CI/CD to summarise model changes between branches:
|
||||
|
||||
pbi database diff-tmdl ./base-export/ ./head-export/
|
||||
|
||||
No Power BI Desktop connection is required.
|
||||
"""
|
||||
from pbi_cli.core.tmdl_diff import diff_tmdl_folders
|
||||
|
||||
run_command(ctx, diff_tmdl_folders, base_folder=base_folder, head_folder=head_folder)
|
||||
|
||||
|
||||
@database.command(name="export-tmsl")
|
||||
@pass_context
|
||||
def export_tmsl(ctx: PbiContext) -> None:
|
||||
|
|
|
|||
244
src/pbi_cli/commands/filters.py
Normal file
244
src/pbi_cli/commands/filters.py
Normal file
|
|
@ -0,0 +1,244 @@
|
|||
"""PBIR filter management commands."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import click
|
||||
|
||||
from pbi_cli.commands._helpers import run_command
|
||||
from pbi_cli.main import PbiContext, pass_context
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.option(
|
||||
"--path",
|
||||
"-p",
|
||||
default=None,
|
||||
help="Path to .Report folder (auto-detected from CWD if omitted).",
|
||||
)
|
||||
@click.pass_context
|
||||
def filters(ctx: click.Context, path: str | None) -> None:
|
||||
"""Manage page and visual filters."""
|
||||
ctx.ensure_object(dict)
|
||||
ctx.obj["report_path"] = path
|
||||
|
||||
|
||||
@filters.command(name="list")
|
||||
@click.option("--page", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--visual", default=None, help="Visual name (returns visual filters if given).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def filter_list_cmd(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
visual: str | None,
|
||||
) -> None:
|
||||
"""List filters on a page or visual."""
|
||||
from pbi_cli.core.filter_backend import filter_list
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
filter_list,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
)
|
||||
|
||||
|
||||
@filters.command(name="add-categorical")
|
||||
@click.option("--page", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--table", required=True, help="Table name.")
|
||||
@click.option("--column", required=True, help="Column name.")
|
||||
@click.option(
|
||||
"--value",
|
||||
"values",
|
||||
multiple=True,
|
||||
required=True,
|
||||
help="Value to include (repeat for multiple).",
|
||||
)
|
||||
@click.option("--visual", default=None, help="Visual name (adds visual filter if given).")
|
||||
@click.option("--name", "-n", default=None, help="Filter ID (auto-generated if omitted).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def add_categorical_cmd(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
table: str,
|
||||
column: str,
|
||||
values: tuple[str, ...],
|
||||
visual: str | None,
|
||||
name: str | None,
|
||||
) -> None:
|
||||
"""Add a categorical filter to a page or visual."""
|
||||
from pbi_cli.core.filter_backend import filter_add_categorical
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
filter_add_categorical,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
table=table,
|
||||
column=column,
|
||||
values=list(values),
|
||||
visual_name=visual,
|
||||
name=name,
|
||||
)
|
||||
|
||||
|
||||
@filters.command(name="add-topn")
|
||||
@click.option("--page", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--table", required=True, help="Table containing the filtered column.")
|
||||
@click.option("--column", required=True, help="Column to filter (e.g. Country).")
|
||||
@click.option("--n", type=int, required=True, help="Number of items to keep.")
|
||||
@click.option("--order-by-table", required=True, help="Table containing the ordering column.")
|
||||
@click.option("--order-by-column", required=True, help="Column to rank by (e.g. Sales).")
|
||||
@click.option(
|
||||
"--direction",
|
||||
default="Top",
|
||||
show_default=True,
|
||||
help="'Top' (highest N) or 'Bottom' (lowest N).",
|
||||
)
|
||||
@click.option("--visual", default=None, help="Visual name (adds visual filter if given).")
|
||||
@click.option("--name", "-n_", default=None, help="Filter ID (auto-generated if omitted).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def add_topn_cmd(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
table: str,
|
||||
column: str,
|
||||
n: int,
|
||||
order_by_table: str,
|
||||
order_by_column: str,
|
||||
direction: str,
|
||||
visual: str | None,
|
||||
name: str | None,
|
||||
) -> None:
|
||||
"""Add a TopN filter (keep top/bottom N rows by a ranking column)."""
|
||||
from pbi_cli.core.filter_backend import filter_add_topn
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
filter_add_topn,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
table=table,
|
||||
column=column,
|
||||
n=n,
|
||||
order_by_table=order_by_table,
|
||||
order_by_column=order_by_column,
|
||||
direction=direction,
|
||||
visual_name=visual,
|
||||
name=name,
|
||||
)
|
||||
|
||||
|
||||
@filters.command(name="add-relative-date")
|
||||
@click.option("--page", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--table", required=True, help="Table containing the date column.")
|
||||
@click.option("--column", required=True, help="Date column to filter (e.g. Date).")
|
||||
@click.option("--amount", type=int, required=True, help="Number of periods (e.g. 3).")
|
||||
@click.option(
|
||||
"--unit",
|
||||
required=True,
|
||||
help="Time unit: days, weeks, months, or years.",
|
||||
)
|
||||
@click.option("--visual", default=None, help="Visual name (adds visual filter if given).")
|
||||
@click.option("--name", "-n", default=None, help="Filter ID (auto-generated if omitted).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def add_relative_date_cmd(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
table: str,
|
||||
column: str,
|
||||
amount: int,
|
||||
unit: str,
|
||||
visual: str | None,
|
||||
name: str | None,
|
||||
) -> None:
|
||||
"""Add a RelativeDate filter (e.g. last 3 months)."""
|
||||
from pbi_cli.core.filter_backend import filter_add_relative_date
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
filter_add_relative_date,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
table=table,
|
||||
column=column,
|
||||
amount=amount,
|
||||
time_unit=unit,
|
||||
visual_name=visual,
|
||||
name=name,
|
||||
)
|
||||
|
||||
|
||||
@filters.command(name="remove")
|
||||
@click.argument("filter_name")
|
||||
@click.option("--page", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--visual", default=None, help="Visual name (removes from visual if given).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def remove_cmd(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
filter_name: str,
|
||||
page: str,
|
||||
visual: str | None,
|
||||
) -> None:
|
||||
"""Remove a filter by name from a page or visual."""
|
||||
from pbi_cli.core.filter_backend import filter_remove
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
filter_remove,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
filter_name=filter_name,
|
||||
visual_name=visual,
|
||||
)
|
||||
|
||||
|
||||
@filters.command(name="clear")
|
||||
@click.option("--page", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--visual", default=None, help="Visual name (clears visual filters if given).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def clear_cmd(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
visual: str | None,
|
||||
) -> None:
|
||||
"""Remove all filters from a page or visual."""
|
||||
from pbi_cli.core.filter_backend import filter_clear
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
filter_clear,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
)
|
||||
251
src/pbi_cli/commands/format_cmd.py
Normal file
251
src/pbi_cli/commands/format_cmd.py
Normal file
|
|
@ -0,0 +1,251 @@
|
|||
"""PBIR visual conditional formatting commands."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import click
|
||||
|
||||
from pbi_cli.commands._helpers import run_command
|
||||
from pbi_cli.main import PbiContext, pass_context
|
||||
|
||||
|
||||
@click.group(name="format")
|
||||
@click.option(
|
||||
"--report-path",
|
||||
default=None,
|
||||
help="Path to .Report folder (auto-detected from CWD if omitted).",
|
||||
)
|
||||
@click.pass_context
|
||||
def format_cmd(ctx: click.Context, report_path: str | None) -> None:
|
||||
"""Manage visual conditional formatting."""
|
||||
ctx.ensure_object(dict)
|
||||
ctx.obj["report_path"] = report_path
|
||||
|
||||
|
||||
@format_cmd.command(name="get")
|
||||
@click.argument("visual")
|
||||
@click.option("--page", "-p", required=True, help="Page name (folder name, not display name).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def format_get(ctx: PbiContext, click_ctx: click.Context, visual: str, page: str) -> None:
|
||||
"""Show current formatting objects for a visual.
|
||||
|
||||
VISUAL is the visual folder name (e.g. 5b30ba9c6ce5b695a8df).
|
||||
"""
|
||||
from pbi_cli.core.format_backend import format_get as _format_get
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
_format_get,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
)
|
||||
|
||||
|
||||
@format_cmd.command(name="clear")
|
||||
@click.argument("visual")
|
||||
@click.option("--page", "-p", required=True, help="Page name (folder name, not display name).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def format_clear(ctx: PbiContext, click_ctx: click.Context, visual: str, page: str) -> None:
|
||||
"""Remove all conditional formatting from a visual.
|
||||
|
||||
VISUAL is the visual folder name (e.g. 5b30ba9c6ce5b695a8df).
|
||||
"""
|
||||
from pbi_cli.core.format_backend import format_clear as _format_clear
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
_format_clear,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
)
|
||||
|
||||
|
||||
@format_cmd.command(name="background-gradient")
|
||||
@click.argument("visual")
|
||||
@click.option("--page", "-p", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--input-table", required=True, help="Table name driving the gradient.")
|
||||
@click.option("--input-column", required=True, help="Column name driving the gradient.")
|
||||
@click.option(
|
||||
"--field",
|
||||
"field_query_ref",
|
||||
required=True,
|
||||
help='queryRef of the target field (e.g. "Sum(financials.Profit)").',
|
||||
)
|
||||
@click.option("--min-color", default="minColor", show_default=True, help="Gradient minimum color.")
|
||||
@click.option("--max-color", default="maxColor", show_default=True, help="Gradient maximum color.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def background_gradient(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
visual: str,
|
||||
page: str,
|
||||
input_table: str,
|
||||
input_column: str,
|
||||
field_query_ref: str,
|
||||
min_color: str,
|
||||
max_color: str,
|
||||
) -> None:
|
||||
"""Apply a linear gradient background color rule to a visual column.
|
||||
|
||||
VISUAL is the visual folder name (e.g. 5b30ba9c6ce5b695a8df).
|
||||
|
||||
Example:
|
||||
|
||||
pbi format background-gradient MyVisual --page overview
|
||||
--input-table financials --input-column Profit
|
||||
--field "Sum(financials.Profit)"
|
||||
"""
|
||||
from pbi_cli.core.format_backend import format_background_gradient
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
format_background_gradient,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
input_table=input_table,
|
||||
input_column=input_column,
|
||||
field_query_ref=field_query_ref,
|
||||
min_color=min_color,
|
||||
max_color=max_color,
|
||||
)
|
||||
|
||||
|
||||
@format_cmd.command(name="background-conditional")
|
||||
@click.argument("visual")
|
||||
@click.option("--page", "-p", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--input-table", required=True, help="Table containing the evaluated column.")
|
||||
@click.option("--input-column", required=True, help="Column whose aggregation is tested.")
|
||||
@click.option(
|
||||
"--threshold",
|
||||
type=float,
|
||||
required=True,
|
||||
help="Numeric threshold value to compare against.",
|
||||
)
|
||||
@click.option(
|
||||
"--color",
|
||||
"color_hex",
|
||||
required=True,
|
||||
help="Hex color to apply when condition is met (e.g. #12239E).",
|
||||
)
|
||||
@click.option(
|
||||
"--comparison",
|
||||
default="gt",
|
||||
show_default=True,
|
||||
help="Comparison: eq, neq, gt, gte, lt, lte.",
|
||||
)
|
||||
@click.option(
|
||||
"--field",
|
||||
"field_query_ref",
|
||||
default=None,
|
||||
help='queryRef of the target field. Defaults to "Sum(table.column)".',
|
||||
)
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def background_conditional(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
visual: str,
|
||||
page: str,
|
||||
input_table: str,
|
||||
input_column: str,
|
||||
threshold: float,
|
||||
color_hex: str,
|
||||
comparison: str,
|
||||
field_query_ref: str | None,
|
||||
) -> None:
|
||||
"""Apply a rule-based conditional background color to a visual column.
|
||||
|
||||
VISUAL is the visual folder name (e.g. 5b30ba9c6ce5b695a8df).
|
||||
Colors the cell when Sum(input_column) satisfies the comparison.
|
||||
|
||||
Example:
|
||||
|
||||
pbi format background-conditional MyVisual --page overview
|
||||
--input-table financials --input-column "Units Sold"
|
||||
--threshold 100000 --color "#12239E" --comparison gt
|
||||
"""
|
||||
from pbi_cli.core.format_backend import format_background_conditional
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
format_background_conditional,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
input_table=input_table,
|
||||
input_column=input_column,
|
||||
threshold=threshold,
|
||||
color_hex=color_hex,
|
||||
comparison=comparison,
|
||||
field_query_ref=field_query_ref,
|
||||
)
|
||||
|
||||
|
||||
@format_cmd.command(name="background-measure")
|
||||
@click.argument("visual")
|
||||
@click.option("--page", "-p", required=True, help="Page name (folder name, not display name).")
|
||||
@click.option("--measure-table", required=True, help="Table containing the color measure.")
|
||||
@click.option(
|
||||
"--measure-property", required=True, help="Name of the DAX measure returning hex color."
|
||||
)
|
||||
@click.option(
|
||||
"--field",
|
||||
"field_query_ref",
|
||||
required=True,
|
||||
help='queryRef of the target field (e.g. "Sum(financials.Sales)").',
|
||||
)
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def background_measure(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
visual: str,
|
||||
page: str,
|
||||
measure_table: str,
|
||||
measure_property: str,
|
||||
field_query_ref: str,
|
||||
) -> None:
|
||||
"""Apply a DAX measure-driven background color rule to a visual column.
|
||||
|
||||
VISUAL is the visual folder name (e.g. 5b30ba9c6ce5b695a8df).
|
||||
The DAX measure must return a valid hex color string.
|
||||
|
||||
Example:
|
||||
|
||||
pbi format background-measure MyVisual --page overview
|
||||
--measure-table financials
|
||||
--measure-property "Conditional Formatting Sales"
|
||||
--field "Sum(financials.Sales)"
|
||||
"""
|
||||
from pbi_cli.core.format_backend import format_background_measure
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
format_background_measure,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual,
|
||||
measure_table=measure_table,
|
||||
measure_property=measure_property,
|
||||
field_query_ref=field_query_ref,
|
||||
)
|
||||
324
src/pbi_cli/commands/report.py
Normal file
324
src/pbi_cli/commands/report.py
Normal file
|
|
@ -0,0 +1,324 @@
|
|||
"""PBIR report management commands."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
|
||||
import click
|
||||
|
||||
from pbi_cli.commands._helpers import run_command
|
||||
from pbi_cli.main import PbiContext, pass_context
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.option(
|
||||
"--path",
|
||||
"-p",
|
||||
default=None,
|
||||
help="Path to .Report folder (auto-detected from CWD if omitted).",
|
||||
)
|
||||
@click.pass_context
|
||||
def report(ctx: click.Context, path: str | None) -> None:
|
||||
"""Manage Power BI PBIR reports (pages, themes, validation)."""
|
||||
ctx.ensure_object(dict)
|
||||
ctx.obj["report_path"] = path
|
||||
|
||||
|
||||
@report.command()
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def info(ctx: PbiContext, click_ctx: click.Context) -> None:
|
||||
"""Show report metadata summary."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import report_info
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, report_info, definition_path=definition_path)
|
||||
|
||||
|
||||
@report.command()
|
||||
@click.argument("target_path", type=click.Path())
|
||||
@click.option("--name", "-n", required=True, help="Report name.")
|
||||
@click.option(
|
||||
"--dataset-path",
|
||||
default=None,
|
||||
help="Relative path to semantic model folder (e.g. ../MyModel.Dataset).",
|
||||
)
|
||||
@pass_context
|
||||
def create(
|
||||
ctx: PbiContext, target_path: str, name: str, dataset_path: str | None
|
||||
) -> None:
|
||||
"""Scaffold a new PBIR report project."""
|
||||
from pbi_cli.core.report_backend import report_create
|
||||
|
||||
run_command(
|
||||
ctx,
|
||||
report_create,
|
||||
target_path=Path(target_path),
|
||||
name=name,
|
||||
dataset_path=dataset_path,
|
||||
)
|
||||
|
||||
|
||||
@report.command(name="list-pages")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def list_pages(ctx: PbiContext, click_ctx: click.Context) -> None:
|
||||
"""List all pages in the report."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import page_list
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, page_list, definition_path=definition_path)
|
||||
|
||||
|
||||
@report.command(name="add-page")
|
||||
@click.option("--display-name", "-d", required=True, help="Page display name.")
|
||||
@click.option("--name", "-n", default=None, help="Page ID (auto-generated if omitted).")
|
||||
@click.option("--width", type=int, default=1280, help="Page width in pixels.")
|
||||
@click.option("--height", type=int, default=720, help="Page height in pixels.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def add_page(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
display_name: str,
|
||||
name: str | None,
|
||||
width: int,
|
||||
height: int,
|
||||
) -> None:
|
||||
"""Add a new page to the report."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import page_add
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
page_add,
|
||||
definition_path=definition_path,
|
||||
display_name=display_name,
|
||||
name=name,
|
||||
width=width,
|
||||
height=height,
|
||||
)
|
||||
|
||||
|
||||
@report.command(name="delete-page")
|
||||
@click.argument("name")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def delete_page(ctx: PbiContext, click_ctx: click.Context, name: str) -> None:
|
||||
"""Delete a page and all its visuals."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import page_delete
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, page_delete, definition_path=definition_path, page_name=name)
|
||||
|
||||
|
||||
@report.command(name="get-page")
|
||||
@click.argument("name")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def get_page(ctx: PbiContext, click_ctx: click.Context, name: str) -> None:
|
||||
"""Get details of a specific page."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import page_get
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, page_get, definition_path=definition_path, page_name=name)
|
||||
|
||||
|
||||
@report.command(name="set-theme")
|
||||
@click.option("--file", "-f", required=True, type=click.Path(exists=True), help="Theme JSON file.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def set_theme(ctx: PbiContext, click_ctx: click.Context, file: str) -> None:
|
||||
"""Apply a custom theme to the report."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import theme_set
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
theme_set,
|
||||
definition_path=definition_path,
|
||||
theme_path=Path(file),
|
||||
)
|
||||
|
||||
|
||||
@report.command(name="get-theme")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def get_theme(ctx: PbiContext, click_ctx: click.Context) -> None:
|
||||
"""Show the current theme (base and custom) applied to the report."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import theme_get
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(ctx, theme_get, definition_path=definition_path)
|
||||
|
||||
|
||||
@report.command(name="diff-theme")
|
||||
@click.option(
|
||||
"--file", "-f", required=True, type=click.Path(exists=True),
|
||||
help="Proposed theme JSON file.",
|
||||
)
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def diff_theme(ctx: PbiContext, click_ctx: click.Context, file: str) -> None:
|
||||
"""Compare a proposed theme JSON against the currently applied theme.
|
||||
|
||||
Shows which theme keys would be added, removed, or changed.
|
||||
"""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import theme_diff
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
theme_diff,
|
||||
definition_path=definition_path,
|
||||
theme_path=Path(file),
|
||||
)
|
||||
|
||||
|
||||
@report.command(name="set-background")
|
||||
@click.argument("page_name")
|
||||
@click.option("--color", "-c", required=True, help="Hex color e.g. '#F8F9FA'.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def set_background(
|
||||
ctx: PbiContext, click_ctx: click.Context, page_name: str, color: str
|
||||
) -> None:
|
||||
"""Set the background color of a page."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import page_set_background
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
page_set_background,
|
||||
definition_path=definition_path,
|
||||
page_name=page_name,
|
||||
color=color,
|
||||
)
|
||||
|
||||
|
||||
@report.command(name="set-visibility")
|
||||
@click.argument("page_name")
|
||||
@click.option(
|
||||
"--hidden/--visible",
|
||||
default=True,
|
||||
help="Hide or show the page in navigation.",
|
||||
)
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def set_visibility(
|
||||
ctx: PbiContext, click_ctx: click.Context, page_name: str, hidden: bool
|
||||
) -> None:
|
||||
"""Hide or show a page in the report navigation."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.report_backend import page_set_visibility
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
page_set_visibility,
|
||||
definition_path=definition_path,
|
||||
page_name=page_name,
|
||||
hidden=hidden,
|
||||
)
|
||||
|
||||
|
||||
@report.command()
|
||||
@click.option("--full", is_flag=True, default=False, help="Run enhanced validation with warnings.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def validate(ctx: PbiContext, click_ctx: click.Context, full: bool) -> None:
|
||||
"""Validate the PBIR report structure and JSON files."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
|
||||
if full:
|
||||
from pbi_cli.core.pbir_validators import validate_report_full
|
||||
|
||||
run_command(ctx, validate_report_full, definition_path=definition_path)
|
||||
else:
|
||||
from pbi_cli.core.report_backend import report_validate
|
||||
|
||||
run_command(ctx, report_validate, definition_path=definition_path)
|
||||
|
||||
|
||||
@report.command()
|
||||
@pass_context
|
||||
def reload(ctx: PbiContext) -> None:
|
||||
"""Trigger Power BI Desktop to reload the current report.
|
||||
|
||||
Sends Ctrl+Shift+F5 to Power BI Desktop. Tries pywin32 first,
|
||||
falls back to PowerShell, then prints manual instructions.
|
||||
|
||||
Install pywin32 for best results: pip install pbi-cli-tool[reload]
|
||||
"""
|
||||
from pbi_cli.utils.desktop_reload import reload_desktop
|
||||
|
||||
run_command(ctx, reload_desktop)
|
||||
|
||||
|
||||
@report.command()
|
||||
@click.option("--port", type=int, default=8080, help="HTTP server port (WebSocket uses port+1).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def preview(ctx: PbiContext, click_ctx: click.Context, port: int) -> None:
|
||||
"""Start a live preview server for the PBIR report.
|
||||
|
||||
Opens an HTTP server that renders the report as HTML/SVG.
|
||||
Auto-reloads in the browser when PBIR files change.
|
||||
|
||||
Install websockets for this feature: pip install pbi-cli-tool[preview]
|
||||
"""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.preview.server import start_preview_server
|
||||
|
||||
report_path = click_ctx.parent.obj.get("report_path") if click_ctx.parent else None
|
||||
definition_path = resolve_report_path(report_path)
|
||||
run_command(
|
||||
ctx,
|
||||
start_preview_server,
|
||||
definition_path=definition_path,
|
||||
port=port,
|
||||
)
|
||||
|
||||
|
||||
@report.command()
|
||||
@click.argument("source_path", type=click.Path(exists=True))
|
||||
@click.option("--output", "-o", default=None, type=click.Path(), help="Output directory.")
|
||||
@click.option("--force", is_flag=True, default=False, help="Overwrite existing .pbip file.")
|
||||
@pass_context
|
||||
def convert(ctx: PbiContext, source_path: str, output: str | None, force: bool) -> None:
|
||||
"""Convert a .Report folder into a distributable .pbip project.
|
||||
|
||||
Creates the .pbip project file and .gitignore for version control.
|
||||
Note: does NOT convert .pbix to .pbip (use Power BI Desktop for that).
|
||||
"""
|
||||
from pbi_cli.core.report_backend import report_convert
|
||||
|
||||
run_command(
|
||||
ctx,
|
||||
report_convert,
|
||||
source_path=Path(source_path),
|
||||
output_path=Path(output) if output else None,
|
||||
force=force,
|
||||
)
|
||||
652
src/pbi_cli/commands/visual.py
Normal file
652
src/pbi_cli/commands/visual.py
Normal file
|
|
@ -0,0 +1,652 @@
|
|||
"""PBIR visual CRUD commands."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import click
|
||||
|
||||
from pbi_cli.commands._helpers import run_command
|
||||
from pbi_cli.main import PbiContext, pass_context
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.option(
|
||||
"--path",
|
||||
"-p",
|
||||
default=None,
|
||||
help="Path to .Report folder (auto-detected from CWD if omitted).",
|
||||
)
|
||||
@click.pass_context
|
||||
def visual(ctx: click.Context, path: str | None) -> None:
|
||||
"""Manage visuals in PBIR report pages."""
|
||||
ctx.ensure_object(dict)
|
||||
ctx.obj["report_path"] = path
|
||||
|
||||
|
||||
def _get_report_path(click_ctx: click.Context) -> str | None:
|
||||
"""Extract report_path from parent context."""
|
||||
if click_ctx.parent:
|
||||
return click_ctx.parent.obj.get("report_path")
|
||||
return None
|
||||
|
||||
|
||||
@visual.command(name="list")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def visual_list(ctx: PbiContext, click_ctx: click.Context, page: str) -> None:
|
||||
"""List all visuals on a page."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_list as _visual_list
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(ctx, _visual_list, definition_path=definition_path, page_name=page)
|
||||
|
||||
|
||||
@visual.command()
|
||||
@click.argument("name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def get(ctx: PbiContext, click_ctx: click.Context, name: str, page: str) -> None:
|
||||
"""Get detailed information about a visual."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_get
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_get,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=name,
|
||||
)
|
||||
|
||||
|
||||
@visual.command()
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option(
|
||||
"--type",
|
||||
"visual_type",
|
||||
required=True,
|
||||
help="Visual type (bar_chart, line_chart, card, table, matrix).",
|
||||
)
|
||||
@click.option("--name", "-n", default=None, help="Visual name (auto-generated if omitted).")
|
||||
@click.option("--x", type=float, default=None, help="X position on canvas.")
|
||||
@click.option("--y", type=float, default=None, help="Y position on canvas.")
|
||||
@click.option("--width", type=float, default=None, help="Visual width in pixels.")
|
||||
@click.option("--height", type=float, default=None, help="Visual height in pixels.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def add(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
visual_type: str,
|
||||
name: str | None,
|
||||
x: float | None,
|
||||
y: float | None,
|
||||
width: float | None,
|
||||
height: float | None,
|
||||
) -> None:
|
||||
"""Add a new visual to a page."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_add
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_add,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_type=visual_type,
|
||||
name=name,
|
||||
x=x,
|
||||
y=y,
|
||||
width=width,
|
||||
height=height,
|
||||
)
|
||||
|
||||
|
||||
@visual.command()
|
||||
@click.argument("name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option("--x", type=float, default=None, help="New X position.")
|
||||
@click.option("--y", type=float, default=None, help="New Y position.")
|
||||
@click.option("--width", type=float, default=None, help="New width.")
|
||||
@click.option("--height", type=float, default=None, help="New height.")
|
||||
@click.option("--hidden/--visible", default=None, help="Toggle visibility.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def update(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
name: str,
|
||||
page: str,
|
||||
x: float | None,
|
||||
y: float | None,
|
||||
width: float | None,
|
||||
height: float | None,
|
||||
hidden: bool | None,
|
||||
) -> None:
|
||||
"""Update visual position, size, or visibility."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_update
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_update,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=name,
|
||||
x=x,
|
||||
y=y,
|
||||
width=width,
|
||||
height=height,
|
||||
hidden=hidden,
|
||||
)
|
||||
|
||||
|
||||
@visual.command()
|
||||
@click.argument("name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def delete(ctx: PbiContext, click_ctx: click.Context, name: str, page: str) -> None:
|
||||
"""Delete a visual from a page."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_delete
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_delete,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=name,
|
||||
)
|
||||
|
||||
|
||||
@visual.command()
|
||||
@click.argument("name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option(
|
||||
"--category",
|
||||
multiple=True,
|
||||
help="Category/axis column: bar, line, donut charts. Table[Column] format.",
|
||||
)
|
||||
@click.option(
|
||||
"--value",
|
||||
multiple=True,
|
||||
help="Value/measure: all chart types. Treated as measure. Table[Measure] format.",
|
||||
)
|
||||
@click.option(
|
||||
"--row",
|
||||
multiple=True,
|
||||
help="Row grouping column: matrix only. Table[Column] format.",
|
||||
)
|
||||
@click.option(
|
||||
"--field",
|
||||
multiple=True,
|
||||
help="Data field: card, slicer. Treated as measure for cards. Table[Field] format.",
|
||||
)
|
||||
@click.option(
|
||||
"--legend",
|
||||
multiple=True,
|
||||
help="Legend/series column: bar, line, donut charts. Table[Column] format.",
|
||||
)
|
||||
@click.option(
|
||||
"--indicator",
|
||||
multiple=True,
|
||||
help="KPI indicator measure. Table[Measure] format.",
|
||||
)
|
||||
@click.option(
|
||||
"--goal",
|
||||
multiple=True,
|
||||
help="KPI goal measure. Table[Measure] format.",
|
||||
)
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def bind(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
name: str,
|
||||
page: str,
|
||||
category: tuple[str, ...],
|
||||
value: tuple[str, ...],
|
||||
row: tuple[str, ...],
|
||||
field: tuple[str, ...],
|
||||
legend: tuple[str, ...],
|
||||
indicator: tuple[str, ...],
|
||||
goal: tuple[str, ...],
|
||||
) -> None:
|
||||
"""Bind semantic model fields to a visual's data roles.
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual bind mychart --page p1 --category "Geo[Region]" --value "Sales[Amount]"
|
||||
|
||||
pbi visual bind mycard --page p1 --field "Sales[Total Revenue]"
|
||||
|
||||
pbi visual bind mymatrix --page p1 --row "Product[Category]" --value "Sales[Qty]"
|
||||
|
||||
pbi visual bind mykpi --page p1 --indicator "Sales[Revenue]" --goal "Sales[Target]"
|
||||
"""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_bind
|
||||
|
||||
bindings: list[dict[str, str]] = []
|
||||
for f in category:
|
||||
bindings.append({"role": "category", "field": f})
|
||||
for f in value:
|
||||
bindings.append({"role": "value", "field": f})
|
||||
for f in row:
|
||||
bindings.append({"role": "row", "field": f})
|
||||
for f in field:
|
||||
bindings.append({"role": "field", "field": f})
|
||||
for f in legend:
|
||||
bindings.append({"role": "legend", "field": f})
|
||||
for f in indicator:
|
||||
bindings.append({"role": "indicator", "field": f})
|
||||
for f in goal:
|
||||
bindings.append({"role": "goal", "field": f})
|
||||
|
||||
if not bindings:
|
||||
raise click.UsageError(
|
||||
"At least one binding required "
|
||||
"(--category, --value, --row, --field, --legend, --indicator, or --goal)."
|
||||
)
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_bind,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=name,
|
||||
bindings=bindings,
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# v3.1.0 Bulk operations
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
@visual.command()
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option("--type", "visual_type", default=None, help="Filter by PBIR visual type or alias.")
|
||||
@click.option("--name-pattern", default=None, help="fnmatch glob on visual name (e.g. 'Chart_*').")
|
||||
@click.option("--x-min", type=float, default=None, help="Minimum x position.")
|
||||
@click.option("--x-max", type=float, default=None, help="Maximum x position.")
|
||||
@click.option("--y-min", type=float, default=None, help="Minimum y position.")
|
||||
@click.option("--y-max", type=float, default=None, help="Maximum y position.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def where(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
visual_type: str | None,
|
||||
name_pattern: str | None,
|
||||
x_min: float | None,
|
||||
x_max: float | None,
|
||||
y_min: float | None,
|
||||
y_max: float | None,
|
||||
) -> None:
|
||||
"""Filter visuals by type and/or position bounds.
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual where --page overview --type barChart
|
||||
|
||||
pbi visual where --page overview --x-max 640
|
||||
|
||||
pbi visual where --page overview --type kpi --name-pattern "KPI_*"
|
||||
"""
|
||||
from pbi_cli.core.bulk_backend import visual_where
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_where,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_type=visual_type,
|
||||
name_pattern=name_pattern,
|
||||
x_min=x_min,
|
||||
x_max=x_max,
|
||||
y_min=y_min,
|
||||
y_max=y_max,
|
||||
)
|
||||
|
||||
|
||||
@visual.command(name="bulk-bind")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option("--type", "visual_type", required=True, help="Target PBIR visual type or alias.")
|
||||
@click.option("--name-pattern", default=None, help="Restrict to visuals matching fnmatch pattern.")
|
||||
@click.option("--category", multiple=True, help="Category/axis. Table[Column].")
|
||||
@click.option("--value", multiple=True, help="Value/measure: all chart types. Table[Measure].")
|
||||
@click.option("--row", multiple=True, help="Row grouping: matrix only. Table[Column].")
|
||||
@click.option("--field", multiple=True, help="Data field: card, slicer. Table[Field].")
|
||||
@click.option("--legend", multiple=True, help="Legend/series. Table[Column].")
|
||||
@click.option("--indicator", multiple=True, help="KPI indicator measure. Table[Measure].")
|
||||
@click.option("--goal", multiple=True, help="KPI goal measure. Table[Measure].")
|
||||
@click.option("--column", "col_value", multiple=True, help="Combo column Y. Table[Measure].")
|
||||
@click.option("--line", multiple=True, help="Line Y axis for combo chart. Table[Measure].")
|
||||
@click.option("--x", "x_field", multiple=True, help="X axis for scatter chart. Table[Measure].")
|
||||
@click.option("--y", "y_field", multiple=True, help="Y axis for scatter chart. Table[Measure].")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def bulk_bind(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
visual_type: str,
|
||||
name_pattern: str | None,
|
||||
category: tuple[str, ...],
|
||||
value: tuple[str, ...],
|
||||
row: tuple[str, ...],
|
||||
field: tuple[str, ...],
|
||||
legend: tuple[str, ...],
|
||||
indicator: tuple[str, ...],
|
||||
goal: tuple[str, ...],
|
||||
col_value: tuple[str, ...],
|
||||
line: tuple[str, ...],
|
||||
x_field: tuple[str, ...],
|
||||
y_field: tuple[str, ...],
|
||||
) -> None:
|
||||
"""Bind fields to ALL visuals of a given type on a page.
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual bulk-bind --page overview --type barChart \\
|
||||
--category "Date[Month]" --value "Sales[Revenue]"
|
||||
|
||||
pbi visual bulk-bind --page overview --type kpi \\
|
||||
--indicator "Sales[Revenue]" --goal "Sales[Target]"
|
||||
|
||||
pbi visual bulk-bind --page overview --type lineStackedColumnComboChart \\
|
||||
--column "Sales[Revenue]" --line "Sales[Margin]"
|
||||
"""
|
||||
from pbi_cli.core.bulk_backend import visual_bulk_bind
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
bindings: list[dict[str, str]] = []
|
||||
for f in category:
|
||||
bindings.append({"role": "category", "field": f})
|
||||
for f in value:
|
||||
bindings.append({"role": "value", "field": f})
|
||||
for f in row:
|
||||
bindings.append({"role": "row", "field": f})
|
||||
for f in field:
|
||||
bindings.append({"role": "field", "field": f})
|
||||
for f in legend:
|
||||
bindings.append({"role": "legend", "field": f})
|
||||
for f in indicator:
|
||||
bindings.append({"role": "indicator", "field": f})
|
||||
for f in goal:
|
||||
bindings.append({"role": "goal", "field": f})
|
||||
for f in col_value:
|
||||
bindings.append({"role": "column", "field": f})
|
||||
for f in line:
|
||||
bindings.append({"role": "line", "field": f})
|
||||
for f in x_field:
|
||||
bindings.append({"role": "x", "field": f})
|
||||
for f in y_field:
|
||||
bindings.append({"role": "y", "field": f})
|
||||
|
||||
if not bindings:
|
||||
raise click.UsageError("At least one binding role required.")
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_bulk_bind,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_type=visual_type,
|
||||
bindings=bindings,
|
||||
name_pattern=name_pattern,
|
||||
)
|
||||
|
||||
|
||||
@visual.command(name="bulk-update")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option("--type", "visual_type", default=None, help="Filter by visual type or alias.")
|
||||
@click.option("--name-pattern", default=None, help="fnmatch filter on visual name.")
|
||||
@click.option("--width", type=float, default=None, help="Set width for all matching visuals.")
|
||||
@click.option("--height", type=float, default=None, help="Set height for all matching visuals.")
|
||||
@click.option("--x", "set_x", type=float, default=None, help="Set x position.")
|
||||
@click.option("--y", "set_y", type=float, default=None, help="Set y position.")
|
||||
@click.option("--hidden/--visible", default=None, help="Show or hide all matching visuals.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def bulk_update(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
visual_type: str | None,
|
||||
name_pattern: str | None,
|
||||
width: float | None,
|
||||
height: float | None,
|
||||
set_x: float | None,
|
||||
set_y: float | None,
|
||||
hidden: bool | None,
|
||||
) -> None:
|
||||
"""Update dimensions or visibility for ALL visuals matching the filter.
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual bulk-update --page overview --type kpi --height 200 --width 300
|
||||
|
||||
pbi visual bulk-update --page overview --name-pattern "Temp_*" --hidden
|
||||
"""
|
||||
from pbi_cli.core.bulk_backend import visual_bulk_update
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_bulk_update,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
where_type=visual_type,
|
||||
where_name_pattern=name_pattern,
|
||||
set_hidden=hidden,
|
||||
set_width=width,
|
||||
set_height=height,
|
||||
set_x=set_x,
|
||||
set_y=set_y,
|
||||
)
|
||||
|
||||
|
||||
@visual.command(name="bulk-delete")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option("--type", "visual_type", default=None, help="Filter by visual type or alias.")
|
||||
@click.option("--name-pattern", default=None, help="fnmatch filter on visual name.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def bulk_delete(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
page: str,
|
||||
visual_type: str | None,
|
||||
name_pattern: str | None,
|
||||
) -> None:
|
||||
"""Delete ALL visuals matching the filter (requires --type or --name-pattern).
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual bulk-delete --page overview --type barChart
|
||||
|
||||
pbi visual bulk-delete --page overview --name-pattern "Draft_*"
|
||||
"""
|
||||
from pbi_cli.core.bulk_backend import visual_bulk_delete
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_bulk_delete,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
where_type=visual_type,
|
||||
where_name_pattern=name_pattern,
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# v3.2.0 Visual Calculations (Phase 7)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
@visual.command(name="calc-add")
|
||||
@click.argument("visual_name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option("--name", "calc_name", required=True, help="Display name for the calculation.")
|
||||
@click.option("--expression", required=True, help="DAX expression for the calculation.")
|
||||
@click.option("--role", default="Y", show_default=True, help="Target data role (e.g. Y, Values).")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def calc_add(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
visual_name: str,
|
||||
page: str,
|
||||
calc_name: str,
|
||||
expression: str,
|
||||
role: str,
|
||||
) -> None:
|
||||
"""Add a visual calculation to a data role's projections.
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual calc-add MyChart --page overview --name "Running sum" \\
|
||||
--expression "RUNNINGSUM([Sum of Sales])"
|
||||
|
||||
pbi visual calc-add MyChart --page overview --name "Rank" \\
|
||||
--expression "RANK()" --role Y
|
||||
"""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_calc_add
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_calc_add,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual_name,
|
||||
calc_name=calc_name,
|
||||
expression=expression,
|
||||
role=role,
|
||||
)
|
||||
|
||||
|
||||
@visual.command(name="calc-list")
|
||||
@click.argument("visual_name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def calc_list(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
visual_name: str,
|
||||
page: str,
|
||||
) -> None:
|
||||
"""List all visual calculations on a visual across all roles.
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual calc-list MyChart --page overview
|
||||
"""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_calc_list
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_calc_list,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual_name,
|
||||
)
|
||||
|
||||
|
||||
@visual.command(name="set-container")
|
||||
@click.argument("name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option(
|
||||
"--border-show",
|
||||
type=bool,
|
||||
default=None,
|
||||
help="Show (true) or hide (false) the visual border.",
|
||||
)
|
||||
@click.option(
|
||||
"--background-show",
|
||||
type=bool,
|
||||
default=None,
|
||||
help="Show (true) or hide (false) the visual background.",
|
||||
)
|
||||
@click.option("--title", default=None, help="Set container title text.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def set_container(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
name: str,
|
||||
page: str,
|
||||
border_show: bool | None,
|
||||
background_show: bool | None,
|
||||
title: str | None,
|
||||
) -> None:
|
||||
"""Set container-level border, background, or title on a visual."""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_set_container
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_set_container,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=name,
|
||||
border_show=border_show,
|
||||
background_show=background_show,
|
||||
title=title,
|
||||
)
|
||||
|
||||
|
||||
@visual.command(name="calc-delete")
|
||||
@click.argument("visual_name")
|
||||
@click.option("--page", required=True, help="Page name/ID.")
|
||||
@click.option("--name", "calc_name", required=True, help="Name of the calculation to delete.")
|
||||
@click.pass_context
|
||||
@pass_context
|
||||
def calc_delete(
|
||||
ctx: PbiContext,
|
||||
click_ctx: click.Context,
|
||||
visual_name: str,
|
||||
page: str,
|
||||
calc_name: str,
|
||||
) -> None:
|
||||
"""Delete a visual calculation by name.
|
||||
|
||||
Examples:
|
||||
|
||||
pbi visual calc-delete MyChart --page overview --name "Running sum"
|
||||
"""
|
||||
from pbi_cli.core.pbir_path import resolve_report_path
|
||||
from pbi_cli.core.visual_backend import visual_calc_delete
|
||||
|
||||
definition_path = resolve_report_path(_get_report_path(click_ctx))
|
||||
run_command(
|
||||
ctx,
|
||||
visual_calc_delete,
|
||||
definition_path=definition_path,
|
||||
page_name=page,
|
||||
visual_name=visual_name,
|
||||
calc_name=calc_name,
|
||||
)
|
||||
246
src/pbi_cli/core/bookmark_backend.py
Normal file
246
src/pbi_cli/core/bookmark_backend.py
Normal file
|
|
@ -0,0 +1,246 @@
|
|||
"""Pure-function backend for PBIR bookmark operations.
|
||||
|
||||
Mirrors ``report_backend.py`` but operates on the bookmarks subfolder.
|
||||
Every function takes a ``Path`` to the definition folder and returns a plain
|
||||
Python dict suitable for ``format_result()``.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import secrets
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Schema constants
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
SCHEMA_BOOKMARKS_METADATA = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/bookmarksMetadata/1.0.0/schema.json"
|
||||
)
|
||||
SCHEMA_BOOKMARK = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/bookmark/2.1.0/schema.json"
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# JSON helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any]:
|
||||
"""Read and parse a JSON file."""
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
"""Write JSON with consistent formatting."""
|
||||
path.write_text(
|
||||
json.dumps(data, indent=2, ensure_ascii=False) + "\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
|
||||
def _generate_name() -> str:
|
||||
"""Generate a 20-character hex identifier matching PBIR convention."""
|
||||
return secrets.token_hex(10)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Path helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _bookmarks_dir(definition_path: Path) -> Path:
|
||||
return definition_path / "bookmarks"
|
||||
|
||||
|
||||
def _index_path(definition_path: Path) -> Path:
|
||||
return _bookmarks_dir(definition_path) / "bookmarks.json"
|
||||
|
||||
|
||||
def _bookmark_path(definition_path: Path, name: str) -> Path:
|
||||
return _bookmarks_dir(definition_path) / f"{name}.bookmark.json"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Public API
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def bookmark_list(definition_path: Path) -> list[dict[str, Any]]:
|
||||
"""List all bookmarks.
|
||||
|
||||
Returns a list of ``{name, display_name, active_section}`` dicts.
|
||||
Returns ``[]`` if the bookmarks folder or index does not exist.
|
||||
"""
|
||||
index_file = _index_path(definition_path)
|
||||
if not index_file.exists():
|
||||
return []
|
||||
|
||||
index = _read_json(index_file)
|
||||
items: list[dict[str, Any]] = index.get("items", [])
|
||||
|
||||
results: list[dict[str, Any]] = []
|
||||
for item in items:
|
||||
name = item.get("name", "")
|
||||
bm_file = _bookmark_path(definition_path, name)
|
||||
if not bm_file.exists():
|
||||
continue
|
||||
bm = _read_json(bm_file)
|
||||
exploration = bm.get("explorationState", {})
|
||||
results.append({
|
||||
"name": name,
|
||||
"display_name": bm.get("displayName", ""),
|
||||
"active_section": exploration.get("activeSection"),
|
||||
})
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def bookmark_get(definition_path: Path, name: str) -> dict[str, Any]:
|
||||
"""Get the full data for a single bookmark by name.
|
||||
|
||||
Raises ``PbiCliError`` if the bookmark does not exist.
|
||||
"""
|
||||
bm_file = _bookmark_path(definition_path, name)
|
||||
if not bm_file.exists():
|
||||
raise PbiCliError(f"Bookmark '{name}' not found.")
|
||||
return _read_json(bm_file)
|
||||
|
||||
|
||||
def bookmark_add(
|
||||
definition_path: Path,
|
||||
display_name: str,
|
||||
target_page: str,
|
||||
name: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Create a new bookmark pointing to *target_page*.
|
||||
|
||||
Creates the ``bookmarks/`` directory and ``bookmarks.json`` index if they
|
||||
do not already exist. Returns a status dict with the created bookmark info.
|
||||
"""
|
||||
bm_name = name if name is not None else _generate_name()
|
||||
|
||||
bm_dir = _bookmarks_dir(definition_path)
|
||||
bm_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
index_file = _index_path(definition_path)
|
||||
if index_file.exists():
|
||||
index = _read_json(index_file)
|
||||
else:
|
||||
index = {"$schema": SCHEMA_BOOKMARKS_METADATA, "items": []}
|
||||
|
||||
index["items"] = list(index.get("items", []))
|
||||
index["items"].append({"name": bm_name})
|
||||
_write_json(index_file, index)
|
||||
|
||||
bookmark_data: dict[str, Any] = {
|
||||
"$schema": SCHEMA_BOOKMARK,
|
||||
"displayName": display_name,
|
||||
"name": bm_name,
|
||||
"options": {"targetVisualNames": []},
|
||||
"explorationState": {
|
||||
"version": "1.3",
|
||||
"activeSection": target_page,
|
||||
},
|
||||
}
|
||||
_write_json(_bookmark_path(definition_path, bm_name), bookmark_data)
|
||||
|
||||
return {
|
||||
"status": "created",
|
||||
"name": bm_name,
|
||||
"display_name": display_name,
|
||||
"target_page": target_page,
|
||||
}
|
||||
|
||||
|
||||
def bookmark_delete(
|
||||
definition_path: Path,
|
||||
name: str,
|
||||
) -> dict[str, Any]:
|
||||
"""Delete a bookmark by name.
|
||||
|
||||
Removes the ``.bookmark.json`` file and its entry in ``bookmarks.json``.
|
||||
Raises ``PbiCliError`` if the bookmark is not found.
|
||||
"""
|
||||
index_file = _index_path(definition_path)
|
||||
if not index_file.exists():
|
||||
raise PbiCliError(f"Bookmark '{name}' not found.")
|
||||
|
||||
index = _read_json(index_file)
|
||||
items: list[dict[str, Any]] = index.get("items", [])
|
||||
existing_names = [i.get("name") for i in items]
|
||||
|
||||
if name not in existing_names:
|
||||
raise PbiCliError(f"Bookmark '{name}' not found.")
|
||||
|
||||
bm_file = _bookmark_path(definition_path, name)
|
||||
if bm_file.exists():
|
||||
bm_file.unlink()
|
||||
|
||||
updated_items = [i for i in items if i.get("name") != name]
|
||||
updated_index = {**index, "items": updated_items}
|
||||
_write_json(index_file, updated_index)
|
||||
|
||||
return {"status": "deleted", "name": name}
|
||||
|
||||
|
||||
def bookmark_set_visibility(
|
||||
definition_path: Path,
|
||||
name: str,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
hidden: bool,
|
||||
) -> dict[str, Any]:
|
||||
"""Set a visual's hidden/visible state inside a bookmark's explorationState.
|
||||
|
||||
When *hidden* is ``True``, sets ``singleVisual.display.mode = "hidden"``.
|
||||
When *hidden* is ``False``, removes the ``display`` key from ``singleVisual``
|
||||
(presence of ``display`` is what hides the visual in Power BI Desktop).
|
||||
|
||||
Creates the ``explorationState.sections.{page_name}.visualContainers.{visual_name}``
|
||||
path if it does not already exist in the bookmark.
|
||||
|
||||
Raises ``PbiCliError`` if the bookmark does not exist.
|
||||
Returns a status dict with name, page, visual, and the new visibility state.
|
||||
"""
|
||||
bm_file = _bookmark_path(definition_path, name)
|
||||
if not bm_file.exists():
|
||||
raise PbiCliError(f"Bookmark '{name}' not found.")
|
||||
|
||||
bm = _read_json(bm_file)
|
||||
|
||||
# Navigate / build the explorationState path immutably.
|
||||
exploration = dict(bm.get("explorationState") or {})
|
||||
sections = dict(exploration.get("sections") or {})
|
||||
page_section = dict(sections.get(page_name) or {})
|
||||
visual_containers = dict(page_section.get("visualContainers") or {})
|
||||
container = dict(visual_containers.get(visual_name) or {})
|
||||
single_visual = dict(container.get("singleVisual") or {})
|
||||
|
||||
if hidden:
|
||||
single_visual = {**single_visual, "display": {"mode": "hidden"}}
|
||||
else:
|
||||
single_visual = {k: v for k, v in single_visual.items() if k != "display"}
|
||||
|
||||
new_container = {**container, "singleVisual": single_visual}
|
||||
new_visual_containers = {**visual_containers, visual_name: new_container}
|
||||
new_page_section = {**page_section, "visualContainers": new_visual_containers}
|
||||
new_sections = {**sections, page_name: new_page_section}
|
||||
new_exploration = {**exploration, "sections": new_sections}
|
||||
new_bm = {**bm, "explorationState": new_exploration}
|
||||
|
||||
_write_json(bm_file, new_bm)
|
||||
|
||||
return {
|
||||
"status": "updated",
|
||||
"bookmark": name,
|
||||
"page": page_name,
|
||||
"visual": visual_name,
|
||||
"hidden": hidden,
|
||||
}
|
||||
217
src/pbi_cli/core/bulk_backend.py
Normal file
217
src/pbi_cli/core/bulk_backend.py
Normal file
|
|
@ -0,0 +1,217 @@
|
|||
"""Bulk visual operations for PBIR reports.
|
||||
|
||||
Orchestration layer over visual_backend.py -- applies filtering
|
||||
(by type, name pattern, position bounds) and fans out to the
|
||||
individual visual_* pure functions.
|
||||
|
||||
Every function follows the same signature contract as the rest of the
|
||||
report layer: takes a ``definition_path: Path`` and returns a plain dict.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import fnmatch
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from pbi_cli.core.visual_backend import (
|
||||
VISUAL_DATA_ROLES,
|
||||
_resolve_visual_type,
|
||||
visual_bind,
|
||||
visual_delete,
|
||||
visual_list,
|
||||
visual_update,
|
||||
)
|
||||
|
||||
|
||||
def visual_where(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_type: str | None = None,
|
||||
name_pattern: str | None = None,
|
||||
x_min: float | None = None,
|
||||
x_max: float | None = None,
|
||||
y_min: float | None = None,
|
||||
y_max: float | None = None,
|
||||
) -> list[dict[str, Any]]:
|
||||
"""Filter visuals on a page by type and/or position bounds.
|
||||
|
||||
Returns the subset of ``visual_list()`` matching ALL provided criteria.
|
||||
All filter arguments are optional -- omitting all returns every visual.
|
||||
|
||||
Args:
|
||||
definition_path: Path to the ``definition/`` folder.
|
||||
page_name: Name of the page to search.
|
||||
visual_type: Resolved PBIR visualType or user alias (e.g. ``"bar"``).
|
||||
name_pattern: fnmatch pattern matched against visual names (e.g. ``"Chart_*"``).
|
||||
x_min: Minimum x position (inclusive).
|
||||
x_max: Maximum x position (inclusive).
|
||||
y_min: Minimum y position (inclusive).
|
||||
y_max: Maximum y position (inclusive).
|
||||
"""
|
||||
resolved_type: str | None = None
|
||||
if visual_type is not None:
|
||||
resolved_type = _resolve_visual_type(visual_type)
|
||||
|
||||
all_visuals = visual_list(definition_path, page_name)
|
||||
result: list[dict[str, Any]] = []
|
||||
|
||||
for v in all_visuals:
|
||||
if resolved_type is not None and v.get("visual_type") != resolved_type:
|
||||
continue
|
||||
if name_pattern is not None and not fnmatch.fnmatch(v.get("name", ""), name_pattern):
|
||||
continue
|
||||
x = v.get("x", 0.0)
|
||||
y = v.get("y", 0.0)
|
||||
if x_min is not None and x < x_min:
|
||||
continue
|
||||
if x_max is not None and x > x_max:
|
||||
continue
|
||||
if y_min is not None and y < y_min:
|
||||
continue
|
||||
if y_max is not None and y > y_max:
|
||||
continue
|
||||
result.append(v)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def visual_bulk_bind(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_type: str,
|
||||
bindings: list[dict[str, str]],
|
||||
name_pattern: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Bind fields to every visual of a given type on a page.
|
||||
|
||||
Applies the same ``bindings`` list to each matching visual by calling
|
||||
``visual_bind()`` in sequence. Stops and raises on the first error.
|
||||
|
||||
Args:
|
||||
definition_path: Path to the ``definition/`` folder.
|
||||
page_name: Name of the page.
|
||||
visual_type: PBIR visualType or user alias -- required (unlike ``visual_where``).
|
||||
bindings: List of ``{"role": ..., "field": ...}`` dicts, same format as
|
||||
``visual_bind()``.
|
||||
name_pattern: Optional fnmatch filter on visual name.
|
||||
|
||||
Returns:
|
||||
``{"bound": N, "page": page_name, "type": resolved_type, "visuals": [names],
|
||||
"bindings": bindings}``
|
||||
"""
|
||||
matching = visual_where(
|
||||
definition_path,
|
||||
page_name,
|
||||
visual_type=visual_type,
|
||||
name_pattern=name_pattern,
|
||||
)
|
||||
bound_names: list[str] = []
|
||||
for v in matching:
|
||||
visual_bind(definition_path, page_name, v["name"], bindings)
|
||||
bound_names.append(v["name"])
|
||||
|
||||
resolved_type = _resolve_visual_type(visual_type)
|
||||
return {
|
||||
"bound": len(bound_names),
|
||||
"page": page_name,
|
||||
"type": resolved_type,
|
||||
"visuals": bound_names,
|
||||
"bindings": bindings,
|
||||
}
|
||||
|
||||
|
||||
def visual_bulk_update(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
where_type: str | None = None,
|
||||
where_name_pattern: str | None = None,
|
||||
set_hidden: bool | None = None,
|
||||
set_width: float | None = None,
|
||||
set_height: float | None = None,
|
||||
set_x: float | None = None,
|
||||
set_y: float | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Apply position/visibility updates to all visuals matching the filter.
|
||||
|
||||
Delegates to ``visual_update()`` for each match. At least one ``set_*``
|
||||
argument must be provided.
|
||||
|
||||
Returns:
|
||||
``{"updated": N, "page": page_name, "visuals": [names]}``
|
||||
"""
|
||||
if all(v is None for v in (set_hidden, set_width, set_height, set_x, set_y)):
|
||||
raise ValueError("At least one set_* argument must be provided to bulk-update")
|
||||
|
||||
matching = visual_where(
|
||||
definition_path,
|
||||
page_name,
|
||||
visual_type=where_type,
|
||||
name_pattern=where_name_pattern,
|
||||
)
|
||||
updated_names: list[str] = []
|
||||
for v in matching:
|
||||
visual_update(
|
||||
definition_path,
|
||||
page_name,
|
||||
v["name"],
|
||||
x=set_x,
|
||||
y=set_y,
|
||||
width=set_width,
|
||||
height=set_height,
|
||||
hidden=set_hidden,
|
||||
)
|
||||
updated_names.append(v["name"])
|
||||
|
||||
return {
|
||||
"updated": len(updated_names),
|
||||
"page": page_name,
|
||||
"visuals": updated_names,
|
||||
}
|
||||
|
||||
|
||||
def visual_bulk_delete(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
where_type: str | None = None,
|
||||
where_name_pattern: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Delete all visuals on a page matching the filter criteria.
|
||||
|
||||
Delegates to ``visual_delete()`` for each match.
|
||||
|
||||
Returns:
|
||||
``{"deleted": N, "page": page_name, "visuals": [names]}``
|
||||
"""
|
||||
if where_type is None and where_name_pattern is None:
|
||||
raise ValueError(
|
||||
"Provide at least --type or --name-pattern to prevent accidental bulk deletion"
|
||||
)
|
||||
|
||||
matching = visual_where(
|
||||
definition_path,
|
||||
page_name,
|
||||
visual_type=where_type,
|
||||
name_pattern=where_name_pattern,
|
||||
)
|
||||
deleted_names: list[str] = []
|
||||
for v in matching:
|
||||
visual_delete(definition_path, page_name, v["name"])
|
||||
deleted_names.append(v["name"])
|
||||
|
||||
return {
|
||||
"deleted": len(deleted_names),
|
||||
"page": page_name,
|
||||
"visuals": deleted_names,
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _supported_roles_for_type(visual_type: str) -> list[str]:
|
||||
"""Return the data role names for a visual type (for help text generation)."""
|
||||
resolved = _resolve_visual_type(visual_type)
|
||||
return VISUAL_DATA_ROLES.get(resolved, [])
|
||||
|
|
@ -19,13 +19,21 @@ _PBI_CLI_CLAUDE_MD_SNIPPET = (
|
|||
"When working with Power BI, DAX, semantic models, or data modeling,\n"
|
||||
"invoke the relevant pbi-cli skill before responding:\n"
|
||||
"\n"
|
||||
"**Semantic Model (requires `pbi connect`):**\n"
|
||||
"- **power-bi-dax** -- DAX queries, measures, calculations\n"
|
||||
"- **power-bi-modeling** -- tables, columns, measures, relationships\n"
|
||||
"- **power-bi-diagnostics** -- troubleshooting, tracing, setup\n"
|
||||
"- **power-bi-deployment** -- TMDL export/import, transactions\n"
|
||||
"- **power-bi-deployment** -- TMDL export/import, transactions, diff\n"
|
||||
"- **power-bi-docs** -- model documentation, data dictionary\n"
|
||||
"- **power-bi-partitions** -- partitions, M expressions, data sources\n"
|
||||
"- **power-bi-security** -- RLS roles, perspectives, access control\n"
|
||||
"- **power-bi-diagnostics** -- troubleshooting, tracing, setup\n"
|
||||
"\n"
|
||||
"**Report Layer (no connection needed):**\n"
|
||||
"- **power-bi-report** -- scaffold, validate, preview PBIR reports\n"
|
||||
"- **power-bi-visuals** -- add, bind, update, bulk-manage visuals\n"
|
||||
"- **power-bi-pages** -- pages, bookmarks, visibility, drillthrough\n"
|
||||
"- **power-bi-themes** -- themes, conditional formatting, styling\n"
|
||||
"- **power-bi-filters** -- page and visual filters (TopN, date, categorical)\n"
|
||||
"\n"
|
||||
"Critical: Multi-line DAX (VAR/RETURN) cannot be passed via `-e`.\n"
|
||||
"Use `--file` or stdin piping instead. See power-bi-dax skill.\n"
|
||||
|
|
|
|||
|
|
@ -40,3 +40,27 @@ class TomError(PbiCliError):
|
|||
self.operation = operation
|
||||
self.detail = detail
|
||||
super().__init__(f"{operation}: {detail}")
|
||||
|
||||
|
||||
class VisualTypeError(PbiCliError):
|
||||
"""Raised when a visual type is not recognised."""
|
||||
|
||||
def __init__(self, visual_type: str) -> None:
|
||||
self.visual_type = visual_type
|
||||
super().__init__(
|
||||
f"Unknown visual type '{visual_type}'. "
|
||||
"Run 'pbi visual types' to see supported types."
|
||||
)
|
||||
|
||||
|
||||
class ReportNotFoundError(PbiCliError):
|
||||
"""Raised when no PBIR report definition folder can be found."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
message: str = (
|
||||
"No PBIR report found. Run this command inside a .pbip project "
|
||||
"or pass --path to the .Report folder."
|
||||
),
|
||||
) -> None:
|
||||
super().__init__(message)
|
||||
|
|
|
|||
514
src/pbi_cli/core/filter_backend.py
Normal file
514
src/pbi_cli/core/filter_backend.py
Normal file
|
|
@ -0,0 +1,514 @@
|
|||
"""Pure-function backend for PBIR filter operations.
|
||||
|
||||
Every function takes a ``Path`` to the definition folder and returns a plain
|
||||
Python dict suitable for ``format_result()``.
|
||||
|
||||
Filters are stored in ``filterConfig.filters[]`` inside either:
|
||||
- ``pages/<page_name>/page.json`` for page-level filters
|
||||
- ``pages/<page_name>/visuals/<visual_name>/visual.json`` for visual-level filters
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import secrets
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.pbir_path import get_page_dir, get_visual_dir
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# JSON helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any]:
|
||||
"""Read and parse a JSON file."""
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
"""Write JSON with consistent formatting."""
|
||||
path.write_text(
|
||||
json.dumps(data, indent=2, ensure_ascii=False) + "\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
|
||||
def _generate_name() -> str:
|
||||
"""Generate a 20-character hex identifier matching PBIR convention."""
|
||||
return secrets.token_hex(10)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Path resolution helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _resolve_target_path(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str | None,
|
||||
) -> Path:
|
||||
"""Return the JSON file path for the target (page or visual)."""
|
||||
if visual_name is None:
|
||||
return get_page_dir(definition_path, page_name) / "page.json"
|
||||
return get_visual_dir(definition_path, page_name, visual_name) / "visual.json"
|
||||
|
||||
|
||||
def _get_filters(data: dict[str, Any]) -> list[dict[str, Any]]:
|
||||
"""Extract the filters list from a page or visual JSON dict."""
|
||||
filter_config = data.get("filterConfig")
|
||||
if not isinstance(filter_config, dict):
|
||||
return []
|
||||
filters = filter_config.get("filters")
|
||||
if not isinstance(filters, list):
|
||||
return []
|
||||
return filters
|
||||
|
||||
|
||||
def _set_filters(data: dict[str, Any], filters: list[dict[str, Any]]) -> dict[str, Any]:
|
||||
"""Return a new dict with filterConfig.filters replaced (immutable update)."""
|
||||
filter_config = dict(data.get("filterConfig") or {})
|
||||
filter_config["filters"] = filters
|
||||
return {**data, "filterConfig": filter_config}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Public API
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def filter_list(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str | None = None,
|
||||
) -> list[dict[str, Any]]:
|
||||
"""List filters on a page or specific visual.
|
||||
|
||||
If visual_name is None, returns page-level filters from page.json.
|
||||
If visual_name is given, returns visual-level filters from visual.json.
|
||||
Returns the raw filter dicts from filterConfig.filters[].
|
||||
"""
|
||||
target = _resolve_target_path(definition_path, page_name, visual_name)
|
||||
if not target.exists():
|
||||
raise PbiCliError(f"File not found: {target}")
|
||||
data = _read_json(target)
|
||||
return _get_filters(data)
|
||||
|
||||
|
||||
def _to_pbi_literal(value: str) -> str:
|
||||
"""Convert a CLI string value to a Power BI literal.
|
||||
|
||||
Power BI uses typed literals: strings are single-quoted (``'text'``),
|
||||
integers use an ``L`` suffix (``123L``), and doubles use ``D`` (``1.5D``).
|
||||
"""
|
||||
# Try integer first (e.g. "2014" -> "2014L")
|
||||
try:
|
||||
int(value)
|
||||
return f"{value}L"
|
||||
except ValueError:
|
||||
pass
|
||||
# Try float (e.g. "3.14" -> "3.14D")
|
||||
try:
|
||||
float(value)
|
||||
return f"{value}D"
|
||||
except ValueError:
|
||||
pass
|
||||
# Fall back to string literal
|
||||
return f"'{value}'"
|
||||
|
||||
|
||||
def filter_add_categorical(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
table: str,
|
||||
column: str,
|
||||
values: list[str],
|
||||
visual_name: str | None = None,
|
||||
name: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Add a categorical filter to a page or visual.
|
||||
|
||||
Builds the full filterConfig entry from table/column/values.
|
||||
The source alias is always the first letter of the table name (lowercase).
|
||||
Returns a status dict with name, type, and scope.
|
||||
"""
|
||||
target = _resolve_target_path(definition_path, page_name, visual_name)
|
||||
if not target.exists():
|
||||
raise PbiCliError(f"File not found: {target}")
|
||||
|
||||
filter_name = name if name is not None else _generate_name()
|
||||
alias = table[0].lower()
|
||||
scope = "visual" if visual_name is not None else "page"
|
||||
|
||||
where_values: list[list[dict[str, Any]]] = [
|
||||
[{"Literal": {"Value": _to_pbi_literal(v)}}] for v in values
|
||||
]
|
||||
|
||||
entry: dict[str, Any] = {
|
||||
"name": filter_name,
|
||||
"field": {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Entity": table}},
|
||||
"Property": column,
|
||||
}
|
||||
},
|
||||
"type": "Categorical",
|
||||
"filter": {
|
||||
"Version": 2,
|
||||
"From": [{"Name": alias, "Entity": table, "Type": 0}],
|
||||
"Where": [
|
||||
{
|
||||
"Condition": {
|
||||
"In": {
|
||||
"Expressions": [
|
||||
{
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Source": alias}},
|
||||
"Property": column,
|
||||
}
|
||||
}
|
||||
],
|
||||
"Values": where_values,
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
if scope == "page":
|
||||
entry["howCreated"] = "User"
|
||||
|
||||
data = _read_json(target)
|
||||
filters = list(_get_filters(data))
|
||||
filters.append(entry)
|
||||
updated = _set_filters(data, filters)
|
||||
_write_json(target, updated)
|
||||
|
||||
return {"status": "added", "name": filter_name, "type": "Categorical", "scope": scope}
|
||||
|
||||
|
||||
def filter_remove(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
filter_name: str,
|
||||
visual_name: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Remove a filter by name from a page or visual.
|
||||
|
||||
Raises PbiCliError if filter_name is not found.
|
||||
Returns a status dict with the removed filter name.
|
||||
"""
|
||||
target = _resolve_target_path(definition_path, page_name, visual_name)
|
||||
if not target.exists():
|
||||
raise PbiCliError(f"File not found: {target}")
|
||||
|
||||
data = _read_json(target)
|
||||
filters = _get_filters(data)
|
||||
remaining = [f for f in filters if f.get("name") != filter_name]
|
||||
|
||||
if len(remaining) == len(filters):
|
||||
raise PbiCliError(
|
||||
f"Filter '{filter_name}' not found on "
|
||||
f"{'visual ' + visual_name if visual_name else 'page'} '{page_name}'."
|
||||
)
|
||||
|
||||
updated = _set_filters(data, remaining)
|
||||
_write_json(target, updated)
|
||||
return {"status": "removed", "name": filter_name}
|
||||
|
||||
|
||||
def filter_add_topn(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
table: str,
|
||||
column: str,
|
||||
n: int,
|
||||
order_by_table: str,
|
||||
order_by_column: str,
|
||||
direction: str = "Top",
|
||||
visual_name: str | None = None,
|
||||
name: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Add a TopN filter to a page or visual.
|
||||
|
||||
*direction* is ``"Top"`` (highest N by *order_by_column*) or
|
||||
``"Bottom"`` (lowest N). Direction maps to Power BI query Direction
|
||||
values: Top = 2 (Descending), Bottom = 1 (Ascending).
|
||||
|
||||
Returns a status dict with name, type, scope, n, and direction.
|
||||
"""
|
||||
direction_upper = direction.strip().capitalize()
|
||||
if direction_upper not in ("Top", "Bottom"):
|
||||
raise PbiCliError(f"direction must be 'Top' or 'Bottom', got '{direction}'.")
|
||||
|
||||
pbi_direction = 2 if direction_upper == "Top" else 1
|
||||
|
||||
target = _resolve_target_path(definition_path, page_name, visual_name)
|
||||
if not target.exists():
|
||||
raise PbiCliError(f"File not found: {target}")
|
||||
|
||||
filter_name = name if name is not None else _generate_name()
|
||||
cat_alias = table[0].lower()
|
||||
ord_alias = order_by_table[0].lower()
|
||||
# Avoid alias collision when both tables start with the same letter
|
||||
if ord_alias == cat_alias and order_by_table != table:
|
||||
ord_alias = ord_alias + "2"
|
||||
scope = "visual" if visual_name is not None else "page"
|
||||
|
||||
# Inner subquery From: include both tables when they differ
|
||||
inner_from: list[dict[str, Any]] = [
|
||||
{"Name": cat_alias, "Entity": table, "Type": 0},
|
||||
]
|
||||
if order_by_table != table:
|
||||
inner_from.append({"Name": ord_alias, "Entity": order_by_table, "Type": 0})
|
||||
|
||||
entry: dict[str, Any] = {
|
||||
"name": filter_name,
|
||||
"field": {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Entity": table}},
|
||||
"Property": column,
|
||||
}
|
||||
},
|
||||
"type": "TopN",
|
||||
"filter": {
|
||||
"Version": 2,
|
||||
"From": [
|
||||
{
|
||||
"Name": "subquery",
|
||||
"Expression": {
|
||||
"Subquery": {
|
||||
"Query": {
|
||||
"Version": 2,
|
||||
"From": inner_from,
|
||||
"Select": [
|
||||
{
|
||||
"Column": {
|
||||
"Expression": {
|
||||
"SourceRef": {"Source": cat_alias}
|
||||
},
|
||||
"Property": column,
|
||||
},
|
||||
"Name": "field",
|
||||
}
|
||||
],
|
||||
"OrderBy": [
|
||||
{
|
||||
"Direction": pbi_direction,
|
||||
"Expression": {
|
||||
"Aggregation": {
|
||||
"Expression": {
|
||||
"Column": {
|
||||
"Expression": {
|
||||
"SourceRef": {
|
||||
"Source": ord_alias
|
||||
if order_by_table != table
|
||||
else cat_alias
|
||||
}
|
||||
},
|
||||
"Property": order_by_column,
|
||||
}
|
||||
},
|
||||
"Function": 0,
|
||||
}
|
||||
},
|
||||
}
|
||||
],
|
||||
"Top": n,
|
||||
}
|
||||
}
|
||||
},
|
||||
"Type": 2,
|
||||
},
|
||||
{"Name": cat_alias, "Entity": table, "Type": 0},
|
||||
],
|
||||
"Where": [
|
||||
{
|
||||
"Condition": {
|
||||
"In": {
|
||||
"Expressions": [
|
||||
{
|
||||
"Column": {
|
||||
"Expression": {
|
||||
"SourceRef": {"Source": cat_alias}
|
||||
},
|
||||
"Property": column,
|
||||
}
|
||||
}
|
||||
],
|
||||
"Table": {"SourceRef": {"Source": "subquery"}},
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
if scope == "page":
|
||||
entry["howCreated"] = "User"
|
||||
|
||||
data = _read_json(target)
|
||||
filters = list(_get_filters(data))
|
||||
filters.append(entry)
|
||||
updated = _set_filters(data, filters)
|
||||
_write_json(target, updated)
|
||||
|
||||
return {
|
||||
"status": "added",
|
||||
"name": filter_name,
|
||||
"type": "TopN",
|
||||
"scope": scope,
|
||||
"n": n,
|
||||
"direction": direction_upper,
|
||||
}
|
||||
|
||||
|
||||
# TimeUnit integer codes used by Power BI for RelativeDate filters.
|
||||
_RELATIVE_DATE_TIME_UNITS: dict[str, int] = {
|
||||
"days": 0,
|
||||
"weeks": 1,
|
||||
"months": 2,
|
||||
"years": 3,
|
||||
}
|
||||
|
||||
|
||||
def filter_add_relative_date(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
table: str,
|
||||
column: str,
|
||||
amount: int,
|
||||
time_unit: str,
|
||||
visual_name: str | None = None,
|
||||
name: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Add a RelativeDate filter (e.g. "last 3 months") to a page or visual.
|
||||
|
||||
*amount* is a positive integer representing the period count.
|
||||
*time_unit* is one of ``"days"``, ``"weeks"``, ``"months"``, ``"years"``.
|
||||
|
||||
The filter matches rows where *column* falls in the last *amount* *time_unit*
|
||||
relative to today (inclusive of the current period boundary).
|
||||
|
||||
Returns a status dict with name, type, scope, amount, and time_unit.
|
||||
"""
|
||||
time_unit_lower = time_unit.strip().lower()
|
||||
if time_unit_lower not in _RELATIVE_DATE_TIME_UNITS:
|
||||
valid = ", ".join(_RELATIVE_DATE_TIME_UNITS)
|
||||
raise PbiCliError(
|
||||
f"time_unit must be one of {valid}, got '{time_unit}'."
|
||||
)
|
||||
time_unit_code = _RELATIVE_DATE_TIME_UNITS[time_unit_lower]
|
||||
days_code = _RELATIVE_DATE_TIME_UNITS["days"]
|
||||
|
||||
target = _resolve_target_path(definition_path, page_name, visual_name)
|
||||
if not target.exists():
|
||||
raise PbiCliError(f"File not found: {target}")
|
||||
|
||||
filter_name = name if name is not None else _generate_name()
|
||||
alias = table[0].lower()
|
||||
scope = "visual" if visual_name is not None else "page"
|
||||
|
||||
# LowerBound: DateSpan(DateAdd(DateAdd(Now(), +1, days), -amount, time_unit), days)
|
||||
lower_bound: dict[str, Any] = {
|
||||
"DateSpan": {
|
||||
"Expression": {
|
||||
"DateAdd": {
|
||||
"Expression": {
|
||||
"DateAdd": {
|
||||
"Expression": {"Now": {}},
|
||||
"Amount": 1,
|
||||
"TimeUnit": days_code,
|
||||
}
|
||||
},
|
||||
"Amount": -amount,
|
||||
"TimeUnit": time_unit_code,
|
||||
}
|
||||
},
|
||||
"TimeUnit": days_code,
|
||||
}
|
||||
}
|
||||
|
||||
# UpperBound: DateSpan(Now(), days)
|
||||
upper_bound: dict[str, Any] = {
|
||||
"DateSpan": {
|
||||
"Expression": {"Now": {}},
|
||||
"TimeUnit": days_code,
|
||||
}
|
||||
}
|
||||
|
||||
entry: dict[str, Any] = {
|
||||
"name": filter_name,
|
||||
"field": {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Entity": table}},
|
||||
"Property": column,
|
||||
}
|
||||
},
|
||||
"type": "RelativeDate",
|
||||
"filter": {
|
||||
"Version": 2,
|
||||
"From": [{"Name": alias, "Entity": table, "Type": 0}],
|
||||
"Where": [
|
||||
{
|
||||
"Condition": {
|
||||
"Between": {
|
||||
"Expression": {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Source": alias}},
|
||||
"Property": column,
|
||||
}
|
||||
},
|
||||
"LowerBound": lower_bound,
|
||||
"UpperBound": upper_bound,
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
if scope == "page":
|
||||
entry["howCreated"] = "User"
|
||||
|
||||
data = _read_json(target)
|
||||
filters = list(_get_filters(data))
|
||||
filters.append(entry)
|
||||
updated = _set_filters(data, filters)
|
||||
_write_json(target, updated)
|
||||
|
||||
return {
|
||||
"status": "added",
|
||||
"name": filter_name,
|
||||
"type": "RelativeDate",
|
||||
"scope": scope,
|
||||
"amount": amount,
|
||||
"time_unit": time_unit_lower,
|
||||
}
|
||||
|
||||
|
||||
def filter_clear(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Remove all filters from a page or visual.
|
||||
|
||||
Returns a status dict with the count of removed filters and scope.
|
||||
"""
|
||||
target = _resolve_target_path(definition_path, page_name, visual_name)
|
||||
if not target.exists():
|
||||
raise PbiCliError(f"File not found: {target}")
|
||||
|
||||
scope = "visual" if visual_name is not None else "page"
|
||||
data = _read_json(target)
|
||||
filters = _get_filters(data)
|
||||
removed = len(filters)
|
||||
|
||||
updated = _set_filters(data, [])
|
||||
_write_json(target, updated)
|
||||
return {"status": "cleared", "removed": removed, "scope": scope}
|
||||
403
src/pbi_cli/core/format_backend.py
Normal file
403
src/pbi_cli/core/format_backend.py
Normal file
|
|
@ -0,0 +1,403 @@
|
|||
"""Pure-function backend for PBIR conditional formatting operations.
|
||||
|
||||
Mirrors ``report_backend.py`` but focuses on visual conditional formatting.
|
||||
Every function takes a ``Path`` to the definition folder and returns a plain
|
||||
Python dict suitable for ``format_result()``.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.pbir_path import get_visual_dir
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# JSON helpers (same as report_backend / visual_backend)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any]:
|
||||
"""Read and parse a JSON file."""
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
"""Write JSON with consistent formatting."""
|
||||
path.write_text(
|
||||
json.dumps(data, indent=2, ensure_ascii=False) + "\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Internal helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _load_visual(definition_path: Path, page_name: str, visual_name: str) -> dict[str, Any]:
|
||||
"""Load and return visual JSON data, raising PbiCliError if missing."""
|
||||
visual_path = get_visual_dir(definition_path, page_name, visual_name) / "visual.json"
|
||||
if not visual_path.exists():
|
||||
raise PbiCliError(
|
||||
f"Visual '{visual_name}' not found on page '{page_name}'. "
|
||||
f"Expected: {visual_path}"
|
||||
)
|
||||
return _read_json(visual_path)
|
||||
|
||||
|
||||
def _save_visual(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
data: dict[str, Any],
|
||||
) -> None:
|
||||
"""Write visual JSON data back to disk."""
|
||||
visual_path = get_visual_dir(definition_path, page_name, visual_name) / "visual.json"
|
||||
_write_json(visual_path, data)
|
||||
|
||||
|
||||
def _get_values_list(objects: dict[str, Any]) -> list[dict[str, Any]]:
|
||||
"""Return the objects.values list, defaulting to empty."""
|
||||
return list(objects.get("values", []))
|
||||
|
||||
|
||||
def _replace_or_append(
|
||||
values: list[dict[str, Any]],
|
||||
new_entry: dict[str, Any],
|
||||
field_query_ref: str,
|
||||
) -> list[dict[str, Any]]:
|
||||
"""Return a new list with *new_entry* replacing any existing entry
|
||||
whose ``selector.metadata`` matches *field_query_ref*, or appended
|
||||
if no match exists. Immutable -- does not modify the input list.
|
||||
"""
|
||||
replaced = False
|
||||
result: list[dict[str, Any]] = []
|
||||
for entry in values:
|
||||
meta = entry.get("selector", {}).get("metadata", "")
|
||||
if meta == field_query_ref:
|
||||
result.append(new_entry)
|
||||
replaced = True
|
||||
else:
|
||||
result.append(entry)
|
||||
if not replaced:
|
||||
result.append(new_entry)
|
||||
return result
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Public API
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def format_get(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
) -> dict[str, Any]:
|
||||
"""Return current formatting objects for a visual.
|
||||
|
||||
Returns ``{"visual": visual_name, "objects": {...}}`` where *objects*
|
||||
is the content of ``visual.objects`` (empty dict if absent).
|
||||
"""
|
||||
data = _load_visual(definition_path, page_name, visual_name)
|
||||
objects = data.get("visual", {}).get("objects", {})
|
||||
return {"visual": visual_name, "objects": objects}
|
||||
|
||||
|
||||
def format_clear(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
) -> dict[str, Any]:
|
||||
"""Clear all formatting objects from a visual.
|
||||
|
||||
Sets ``visual.objects`` to ``{}`` and persists the change.
|
||||
Returns ``{"status": "cleared", "visual": visual_name}``.
|
||||
"""
|
||||
data = _load_visual(definition_path, page_name, visual_name)
|
||||
visual_section = dict(data.get("visual", {}))
|
||||
visual_section["objects"] = {}
|
||||
new_data = {**data, "visual": visual_section}
|
||||
_save_visual(definition_path, page_name, visual_name, new_data)
|
||||
return {"status": "cleared", "visual": visual_name}
|
||||
|
||||
|
||||
def format_background_gradient(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
input_table: str,
|
||||
input_column: str,
|
||||
field_query_ref: str,
|
||||
min_color: str = "minColor",
|
||||
max_color: str = "maxColor",
|
||||
) -> dict[str, Any]:
|
||||
"""Add a linear gradient background color rule to a visual column.
|
||||
|
||||
*input_table* / *input_column*: the measure/column driving the gradient
|
||||
(used for the FillRule.Input Aggregation).
|
||||
|
||||
*field_query_ref*: the queryRef of the target field (e.g.
|
||||
``"Sum(financials.Profit)"``). Used as ``selector.metadata``.
|
||||
|
||||
Adds/replaces the entry in ``visual.objects.values[]`` whose
|
||||
``selector.metadata`` matches *field_query_ref*.
|
||||
|
||||
Returns ``{"status": "applied", "visual": visual_name,
|
||||
"rule": "gradient", "field": field_query_ref}``.
|
||||
"""
|
||||
data = _load_visual(definition_path, page_name, visual_name)
|
||||
visual_section = dict(data.get("visual", {}))
|
||||
objects = dict(visual_section.get("objects", {}))
|
||||
values = _get_values_list(objects)
|
||||
|
||||
new_entry: dict[str, Any] = {
|
||||
"properties": {
|
||||
"backColor": {
|
||||
"solid": {
|
||||
"color": {
|
||||
"expr": {
|
||||
"FillRule": {
|
||||
"Input": {
|
||||
"Aggregation": {
|
||||
"Expression": {
|
||||
"Column": {
|
||||
"Expression": {
|
||||
"SourceRef": {"Entity": input_table}
|
||||
},
|
||||
"Property": input_column,
|
||||
}
|
||||
},
|
||||
"Function": 0,
|
||||
}
|
||||
},
|
||||
"FillRule": {
|
||||
"linearGradient2": {
|
||||
"min": {
|
||||
"color": {
|
||||
"Literal": {"Value": f"'{min_color}'"}
|
||||
}
|
||||
},
|
||||
"max": {
|
||||
"color": {
|
||||
"Literal": {"Value": f"'{max_color}'"}
|
||||
}
|
||||
},
|
||||
"nullColoringStrategy": {
|
||||
"strategy": {
|
||||
"Literal": {"Value": "'asZero'"}
|
||||
}
|
||||
},
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"selector": {
|
||||
"data": [{"dataViewWildcard": {"matchingOption": 1}}],
|
||||
"metadata": field_query_ref,
|
||||
},
|
||||
}
|
||||
|
||||
new_values = _replace_or_append(values, new_entry, field_query_ref)
|
||||
new_objects = {**objects, "values": new_values}
|
||||
new_visual = {**visual_section, "objects": new_objects}
|
||||
new_data = {**data, "visual": new_visual}
|
||||
_save_visual(definition_path, page_name, visual_name, new_data)
|
||||
|
||||
return {
|
||||
"status": "applied",
|
||||
"visual": visual_name,
|
||||
"rule": "gradient",
|
||||
"field": field_query_ref,
|
||||
}
|
||||
|
||||
|
||||
# ComparisonKind integer codes (Power BI query expression).
|
||||
_COMPARISON_KINDS: dict[str, int] = {
|
||||
"eq": 0,
|
||||
"neq": 1,
|
||||
"gt": 2,
|
||||
"gte": 3,
|
||||
"lt": 4,
|
||||
"lte": 5,
|
||||
}
|
||||
|
||||
|
||||
def format_background_conditional(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
input_table: str,
|
||||
input_column: str,
|
||||
threshold: float | int,
|
||||
color_hex: str,
|
||||
comparison: str = "gt",
|
||||
field_query_ref: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Add a rule-based conditional background color to a visual column.
|
||||
|
||||
When the aggregated value of *input_column* satisfies the comparison
|
||||
against *threshold*, the cell background is set to *color_hex*.
|
||||
|
||||
*comparison* is one of ``"eq"``, ``"neq"``, ``"gt"``, ``"gte"``,
|
||||
``"lt"``, ``"lte"`` (default ``"gt"``).
|
||||
|
||||
*field_query_ref* is the ``selector.metadata`` queryRef of the target
|
||||
field (e.g. ``"Sum(financials.Units Sold)"``). Defaults to
|
||||
``"Sum({table}.{column})"`` if omitted.
|
||||
|
||||
Returns ``{"status": "applied", "visual": visual_name,
|
||||
"rule": "conditional", "field": field_query_ref}``.
|
||||
"""
|
||||
comparison_lower = comparison.strip().lower()
|
||||
if comparison_lower not in _COMPARISON_KINDS:
|
||||
valid = ", ".join(_COMPARISON_KINDS)
|
||||
raise PbiCliError(
|
||||
f"comparison must be one of {valid}, got '{comparison}'."
|
||||
)
|
||||
comparison_kind = _COMPARISON_KINDS[comparison_lower]
|
||||
|
||||
if field_query_ref is None:
|
||||
field_query_ref = f"Sum({input_table}.{input_column})"
|
||||
|
||||
# Format threshold as a Power BI decimal literal (D suffix).
|
||||
threshold_literal = f"{threshold}D"
|
||||
|
||||
data = _load_visual(definition_path, page_name, visual_name)
|
||||
visual_section = dict(data.get("visual", {}))
|
||||
objects = dict(visual_section.get("objects", {}))
|
||||
values = _get_values_list(objects)
|
||||
|
||||
new_entry: dict[str, Any] = {
|
||||
"properties": {
|
||||
"backColor": {
|
||||
"solid": {
|
||||
"color": {
|
||||
"expr": {
|
||||
"Conditional": {
|
||||
"Cases": [
|
||||
{
|
||||
"Condition": {
|
||||
"Comparison": {
|
||||
"ComparisonKind": comparison_kind,
|
||||
"Left": {
|
||||
"Aggregation": {
|
||||
"Expression": {
|
||||
"Column": {
|
||||
"Expression": {
|
||||
"SourceRef": {
|
||||
"Entity": input_table
|
||||
}
|
||||
},
|
||||
"Property": input_column,
|
||||
}
|
||||
},
|
||||
"Function": 0,
|
||||
}
|
||||
},
|
||||
"Right": {
|
||||
"Literal": {
|
||||
"Value": threshold_literal
|
||||
}
|
||||
},
|
||||
}
|
||||
},
|
||||
"Value": {
|
||||
"Literal": {"Value": f"'{color_hex}'"}
|
||||
},
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"selector": {
|
||||
"data": [{"dataViewWildcard": {"matchingOption": 1}}],
|
||||
"metadata": field_query_ref,
|
||||
},
|
||||
}
|
||||
|
||||
new_values = _replace_or_append(values, new_entry, field_query_ref)
|
||||
new_objects = {**objects, "values": new_values}
|
||||
new_visual = {**visual_section, "objects": new_objects}
|
||||
new_data = {**data, "visual": new_visual}
|
||||
_save_visual(definition_path, page_name, visual_name, new_data)
|
||||
|
||||
return {
|
||||
"status": "applied",
|
||||
"visual": visual_name,
|
||||
"rule": "conditional",
|
||||
"field": field_query_ref,
|
||||
}
|
||||
|
||||
|
||||
def format_background_measure(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
measure_table: str,
|
||||
measure_property: str,
|
||||
field_query_ref: str,
|
||||
) -> dict[str, Any]:
|
||||
"""Add a measure-driven background color rule to a visual column.
|
||||
|
||||
*measure_table* / *measure_property*: the DAX measure that returns a
|
||||
hex color string.
|
||||
|
||||
*field_query_ref*: the queryRef of the target field.
|
||||
|
||||
Adds/replaces the entry in ``visual.objects.values[]`` whose
|
||||
``selector.metadata`` matches *field_query_ref*.
|
||||
|
||||
Returns ``{"status": "applied", "visual": visual_name,
|
||||
"rule": "measure", "field": field_query_ref}``.
|
||||
"""
|
||||
data = _load_visual(definition_path, page_name, visual_name)
|
||||
visual_section = dict(data.get("visual", {}))
|
||||
objects = dict(visual_section.get("objects", {}))
|
||||
values = _get_values_list(objects)
|
||||
|
||||
new_entry: dict[str, Any] = {
|
||||
"properties": {
|
||||
"backColor": {
|
||||
"solid": {
|
||||
"color": {
|
||||
"expr": {
|
||||
"Measure": {
|
||||
"Expression": {
|
||||
"SourceRef": {"Entity": measure_table}
|
||||
},
|
||||
"Property": measure_property,
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"selector": {
|
||||
"data": [{"dataViewWildcard": {"matchingOption": 1}}],
|
||||
"metadata": field_query_ref,
|
||||
},
|
||||
}
|
||||
|
||||
new_values = _replace_or_append(values, new_entry, field_query_ref)
|
||||
new_objects = {**objects, "values": new_values}
|
||||
new_visual = {**visual_section, "objects": new_objects}
|
||||
new_data = {**data, "visual": new_visual}
|
||||
_save_visual(definition_path, page_name, visual_name, new_data)
|
||||
|
||||
return {
|
||||
"status": "applied",
|
||||
"visual": visual_name,
|
||||
"rule": "measure",
|
||||
"field": field_query_ref,
|
||||
}
|
||||
220
src/pbi_cli/core/pbir_models.py
Normal file
220
src/pbi_cli/core/pbir_models.py
Normal file
|
|
@ -0,0 +1,220 @@
|
|||
"""Frozen dataclasses for PBIR (Power BI Enhanced Report Format) structures."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from pathlib import Path
|
||||
|
||||
# -- PBIR Schema URLs -------------------------------------------------------
|
||||
|
||||
SCHEMA_REPORT = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/report/1.2.0/schema.json"
|
||||
)
|
||||
SCHEMA_PAGE = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/page/2.1.0/schema.json"
|
||||
)
|
||||
SCHEMA_PAGES_METADATA = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/pagesMetadata/1.0.0/schema.json"
|
||||
)
|
||||
SCHEMA_VERSION = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/versionMetadata/1.0.0/schema.json"
|
||||
)
|
||||
SCHEMA_VISUAL_CONTAINER = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/visualContainer/2.7.0/schema.json"
|
||||
)
|
||||
SCHEMA_BOOKMARKS_METADATA = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/bookmarksMetadata/1.0.0/schema.json"
|
||||
)
|
||||
SCHEMA_BOOKMARK = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/bookmark/2.1.0/schema.json"
|
||||
)
|
||||
|
||||
|
||||
# -- Visual type identifiers ------------------------------------------------
|
||||
|
||||
SUPPORTED_VISUAL_TYPES: frozenset[str] = frozenset({
|
||||
# Original 9
|
||||
"barChart",
|
||||
"lineChart",
|
||||
"card",
|
||||
"pivotTable",
|
||||
"tableEx",
|
||||
"slicer",
|
||||
"kpi",
|
||||
"gauge",
|
||||
"donutChart",
|
||||
# v3.1.0 additions
|
||||
"columnChart",
|
||||
"areaChart",
|
||||
"ribbonChart",
|
||||
"waterfallChart",
|
||||
"scatterChart",
|
||||
"funnelChart",
|
||||
"multiRowCard",
|
||||
"treemap",
|
||||
"cardNew",
|
||||
"stackedBarChart",
|
||||
"lineStackedColumnComboChart",
|
||||
# v3.4.0 additions
|
||||
"cardVisual",
|
||||
"actionButton",
|
||||
# v3.5.0 additions (confirmed from HR Analysis Desktop export)
|
||||
"clusteredColumnChart",
|
||||
"clusteredBarChart",
|
||||
"textSlicer",
|
||||
"listSlicer",
|
||||
# v3.6.0 additions (confirmed from HR Analysis Desktop export)
|
||||
"image",
|
||||
"shape",
|
||||
"textbox",
|
||||
"pageNavigator",
|
||||
"advancedSlicerVisual",
|
||||
# v3.8.0 additions
|
||||
"azureMap",
|
||||
})
|
||||
|
||||
# Mapping from user-friendly names to PBIR visualType identifiers
|
||||
VISUAL_TYPE_ALIASES: dict[str, str] = {
|
||||
# Original 9
|
||||
"bar_chart": "barChart",
|
||||
"bar": "barChart",
|
||||
"line_chart": "lineChart",
|
||||
"line": "lineChart",
|
||||
"card": "card",
|
||||
"table": "tableEx",
|
||||
"matrix": "pivotTable",
|
||||
"slicer": "slicer",
|
||||
"kpi": "kpi",
|
||||
"gauge": "gauge",
|
||||
"donut": "donutChart",
|
||||
"donut_chart": "donutChart",
|
||||
"pie": "donutChart",
|
||||
# v3.1.0 additions
|
||||
"column": "columnChart",
|
||||
"column_chart": "columnChart",
|
||||
"area": "areaChart",
|
||||
"area_chart": "areaChart",
|
||||
"ribbon": "ribbonChart",
|
||||
"ribbon_chart": "ribbonChart",
|
||||
"waterfall": "waterfallChart",
|
||||
"waterfall_chart": "waterfallChart",
|
||||
"scatter": "scatterChart",
|
||||
"scatter_chart": "scatterChart",
|
||||
"funnel": "funnelChart",
|
||||
"funnel_chart": "funnelChart",
|
||||
"multi_row_card": "multiRowCard",
|
||||
"treemap": "treemap",
|
||||
"card_new": "cardNew",
|
||||
"new_card": "cardNew",
|
||||
"stacked_bar": "stackedBarChart",
|
||||
"stacked_bar_chart": "stackedBarChart",
|
||||
"combo": "lineStackedColumnComboChart",
|
||||
"combo_chart": "lineStackedColumnComboChart",
|
||||
# v3.4.0 additions
|
||||
"card_visual": "cardVisual",
|
||||
"modern_card": "cardVisual",
|
||||
"action_button": "actionButton",
|
||||
"button": "actionButton",
|
||||
# v3.5.0 additions
|
||||
"clustered_column": "clusteredColumnChart",
|
||||
"clustered_column_chart": "clusteredColumnChart",
|
||||
"clustered_bar": "clusteredBarChart",
|
||||
"clustered_bar_chart": "clusteredBarChart",
|
||||
"text_slicer": "textSlicer",
|
||||
"list_slicer": "listSlicer",
|
||||
# v3.6.0 additions
|
||||
"img": "image",
|
||||
"text_box": "textbox",
|
||||
"page_navigator": "pageNavigator",
|
||||
"page_nav": "pageNavigator",
|
||||
"navigator": "pageNavigator",
|
||||
"advanced_slicer": "advancedSlicerVisual",
|
||||
"adv_slicer": "advancedSlicerVisual",
|
||||
"tile_slicer": "advancedSlicerVisual",
|
||||
# v3.8.0 additions
|
||||
"azure_map": "azureMap",
|
||||
"map": "azureMap",
|
||||
}
|
||||
|
||||
|
||||
# -- Default theme -----------------------------------------------------------
|
||||
|
||||
DEFAULT_BASE_THEME = {
|
||||
"name": "CY24SU06",
|
||||
"reportVersionAtImport": "5.55",
|
||||
"type": "SharedResources",
|
||||
}
|
||||
|
||||
|
||||
# -- Dataclasses -------------------------------------------------------------
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class PbirPosition:
|
||||
"""Visual position and dimensions on a page canvas."""
|
||||
|
||||
x: float
|
||||
y: float
|
||||
width: float
|
||||
height: float
|
||||
z: int = 0
|
||||
tab_order: int = 0
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class PbirVisual:
|
||||
"""Summary of a single PBIR visual."""
|
||||
|
||||
name: str
|
||||
visual_type: str
|
||||
position: PbirPosition
|
||||
page_name: str
|
||||
folder_path: Path
|
||||
has_query: bool = False
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class PbirPage:
|
||||
"""Summary of a single PBIR page."""
|
||||
|
||||
name: str
|
||||
display_name: str
|
||||
ordinal: int
|
||||
width: int
|
||||
height: int
|
||||
display_option: str
|
||||
visual_count: int
|
||||
folder_path: Path
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class PbirReport:
|
||||
"""Summary of a PBIR report."""
|
||||
|
||||
name: str
|
||||
definition_path: Path
|
||||
page_count: int
|
||||
theme_name: str
|
||||
pages: list[PbirPage] = field(default_factory=list)
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class FieldBinding:
|
||||
"""A single field binding for a visual data role."""
|
||||
|
||||
role: str
|
||||
table: str
|
||||
column: str
|
||||
is_measure: bool = False
|
||||
|
||||
@property
|
||||
def qualified_name(self) -> str:
|
||||
"""Return Table[Column] notation."""
|
||||
return f"{self.table}[{self.column}]"
|
||||
163
src/pbi_cli/core/pbir_path.py
Normal file
163
src/pbi_cli/core/pbir_path.py
Normal file
|
|
@ -0,0 +1,163 @@
|
|||
"""PBIR report folder resolution and path utilities."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
|
||||
from pbi_cli.core.errors import ReportNotFoundError
|
||||
|
||||
# Maximum parent directories to walk up when auto-detecting
|
||||
_MAX_WALK_UP = 5
|
||||
|
||||
|
||||
def resolve_report_path(explicit_path: str | None = None) -> Path:
|
||||
"""Resolve the PBIR definition folder path.
|
||||
|
||||
Resolution order:
|
||||
1. Explicit ``--path`` provided by user.
|
||||
2. Walk up from CWD looking for ``*.Report/definition/report.json``.
|
||||
3. Look for a sibling ``.pbip`` file and derive the ``.Report`` folder.
|
||||
4. Raise ``ReportNotFoundError``.
|
||||
"""
|
||||
if explicit_path is not None:
|
||||
return _resolve_explicit(Path(explicit_path))
|
||||
|
||||
cwd = Path.cwd()
|
||||
|
||||
# Try walk-up detection
|
||||
found = _find_definition_walkup(cwd)
|
||||
if found is not None:
|
||||
return found
|
||||
|
||||
# Try .pbip sibling detection
|
||||
found = _find_from_pbip(cwd)
|
||||
if found is not None:
|
||||
return found
|
||||
|
||||
raise ReportNotFoundError()
|
||||
|
||||
|
||||
def _resolve_explicit(path: Path) -> Path:
|
||||
"""Normalise an explicit path to the definition folder."""
|
||||
path = path.resolve()
|
||||
|
||||
# User pointed directly at the definition folder
|
||||
if path.name == "definition" and (path / "report.json").exists():
|
||||
return path
|
||||
|
||||
# User pointed at the .Report folder
|
||||
defn = path / "definition"
|
||||
if defn.is_dir() and (defn / "report.json").exists():
|
||||
return defn
|
||||
|
||||
# User pointed at something that contains a .Report child
|
||||
for child in path.iterdir() if path.is_dir() else []:
|
||||
if child.name.endswith(".Report") and child.is_dir():
|
||||
defn = child / "definition"
|
||||
if (defn / "report.json").exists():
|
||||
return defn
|
||||
|
||||
raise ReportNotFoundError(
|
||||
f"No PBIR definition found at '{path}'. "
|
||||
"Expected a folder containing definition/report.json."
|
||||
)
|
||||
|
||||
|
||||
def _find_definition_walkup(start: Path) -> Path | None:
|
||||
"""Walk up from *start* looking for a .Report/definition/ folder."""
|
||||
current = start.resolve()
|
||||
for _ in range(_MAX_WALK_UP):
|
||||
for child in current.iterdir():
|
||||
if child.is_dir() and child.name.endswith(".Report"):
|
||||
defn = child / "definition"
|
||||
if defn.is_dir() and (defn / "report.json").exists():
|
||||
return defn
|
||||
parent = current.parent
|
||||
if parent == current:
|
||||
break
|
||||
current = parent
|
||||
return None
|
||||
|
||||
|
||||
def _find_from_pbip(start: Path) -> Path | None:
|
||||
"""Look for a .pbip file and derive the .Report folder."""
|
||||
if not start.is_dir():
|
||||
return None
|
||||
try:
|
||||
for item in start.iterdir():
|
||||
if item.is_file() and item.suffix == ".pbip":
|
||||
report_folder = start / f"{item.stem}.Report"
|
||||
defn = report_folder / "definition"
|
||||
if defn.is_dir() and (defn / "report.json").exists():
|
||||
return defn
|
||||
except (OSError, PermissionError):
|
||||
return None
|
||||
return None
|
||||
|
||||
|
||||
def get_pages_dir(definition_path: Path) -> Path:
|
||||
"""Return the pages directory, creating it if needed."""
|
||||
pages = definition_path / "pages"
|
||||
pages.mkdir(exist_ok=True)
|
||||
return pages
|
||||
|
||||
|
||||
def get_page_dir(definition_path: Path, page_name: str) -> Path:
|
||||
"""Return the directory for a specific page."""
|
||||
return definition_path / "pages" / page_name
|
||||
|
||||
|
||||
def get_visuals_dir(definition_path: Path, page_name: str) -> Path:
|
||||
"""Return the visuals directory for a specific page."""
|
||||
visuals = definition_path / "pages" / page_name / "visuals"
|
||||
visuals.mkdir(parents=True, exist_ok=True)
|
||||
return visuals
|
||||
|
||||
|
||||
def get_visual_dir(
|
||||
definition_path: Path, page_name: str, visual_name: str
|
||||
) -> Path:
|
||||
"""Return the directory for a specific visual."""
|
||||
return definition_path / "pages" / page_name / "visuals" / visual_name
|
||||
|
||||
|
||||
def validate_report_structure(definition_path: Path) -> list[str]:
|
||||
"""Check that the PBIR folder structure is valid.
|
||||
|
||||
Returns a list of error messages (empty if valid).
|
||||
"""
|
||||
errors: list[str] = []
|
||||
|
||||
if not definition_path.is_dir():
|
||||
errors.append(f"Definition folder does not exist: {definition_path}")
|
||||
return errors
|
||||
|
||||
report_json = definition_path / "report.json"
|
||||
if not report_json.exists():
|
||||
errors.append("Missing required file: report.json")
|
||||
|
||||
version_json = definition_path / "version.json"
|
||||
if not version_json.exists():
|
||||
errors.append("Missing required file: version.json")
|
||||
|
||||
pages_dir = definition_path / "pages"
|
||||
if pages_dir.is_dir():
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
page_json = page_dir / "page.json"
|
||||
if not page_json.exists():
|
||||
errors.append(f"Page folder '{page_dir.name}' missing page.json")
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if visuals_dir.is_dir():
|
||||
for visual_dir in sorted(visuals_dir.iterdir()):
|
||||
if not visual_dir.is_dir():
|
||||
continue
|
||||
visual_json = visual_dir / "visual.json"
|
||||
if not visual_json.exists():
|
||||
errors.append(
|
||||
f"Visual folder '{page_dir.name}/visuals/{visual_dir.name}' "
|
||||
"missing visual.json"
|
||||
)
|
||||
|
||||
return errors
|
||||
435
src/pbi_cli/core/pbir_validators.py
Normal file
435
src/pbi_cli/core/pbir_validators.py
Normal file
|
|
@ -0,0 +1,435 @@
|
|||
"""Enhanced PBIR validation beyond basic structure checks.
|
||||
|
||||
Provides three tiers of validation:
|
||||
1. Structural: folder layout and file existence (in pbir_path.py)
|
||||
2. Schema: required fields, valid types, cross-file consistency
|
||||
3. Model-aware: field bindings against a connected semantic model (optional)
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class ValidationResult:
|
||||
"""Immutable container for a single validation finding."""
|
||||
|
||||
level: str # "error", "warning", "info"
|
||||
file: str
|
||||
message: str
|
||||
|
||||
|
||||
def validate_report_full(definition_path: Path) -> dict[str, Any]:
|
||||
"""Run all validation tiers and return a structured report.
|
||||
|
||||
Returns a dict with ``valid``, ``errors``, ``warnings``, and ``summary``.
|
||||
"""
|
||||
findings: list[ValidationResult] = []
|
||||
|
||||
# Tier 1: structural (reuse existing)
|
||||
from pbi_cli.core.pbir_path import validate_report_structure
|
||||
|
||||
structural = validate_report_structure(definition_path)
|
||||
for msg in structural:
|
||||
findings.append(ValidationResult("error", "", msg))
|
||||
|
||||
if not definition_path.is_dir():
|
||||
return _build_result(findings)
|
||||
|
||||
# Tier 2: JSON syntax
|
||||
findings.extend(_validate_json_syntax(definition_path))
|
||||
|
||||
# Tier 2: schema validation per file type
|
||||
findings.extend(_validate_report_json(definition_path))
|
||||
findings.extend(_validate_version_json(definition_path))
|
||||
findings.extend(_validate_pages_metadata(definition_path))
|
||||
findings.extend(_validate_all_pages(definition_path))
|
||||
findings.extend(_validate_all_visuals(definition_path))
|
||||
|
||||
# Tier 2: cross-file consistency
|
||||
findings.extend(_validate_page_order_consistency(definition_path))
|
||||
findings.extend(_validate_visual_name_uniqueness(definition_path))
|
||||
|
||||
return _build_result(findings)
|
||||
|
||||
|
||||
def validate_bindings_against_model(
|
||||
definition_path: Path,
|
||||
model_tables: list[dict[str, Any]],
|
||||
) -> list[ValidationResult]:
|
||||
"""Tier 3: cross-reference visual field bindings against a model.
|
||||
|
||||
``model_tables`` should be a list of dicts with 'name' and 'columns' keys,
|
||||
where 'columns' is a list of dicts with 'name' keys. Measures are included
|
||||
as columns.
|
||||
"""
|
||||
findings: list[ValidationResult] = []
|
||||
|
||||
# Build lookup set
|
||||
valid_fields: set[str] = set()
|
||||
for table in model_tables:
|
||||
table_name = table.get("name", "")
|
||||
for col in table.get("columns", []):
|
||||
valid_fields.add(f"{table_name}[{col.get('name', '')}]")
|
||||
for mea in table.get("measures", []):
|
||||
valid_fields.add(f"{table_name}[{mea.get('name', '')}]")
|
||||
|
||||
pages_dir = definition_path / "pages"
|
||||
if not pages_dir.is_dir():
|
||||
return findings
|
||||
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if not visuals_dir.is_dir():
|
||||
continue
|
||||
for vdir in sorted(visuals_dir.iterdir()):
|
||||
if not vdir.is_dir():
|
||||
continue
|
||||
vfile = vdir / "visual.json"
|
||||
if not vfile.exists():
|
||||
continue
|
||||
try:
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
visual_config = data.get("visual", {})
|
||||
query = visual_config.get("query", {})
|
||||
|
||||
# Check Commands-based bindings
|
||||
for cmd in query.get("Commands", []):
|
||||
sq = cmd.get("SemanticQueryDataShapeCommand", {}).get("Query", {})
|
||||
sources = {s["Name"]: s["Entity"] for s in sq.get("From", [])}
|
||||
for sel in sq.get("Select", []):
|
||||
ref = _extract_field_ref(sel, sources)
|
||||
if ref and ref not in valid_fields:
|
||||
rel = f"{page_dir.name}/visuals/{vdir.name}"
|
||||
findings.append(ValidationResult(
|
||||
"warning",
|
||||
rel,
|
||||
f"Field '{ref}' not found in semantic model",
|
||||
))
|
||||
except (json.JSONDecodeError, KeyError, TypeError):
|
||||
continue
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Tier 2 validators
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _validate_json_syntax(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Check all JSON files parse without errors."""
|
||||
findings: list[ValidationResult] = []
|
||||
for json_file in definition_path.rglob("*.json"):
|
||||
try:
|
||||
json.loads(json_file.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError as e:
|
||||
rel = str(json_file.relative_to(definition_path))
|
||||
findings.append(ValidationResult("error", rel, f"Invalid JSON: {e}"))
|
||||
return findings
|
||||
|
||||
|
||||
def _validate_report_json(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Validate report.json required fields and schema."""
|
||||
findings: list[ValidationResult] = []
|
||||
report_json = definition_path / "report.json"
|
||||
if not report_json.exists():
|
||||
return findings # Structural check already caught this
|
||||
|
||||
try:
|
||||
data = json.loads(report_json.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError:
|
||||
return findings
|
||||
|
||||
if "$schema" not in data:
|
||||
findings.append(ValidationResult("warning", "report.json", "Missing $schema reference"))
|
||||
|
||||
if "themeCollection" not in data:
|
||||
findings.append(ValidationResult(
|
||||
"error", "report.json", "Missing required 'themeCollection'"
|
||||
))
|
||||
else:
|
||||
tc = data["themeCollection"]
|
||||
if "baseTheme" not in tc:
|
||||
findings.append(ValidationResult(
|
||||
"warning", "report.json", "themeCollection missing 'baseTheme'"
|
||||
))
|
||||
|
||||
if "layoutOptimization" not in data:
|
||||
findings.append(ValidationResult(
|
||||
"error", "report.json", "Missing required 'layoutOptimization'"
|
||||
))
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
def _validate_version_json(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Validate version.json content."""
|
||||
findings: list[ValidationResult] = []
|
||||
version_json = definition_path / "version.json"
|
||||
if not version_json.exists():
|
||||
return findings
|
||||
|
||||
try:
|
||||
data = json.loads(version_json.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError:
|
||||
return findings
|
||||
|
||||
if "version" not in data:
|
||||
findings.append(ValidationResult("error", "version.json", "Missing required 'version'"))
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
def _validate_pages_metadata(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Validate pages.json if present."""
|
||||
findings: list[ValidationResult] = []
|
||||
pages_json = definition_path / "pages" / "pages.json"
|
||||
if not pages_json.exists():
|
||||
return findings
|
||||
|
||||
try:
|
||||
data = json.loads(pages_json.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError:
|
||||
return findings
|
||||
|
||||
page_order = data.get("pageOrder", [])
|
||||
if not isinstance(page_order, list):
|
||||
findings.append(ValidationResult(
|
||||
"error", "pages/pages.json", "'pageOrder' must be an array"
|
||||
))
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
def _validate_all_pages(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Validate individual page.json files."""
|
||||
findings: list[ValidationResult] = []
|
||||
pages_dir = definition_path / "pages"
|
||||
if not pages_dir.is_dir():
|
||||
return findings
|
||||
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
page_json = page_dir / "page.json"
|
||||
if not page_json.exists():
|
||||
continue
|
||||
|
||||
try:
|
||||
data = json.loads(page_json.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError:
|
||||
continue
|
||||
|
||||
rel = f"pages/{page_dir.name}/page.json"
|
||||
|
||||
for req in ("name", "displayName", "displayOption"):
|
||||
if req not in data:
|
||||
findings.append(ValidationResult("error", rel, f"Missing required '{req}'"))
|
||||
|
||||
valid_options = {
|
||||
"FitToPage", "FitToWidth", "ActualSize",
|
||||
"ActualSizeTopLeft", "DeprecatedDynamic",
|
||||
}
|
||||
opt = data.get("displayOption")
|
||||
if opt and opt not in valid_options:
|
||||
findings.append(ValidationResult(
|
||||
"warning", rel, f"Unknown displayOption '{opt}'"
|
||||
))
|
||||
|
||||
if opt != "DeprecatedDynamic":
|
||||
if "width" not in data:
|
||||
findings.append(ValidationResult("error", rel, "Missing required 'width'"))
|
||||
if "height" not in data:
|
||||
findings.append(ValidationResult("error", rel, "Missing required 'height'"))
|
||||
|
||||
name = data.get("name", "")
|
||||
if name and len(name) > 50:
|
||||
findings.append(ValidationResult(
|
||||
"warning", rel, f"Name exceeds 50 chars: '{name[:20]}...'"
|
||||
))
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
def _validate_all_visuals(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Validate individual visual.json files."""
|
||||
findings: list[ValidationResult] = []
|
||||
pages_dir = definition_path / "pages"
|
||||
if not pages_dir.is_dir():
|
||||
return findings
|
||||
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if not visuals_dir.is_dir():
|
||||
continue
|
||||
for vdir in sorted(visuals_dir.iterdir()):
|
||||
if not vdir.is_dir():
|
||||
continue
|
||||
vfile = vdir / "visual.json"
|
||||
if not vfile.exists():
|
||||
continue
|
||||
|
||||
try:
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError:
|
||||
continue
|
||||
|
||||
rel = f"pages/{page_dir.name}/visuals/{vdir.name}/visual.json"
|
||||
|
||||
if "name" not in data:
|
||||
findings.append(ValidationResult("error", rel, "Missing required 'name'"))
|
||||
|
||||
if "position" not in data:
|
||||
findings.append(ValidationResult("error", rel, "Missing required 'position'"))
|
||||
else:
|
||||
pos = data["position"]
|
||||
for req in ("x", "y", "width", "height"):
|
||||
if req not in pos:
|
||||
findings.append(ValidationResult(
|
||||
"error", rel, f"Position missing required '{req}'"
|
||||
))
|
||||
|
||||
visual_config = data.get("visual", {})
|
||||
vtype = visual_config.get("visualType", "")
|
||||
if not vtype:
|
||||
# Could be a visualGroup, which is also valid
|
||||
if "visualGroup" not in data:
|
||||
findings.append(ValidationResult(
|
||||
"warning", rel, "Missing 'visual.visualType' (not a visual group either)"
|
||||
))
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Cross-file consistency
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _validate_page_order_consistency(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Check that pages.json references match actual page folders."""
|
||||
findings: list[ValidationResult] = []
|
||||
pages_json = definition_path / "pages" / "pages.json"
|
||||
if not pages_json.exists():
|
||||
return findings
|
||||
|
||||
try:
|
||||
data = json.loads(pages_json.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError:
|
||||
return findings
|
||||
|
||||
page_order = data.get("pageOrder", [])
|
||||
pages_dir = definition_path / "pages"
|
||||
|
||||
actual_pages = {
|
||||
d.name
|
||||
for d in pages_dir.iterdir()
|
||||
if d.is_dir() and (d / "page.json").exists()
|
||||
}
|
||||
|
||||
for name in page_order:
|
||||
if name not in actual_pages:
|
||||
findings.append(ValidationResult(
|
||||
"warning",
|
||||
"pages/pages.json",
|
||||
f"pageOrder references '{name}' but no such page folder exists",
|
||||
))
|
||||
|
||||
unlisted = actual_pages - set(page_order)
|
||||
for name in sorted(unlisted):
|
||||
findings.append(ValidationResult(
|
||||
"info",
|
||||
"pages/pages.json",
|
||||
f"Page '{name}' exists but is not listed in pageOrder",
|
||||
))
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
def _validate_visual_name_uniqueness(definition_path: Path) -> list[ValidationResult]:
|
||||
"""Check that visual names are unique within each page."""
|
||||
findings: list[ValidationResult] = []
|
||||
pages_dir = definition_path / "pages"
|
||||
if not pages_dir.is_dir():
|
||||
return findings
|
||||
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if not visuals_dir.is_dir():
|
||||
continue
|
||||
|
||||
names_seen: dict[str, str] = {}
|
||||
for vdir in sorted(visuals_dir.iterdir()):
|
||||
if not vdir.is_dir():
|
||||
continue
|
||||
vfile = vdir / "visual.json"
|
||||
if not vfile.exists():
|
||||
continue
|
||||
try:
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
name = data.get("name", "")
|
||||
if name in names_seen:
|
||||
rel = f"pages/{page_dir.name}/visuals/{vdir.name}/visual.json"
|
||||
findings.append(ValidationResult(
|
||||
"error",
|
||||
rel,
|
||||
f"Duplicate visual name '{name}' (also in {names_seen[name]})",
|
||||
))
|
||||
else:
|
||||
names_seen[name] = vdir.name
|
||||
except (json.JSONDecodeError, KeyError):
|
||||
continue
|
||||
|
||||
return findings
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _build_result(findings: list[ValidationResult]) -> dict[str, Any]:
|
||||
"""Build the final validation report dict."""
|
||||
errors = [f for f in findings if f.level == "error"]
|
||||
warnings = [f for f in findings if f.level == "warning"]
|
||||
infos = [f for f in findings if f.level == "info"]
|
||||
|
||||
return {
|
||||
"valid": len(errors) == 0,
|
||||
"errors": [{"file": f.file, "message": f.message} for f in errors],
|
||||
"warnings": [{"file": f.file, "message": f.message} for f in warnings],
|
||||
"info": [{"file": f.file, "message": f.message} for f in infos],
|
||||
"summary": {
|
||||
"errors": len(errors),
|
||||
"warnings": len(warnings),
|
||||
"info": len(infos),
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def _extract_field_ref(
|
||||
select_item: dict[str, Any], sources: dict[str, str]
|
||||
) -> str | None:
|
||||
"""Extract a Table[Column] reference from a semantic query select item."""
|
||||
for kind in ("Column", "Measure"):
|
||||
if kind in select_item:
|
||||
item = select_item[kind]
|
||||
source_name = (
|
||||
item.get("Expression", {}).get("SourceRef", {}).get("Source", "")
|
||||
)
|
||||
prop = item.get("Property", "")
|
||||
table = sources.get(source_name, source_name)
|
||||
if table and prop:
|
||||
return f"{table}[{prop}]"
|
||||
return None
|
||||
797
src/pbi_cli/core/report_backend.py
Normal file
797
src/pbi_cli/core/report_backend.py
Normal file
|
|
@ -0,0 +1,797 @@
|
|||
"""Pure-function backend for PBIR report and page operations.
|
||||
|
||||
Mirrors ``tom_backend.py`` but operates on JSON files instead of .NET TOM.
|
||||
Every function takes a ``Path`` to the definition folder and returns a plain
|
||||
Python dict suitable for ``format_result()``.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import re
|
||||
import secrets
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.pbir_models import (
|
||||
DEFAULT_BASE_THEME,
|
||||
SCHEMA_PAGE,
|
||||
SCHEMA_PAGES_METADATA,
|
||||
SCHEMA_REPORT,
|
||||
SCHEMA_VERSION,
|
||||
)
|
||||
from pbi_cli.core.pbir_path import (
|
||||
get_page_dir,
|
||||
get_pages_dir,
|
||||
validate_report_structure,
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# JSON helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any]:
|
||||
"""Read and parse a JSON file."""
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
"""Write JSON with consistent formatting."""
|
||||
path.write_text(
|
||||
json.dumps(data, indent=2, ensure_ascii=False) + "\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
|
||||
def _generate_name() -> str:
|
||||
"""Generate a 20-character hex identifier matching PBIR convention."""
|
||||
return secrets.token_hex(10)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Report operations
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def report_info(definition_path: Path) -> dict[str, Any]:
|
||||
"""Return report metadata summary."""
|
||||
report_data = _read_json(definition_path / "report.json")
|
||||
pages_dir = definition_path / "pages"
|
||||
|
||||
pages: list[dict[str, Any]] = []
|
||||
if pages_dir.is_dir():
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
page_json = page_dir / "page.json"
|
||||
if page_json.exists():
|
||||
page_data = _read_json(page_json)
|
||||
visual_count = 0
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if visuals_dir.is_dir():
|
||||
visual_count = sum(
|
||||
1
|
||||
for v in visuals_dir.iterdir()
|
||||
if v.is_dir() and (v / "visual.json").exists()
|
||||
)
|
||||
pages.append({
|
||||
"name": page_data.get("name", page_dir.name),
|
||||
"display_name": page_data.get("displayName", ""),
|
||||
"ordinal": page_data.get("ordinal", 0),
|
||||
"visual_count": visual_count,
|
||||
})
|
||||
|
||||
theme = report_data.get("themeCollection", {}).get("baseTheme", {})
|
||||
|
||||
return {
|
||||
"page_count": len(pages),
|
||||
"theme": theme.get("name", "Default"),
|
||||
"pages": pages,
|
||||
"total_visuals": sum(p["visual_count"] for p in pages),
|
||||
"path": str(definition_path),
|
||||
}
|
||||
|
||||
|
||||
def report_create(
|
||||
target_path: Path,
|
||||
name: str,
|
||||
dataset_path: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Scaffold a new PBIR report project structure.
|
||||
|
||||
Creates:
|
||||
<target_path>/<name>.Report/definition/report.json
|
||||
<target_path>/<name>.Report/definition/version.json
|
||||
<target_path>/<name>.Report/definition/pages/ (empty)
|
||||
<target_path>/<name>.Report/definition.pbir
|
||||
<target_path>/<name>.pbip (optional project file)
|
||||
"""
|
||||
target_path = target_path.resolve()
|
||||
report_folder = target_path / f"{name}.Report"
|
||||
definition_dir = report_folder / "definition"
|
||||
pages_dir = definition_dir / "pages"
|
||||
pages_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# version.json
|
||||
_write_json(definition_dir / "version.json", {
|
||||
"$schema": SCHEMA_VERSION,
|
||||
"version": "2.0.0",
|
||||
})
|
||||
|
||||
# report.json (matches Desktop defaults)
|
||||
_write_json(definition_dir / "report.json", {
|
||||
"$schema": SCHEMA_REPORT,
|
||||
"themeCollection": {
|
||||
"baseTheme": dict(DEFAULT_BASE_THEME),
|
||||
},
|
||||
"layoutOptimization": "None",
|
||||
"settings": {
|
||||
"useStylableVisualContainerHeader": True,
|
||||
"defaultDrillFilterOtherVisuals": True,
|
||||
"allowChangeFilterTypes": True,
|
||||
"useEnhancedTooltips": True,
|
||||
"useDefaultAggregateDisplayName": True,
|
||||
},
|
||||
"slowDataSourceSettings": {
|
||||
"isCrossHighlightingDisabled": False,
|
||||
"isSlicerSelectionsButtonEnabled": False,
|
||||
"isFilterSelectionsButtonEnabled": False,
|
||||
"isFieldWellButtonEnabled": False,
|
||||
"isApplyAllButtonEnabled": False,
|
||||
},
|
||||
})
|
||||
|
||||
# pages.json (empty page order)
|
||||
_write_json(definition_dir / "pages" / "pages.json", {
|
||||
"$schema": SCHEMA_PAGES_METADATA,
|
||||
"pageOrder": [],
|
||||
})
|
||||
|
||||
# Scaffold a blank semantic model if no dataset path provided
|
||||
if not dataset_path:
|
||||
dataset_path = f"../{name}.SemanticModel"
|
||||
_scaffold_blank_semantic_model(target_path, name)
|
||||
|
||||
# definition.pbir (datasetReference is REQUIRED by Desktop)
|
||||
_write_json(report_folder / "definition.pbir", {
|
||||
"version": "4.0",
|
||||
"datasetReference": {
|
||||
"byPath": {"path": dataset_path},
|
||||
},
|
||||
})
|
||||
|
||||
# .platform file for the report
|
||||
_write_json(report_folder / ".platform", {
|
||||
"$schema": (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/gitIntegration/platformProperties/2.0.0/schema.json"
|
||||
),
|
||||
"metadata": {
|
||||
"type": "Report",
|
||||
"displayName": name,
|
||||
},
|
||||
"config": {
|
||||
"version": "2.0",
|
||||
"logicalId": "00000000-0000-0000-0000-000000000000",
|
||||
},
|
||||
})
|
||||
|
||||
# .pbip project file
|
||||
_write_json(target_path / f"{name}.pbip", {
|
||||
"version": "1.0",
|
||||
"artifacts": [
|
||||
{
|
||||
"report": {"path": f"{name}.Report"},
|
||||
}
|
||||
],
|
||||
})
|
||||
|
||||
return {
|
||||
"status": "created",
|
||||
"name": name,
|
||||
"path": str(report_folder),
|
||||
"definition_path": str(definition_dir),
|
||||
}
|
||||
|
||||
|
||||
def report_validate(definition_path: Path) -> dict[str, Any]:
|
||||
"""Validate the PBIR report structure and JSON files.
|
||||
|
||||
Returns a dict with ``valid`` bool and ``errors`` list.
|
||||
"""
|
||||
errors = validate_report_structure(definition_path)
|
||||
|
||||
# Validate JSON syntax of all files
|
||||
if definition_path.is_dir():
|
||||
for json_file in definition_path.rglob("*.json"):
|
||||
try:
|
||||
_read_json(json_file)
|
||||
except json.JSONDecodeError as e:
|
||||
rel = json_file.relative_to(definition_path)
|
||||
errors.append(f"Invalid JSON in {rel}: {e}")
|
||||
|
||||
# Validate required schema fields
|
||||
report_json = definition_path / "report.json"
|
||||
if report_json.exists():
|
||||
try:
|
||||
data = _read_json(report_json)
|
||||
if "themeCollection" not in data:
|
||||
errors.append("report.json missing required 'themeCollection'")
|
||||
if "layoutOptimization" not in data:
|
||||
errors.append("report.json missing required 'layoutOptimization'")
|
||||
except json.JSONDecodeError:
|
||||
pass # Already caught above
|
||||
|
||||
# Validate pages
|
||||
pages_dir = definition_path / "pages"
|
||||
if pages_dir.is_dir():
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
page_json = page_dir / "page.json"
|
||||
if page_json.exists():
|
||||
try:
|
||||
pdata = _read_json(page_json)
|
||||
for req in ("name", "displayName", "displayOption"):
|
||||
if req not in pdata:
|
||||
errors.append(
|
||||
f"Page '{page_dir.name}' missing required '{req}'"
|
||||
)
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
return {
|
||||
"valid": len(errors) == 0,
|
||||
"errors": errors,
|
||||
"files_checked": sum(1 for _ in definition_path.rglob("*.json"))
|
||||
if definition_path.is_dir()
|
||||
else 0,
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Page operations
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def page_list(definition_path: Path) -> list[dict[str, Any]]:
|
||||
"""List all pages in the report."""
|
||||
pages_dir = definition_path / "pages"
|
||||
if not pages_dir.is_dir():
|
||||
return []
|
||||
|
||||
# Read page order if available
|
||||
pages_meta = pages_dir / "pages.json"
|
||||
page_order: list[str] = []
|
||||
if pages_meta.exists():
|
||||
meta = _read_json(pages_meta)
|
||||
page_order = meta.get("pageOrder", [])
|
||||
|
||||
results: list[dict[str, Any]] = []
|
||||
for page_dir in sorted(pages_dir.iterdir()):
|
||||
if not page_dir.is_dir():
|
||||
continue
|
||||
page_json = page_dir / "page.json"
|
||||
if not page_json.exists():
|
||||
continue
|
||||
data = _read_json(page_json)
|
||||
visual_count = 0
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if visuals_dir.is_dir():
|
||||
visual_count = sum(
|
||||
1
|
||||
for v in visuals_dir.iterdir()
|
||||
if v.is_dir() and (v / "visual.json").exists()
|
||||
)
|
||||
results.append({
|
||||
"name": data.get("name", page_dir.name),
|
||||
"display_name": data.get("displayName", ""),
|
||||
"ordinal": data.get("ordinal", 0),
|
||||
"width": data.get("width", 1280),
|
||||
"height": data.get("height", 720),
|
||||
"display_option": data.get("displayOption", "FitToPage"),
|
||||
"visual_count": visual_count,
|
||||
"is_hidden": data.get("visibility") == "HiddenInViewMode",
|
||||
"page_type": data.get("type", "Default"),
|
||||
})
|
||||
|
||||
# Sort by page order if available, then by ordinal
|
||||
if page_order:
|
||||
order_map = {name: i for i, name in enumerate(page_order)}
|
||||
results.sort(key=lambda p: order_map.get(p["name"], 9999))
|
||||
else:
|
||||
results.sort(key=lambda p: p["ordinal"])
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def page_add(
|
||||
definition_path: Path,
|
||||
display_name: str,
|
||||
name: str | None = None,
|
||||
width: int = 1280,
|
||||
height: int = 720,
|
||||
display_option: str = "FitToPage",
|
||||
) -> dict[str, Any]:
|
||||
"""Add a new page to the report."""
|
||||
page_name = name or _generate_name()
|
||||
pages_dir = get_pages_dir(definition_path)
|
||||
page_dir = pages_dir / page_name
|
||||
|
||||
if page_dir.exists():
|
||||
raise PbiCliError(f"Page '{page_name}' already exists.")
|
||||
|
||||
page_dir.mkdir(parents=True)
|
||||
(page_dir / "visuals").mkdir()
|
||||
|
||||
# Write page.json (no ordinal - Desktop uses pages.json pageOrder instead)
|
||||
_write_json(page_dir / "page.json", {
|
||||
"$schema": SCHEMA_PAGE,
|
||||
"name": page_name,
|
||||
"displayName": display_name,
|
||||
"displayOption": display_option,
|
||||
"height": height,
|
||||
"width": width,
|
||||
})
|
||||
|
||||
# Update pages.json
|
||||
_update_page_order(definition_path, page_name, action="add")
|
||||
|
||||
return {
|
||||
"status": "created",
|
||||
"name": page_name,
|
||||
"display_name": display_name,
|
||||
}
|
||||
|
||||
|
||||
def page_delete(definition_path: Path, page_name: str) -> dict[str, Any]:
|
||||
"""Delete a page and all its visuals."""
|
||||
page_dir = get_page_dir(definition_path, page_name)
|
||||
|
||||
if not page_dir.exists():
|
||||
raise PbiCliError(f"Page '{page_name}' not found.")
|
||||
|
||||
# Recursively remove
|
||||
_rmtree(page_dir)
|
||||
|
||||
# Update pages.json
|
||||
_update_page_order(definition_path, page_name, action="remove")
|
||||
|
||||
return {"status": "deleted", "name": page_name}
|
||||
|
||||
|
||||
def page_get(definition_path: Path, page_name: str) -> dict[str, Any]:
|
||||
"""Get details of a specific page."""
|
||||
page_dir = get_page_dir(definition_path, page_name)
|
||||
page_json = page_dir / "page.json"
|
||||
|
||||
if not page_json.exists():
|
||||
raise PbiCliError(f"Page '{page_name}' not found.")
|
||||
|
||||
data = _read_json(page_json)
|
||||
visual_count = 0
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if visuals_dir.is_dir():
|
||||
visual_count = sum(
|
||||
1
|
||||
for v in visuals_dir.iterdir()
|
||||
if v.is_dir() and (v / "visual.json").exists()
|
||||
)
|
||||
|
||||
return {
|
||||
"name": data.get("name", page_name),
|
||||
"display_name": data.get("displayName", ""),
|
||||
"ordinal": data.get("ordinal", 0),
|
||||
"width": data.get("width", 1280),
|
||||
"height": data.get("height", 720),
|
||||
"display_option": data.get("displayOption", "FitToPage"),
|
||||
"visual_count": visual_count,
|
||||
"is_hidden": data.get("visibility") == "HiddenInViewMode",
|
||||
"page_type": data.get("type", "Default"),
|
||||
"filter_config": data.get("filterConfig"),
|
||||
"visual_interactions": data.get("visualInteractions"),
|
||||
"page_binding": data.get("pageBinding"),
|
||||
}
|
||||
|
||||
|
||||
def page_set_background(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
color: str,
|
||||
) -> dict[str, Any]:
|
||||
"""Set the background color of a page.
|
||||
|
||||
Updates the ``objects.background`` property in ``page.json``.
|
||||
The color must be a hex string, e.g. ``'#F8F9FA'``.
|
||||
"""
|
||||
if not re.fullmatch(r"#[0-9A-Fa-f]{3,8}", color):
|
||||
raise PbiCliError(
|
||||
f"Invalid color '{color}' -- expected hex format like '#F8F9FA'."
|
||||
)
|
||||
|
||||
page_dir = get_page_dir(definition_path, page_name)
|
||||
page_json_path = page_dir / "page.json"
|
||||
if not page_json_path.exists():
|
||||
raise PbiCliError(f"Page '{page_name}' not found.")
|
||||
|
||||
page_data = _read_json(page_json_path)
|
||||
background_entry = {
|
||||
"properties": {
|
||||
"color": {
|
||||
"solid": {
|
||||
"color": {
|
||||
"expr": {
|
||||
"Literal": {"Value": f"'{color}'"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
objects = {**page_data.get("objects", {}), "background": [background_entry]}
|
||||
_write_json(page_json_path, {**page_data, "objects": objects})
|
||||
return {"status": "updated", "page": page_name, "background_color": color}
|
||||
|
||||
|
||||
def page_set_visibility(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
hidden: bool,
|
||||
) -> dict[str, Any]:
|
||||
"""Show or hide a page in the report navigation.
|
||||
|
||||
Setting ``hidden=True`` writes ``"visibility": "HiddenInViewMode"`` to
|
||||
``page.json``. Setting ``hidden=False`` removes the key if present.
|
||||
"""
|
||||
page_dir = get_page_dir(definition_path, page_name)
|
||||
page_json_path = page_dir / "page.json"
|
||||
if not page_json_path.exists():
|
||||
raise PbiCliError(f"Page '{page_name}' not found.")
|
||||
|
||||
page_data = _read_json(page_json_path)
|
||||
if hidden:
|
||||
updated = {**page_data, "visibility": "HiddenInViewMode"}
|
||||
else:
|
||||
updated = {k: v for k, v in page_data.items() if k != "visibility"}
|
||||
_write_json(page_json_path, updated)
|
||||
return {"status": "updated", "page": page_name, "hidden": hidden}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Theme operations
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def theme_set(
|
||||
definition_path: Path, theme_path: Path
|
||||
) -> dict[str, Any]:
|
||||
"""Apply a custom theme JSON to the report."""
|
||||
if not theme_path.exists():
|
||||
raise PbiCliError(f"Theme file not found: {theme_path}")
|
||||
|
||||
theme_data = _read_json(theme_path)
|
||||
report_json_path = definition_path / "report.json"
|
||||
report_data = _read_json(report_json_path)
|
||||
|
||||
# Set custom theme
|
||||
theme_collection = report_data.get("themeCollection", {})
|
||||
theme_collection["customTheme"] = {
|
||||
"name": theme_data.get("name", theme_path.stem),
|
||||
"reportVersionAtImport": "5.55",
|
||||
"type": "RegisteredResources",
|
||||
}
|
||||
report_data["themeCollection"] = theme_collection
|
||||
|
||||
# Copy theme file to RegisteredResources if needed
|
||||
report_folder = definition_path.parent
|
||||
resources_dir = report_folder / "StaticResources" / "RegisteredResources"
|
||||
resources_dir.mkdir(parents=True, exist_ok=True)
|
||||
theme_dest = resources_dir / theme_path.name
|
||||
theme_dest.write_text(
|
||||
theme_path.read_text(encoding="utf-8"), encoding="utf-8"
|
||||
)
|
||||
|
||||
# Update resource packages in report.json
|
||||
resource_packages = report_data.get("resourcePackages", [])
|
||||
found = False
|
||||
for pkg in resource_packages:
|
||||
if pkg.get("name") == "RegisteredResources":
|
||||
found = True
|
||||
items = pkg.get("items", [])
|
||||
# Add or update theme entry
|
||||
theme_item = {
|
||||
"name": theme_path.name,
|
||||
"type": 202,
|
||||
"path": f"BaseThemes/{theme_path.name}",
|
||||
}
|
||||
existing_names = {i["name"] for i in items}
|
||||
if theme_path.name not in existing_names:
|
||||
items.append(theme_item)
|
||||
pkg["items"] = items
|
||||
break
|
||||
|
||||
if not found:
|
||||
resource_packages.append({
|
||||
"name": "RegisteredResources",
|
||||
"type": "RegisteredResources",
|
||||
"items": [{
|
||||
"name": theme_path.name,
|
||||
"type": 202,
|
||||
"path": f"BaseThemes/{theme_path.name}",
|
||||
}],
|
||||
})
|
||||
report_data["resourcePackages"] = resource_packages
|
||||
|
||||
_write_json(report_json_path, report_data)
|
||||
|
||||
return {
|
||||
"status": "applied",
|
||||
"theme": theme_data.get("name", theme_path.stem),
|
||||
"file": str(theme_dest),
|
||||
}
|
||||
|
||||
|
||||
def theme_get(definition_path: Path) -> dict[str, Any]:
|
||||
"""Return current theme information for the report.
|
||||
|
||||
Reads ``report.json`` to determine the base and custom theme names.
|
||||
If a custom theme is set and the theme file exists in
|
||||
``StaticResources/RegisteredResources/``, the full theme JSON is also
|
||||
returned.
|
||||
|
||||
Returns:
|
||||
``{"base_theme": str, "custom_theme": str | None,
|
||||
"theme_data": dict | None}``
|
||||
"""
|
||||
report_json_path = definition_path / "report.json"
|
||||
if not report_json_path.exists():
|
||||
raise PbiCliError("report.json not found -- is this a valid PBIR definition folder?")
|
||||
|
||||
report_data = _read_json(report_json_path)
|
||||
theme_collection = report_data.get("themeCollection", {})
|
||||
|
||||
base_theme = theme_collection.get("baseTheme", {}).get("name", "")
|
||||
custom_theme_info = theme_collection.get("customTheme")
|
||||
custom_theme_name: str | None = None
|
||||
theme_data: dict[str, Any] | None = None
|
||||
|
||||
if custom_theme_info:
|
||||
custom_theme_name = custom_theme_info.get("name")
|
||||
# Try to load from RegisteredResources
|
||||
report_folder = definition_path.parent
|
||||
resources_dir = report_folder / "StaticResources" / "RegisteredResources"
|
||||
if resources_dir.is_dir():
|
||||
for candidate in resources_dir.glob("*.json"):
|
||||
try:
|
||||
parsed = _read_json(candidate)
|
||||
if parsed.get("name") == custom_theme_name:
|
||||
theme_data = parsed
|
||||
break
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
return {
|
||||
"base_theme": base_theme,
|
||||
"custom_theme": custom_theme_name,
|
||||
"theme_data": theme_data,
|
||||
}
|
||||
|
||||
|
||||
def theme_diff(definition_path: Path, theme_path: Path) -> dict[str, Any]:
|
||||
"""Compare a proposed theme JSON file against the currently applied theme.
|
||||
|
||||
If no custom theme is set, the diff compares against an empty dict
|
||||
(i.e. everything in the proposed file is an addition).
|
||||
|
||||
Returns:
|
||||
``{"current": str, "proposed": str,
|
||||
"added": list[str], "removed": list[str], "changed": list[str]}``
|
||||
"""
|
||||
if not theme_path.exists():
|
||||
raise PbiCliError(f"Proposed theme file not found: {theme_path}")
|
||||
|
||||
current_info = theme_get(definition_path)
|
||||
current_data: dict[str, Any] = current_info.get("theme_data") or {}
|
||||
proposed_data = _read_json(theme_path)
|
||||
|
||||
current_name = current_info.get("custom_theme") or current_info.get("base_theme") or "(none)"
|
||||
proposed_name = proposed_data.get("name", theme_path.stem)
|
||||
|
||||
added, removed, changed = _dict_diff(current_data, proposed_data)
|
||||
|
||||
return {
|
||||
"current": current_name,
|
||||
"proposed": proposed_name,
|
||||
"added": added,
|
||||
"removed": removed,
|
||||
"changed": changed,
|
||||
}
|
||||
|
||||
|
||||
def _dict_diff(
|
||||
current: dict[str, Any],
|
||||
proposed: dict[str, Any],
|
||||
prefix: str = "",
|
||||
) -> tuple[list[str], list[str], list[str]]:
|
||||
"""Recursively diff two dicts and return (added, removed, changed) key paths."""
|
||||
added: list[str] = []
|
||||
removed: list[str] = []
|
||||
changed: list[str] = []
|
||||
|
||||
all_keys = set(current) | set(proposed)
|
||||
for key in sorted(all_keys):
|
||||
path = f"{prefix}{key}" if not prefix else f"{prefix}.{key}"
|
||||
if key not in current:
|
||||
added.append(path)
|
||||
elif key not in proposed:
|
||||
removed.append(path)
|
||||
elif isinstance(current[key], dict) and isinstance(proposed[key], dict):
|
||||
sub_added, sub_removed, sub_changed = _dict_diff(
|
||||
current[key], proposed[key], prefix=path
|
||||
)
|
||||
added.extend(sub_added)
|
||||
removed.extend(sub_removed)
|
||||
changed.extend(sub_changed)
|
||||
elif current[key] != proposed[key]:
|
||||
changed.append(path)
|
||||
|
||||
return added, removed, changed
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Convert operations
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def report_convert(
|
||||
source_path: Path,
|
||||
output_path: Path | None = None,
|
||||
force: bool = False,
|
||||
) -> dict[str, Any]:
|
||||
"""Convert a PBIR report project to a distributable .pbip package.
|
||||
|
||||
This scaffolds the proper .pbip project structure from an existing
|
||||
.Report folder. It does NOT convert .pbix to .pbip (that requires
|
||||
Power BI Desktop's "Save as .pbip" feature).
|
||||
"""
|
||||
source_path = source_path.resolve()
|
||||
|
||||
# Find the .Report folder
|
||||
report_folder: Path | None = None
|
||||
if source_path.name.endswith(".Report") and source_path.is_dir():
|
||||
report_folder = source_path
|
||||
else:
|
||||
for child in source_path.iterdir():
|
||||
if child.is_dir() and child.name.endswith(".Report"):
|
||||
report_folder = child
|
||||
break
|
||||
|
||||
if report_folder is None:
|
||||
raise PbiCliError(
|
||||
f"No .Report folder found in '{source_path}'. "
|
||||
"Expected a folder ending in .Report."
|
||||
)
|
||||
|
||||
name = report_folder.name.replace(".Report", "")
|
||||
target = output_path.resolve() if output_path else source_path
|
||||
|
||||
# Create .pbip file
|
||||
pbip_path = target / f"{name}.pbip"
|
||||
if pbip_path.exists() and not force:
|
||||
raise PbiCliError(
|
||||
f".pbip file already exists at '{pbip_path}'. Use --force to overwrite."
|
||||
)
|
||||
_write_json(pbip_path, {
|
||||
"version": "1.0",
|
||||
"artifacts": [
|
||||
{"report": {"path": f"{name}.Report"}},
|
||||
],
|
||||
})
|
||||
|
||||
# Create .gitignore if not present
|
||||
gitignore = target / ".gitignore"
|
||||
gitignore_created = not gitignore.exists()
|
||||
if gitignore_created:
|
||||
gitignore_content = (
|
||||
"# Power BI local settings\n"
|
||||
".pbi/\n"
|
||||
"*.pbix\n"
|
||||
"*.bak\n"
|
||||
)
|
||||
gitignore.write_text(gitignore_content, encoding="utf-8")
|
||||
|
||||
# Validate the definition.pbir exists
|
||||
defn_pbir = report_folder / "definition.pbir"
|
||||
|
||||
return {
|
||||
"status": "converted",
|
||||
"name": name,
|
||||
"pbip_path": str(pbip_path),
|
||||
"report_folder": str(report_folder),
|
||||
"has_definition_pbir": defn_pbir.exists(),
|
||||
"gitignore_created": gitignore_created,
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Internal helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _scaffold_blank_semantic_model(target_path: Path, name: str) -> None:
|
||||
"""Create a minimal TMDL semantic model so Desktop can open the report."""
|
||||
model_dir = target_path / f"{name}.SemanticModel"
|
||||
defn_dir = model_dir / "definition"
|
||||
defn_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# model.tmdl (minimal valid TMDL)
|
||||
(defn_dir / "model.tmdl").write_text(
|
||||
"model Model\n"
|
||||
" culture: en-US\n"
|
||||
" defaultPowerBIDataSourceVersion: powerBI_V3\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
# .platform file (required by Desktop)
|
||||
_write_json(model_dir / ".platform", {
|
||||
"$schema": (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/gitIntegration/platformProperties/2.0.0/schema.json"
|
||||
),
|
||||
"metadata": {
|
||||
"type": "SemanticModel",
|
||||
"displayName": name,
|
||||
},
|
||||
"config": {
|
||||
"version": "2.0",
|
||||
"logicalId": "00000000-0000-0000-0000-000000000000",
|
||||
},
|
||||
})
|
||||
|
||||
# definition.pbism (matches Desktop format)
|
||||
_write_json(model_dir / "definition.pbism", {
|
||||
"version": "4.1",
|
||||
"settings": {},
|
||||
})
|
||||
|
||||
|
||||
def _update_page_order(
|
||||
definition_path: Path, page_name: str, action: str
|
||||
) -> None:
|
||||
"""Update pages.json with page add/remove."""
|
||||
pages_meta_path = definition_path / "pages" / "pages.json"
|
||||
|
||||
if pages_meta_path.exists():
|
||||
meta = _read_json(pages_meta_path)
|
||||
else:
|
||||
meta = {"$schema": SCHEMA_PAGES_METADATA}
|
||||
|
||||
order = meta.get("pageOrder", [])
|
||||
|
||||
if action == "add" and page_name not in order:
|
||||
order.append(page_name)
|
||||
elif action == "remove" and page_name in order:
|
||||
order = [p for p in order if p != page_name]
|
||||
|
||||
meta["pageOrder"] = order
|
||||
|
||||
# Always set activePageName to the first page (Desktop requires this)
|
||||
if order:
|
||||
meta["activePageName"] = meta.get("activePageName", order[0])
|
||||
# If active page was removed, reset to first
|
||||
if meta["activePageName"] not in order:
|
||||
meta["activePageName"] = order[0]
|
||||
|
||||
_write_json(pages_meta_path, meta)
|
||||
|
||||
|
||||
def _rmtree(path: Path) -> None:
|
||||
"""Recursively remove a directory tree (stdlib-only)."""
|
||||
if path.is_dir():
|
||||
for child in path.iterdir():
|
||||
_rmtree(child)
|
||||
path.rmdir()
|
||||
else:
|
||||
path.unlink()
|
||||
329
src/pbi_cli/core/tmdl_diff.py
Normal file
329
src/pbi_cli/core/tmdl_diff.py
Normal file
|
|
@ -0,0 +1,329 @@
|
|||
"""TMDL folder diff -- pure Python, no .NET required."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
|
||||
# Entity keywords inside table files (at 1-tab indent).
|
||||
# "variation" is intentionally excluded: it is a sub-property of a column,
|
||||
# not a sibling entity, so its content stays inside the parent column block.
|
||||
_TABLE_ENTITY_KEYWORDS = frozenset({"measure", "column", "hierarchy", "partition"})
|
||||
|
||||
|
||||
def diff_tmdl_folders(base_folder: str, head_folder: str) -> dict[str, Any]:
|
||||
"""Compare two TMDL export folders and return a structured diff.
|
||||
|
||||
Works on any two folders produced by ``pbi database export-tmdl`` or
|
||||
exported from Power BI Desktop / Fabric Git. No live connection needed.
|
||||
|
||||
Returns a dict with keys: base, head, changed, summary, tables,
|
||||
relationships, model.
|
||||
"""
|
||||
base = Path(base_folder)
|
||||
head = Path(head_folder)
|
||||
if not base.is_dir():
|
||||
raise PbiCliError(f"Base folder not found: {base}")
|
||||
if not head.is_dir():
|
||||
raise PbiCliError(f"Head folder not found: {head}")
|
||||
|
||||
base_def = _find_definition_dir(base)
|
||||
head_def = _find_definition_dir(head)
|
||||
|
||||
tables_diff = _diff_tables(base_def, head_def)
|
||||
rels_diff = _diff_relationships(base_def, head_def)
|
||||
model_diff = _diff_model(base_def, head_def)
|
||||
|
||||
any_changed = bool(
|
||||
tables_diff["added"]
|
||||
or tables_diff["removed"]
|
||||
or tables_diff["changed"]
|
||||
or rels_diff["added"]
|
||||
or rels_diff["removed"]
|
||||
or rels_diff["changed"]
|
||||
or model_diff["changed_properties"]
|
||||
)
|
||||
|
||||
summary: dict[str, Any] = {
|
||||
"tables_added": len(tables_diff["added"]),
|
||||
"tables_removed": len(tables_diff["removed"]),
|
||||
"tables_changed": len(tables_diff["changed"]),
|
||||
"relationships_added": len(rels_diff["added"]),
|
||||
"relationships_removed": len(rels_diff["removed"]),
|
||||
"relationships_changed": len(rels_diff["changed"]),
|
||||
"model_changed": bool(model_diff["changed_properties"]),
|
||||
}
|
||||
|
||||
return {
|
||||
"base": str(base),
|
||||
"head": str(head),
|
||||
"changed": any_changed,
|
||||
"summary": summary,
|
||||
"tables": tables_diff,
|
||||
"relationships": rels_diff,
|
||||
"model": model_diff,
|
||||
}
|
||||
|
||||
|
||||
def _find_definition_dir(folder: Path) -> Path:
|
||||
"""Return the directory that directly contains model.tmdl / tables/.
|
||||
|
||||
Handles both:
|
||||
- Direct layout: folder/model.tmdl
|
||||
- SemanticModel: folder/definition/model.tmdl
|
||||
"""
|
||||
candidate = folder / "definition"
|
||||
if candidate.is_dir():
|
||||
return candidate
|
||||
return folder
|
||||
|
||||
|
||||
def _read_tmdl(path: Path) -> str:
|
||||
"""Read a TMDL file, returning empty string if absent."""
|
||||
if not path.exists():
|
||||
return ""
|
||||
return path.read_text(encoding="utf-8")
|
||||
|
||||
|
||||
def _strip_lineage_tags(text: str) -> str:
|
||||
"""Remove lineageTag lines so spurious GUID regeneration is ignored."""
|
||||
return re.sub(r"[ \t]*lineageTag:.*\n?", "", text)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Table diffing
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _diff_tables(base_def: Path, head_def: Path) -> dict[str, Any]:
|
||||
base_tables_dir = base_def / "tables"
|
||||
head_tables_dir = head_def / "tables"
|
||||
|
||||
base_names = _list_tmdl_names(base_tables_dir)
|
||||
head_names = _list_tmdl_names(head_tables_dir)
|
||||
|
||||
added = sorted(head_names - base_names)
|
||||
removed = sorted(base_names - head_names)
|
||||
changed: dict[str, Any] = {}
|
||||
|
||||
for name in sorted(base_names & head_names):
|
||||
base_text = _read_tmdl(base_tables_dir / f"{name}.tmdl")
|
||||
head_text = _read_tmdl(head_tables_dir / f"{name}.tmdl")
|
||||
if _strip_lineage_tags(base_text) == _strip_lineage_tags(head_text):
|
||||
continue
|
||||
table_diff = _diff_table_entities(base_text, head_text)
|
||||
if any(table_diff[k] for k in table_diff):
|
||||
changed[name] = table_diff
|
||||
|
||||
return {"added": added, "removed": removed, "changed": changed}
|
||||
|
||||
|
||||
def _list_tmdl_names(tables_dir: Path) -> set[str]:
|
||||
"""Return stem names of all .tmdl files in a directory."""
|
||||
if not tables_dir.is_dir():
|
||||
return set()
|
||||
return {p.stem for p in tables_dir.glob("*.tmdl")}
|
||||
|
||||
|
||||
def _diff_table_entities(
|
||||
base_text: str, head_text: str
|
||||
) -> dict[str, list[str]]:
|
||||
"""Compare entity blocks within two table TMDL files."""
|
||||
base_entities = _parse_table_entities(base_text)
|
||||
head_entities = _parse_table_entities(head_text)
|
||||
|
||||
result: dict[str, list[str]] = {
|
||||
"measures_added": [],
|
||||
"measures_removed": [],
|
||||
"measures_changed": [],
|
||||
"columns_added": [],
|
||||
"columns_removed": [],
|
||||
"columns_changed": [],
|
||||
"partitions_added": [],
|
||||
"partitions_removed": [],
|
||||
"partitions_changed": [],
|
||||
"hierarchies_added": [],
|
||||
"hierarchies_removed": [],
|
||||
"hierarchies_changed": [],
|
||||
"other_added": [],
|
||||
"other_removed": [],
|
||||
"other_changed": [],
|
||||
}
|
||||
|
||||
# Map TMDL keywords to their plural result-dict prefix
|
||||
keyword_plurals: dict[str, str] = {
|
||||
"measure": "measures",
|
||||
"column": "columns",
|
||||
"partition": "partitions",
|
||||
"hierarchy": "hierarchies",
|
||||
}
|
||||
|
||||
all_keys = set(base_entities) | set(head_entities)
|
||||
for key in sorted(all_keys):
|
||||
keyword, _, name = key.partition("/")
|
||||
plural = keyword_plurals.get(keyword, "other")
|
||||
added_key = f"{plural}_added"
|
||||
removed_key = f"{plural}_removed"
|
||||
changed_key = f"{plural}_changed"
|
||||
|
||||
if key not in base_entities:
|
||||
result[added_key].append(name)
|
||||
elif key not in head_entities:
|
||||
result[removed_key].append(name)
|
||||
else:
|
||||
b = _strip_lineage_tags(base_entities[key])
|
||||
h = _strip_lineage_tags(head_entities[key])
|
||||
if b != h:
|
||||
result[changed_key].append(name)
|
||||
|
||||
# Remove empty other_* lists to keep output clean
|
||||
for k in ("other_added", "other_removed", "other_changed"):
|
||||
if not result[k]:
|
||||
del result[k]
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def _parse_table_entities(text: str) -> dict[str, str]:
|
||||
"""Parse a table TMDL file into {keyword/name: text_block} entries.
|
||||
|
||||
Entities (measure, column, hierarchy, partition, variation) start at
|
||||
exactly one tab of indentation inside the table declaration.
|
||||
"""
|
||||
entities: dict[str, str] = {}
|
||||
lines = text.splitlines(keepends=True)
|
||||
current_key: str | None = None
|
||||
current_lines: list[str] = []
|
||||
|
||||
for line in lines:
|
||||
# Entity declaration: starts with exactly one tab, not two
|
||||
if line.startswith("\t") and not line.startswith("\t\t"):
|
||||
stripped = line[1:] # remove leading tab
|
||||
keyword = stripped.split()[0] if stripped.split() else ""
|
||||
if keyword in _TABLE_ENTITY_KEYWORDS:
|
||||
# Save previous block
|
||||
if current_key is not None:
|
||||
entities[current_key] = "".join(current_lines)
|
||||
name = _extract_entity_name(keyword, stripped)
|
||||
current_key = f"{keyword}/{name}"
|
||||
current_lines = [line]
|
||||
continue
|
||||
|
||||
if current_key is not None:
|
||||
current_lines.append(line)
|
||||
|
||||
if current_key is not None:
|
||||
entities[current_key] = "".join(current_lines)
|
||||
|
||||
return entities
|
||||
|
||||
|
||||
def _extract_entity_name(keyword: str, declaration: str) -> str:
|
||||
"""Extract the entity name from a TMDL declaration line (no leading tab)."""
|
||||
# e.g. "measure 'Total Revenue' = ..." -> "Total Revenue"
|
||||
# e.g. "column ProductID" -> "ProductID"
|
||||
# e.g. "partition Sales = m" -> "Sales"
|
||||
rest = declaration[len(keyword):].strip()
|
||||
if rest.startswith("'"):
|
||||
end = rest.find("'", 1)
|
||||
return rest[1:end] if end > 0 else rest[1:]
|
||||
# Take first token, stop at '=' or whitespace
|
||||
token = re.split(r"[\s=]", rest)[0]
|
||||
return token.strip("'\"") if token else rest
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Relationship diffing
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _diff_relationships(base_def: Path, head_def: Path) -> dict[str, list[str]]:
|
||||
base_rels = _parse_relationships(_read_tmdl(base_def / "relationships.tmdl"))
|
||||
head_rels = _parse_relationships(_read_tmdl(head_def / "relationships.tmdl"))
|
||||
|
||||
all_keys = set(base_rels) | set(head_rels)
|
||||
added: list[str] = []
|
||||
removed: list[str] = []
|
||||
changed: list[str] = []
|
||||
|
||||
for key in sorted(all_keys):
|
||||
if key not in base_rels:
|
||||
added.append(key)
|
||||
elif key not in head_rels:
|
||||
removed.append(key)
|
||||
elif _strip_lineage_tags(base_rels[key]) != _strip_lineage_tags(head_rels[key]):
|
||||
changed.append(key)
|
||||
|
||||
return {"added": added, "removed": removed, "changed": changed}
|
||||
|
||||
|
||||
def _parse_relationships(text: str) -> dict[str, str]:
|
||||
"""Parse relationships.tmdl into {from -> to: text_block} entries."""
|
||||
if not text.strip():
|
||||
return {}
|
||||
|
||||
blocks: dict[str, str] = {}
|
||||
current_lines: list[str] = []
|
||||
in_rel = False
|
||||
|
||||
for line in text.splitlines(keepends=True):
|
||||
if line.startswith("relationship "):
|
||||
if in_rel and current_lines:
|
||||
_save_relationship(current_lines, blocks)
|
||||
current_lines = [line]
|
||||
in_rel = True
|
||||
elif in_rel:
|
||||
current_lines.append(line)
|
||||
|
||||
if in_rel and current_lines:
|
||||
_save_relationship(current_lines, blocks)
|
||||
|
||||
return blocks
|
||||
|
||||
|
||||
def _save_relationship(lines: list[str], blocks: dict[str, str]) -> None:
|
||||
"""Extract semantic key from a relationship block and store it."""
|
||||
from_col = ""
|
||||
to_col = ""
|
||||
for line in lines:
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("fromColumn:"):
|
||||
from_col = stripped.split(":", 1)[1].strip()
|
||||
elif stripped.startswith("toColumn:"):
|
||||
to_col = stripped.split(":", 1)[1].strip()
|
||||
if from_col or to_col:
|
||||
key = f"{from_col} -> {to_col}"
|
||||
blocks[key] = "".join(lines)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Model property diffing
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _diff_model(base_def: Path, head_def: Path) -> dict[str, list[str]]:
|
||||
base_props = _parse_model_props(_read_tmdl(base_def / "model.tmdl"))
|
||||
head_props = _parse_model_props(_read_tmdl(head_def / "model.tmdl"))
|
||||
|
||||
changed: list[str] = []
|
||||
all_keys = set(base_props) | set(head_props)
|
||||
for key in sorted(all_keys):
|
||||
b_val = base_props.get(key)
|
||||
h_val = head_props.get(key)
|
||||
if b_val != h_val:
|
||||
changed.append(f"{key}: {b_val!r} -> {h_val!r}")
|
||||
|
||||
return {"changed_properties": changed}
|
||||
|
||||
|
||||
def _parse_model_props(text: str) -> dict[str, str]:
|
||||
"""Extract key: value properties at 1-tab indent from model.tmdl."""
|
||||
props: dict[str, str] = {}
|
||||
for line in text.splitlines():
|
||||
if line.startswith("\t") and not line.startswith("\t\t") and ":" in line:
|
||||
key, _, val = line[1:].partition(":")
|
||||
props[key.strip()] = val.strip()
|
||||
return props
|
||||
929
src/pbi_cli/core/visual_backend.py
Normal file
929
src/pbi_cli/core/visual_backend.py
Normal file
|
|
@ -0,0 +1,929 @@
|
|||
"""Pure-function backend for PBIR visual operations.
|
||||
|
||||
Mirrors ``report_backend.py`` but focuses on individual visual CRUD.
|
||||
Every function takes a ``Path`` to the definition folder and returns
|
||||
plain Python dicts suitable for ``format_result()``.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import re
|
||||
import secrets
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError, VisualTypeError
|
||||
from pbi_cli.core.pbir_models import (
|
||||
SUPPORTED_VISUAL_TYPES,
|
||||
VISUAL_TYPE_ALIASES,
|
||||
)
|
||||
from pbi_cli.core.pbir_path import get_visual_dir, get_visuals_dir
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# JSON helpers (same as report_backend)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any]:
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
path.write_text(
|
||||
json.dumps(data, indent=2, ensure_ascii=False) + "\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
|
||||
def _generate_name() -> str:
|
||||
return secrets.token_hex(10)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Template loading
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
# Data role mappings for each visual type
|
||||
VISUAL_DATA_ROLES: dict[str, list[str]] = {
|
||||
# Original 9
|
||||
"barChart": ["Category", "Y", "Legend"],
|
||||
"lineChart": ["Category", "Y", "Legend"],
|
||||
"card": ["Values"],
|
||||
"tableEx": ["Values"],
|
||||
"pivotTable": ["Rows", "Values", "Columns"],
|
||||
"slicer": ["Values"],
|
||||
"kpi": ["Indicator", "Goal", "TrendLine"],
|
||||
"gauge": ["Y", "MaxValue"],
|
||||
"donutChart": ["Category", "Y", "Legend"],
|
||||
# v3.1.0 additions
|
||||
"columnChart": ["Category", "Y", "Legend"],
|
||||
"areaChart": ["Category", "Y", "Legend"],
|
||||
"ribbonChart": ["Category", "Y", "Legend"],
|
||||
"waterfallChart": ["Category", "Y", "Breakdown"],
|
||||
"scatterChart": ["Details", "X", "Y", "Size", "Legend"],
|
||||
"funnelChart": ["Category", "Y"],
|
||||
"multiRowCard": ["Values"],
|
||||
"treemap": ["Category", "Values"],
|
||||
"cardNew": ["Fields"],
|
||||
"stackedBarChart": ["Category", "Y", "Legend"],
|
||||
"lineStackedColumnComboChart": ["Category", "ColumnY", "LineY", "Legend"],
|
||||
# v3.4.0 additions
|
||||
"cardVisual": ["Data"],
|
||||
"actionButton": [],
|
||||
# v3.5.0 additions
|
||||
"clusteredColumnChart": ["Category", "Y", "Legend"],
|
||||
"clusteredBarChart": ["Category", "Y", "Legend"],
|
||||
"textSlicer": ["Values"],
|
||||
"listSlicer": ["Values"],
|
||||
# v3.6.0 additions
|
||||
"image": [],
|
||||
"shape": [],
|
||||
"textbox": [],
|
||||
"pageNavigator": [],
|
||||
"advancedSlicerVisual": ["Values"],
|
||||
# v3.8.0 additions
|
||||
"azureMap": ["Category", "Size"],
|
||||
}
|
||||
|
||||
# Roles that should default to Measure references (not Column)
|
||||
MEASURE_ROLES: frozenset[str] = frozenset({
|
||||
"Y", "Values", "Fields", # "Fields" is used by cardNew only
|
||||
"Indicator", "Goal",
|
||||
# v3.1.0 additions
|
||||
"ColumnY", "LineY", "X", "Size",
|
||||
# v3.4.0 additions
|
||||
"Data",
|
||||
# v3.8.0 additions
|
||||
"MaxValue",
|
||||
})
|
||||
|
||||
# User-friendly role aliases to PBIR role names
|
||||
ROLE_ALIASES: dict[str, dict[str, str]] = {
|
||||
# Original 9
|
||||
"barChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"lineChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"card": {"field": "Values", "value": "Values"},
|
||||
"tableEx": {"value": "Values", "column": "Values"},
|
||||
"pivotTable": {"row": "Rows", "value": "Values", "column": "Columns"},
|
||||
"slicer": {"value": "Values", "field": "Values"},
|
||||
"kpi": {
|
||||
"indicator": "Indicator",
|
||||
"value": "Indicator",
|
||||
"goal": "Goal",
|
||||
"trend_line": "TrendLine",
|
||||
"trend": "TrendLine",
|
||||
},
|
||||
"gauge": {
|
||||
"value": "Y",
|
||||
"max": "MaxValue",
|
||||
"max_value": "MaxValue",
|
||||
"target": "MaxValue",
|
||||
},
|
||||
"donutChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
# v3.1.0 additions
|
||||
"columnChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"areaChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"ribbonChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"waterfallChart": {"category": "Category", "value": "Y", "breakdown": "Breakdown"},
|
||||
"scatterChart": {
|
||||
"x": "X", "y": "Y", "detail": "Details", "size": "Size", "legend": "Legend",
|
||||
"value": "Y",
|
||||
},
|
||||
"funnelChart": {"category": "Category", "value": "Y"},
|
||||
"multiRowCard": {"field": "Values", "value": "Values"},
|
||||
"treemap": {"category": "Category", "value": "Values"},
|
||||
"cardNew": {"field": "Fields", "value": "Fields"},
|
||||
"stackedBarChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"lineStackedColumnComboChart": {
|
||||
"category": "Category",
|
||||
"column": "ColumnY",
|
||||
"line": "LineY",
|
||||
"legend": "Legend",
|
||||
"value": "ColumnY",
|
||||
},
|
||||
# v3.4.0 additions
|
||||
"cardVisual": {"field": "Data", "value": "Data"},
|
||||
"actionButton": {},
|
||||
# v3.5.0 additions
|
||||
"clusteredColumnChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"clusteredBarChart": {"category": "Category", "value": "Y", "legend": "Legend"},
|
||||
"textSlicer": {"value": "Values", "field": "Values"},
|
||||
"listSlicer": {"value": "Values", "field": "Values"},
|
||||
# v3.6.0 additions
|
||||
"image": {},
|
||||
"shape": {},
|
||||
"textbox": {},
|
||||
"pageNavigator": {},
|
||||
"advancedSlicerVisual": {"value": "Values", "field": "Values"},
|
||||
# v3.8.0 additions
|
||||
"azureMap": {"category": "Category", "value": "Size", "size": "Size"},
|
||||
}
|
||||
|
||||
|
||||
def _resolve_visual_type(user_type: str) -> str:
|
||||
"""Resolve a user-provided visual type to a PBIR visualType."""
|
||||
if user_type in SUPPORTED_VISUAL_TYPES:
|
||||
return user_type
|
||||
resolved = VISUAL_TYPE_ALIASES.get(user_type)
|
||||
if resolved is not None:
|
||||
return resolved
|
||||
raise VisualTypeError(user_type)
|
||||
|
||||
|
||||
def _load_template(visual_type: str) -> str:
|
||||
"""Load a visual template as a raw string (contains placeholders)."""
|
||||
import importlib.resources
|
||||
|
||||
templates_pkg = importlib.resources.files("pbi_cli.templates.visuals")
|
||||
template_file = templates_pkg / f"{visual_type}.json"
|
||||
return template_file.read_text(encoding="utf-8")
|
||||
|
||||
|
||||
def _build_visual_json(
|
||||
template_str: str,
|
||||
name: str,
|
||||
x: float,
|
||||
y: float,
|
||||
width: float,
|
||||
height: float,
|
||||
z: int = 0,
|
||||
tab_order: int = 0,
|
||||
) -> dict[str, Any]:
|
||||
"""Fill placeholders in a template string and return parsed JSON."""
|
||||
filled = (
|
||||
template_str
|
||||
.replace("__VISUAL_NAME__", name)
|
||||
.replace("__X__", str(x))
|
||||
.replace("__Y__", str(y))
|
||||
.replace("__WIDTH__", str(width))
|
||||
.replace("__HEIGHT__", str(height))
|
||||
.replace("__Z__", str(z))
|
||||
.replace("__TAB_ORDER__", str(tab_order))
|
||||
)
|
||||
return json.loads(filled)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Default positions and sizes per visual type
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
DEFAULT_SIZES: dict[str, tuple[float, float]] = {
|
||||
# Original 9
|
||||
"barChart": (400, 300),
|
||||
"lineChart": (400, 300),
|
||||
"card": (200, 120),
|
||||
"tableEx": (500, 350),
|
||||
"pivotTable": (500, 350),
|
||||
"slicer": (200, 300),
|
||||
"kpi": (250, 150),
|
||||
"gauge": (300, 250),
|
||||
"donutChart": (350, 300),
|
||||
# v3.1.0 additions
|
||||
"columnChart": (400, 300),
|
||||
"areaChart": (400, 300),
|
||||
"ribbonChart": (400, 300),
|
||||
"waterfallChart": (450, 300),
|
||||
"scatterChart": (400, 350),
|
||||
"funnelChart": (350, 300),
|
||||
"multiRowCard": (300, 200),
|
||||
"treemap": (400, 300),
|
||||
"cardNew": (200, 120),
|
||||
"stackedBarChart": (400, 300),
|
||||
"lineStackedColumnComboChart": (500, 300),
|
||||
# v3.4.0 additions -- sizes from real Desktop export
|
||||
"cardVisual": (217, 87),
|
||||
"actionButton": (51, 22),
|
||||
# v3.5.0 additions
|
||||
"clusteredColumnChart": (400, 300),
|
||||
"clusteredBarChart": (400, 300),
|
||||
"textSlicer": (200, 50),
|
||||
"listSlicer": (200, 300),
|
||||
# v3.6.0 additions (from real HR Analysis Desktop export sizing)
|
||||
"image": (200, 150),
|
||||
"shape": (300, 200),
|
||||
"textbox": (300, 100),
|
||||
"pageNavigator": (120, 400),
|
||||
"advancedSlicerVisual": (280, 280),
|
||||
# v3.8.0 additions
|
||||
"azureMap": (500, 400),
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Visual CRUD operations
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def visual_list(
|
||||
definition_path: Path, page_name: str
|
||||
) -> list[dict[str, Any]]:
|
||||
"""List all visuals on a page."""
|
||||
visuals_dir = definition_path / "pages" / page_name / "visuals"
|
||||
if not visuals_dir.is_dir():
|
||||
return []
|
||||
|
||||
results: list[dict[str, Any]] = []
|
||||
for vdir in sorted(visuals_dir.iterdir()):
|
||||
if not vdir.is_dir():
|
||||
continue
|
||||
vfile = vdir / "visual.json"
|
||||
if not vfile.exists():
|
||||
continue
|
||||
data = _read_json(vfile)
|
||||
|
||||
# Group container: has "visualGroup" key instead of "visual"
|
||||
if "visualGroup" in data and "visual" not in data:
|
||||
results.append({
|
||||
"name": data.get("name", vdir.name),
|
||||
"visual_type": "group",
|
||||
"x": 0,
|
||||
"y": 0,
|
||||
"width": 0,
|
||||
"height": 0,
|
||||
})
|
||||
continue
|
||||
|
||||
pos = data.get("position", {})
|
||||
visual_config = data.get("visual", {})
|
||||
results.append({
|
||||
"name": data.get("name", vdir.name),
|
||||
"visual_type": visual_config.get("visualType", "unknown"),
|
||||
"x": pos.get("x", 0),
|
||||
"y": pos.get("y", 0),
|
||||
"width": pos.get("width", 0),
|
||||
"height": pos.get("height", 0),
|
||||
})
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def visual_get(
|
||||
definition_path: Path, page_name: str, visual_name: str
|
||||
) -> dict[str, Any]:
|
||||
"""Get detailed information about a visual."""
|
||||
visual_dir = get_visual_dir(definition_path, page_name, visual_name)
|
||||
vfile = visual_dir / "visual.json"
|
||||
|
||||
if not vfile.exists():
|
||||
raise PbiCliError(f"Visual '{visual_name}' not found on page '{page_name}'.")
|
||||
|
||||
data = _read_json(vfile)
|
||||
pos = data.get("position", {})
|
||||
visual_config = data.get("visual", {})
|
||||
query_state = visual_config.get("query", {}).get("queryState", {})
|
||||
|
||||
# Extract bindings summary
|
||||
bindings: list[dict[str, Any]] = []
|
||||
for role, state in query_state.items():
|
||||
projections = state.get("projections", [])
|
||||
for proj in projections:
|
||||
field = proj.get("field", {})
|
||||
query_ref = proj.get("queryRef", "")
|
||||
bindings.append({
|
||||
"role": role,
|
||||
"query_ref": query_ref,
|
||||
"field": _summarize_field(field),
|
||||
})
|
||||
|
||||
return {
|
||||
"name": data.get("name", visual_name),
|
||||
"visual_type": visual_config.get("visualType", "unknown"),
|
||||
"x": pos.get("x", 0),
|
||||
"y": pos.get("y", 0),
|
||||
"width": pos.get("width", 0),
|
||||
"height": pos.get("height", 0),
|
||||
"bindings": bindings,
|
||||
"is_hidden": data.get("isHidden", False),
|
||||
}
|
||||
|
||||
|
||||
def visual_add(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_type: str,
|
||||
name: str | None = None,
|
||||
x: float | None = None,
|
||||
y: float | None = None,
|
||||
width: float | None = None,
|
||||
height: float | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Add a new visual to a page."""
|
||||
# Validate page exists
|
||||
page_dir = definition_path / "pages" / page_name
|
||||
if not page_dir.is_dir():
|
||||
raise PbiCliError(f"Page '{page_name}' not found.")
|
||||
|
||||
resolved_type = _resolve_visual_type(visual_type)
|
||||
visual_name = name or _generate_name()
|
||||
|
||||
# Defaults
|
||||
default_w, default_h = DEFAULT_SIZES.get(resolved_type, (400, 300))
|
||||
final_x = x if x is not None else 50
|
||||
final_y = y if y is not None else _next_y_position(definition_path, page_name)
|
||||
final_w = width if width is not None else default_w
|
||||
final_h = height if height is not None else default_h
|
||||
|
||||
# Determine z-order
|
||||
z = _next_z_order(definition_path, page_name)
|
||||
|
||||
# Load and fill template
|
||||
template_str = _load_template(resolved_type)
|
||||
visual_data = _build_visual_json(
|
||||
template_str,
|
||||
name=visual_name,
|
||||
x=final_x,
|
||||
y=final_y,
|
||||
width=final_w,
|
||||
height=final_h,
|
||||
z=z,
|
||||
tab_order=z,
|
||||
)
|
||||
|
||||
# Write to disk
|
||||
visual_dir = get_visuals_dir(definition_path, page_name) / visual_name
|
||||
visual_dir.mkdir(parents=True, exist_ok=True)
|
||||
_write_json(visual_dir / "visual.json", visual_data)
|
||||
|
||||
return {
|
||||
"status": "created",
|
||||
"name": visual_name,
|
||||
"visual_type": resolved_type,
|
||||
"page": page_name,
|
||||
"x": final_x,
|
||||
"y": final_y,
|
||||
"width": final_w,
|
||||
"height": final_h,
|
||||
}
|
||||
|
||||
|
||||
def visual_update(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
x: float | None = None,
|
||||
y: float | None = None,
|
||||
width: float | None = None,
|
||||
height: float | None = None,
|
||||
hidden: bool | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Update visual position, size, or visibility."""
|
||||
visual_dir = get_visual_dir(definition_path, page_name, visual_name)
|
||||
vfile = visual_dir / "visual.json"
|
||||
|
||||
if not vfile.exists():
|
||||
raise PbiCliError(f"Visual '{visual_name}' not found on page '{page_name}'.")
|
||||
|
||||
data = _read_json(vfile)
|
||||
pos = data.get("position", {})
|
||||
|
||||
if x is not None:
|
||||
pos["x"] = x
|
||||
if y is not None:
|
||||
pos["y"] = y
|
||||
if width is not None:
|
||||
pos["width"] = width
|
||||
if height is not None:
|
||||
pos["height"] = height
|
||||
data["position"] = pos
|
||||
|
||||
if hidden is not None:
|
||||
data["isHidden"] = hidden
|
||||
|
||||
_write_json(vfile, data)
|
||||
|
||||
return {
|
||||
"status": "updated",
|
||||
"name": visual_name,
|
||||
"page": page_name,
|
||||
"position": {
|
||||
"x": pos.get("x", 0),
|
||||
"y": pos.get("y", 0),
|
||||
"width": pos.get("width", 0),
|
||||
"height": pos.get("height", 0),
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def visual_set_container(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
border_show: bool | None = None,
|
||||
background_show: bool | None = None,
|
||||
title: str | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Set container-level properties (border, background, title) on a visual.
|
||||
|
||||
Only the keyword arguments that are provided (not None) are updated.
|
||||
Other ``visualContainerObjects`` keys are preserved unchanged.
|
||||
|
||||
The ``visualContainerObjects`` key is separate from ``visual.objects`` --
|
||||
it controls the container chrome (border, background, header title) rather
|
||||
than the visual's own formatting.
|
||||
"""
|
||||
visual_dir = get_visual_dir(definition_path, page_name, visual_name)
|
||||
visual_json_path = visual_dir / "visual.json"
|
||||
if not visual_json_path.exists():
|
||||
raise PbiCliError(
|
||||
f"Visual '{visual_name}' not found on page '{page_name}'."
|
||||
)
|
||||
|
||||
data = _read_json(visual_json_path)
|
||||
visual = data.get("visual")
|
||||
if visual is None:
|
||||
raise PbiCliError(
|
||||
f"Visual '{visual_name}' has invalid JSON -- missing 'visual' key."
|
||||
)
|
||||
|
||||
if border_show is None and background_show is None and title is None:
|
||||
return {
|
||||
"status": "no-op",
|
||||
"visual": visual_name,
|
||||
"page": page_name,
|
||||
"border_show": None,
|
||||
"background_show": None,
|
||||
"title": None,
|
||||
}
|
||||
|
||||
vco: dict[str, Any] = dict(visual.get("visualContainerObjects", {}))
|
||||
|
||||
def _bool_entry(value: bool) -> list[dict[str, Any]]:
|
||||
return [{
|
||||
"properties": {
|
||||
"show": {
|
||||
"expr": {"Literal": {"Value": str(value).lower()}}
|
||||
}
|
||||
}
|
||||
}]
|
||||
|
||||
if border_show is not None:
|
||||
vco = {**vco, "border": _bool_entry(border_show)}
|
||||
if background_show is not None:
|
||||
vco = {**vco, "background": _bool_entry(background_show)}
|
||||
if title is not None:
|
||||
vco = {**vco, "title": [{
|
||||
"properties": {
|
||||
"text": {
|
||||
"expr": {"Literal": {"Value": f"'{title}'"}}
|
||||
}
|
||||
}
|
||||
}]}
|
||||
|
||||
updated_visual = {**visual, "visualContainerObjects": vco}
|
||||
_write_json(visual_json_path, {**data, "visual": updated_visual})
|
||||
|
||||
return {
|
||||
"status": "updated",
|
||||
"visual": visual_name,
|
||||
"page": page_name,
|
||||
"border_show": border_show,
|
||||
"background_show": background_show,
|
||||
"title": title,
|
||||
}
|
||||
|
||||
|
||||
def visual_delete(
|
||||
definition_path: Path, page_name: str, visual_name: str
|
||||
) -> dict[str, Any]:
|
||||
"""Delete a visual from a page."""
|
||||
visual_dir = get_visual_dir(definition_path, page_name, visual_name)
|
||||
|
||||
if not visual_dir.exists():
|
||||
raise PbiCliError(f"Visual '{visual_name}' not found on page '{page_name}'.")
|
||||
|
||||
_rmtree(visual_dir)
|
||||
|
||||
return {"status": "deleted", "name": visual_name, "page": page_name}
|
||||
|
||||
|
||||
def visual_bind(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
bindings: list[dict[str, Any]],
|
||||
) -> dict[str, Any]:
|
||||
"""Bind semantic model fields to visual data roles.
|
||||
|
||||
Each binding dict should have:
|
||||
- ``role``: Data role (e.g. "category", "value", "row")
|
||||
- ``field``: Field reference in ``Table[Column]`` notation
|
||||
- ``measure``: (optional) bool, force treat as measure
|
||||
|
||||
Roles are resolved through ``ROLE_ALIASES`` to the actual PBIR role name.
|
||||
Measure vs Column is determined by the resolved role: value/field/indicator/goal
|
||||
roles default to Measure; category/row/legend default to Column.
|
||||
"""
|
||||
visual_dir = get_visual_dir(definition_path, page_name, visual_name)
|
||||
vfile = visual_dir / "visual.json"
|
||||
|
||||
if not vfile.exists():
|
||||
raise PbiCliError(f"Visual '{visual_name}' not found on page '{page_name}'.")
|
||||
|
||||
data = _read_json(vfile)
|
||||
visual_config = data.get("visual", {})
|
||||
visual_type = visual_config.get("visualType", "")
|
||||
query = visual_config.setdefault("query", {})
|
||||
query_state = query.setdefault("queryState", {})
|
||||
|
||||
# Collect existing Commands From/Select to merge (fix: don't overwrite)
|
||||
from_entities: dict[str, dict[str, Any]] = {}
|
||||
select_items: list[dict[str, Any]] = []
|
||||
_collect_existing_commands(query, from_entities, select_items)
|
||||
|
||||
role_map = ROLE_ALIASES.get(visual_type, {})
|
||||
applied: list[dict[str, str]] = []
|
||||
|
||||
for binding in bindings:
|
||||
user_role = binding["role"].lower()
|
||||
field_ref = binding["field"]
|
||||
force_measure = binding.get("measure", False)
|
||||
|
||||
# Resolve role alias
|
||||
pbir_role = role_map.get(user_role, binding["role"])
|
||||
|
||||
# Parse Table[Column]
|
||||
table, column = _parse_field_ref(field_ref)
|
||||
|
||||
# Determine measure vs column: explicit flag, or role-based heuristic
|
||||
is_measure = force_measure or pbir_role in MEASURE_ROLES
|
||||
|
||||
# Track source alias for Commands block (use full name to avoid collisions)
|
||||
source_alias = table.replace(" ", "_").lower() if table else "t"
|
||||
from_entities[source_alias] = {
|
||||
"Name": source_alias,
|
||||
"Entity": table,
|
||||
"Type": 0,
|
||||
}
|
||||
|
||||
# Build queryState projection (uses Entity directly, matching Desktop)
|
||||
query_ref = f"{table}.{column}"
|
||||
if is_measure:
|
||||
field_expr: dict[str, Any] = {
|
||||
"Measure": {
|
||||
"Expression": {"SourceRef": {"Entity": table}},
|
||||
"Property": column,
|
||||
}
|
||||
}
|
||||
else:
|
||||
field_expr = {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Entity": table}},
|
||||
"Property": column,
|
||||
}
|
||||
}
|
||||
|
||||
projection = {
|
||||
"field": field_expr,
|
||||
"queryRef": query_ref,
|
||||
"nativeQueryRef": column,
|
||||
}
|
||||
|
||||
# Add to query state
|
||||
role_state = query_state.setdefault(pbir_role, {"projections": []})
|
||||
role_state["projections"].append(projection)
|
||||
|
||||
# Build Commands select item (uses Source alias)
|
||||
if is_measure:
|
||||
cmd_field_expr: dict[str, Any] = {
|
||||
"Measure": {
|
||||
"Expression": {"SourceRef": {"Source": source_alias}},
|
||||
"Property": column,
|
||||
}
|
||||
}
|
||||
else:
|
||||
cmd_field_expr = {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Source": source_alias}},
|
||||
"Property": column,
|
||||
}
|
||||
}
|
||||
select_items.append({
|
||||
**cmd_field_expr,
|
||||
"Name": query_ref,
|
||||
})
|
||||
|
||||
applied.append({
|
||||
"role": pbir_role,
|
||||
"field": field_ref,
|
||||
"query_ref": query_ref,
|
||||
})
|
||||
|
||||
# Set the semantic query Commands block (merges with existing)
|
||||
if from_entities and select_items:
|
||||
query["Commands"] = [{
|
||||
"SemanticQueryDataShapeCommand": {
|
||||
"Query": {
|
||||
"Version": 2,
|
||||
"From": list(from_entities.values()),
|
||||
"Select": select_items,
|
||||
}
|
||||
}
|
||||
}]
|
||||
|
||||
data["visual"] = visual_config
|
||||
_write_json(vfile, data)
|
||||
|
||||
return {
|
||||
"status": "bound",
|
||||
"name": visual_name,
|
||||
"page": page_name,
|
||||
"bindings": applied,
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Internal helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
_FIELD_REF_PATTERN = re.compile(r"^(.+)\[(.+)\]$")
|
||||
|
||||
|
||||
def _parse_field_ref(ref: str) -> tuple[str, str]:
|
||||
"""Parse ``Table[Column]`` or ``[Measure]`` notation.
|
||||
|
||||
Returns (table, column).
|
||||
"""
|
||||
match = _FIELD_REF_PATTERN.match(ref.strip())
|
||||
if match:
|
||||
table = match.group(1).strip()
|
||||
column = match.group(2).strip()
|
||||
return table, column
|
||||
|
||||
raise PbiCliError(
|
||||
f"Invalid field reference '{ref}'. Expected 'Table[Column]' format."
|
||||
)
|
||||
|
||||
|
||||
def _summarize_field(field: dict[str, Any]) -> str:
|
||||
"""Produce a human-readable summary of a query field expression."""
|
||||
for kind in ("Column", "Measure"):
|
||||
if kind in field:
|
||||
item = field[kind]
|
||||
source_ref = item.get("Expression", {}).get("SourceRef", {})
|
||||
# queryState uses Entity, Commands uses Source
|
||||
source = source_ref.get("Entity", source_ref.get("Source", "?"))
|
||||
prop = item.get("Property", "?")
|
||||
if kind == "Measure":
|
||||
return f"{source}.[{prop}]"
|
||||
return f"{source}.{prop}"
|
||||
return str(field)
|
||||
|
||||
|
||||
def _collect_existing_commands(
|
||||
query: dict[str, Any],
|
||||
from_entities: dict[str, dict[str, Any]],
|
||||
select_items: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""Extract existing From entities and Select items from Commands block."""
|
||||
for cmd in query.get("Commands", []):
|
||||
sq = cmd.get("SemanticQueryDataShapeCommand", {}).get("Query", {})
|
||||
for entity in sq.get("From", []):
|
||||
name = entity.get("Name", "")
|
||||
if name:
|
||||
from_entities[name] = entity
|
||||
select_items.extend(sq.get("Select", []))
|
||||
|
||||
|
||||
def _next_y_position(definition_path: Path, page_name: str) -> float:
|
||||
"""Calculate the next y position to avoid overlap with existing visuals."""
|
||||
visuals_dir = definition_path / "pages" / page_name / "visuals"
|
||||
if not visuals_dir.is_dir():
|
||||
return 50
|
||||
|
||||
max_bottom = 50.0
|
||||
for vdir in visuals_dir.iterdir():
|
||||
if not vdir.is_dir():
|
||||
continue
|
||||
vfile = vdir / "visual.json"
|
||||
if not vfile.exists():
|
||||
continue
|
||||
try:
|
||||
data = _read_json(vfile)
|
||||
pos = data.get("position", {})
|
||||
bottom = pos.get("y", 0) + pos.get("height", 0)
|
||||
if bottom > max_bottom:
|
||||
max_bottom = bottom
|
||||
except (json.JSONDecodeError, KeyError):
|
||||
continue
|
||||
|
||||
return max_bottom + 20
|
||||
|
||||
|
||||
def _next_z_order(definition_path: Path, page_name: str) -> int:
|
||||
"""Determine the next z-order value for a new visual."""
|
||||
visuals_dir = definition_path / "pages" / page_name / "visuals"
|
||||
if not visuals_dir.is_dir():
|
||||
return 0
|
||||
|
||||
max_z = -1
|
||||
for vdir in visuals_dir.iterdir():
|
||||
if not vdir.is_dir():
|
||||
continue
|
||||
vfile = vdir / "visual.json"
|
||||
if not vfile.exists():
|
||||
continue
|
||||
try:
|
||||
data = _read_json(vfile)
|
||||
z = data.get("position", {}).get("z", 0)
|
||||
if z > max_z:
|
||||
max_z = z
|
||||
except (json.JSONDecodeError, KeyError):
|
||||
continue
|
||||
|
||||
return max_z + 1
|
||||
|
||||
|
||||
def _rmtree(path: Path) -> None:
|
||||
"""Recursively remove a directory tree."""
|
||||
if path.is_dir():
|
||||
for child in path.iterdir():
|
||||
_rmtree(child)
|
||||
path.rmdir()
|
||||
else:
|
||||
path.unlink()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Visual Calculations (Phase 7)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def visual_calc_add(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
calc_name: str,
|
||||
expression: str,
|
||||
role: str = "Y",
|
||||
) -> dict[str, Any]:
|
||||
"""Add a visual calculation to a data role's projections.
|
||||
|
||||
Appends a NativeVisualCalculation projection to queryState[role].projections[].
|
||||
If the role does not exist in queryState, creates it with an empty projections list.
|
||||
If a calc with the same Name already exists in that role, replaces it (idempotent).
|
||||
|
||||
Returns {"status": "added", "visual": visual_name, "name": calc_name,
|
||||
"role": role, "expression": expression}.
|
||||
Raises PbiCliError if visual.json not found.
|
||||
"""
|
||||
vfile = get_visual_dir(definition_path, page_name, visual_name) / "visual.json"
|
||||
if not vfile.exists():
|
||||
raise PbiCliError(f"Visual '{visual_name}' not found on page '{page_name}'.")
|
||||
|
||||
data = _read_json(vfile)
|
||||
visual_config = data.setdefault("visual", {})
|
||||
query = visual_config.setdefault("query", {})
|
||||
query_state = query.setdefault("queryState", {})
|
||||
role_state = query_state.setdefault(role, {"projections": []})
|
||||
projections: list[dict[str, Any]] = role_state.setdefault("projections", [])
|
||||
|
||||
new_proj: dict[str, Any] = {
|
||||
"field": {
|
||||
"NativeVisualCalculation": {
|
||||
"Language": "dax",
|
||||
"Expression": expression,
|
||||
"Name": calc_name,
|
||||
}
|
||||
},
|
||||
"queryRef": "select",
|
||||
"nativeQueryRef": calc_name,
|
||||
}
|
||||
|
||||
# Replace existing calc with same name (idempotent), else append
|
||||
updated = False
|
||||
new_projections: list[dict[str, Any]] = []
|
||||
for proj in projections:
|
||||
nvc = proj.get("field", {}).get("NativeVisualCalculation", {})
|
||||
if nvc.get("Name") == calc_name:
|
||||
new_projections.append(new_proj)
|
||||
updated = True
|
||||
else:
|
||||
new_projections.append(proj)
|
||||
|
||||
if not updated:
|
||||
new_projections.append(new_proj)
|
||||
|
||||
role_state["projections"] = new_projections
|
||||
_write_json(vfile, data)
|
||||
|
||||
return {
|
||||
"status": "added",
|
||||
"visual": visual_name,
|
||||
"name": calc_name,
|
||||
"role": role,
|
||||
"expression": expression,
|
||||
}
|
||||
|
||||
|
||||
def visual_calc_list(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
) -> list[dict[str, Any]]:
|
||||
"""List all visual calculations across all roles.
|
||||
|
||||
Returns list of {"name": ..., "expression": ..., "role": ..., "query_ref": "select"}.
|
||||
Returns [] if no NativeVisualCalculation projections found.
|
||||
"""
|
||||
vfile = get_visual_dir(definition_path, page_name, visual_name) / "visual.json"
|
||||
if not vfile.exists():
|
||||
raise PbiCliError(f"Visual '{visual_name}' not found on page '{page_name}'.")
|
||||
|
||||
data = _read_json(vfile)
|
||||
query_state = data.get("visual", {}).get("query", {}).get("queryState", {})
|
||||
|
||||
results: list[dict[str, Any]] = []
|
||||
for role, state in query_state.items():
|
||||
for proj in state.get("projections", []):
|
||||
nvc = proj.get("field", {}).get("NativeVisualCalculation")
|
||||
if nvc is not None:
|
||||
results.append({
|
||||
"name": nvc.get("Name", ""),
|
||||
"expression": nvc.get("Expression", ""),
|
||||
"role": role,
|
||||
"query_ref": proj.get("queryRef", "select"),
|
||||
})
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def visual_calc_delete(
|
||||
definition_path: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
calc_name: str,
|
||||
) -> dict[str, Any]:
|
||||
"""Delete a visual calculation by name.
|
||||
|
||||
Searches all roles' projections for NativeVisualCalculation with Name==calc_name.
|
||||
Raises PbiCliError if not found.
|
||||
Returns {"status": "deleted", "visual": visual_name, "name": calc_name}.
|
||||
"""
|
||||
vfile = get_visual_dir(definition_path, page_name, visual_name) / "visual.json"
|
||||
if not vfile.exists():
|
||||
raise PbiCliError(f"Visual '{visual_name}' not found on page '{page_name}'.")
|
||||
|
||||
data = _read_json(vfile)
|
||||
query_state = (
|
||||
data.get("visual", {}).get("query", {}).get("queryState", {})
|
||||
)
|
||||
|
||||
found = False
|
||||
for role, state in query_state.items():
|
||||
projections: list[dict[str, Any]] = state.get("projections", [])
|
||||
new_projections = [
|
||||
proj for proj in projections
|
||||
if proj.get("field", {}).get("NativeVisualCalculation", {}).get("Name") != calc_name
|
||||
]
|
||||
if len(new_projections) < len(projections):
|
||||
state["projections"] = new_projections
|
||||
found = True
|
||||
|
||||
if not found:
|
||||
raise PbiCliError(
|
||||
f"Visual calculation '{calc_name}' not found in visual '{visual_name}'."
|
||||
)
|
||||
|
||||
_write_json(vfile, data)
|
||||
return {"status": "deleted", "visual": visual_name, "name": calc_name}
|
||||
|
|
@ -52,6 +52,7 @@ def cli(ctx: click.Context, json_output: bool, connection: str | None) -> None:
|
|||
def _register_commands() -> None:
|
||||
"""Lazily import and register all command groups."""
|
||||
from pbi_cli.commands.advanced import advanced
|
||||
from pbi_cli.commands.bookmarks import bookmarks
|
||||
from pbi_cli.commands.calc_group import calc_group
|
||||
from pbi_cli.commands.calendar import calendar
|
||||
from pbi_cli.commands.column import column
|
||||
|
|
@ -59,6 +60,8 @@ def _register_commands() -> None:
|
|||
from pbi_cli.commands.database import database
|
||||
from pbi_cli.commands.dax import dax
|
||||
from pbi_cli.commands.expression import expression
|
||||
from pbi_cli.commands.filters import filters
|
||||
from pbi_cli.commands.format_cmd import format_cmd
|
||||
from pbi_cli.commands.hierarchy import hierarchy
|
||||
from pbi_cli.commands.measure import measure
|
||||
from pbi_cli.commands.model import model
|
||||
|
|
@ -66,12 +69,14 @@ def _register_commands() -> None:
|
|||
from pbi_cli.commands.perspective import perspective
|
||||
from pbi_cli.commands.relationship import relationship
|
||||
from pbi_cli.commands.repl_cmd import repl
|
||||
from pbi_cli.commands.report import report
|
||||
from pbi_cli.commands.security import security_role
|
||||
from pbi_cli.commands.setup_cmd import setup
|
||||
from pbi_cli.commands.skills_cmd import skills
|
||||
from pbi_cli.commands.table import table
|
||||
from pbi_cli.commands.trace import trace
|
||||
from pbi_cli.commands.transaction import transaction
|
||||
from pbi_cli.commands.visual import visual
|
||||
|
||||
cli.add_command(setup)
|
||||
cli.add_command(connect)
|
||||
|
|
@ -96,6 +101,11 @@ def _register_commands() -> None:
|
|||
cli.add_command(advanced)
|
||||
cli.add_command(repl)
|
||||
cli.add_command(skills)
|
||||
cli.add_command(report)
|
||||
cli.add_command(visual)
|
||||
cli.add_command(filters)
|
||||
cli.add_command(format_cmd)
|
||||
cli.add_command(bookmarks)
|
||||
|
||||
|
||||
_register_commands()
|
||||
|
|
|
|||
1
src/pbi_cli/preview/__init__.py
Normal file
1
src/pbi_cli/preview/__init__.py
Normal file
|
|
@ -0,0 +1 @@
|
|||
"""Live preview server for PBIR reports."""
|
||||
487
src/pbi_cli/preview/renderer.py
Normal file
487
src/pbi_cli/preview/renderer.py
Normal file
|
|
@ -0,0 +1,487 @@
|
|||
"""PBIR JSON to HTML/SVG renderer.
|
||||
|
||||
Renders a simplified structural preview of PBIR report pages.
|
||||
Not pixel-perfect Power BI rendering -- shows layout, visual types,
|
||||
and field bindings for validation before opening in Desktop.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from html import escape
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
|
||||
def render_report(definition_path: Path) -> str:
|
||||
"""Render a full PBIR report as an HTML page."""
|
||||
report_data = _read_json(definition_path / "report.json")
|
||||
theme = report_data.get("themeCollection", {}).get("baseTheme", {}).get("name", "Default")
|
||||
|
||||
pages_html = []
|
||||
pages_dir = definition_path / "pages"
|
||||
if pages_dir.is_dir():
|
||||
page_order = _get_page_order(definition_path)
|
||||
page_dirs = sorted(
|
||||
[d for d in pages_dir.iterdir() if d.is_dir() and (d / "page.json").exists()],
|
||||
key=lambda d: page_order.index(d.name) if d.name in page_order else 9999,
|
||||
)
|
||||
for page_dir in page_dirs:
|
||||
pages_html.append(_render_page(page_dir))
|
||||
|
||||
pages_content = "\n".join(pages_html) if pages_html else "<p class='empty'>No pages in report</p>"
|
||||
|
||||
return _HTML_TEMPLATE.replace("{{THEME}}", escape(theme)).replace(
|
||||
"{{PAGES}}", pages_content
|
||||
)
|
||||
|
||||
|
||||
def render_page(definition_path: Path, page_name: str) -> str:
|
||||
"""Render a single page as HTML."""
|
||||
page_dir = definition_path / "pages" / page_name
|
||||
if not page_dir.is_dir():
|
||||
return f"<p>Page '{escape(page_name)}' not found</p>"
|
||||
return _render_page(page_dir)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Internal renderers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
_VISUAL_COLORS: dict[str, str] = {
|
||||
# Original 9
|
||||
"barChart": "#4472C4",
|
||||
"lineChart": "#ED7D31",
|
||||
"card": "#A5A5A5",
|
||||
"tableEx": "#5B9BD5",
|
||||
"pivotTable": "#70AD47",
|
||||
"slicer": "#FFC000",
|
||||
"kpi": "#00B050",
|
||||
"gauge": "#7030A0",
|
||||
"donutChart": "#FF6B6B",
|
||||
# v3.1.0 additions
|
||||
"columnChart": "#4472C4",
|
||||
"areaChart": "#ED7D31",
|
||||
"ribbonChart": "#9DC3E6",
|
||||
"waterfallChart": "#548235",
|
||||
"scatterChart": "#FF0000",
|
||||
"funnelChart": "#0070C0",
|
||||
"multiRowCard": "#595959",
|
||||
"treemap": "#833C00",
|
||||
"cardNew": "#767171",
|
||||
"stackedBarChart": "#2E75B6",
|
||||
"lineStackedColumnComboChart": "#C55A11",
|
||||
# v3.4.0 additions
|
||||
"cardVisual": "#767171",
|
||||
"actionButton": "#E8832A",
|
||||
# v3.5.0 additions
|
||||
"clusteredColumnChart": "#4472C4",
|
||||
"clusteredBarChart": "#4472C4",
|
||||
"textSlicer": "#FFC000",
|
||||
"listSlicer": "#FFC000",
|
||||
# v3.6.0 additions
|
||||
"image": "#9E480E",
|
||||
"shape": "#7F7F7F",
|
||||
"textbox": "#404040",
|
||||
"pageNavigator": "#00B0F0",
|
||||
"advancedSlicerVisual": "#FFC000",
|
||||
# v3.8.0 additions
|
||||
"azureMap": "#0078D4",
|
||||
}
|
||||
|
||||
_VISUAL_ICONS: dict[str, str] = {
|
||||
# Original 9
|
||||
"barChart": "▌▌▌",
|
||||
"lineChart": "➚",
|
||||
"card": "■",
|
||||
"tableEx": "▦",
|
||||
"pivotTable": "▩",
|
||||
"slicer": "☰",
|
||||
"kpi": "▲",
|
||||
"gauge": "⏱",
|
||||
"donutChart": "◉",
|
||||
# v3.1.0 additions
|
||||
"columnChart": "▐▐▐",
|
||||
"areaChart": "▲",
|
||||
"ribbonChart": "▶",
|
||||
"waterfallChart": "↕",
|
||||
"scatterChart": "⋅⋅⋅",
|
||||
"funnelChart": "▼",
|
||||
"multiRowCard": "▤▤",
|
||||
"treemap": "▣",
|
||||
"cardNew": "□",
|
||||
"stackedBarChart": "▌▌",
|
||||
"lineStackedColumnComboChart": "▐➚",
|
||||
# v3.4.0 additions
|
||||
"cardVisual": "■",
|
||||
"actionButton": "▶",
|
||||
# v3.5.0 additions
|
||||
"clusteredColumnChart": "▐▐▐",
|
||||
"clusteredBarChart": "▌▌▌",
|
||||
"textSlicer": "☰",
|
||||
"listSlicer": "☰",
|
||||
# v3.6.0 additions
|
||||
"image": "●",
|
||||
"shape": "▲",
|
||||
"textbox": "◻",
|
||||
"pageNavigator": "►",
|
||||
"advancedSlicerVisual": "☰",
|
||||
# v3.8.0 additions
|
||||
"azureMap": "◆",
|
||||
}
|
||||
|
||||
|
||||
def _render_page(page_dir: Path) -> str:
|
||||
"""Render a single page directory as HTML."""
|
||||
page_data = _read_json(page_dir / "page.json")
|
||||
display_name = page_data.get("displayName", page_dir.name)
|
||||
width = page_data.get("width", 1280)
|
||||
height = page_data.get("height", 720)
|
||||
name = page_data.get("name", page_dir.name)
|
||||
|
||||
visuals_html = []
|
||||
visuals_dir = page_dir / "visuals"
|
||||
if visuals_dir.is_dir():
|
||||
for vdir in sorted(visuals_dir.iterdir()):
|
||||
if not vdir.is_dir():
|
||||
continue
|
||||
vfile = vdir / "visual.json"
|
||||
if vfile.exists():
|
||||
visuals_html.append(_render_visual(vfile))
|
||||
|
||||
visuals_content = "\n".join(visuals_html)
|
||||
if not visuals_content:
|
||||
visuals_content = "<div class='empty-page'>Empty page</div>"
|
||||
|
||||
# Scale factor for the preview (fit to ~900px wide container)
|
||||
scale = min(900 / width, 1.0)
|
||||
|
||||
return f"""
|
||||
<div class="page" data-page="{escape(name)}">
|
||||
<h2 class="page-title">{escape(display_name)}</h2>
|
||||
<div class="page-canvas" style="width:{width}px;height:{height}px;transform:scale({scale:.3f});transform-origin:top left;">
|
||||
{visuals_content}
|
||||
</div>
|
||||
</div>
|
||||
"""
|
||||
|
||||
|
||||
def _render_visual(vfile: Path) -> str:
|
||||
"""Render a single visual.json as an HTML element."""
|
||||
data = _read_json(vfile)
|
||||
pos = data.get("position", {})
|
||||
x = pos.get("x", 0)
|
||||
y = pos.get("y", 0)
|
||||
w = pos.get("width", 200)
|
||||
h = pos.get("height", 150)
|
||||
z = pos.get("z", 0)
|
||||
|
||||
visual_config = data.get("visual", {})
|
||||
vtype = visual_config.get("visualType", "unknown")
|
||||
name = data.get("name", "")
|
||||
hidden = data.get("isHidden", False)
|
||||
|
||||
color = _VISUAL_COLORS.get(vtype, "#888")
|
||||
icon = _VISUAL_ICONS.get(vtype, "?")
|
||||
|
||||
# Extract bound fields
|
||||
bindings = _extract_bindings(visual_config)
|
||||
bindings_html = ""
|
||||
if bindings:
|
||||
items = "".join(f"<li>{escape(b)}</li>" for b in bindings)
|
||||
bindings_html = f"<ul class='bindings'>{items}</ul>"
|
||||
|
||||
opacity = "0.4" if hidden else "1"
|
||||
|
||||
return f"""
|
||||
<div class="visual" style="left:{x}px;top:{y}px;width:{w}px;height:{h}px;z-index:{z};border-color:{color};opacity:{opacity};" data-visual="{escape(name)}" data-type="{escape(vtype)}">
|
||||
<div class="visual-header" style="background:{color};">
|
||||
<span class="visual-icon">{icon}</span>
|
||||
<span class="visual-type">{escape(vtype)}</span>
|
||||
</div>
|
||||
<div class="visual-body">
|
||||
{_render_visual_content(vtype, w, h, bindings)}
|
||||
{bindings_html}
|
||||
</div>
|
||||
</div>
|
||||
"""
|
||||
|
||||
|
||||
def _render_visual_content(vtype: str, w: float, h: float, bindings: list[str]) -> str:
|
||||
"""Render simplified chart preview content."""
|
||||
body_h = h - 30 # header height
|
||||
|
||||
if vtype == "barChart":
|
||||
bars = ""
|
||||
num_bars = min(len(bindings), 5) if bindings else 4
|
||||
bar_w = max(w / (num_bars * 2), 15)
|
||||
for i in range(num_bars):
|
||||
bar_h = body_h * (0.3 + 0.5 * ((i * 37 + 13) % 7) / 7)
|
||||
bars += (
|
||||
f'<rect x="{i * bar_w * 2 + bar_w/2}" y="{body_h - bar_h}" '
|
||||
f'width="{bar_w}" height="{bar_h}" fill="#4472C4" opacity="0.7"/>'
|
||||
)
|
||||
return f'<svg class="chart-svg" viewBox="0 0 {w} {body_h}">{bars}</svg>'
|
||||
|
||||
if vtype == "lineChart":
|
||||
points = []
|
||||
num_points = 6
|
||||
for i in range(num_points):
|
||||
px = (w / (num_points - 1)) * i
|
||||
py = body_h * (0.2 + 0.6 * ((i * 47 + 23) % 11) / 11)
|
||||
points.append(f"{px},{py}")
|
||||
polyline = f'<polyline points="{" ".join(points)}" fill="none" stroke="#ED7D31" stroke-width="3"/>'
|
||||
return f'<svg class="chart-svg" viewBox="0 0 {w} {body_h}">{polyline}</svg>'
|
||||
|
||||
if vtype == "card":
|
||||
label = bindings[0] if bindings else "Measure"
|
||||
return f'<div class="card-value">123.4K</div><div class="card-label">{escape(label)}</div>'
|
||||
|
||||
if vtype in ("tableEx", "pivotTable"):
|
||||
cols = bindings[:5] if bindings else ["Col 1", "Col 2", "Col 3"]
|
||||
header = "".join(f"<th>{escape(c)}</th>" for c in cols)
|
||||
rows = ""
|
||||
for r in range(3):
|
||||
cells = "".join("<td>...</td>" for _ in cols)
|
||||
rows += f"<tr>{cells}</tr>"
|
||||
return f'<table class="data-table"><thead><tr>{header}</tr></thead><tbody>{rows}</tbody></table>'
|
||||
|
||||
return f'<div class="unknown-visual">{escape(vtype)}</div>'
|
||||
|
||||
|
||||
def _extract_bindings(visual_config: dict[str, Any]) -> list[str]:
|
||||
"""Extract field binding names from visual configuration."""
|
||||
bindings: list[str] = []
|
||||
query_state = visual_config.get("query", {}).get("queryState", {})
|
||||
|
||||
for role, state in query_state.items():
|
||||
for proj in state.get("projections", []):
|
||||
ref = proj.get("queryRef", "")
|
||||
if ref:
|
||||
bindings.append(ref)
|
||||
|
||||
# Also check Commands-based bindings
|
||||
for cmd in visual_config.get("query", {}).get("Commands", []):
|
||||
sq = cmd.get("SemanticQueryDataShapeCommand", {}).get("Query", {})
|
||||
sources = {s["Name"]: s["Entity"] for s in sq.get("From", [])}
|
||||
for sel in sq.get("Select", []):
|
||||
name = sel.get("Name", "")
|
||||
if name:
|
||||
# Try to make it readable
|
||||
for kind in ("Column", "Measure"):
|
||||
if kind in sel:
|
||||
src = sel[kind].get("Expression", {}).get("SourceRef", {}).get("Source", "")
|
||||
prop = sel[kind].get("Property", "")
|
||||
table = sources.get(src, src)
|
||||
bindings.append(f"{table}[{prop}]")
|
||||
break
|
||||
|
||||
return bindings
|
||||
|
||||
|
||||
def _get_page_order(definition_path: Path) -> list[str]:
|
||||
"""Read page order from pages.json."""
|
||||
pages_json = definition_path / "pages" / "pages.json"
|
||||
if not pages_json.exists():
|
||||
return []
|
||||
try:
|
||||
data = json.loads(pages_json.read_text(encoding="utf-8"))
|
||||
return data.get("pageOrder", [])
|
||||
except (json.JSONDecodeError, KeyError):
|
||||
return []
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any]:
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# HTML template
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
_HTML_TEMPLATE = """<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>pbi-cli Report Preview</title>
|
||||
<style>
|
||||
* { margin: 0; padding: 0; box-sizing: border-box; }
|
||||
body {
|
||||
font-family: 'Segoe UI', system-ui, sans-serif;
|
||||
background: #1e1e1e;
|
||||
color: #e0e0e0;
|
||||
padding: 20px;
|
||||
}
|
||||
.header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 12px;
|
||||
margin-bottom: 24px;
|
||||
padding-bottom: 12px;
|
||||
border-bottom: 1px solid #333;
|
||||
}
|
||||
.header h1 {
|
||||
font-size: 18px;
|
||||
font-weight: 600;
|
||||
color: #f2c811;
|
||||
}
|
||||
.header .badge {
|
||||
background: #333;
|
||||
color: #aaa;
|
||||
padding: 2px 8px;
|
||||
border-radius: 4px;
|
||||
font-size: 12px;
|
||||
}
|
||||
.header .theme {
|
||||
margin-left: auto;
|
||||
font-size: 13px;
|
||||
color: #888;
|
||||
}
|
||||
.page {
|
||||
margin-bottom: 40px;
|
||||
}
|
||||
.page-title {
|
||||
font-size: 16px;
|
||||
font-weight: 500;
|
||||
margin-bottom: 12px;
|
||||
color: #ccc;
|
||||
padding-left: 4px;
|
||||
}
|
||||
.page-canvas {
|
||||
position: relative;
|
||||
background: #2d2d2d;
|
||||
border: 1px solid #444;
|
||||
border-radius: 4px;
|
||||
overflow: hidden;
|
||||
}
|
||||
.visual {
|
||||
position: absolute;
|
||||
border: 2px solid;
|
||||
border-radius: 4px;
|
||||
background: #252525;
|
||||
overflow: hidden;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
.visual-header {
|
||||
padding: 4px 8px;
|
||||
font-size: 11px;
|
||||
font-weight: 600;
|
||||
color: white;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 6px;
|
||||
min-height: 24px;
|
||||
}
|
||||
.visual-icon { font-size: 14px; }
|
||||
.visual-type { text-transform: capitalize; }
|
||||
.visual-body {
|
||||
flex: 1;
|
||||
padding: 6px;
|
||||
overflow: hidden;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
.chart-svg { width: 100%; height: 100%; }
|
||||
.card-value {
|
||||
font-size: 28px;
|
||||
font-weight: 700;
|
||||
text-align: center;
|
||||
color: #f2c811;
|
||||
padding: 10px 0 4px;
|
||||
}
|
||||
.card-label {
|
||||
font-size: 11px;
|
||||
text-align: center;
|
||||
color: #888;
|
||||
}
|
||||
.data-table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
font-size: 11px;
|
||||
}
|
||||
.data-table th {
|
||||
background: #333;
|
||||
padding: 4px 6px;
|
||||
text-align: left;
|
||||
border-bottom: 1px solid #555;
|
||||
color: #ccc;
|
||||
}
|
||||
.data-table td {
|
||||
padding: 3px 6px;
|
||||
border-bottom: 1px solid #333;
|
||||
color: #999;
|
||||
}
|
||||
.bindings {
|
||||
list-style: none;
|
||||
margin-top: 4px;
|
||||
font-size: 10px;
|
||||
color: #777;
|
||||
}
|
||||
.bindings li::before {
|
||||
content: "\\2192 ";
|
||||
color: #555;
|
||||
}
|
||||
.empty { color: #666; text-align: center; padding: 40px; }
|
||||
.empty-page {
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: 50%;
|
||||
transform: translate(-50%, -50%);
|
||||
color: #555;
|
||||
font-size: 14px;
|
||||
}
|
||||
.unknown-visual {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
height: 100%;
|
||||
color: #666;
|
||||
font-style: italic;
|
||||
}
|
||||
.ws-status {
|
||||
position: fixed;
|
||||
bottom: 12px;
|
||||
right: 12px;
|
||||
padding: 4px 10px;
|
||||
border-radius: 12px;
|
||||
font-size: 11px;
|
||||
background: #333;
|
||||
}
|
||||
.ws-status.connected { color: #4caf50; }
|
||||
.ws-status.disconnected { color: #f44336; }
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="header">
|
||||
<h1>pbi-cli Report Preview</h1>
|
||||
<span class="badge">STRUCTURAL</span>
|
||||
<span class="theme">Theme: {{THEME}}</span>
|
||||
</div>
|
||||
{{PAGES}}
|
||||
<div class="ws-status disconnected" id="ws-status">disconnected</div>
|
||||
<script>
|
||||
(function() {
|
||||
var wsUrl = 'ws://' + location.hostname + ':' + (parseInt(location.port) + 1);
|
||||
var status = document.getElementById('ws-status');
|
||||
function connect() {
|
||||
var ws = new WebSocket(wsUrl);
|
||||
ws.onopen = function() {
|
||||
status.textContent = 'live';
|
||||
status.className = 'ws-status connected';
|
||||
};
|
||||
ws.onmessage = function(e) {
|
||||
if (e.data === 'reload') location.reload();
|
||||
};
|
||||
ws.onclose = function() {
|
||||
status.textContent = 'disconnected';
|
||||
status.className = 'ws-status disconnected';
|
||||
setTimeout(connect, 2000);
|
||||
};
|
||||
}
|
||||
connect();
|
||||
})();
|
||||
</script>
|
||||
</body>
|
||||
</html>"""
|
||||
126
src/pbi_cli/preview/server.py
Normal file
126
src/pbi_cli/preview/server.py
Normal file
|
|
@ -0,0 +1,126 @@
|
|||
"""HTTP + WebSocket server for PBIR live preview.
|
||||
|
||||
Architecture:
|
||||
- HTTP server on ``port`` serves the rendered HTML
|
||||
- WebSocket server on ``port + 1`` pushes "reload" on file changes
|
||||
- File watcher polls the definition folder and triggers broadcasts
|
||||
|
||||
Uses only stdlib ``http.server`` + optional ``websockets`` library.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import http.server
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
|
||||
def start_preview_server(
|
||||
definition_path: Path,
|
||||
port: int = 8080,
|
||||
) -> dict[str, Any]:
|
||||
"""Start the preview server (blocking).
|
||||
|
||||
Returns a status dict (only reached if server is stopped).
|
||||
"""
|
||||
# Check for websockets dependency
|
||||
try:
|
||||
import websockets # type: ignore[import-untyped]
|
||||
except ImportError:
|
||||
return {
|
||||
"status": "error",
|
||||
"message": (
|
||||
"Preview requires the 'websockets' package. "
|
||||
"Install with: pip install pbi-cli-tool[preview]"
|
||||
),
|
||||
}
|
||||
|
||||
from pbi_cli.preview.renderer import render_report
|
||||
from pbi_cli.preview.watcher import PbirWatcher
|
||||
|
||||
ws_port = port + 1
|
||||
ws_clients: set[Any] = set()
|
||||
|
||||
# --- WebSocket server ---
|
||||
async def ws_handler(websocket: Any) -> None:
|
||||
ws_clients.add(websocket)
|
||||
try:
|
||||
async for _ in websocket:
|
||||
pass # Keep connection alive
|
||||
finally:
|
||||
ws_clients.discard(websocket)
|
||||
|
||||
async def broadcast_reload() -> None:
|
||||
if ws_clients:
|
||||
msg = "reload"
|
||||
await asyncio.gather(
|
||||
*[c.send(msg) for c in ws_clients],
|
||||
return_exceptions=True,
|
||||
)
|
||||
|
||||
loop: asyncio.AbstractEventLoop | None = None
|
||||
|
||||
def on_file_change() -> None:
|
||||
"""Called by the watcher when files change."""
|
||||
if loop is not None:
|
||||
asyncio.run_coroutine_threadsafe(broadcast_reload(), loop)
|
||||
|
||||
# --- HTTP server ---
|
||||
class PreviewHandler(http.server.BaseHTTPRequestHandler):
|
||||
def do_GET(self) -> None:
|
||||
try:
|
||||
html = render_report(definition_path)
|
||||
self.send_response(200)
|
||||
self.send_header("Content-Type", "text/html; charset=utf-8")
|
||||
self.send_header("Cache-Control", "no-cache, no-store, must-revalidate")
|
||||
self.end_headers()
|
||||
self.wfile.write(html.encode("utf-8"))
|
||||
except Exception as e:
|
||||
self.send_response(500)
|
||||
self.send_header("Content-Type", "text/plain")
|
||||
self.end_headers()
|
||||
self.wfile.write(f"Error: {e}".encode())
|
||||
|
||||
def log_message(self, format: str, *args: Any) -> None:
|
||||
pass # Suppress default logging
|
||||
|
||||
# --- Start everything ---
|
||||
import click
|
||||
|
||||
click.echo("Starting preview server...", err=True)
|
||||
click.echo(f" HTTP: http://localhost:{port}", err=True)
|
||||
click.echo(f" WebSocket: ws://localhost:{ws_port}", err=True)
|
||||
click.echo(f" Watching: {definition_path}", err=True)
|
||||
click.echo(" Press Ctrl+C to stop.", err=True)
|
||||
|
||||
# Start file watcher in background thread
|
||||
watcher = PbirWatcher(definition_path, on_change=on_file_change)
|
||||
watcher_thread = threading.Thread(target=watcher.start, daemon=True)
|
||||
watcher_thread.start()
|
||||
|
||||
# Start HTTP server in background thread
|
||||
httpd = http.server.HTTPServer(("127.0.0.1", port), PreviewHandler)
|
||||
http_thread = threading.Thread(target=httpd.serve_forever, daemon=True)
|
||||
http_thread.start()
|
||||
|
||||
# Run WebSocket server on main thread's event loop
|
||||
async def run_ws() -> None:
|
||||
nonlocal loop
|
||||
loop = asyncio.get_running_loop()
|
||||
async with websockets.serve(ws_handler, "127.0.0.1", ws_port):
|
||||
await asyncio.Future() # Run forever
|
||||
|
||||
try:
|
||||
asyncio.run(run_ws())
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
finally:
|
||||
watcher.stop()
|
||||
httpd.shutdown()
|
||||
|
||||
return {
|
||||
"status": "stopped",
|
||||
"message": "Preview server stopped.",
|
||||
}
|
||||
63
src/pbi_cli/preview/watcher.py
Normal file
63
src/pbi_cli/preview/watcher.py
Normal file
|
|
@ -0,0 +1,63 @@
|
|||
"""File watcher for PBIR report changes.
|
||||
|
||||
Uses polling (stat-based) to avoid external dependencies.
|
||||
Notifies a callback when any JSON file in the definition folder changes.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import time
|
||||
from collections.abc import Callable
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
class PbirWatcher:
|
||||
"""Poll-based file watcher for PBIR definition folders."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
definition_path: Path,
|
||||
on_change: Callable[[], None],
|
||||
interval: float = 0.5,
|
||||
) -> None:
|
||||
self.definition_path = definition_path
|
||||
self.on_change = on_change
|
||||
self.interval = interval
|
||||
self._running = False
|
||||
self._snapshot: dict[str, float] = {}
|
||||
|
||||
def _take_snapshot(self) -> dict[str, float]:
|
||||
"""Capture mtime of all JSON files."""
|
||||
snap: dict[str, float] = {}
|
||||
if not self.definition_path.is_dir():
|
||||
return snap
|
||||
for f in self.definition_path.rglob("*.json"):
|
||||
try:
|
||||
snap[str(f)] = f.stat().st_mtime
|
||||
except OSError:
|
||||
continue
|
||||
return snap
|
||||
|
||||
def _detect_changes(self) -> bool:
|
||||
"""Compare current state to last snapshot."""
|
||||
current = self._take_snapshot()
|
||||
changed = current != self._snapshot
|
||||
self._snapshot = current
|
||||
return changed
|
||||
|
||||
def start(self) -> None:
|
||||
"""Start watching (blocking). Call stop() from another thread to exit."""
|
||||
self._running = True
|
||||
self._snapshot = self._take_snapshot()
|
||||
|
||||
while self._running:
|
||||
time.sleep(self.interval)
|
||||
if self._detect_changes():
|
||||
try:
|
||||
self.on_change()
|
||||
except Exception:
|
||||
pass # Don't crash the watcher
|
||||
|
||||
def stop(self) -> None:
|
||||
"""Signal the watcher to stop."""
|
||||
self._running = False
|
||||
|
|
@ -50,6 +50,31 @@ pbi database import-tmdl ./model-tmdl/
|
|||
pbi database export-tmsl
|
||||
```
|
||||
|
||||
## TMDL Diff (Compare Snapshots)
|
||||
|
||||
Compare two TMDL export folders to see what changed between snapshots.
|
||||
Useful for CI/CD pipelines ("what did this PR change in the model?").
|
||||
|
||||
```bash
|
||||
# Compare two exports
|
||||
pbi database diff-tmdl ./model-before/ ./model-after/
|
||||
|
||||
# JSON output for CI/CD scripting
|
||||
pbi --json database diff-tmdl ./baseline/ ./current/
|
||||
```
|
||||
|
||||
Returns a structured summary:
|
||||
- **tables**: added, removed, and changed tables with per-table entity diffs
|
||||
(measures, columns, partitions, hierarchies added/removed/changed)
|
||||
- **relationships**: added, removed, and changed relationships
|
||||
- **model**: changed model-level properties (e.g. culture, default power bi dataset version)
|
||||
- **summary**: total counts of all changes
|
||||
|
||||
LineageTag-only changes (GUID regeneration without real edits) are automatically
|
||||
filtered out to avoid false positives.
|
||||
|
||||
No connection to Power BI Desktop is needed -- works on exported folders.
|
||||
|
||||
## Database Operations
|
||||
|
||||
```bash
|
||||
|
|
|
|||
137
src/pbi_cli/skills/power-bi-filters/SKILL.md
Normal file
137
src/pbi_cli/skills/power-bi-filters/SKILL.md
Normal file
|
|
@ -0,0 +1,137 @@
|
|||
---
|
||||
name: Power BI Filters
|
||||
description: >
|
||||
Add, remove, and manage page-level and visual-level filters on Power BI PBIR
|
||||
reports using pbi-cli. Invoke this skill whenever the user mentions "filter",
|
||||
"TopN filter", "top 10", "bottom 5", "relative date filter", "last 30 days",
|
||||
"categorical filter", "include values", "exclude values", "clear filters",
|
||||
"slicer filter", "page filter", "visual filter", or wants to restrict which
|
||||
data appears on a page or in a specific visual.
|
||||
tools: pbi-cli
|
||||
---
|
||||
|
||||
# Power BI Filters Skill
|
||||
|
||||
Add and manage filters on PBIR report pages and visuals. Filters are stored
|
||||
in the `filterConfig` section of `page.json` (page-level) or `visual.json`
|
||||
(visual-level). No Power BI Desktop connection is needed.
|
||||
|
||||
## Listing Filters
|
||||
|
||||
```bash
|
||||
# List all filters on a page
|
||||
pbi filters list --page page_abc123
|
||||
|
||||
# List filters on a specific visual
|
||||
pbi filters list --page page_abc123 --visual visual_def456
|
||||
```
|
||||
|
||||
Returns each filter's name, type, field, and scope (page or visual).
|
||||
|
||||
## Categorical Filters
|
||||
|
||||
Include or exclude specific values from a column:
|
||||
|
||||
```bash
|
||||
# Include only East and West regions
|
||||
pbi filters add-categorical --page page1 \
|
||||
--table Sales --column Region \
|
||||
--values "East" "West"
|
||||
```
|
||||
|
||||
The filter appears in the page's `filterConfig.filters` array. Power BI
|
||||
evaluates it as an IN-list against the specified column.
|
||||
|
||||
## TopN Filters
|
||||
|
||||
Show only the top (or bottom) N items ranked by a measure:
|
||||
|
||||
```bash
|
||||
# Top 10 products by revenue
|
||||
pbi filters add-topn --page page1 \
|
||||
--table Product --column Name \
|
||||
--n 10 \
|
||||
--order-by-table Sales --order-by-column Revenue
|
||||
|
||||
# Bottom 5 by quantity (ascending)
|
||||
pbi filters add-topn --page page1 \
|
||||
--table Product --column Name \
|
||||
--n 5 \
|
||||
--order-by-table Sales --order-by-column Quantity \
|
||||
--direction Bottom
|
||||
```
|
||||
|
||||
The `--table` and `--column` define which dimension to filter (the rows you
|
||||
want to keep). The `--order-by-table` and `--order-by-column` define the
|
||||
measure used for ranking. These can be different tables -- for example,
|
||||
filtering Product names by Sales revenue.
|
||||
|
||||
Direction defaults to `Top` (descending -- highest N). Use `--direction Bottom`
|
||||
for ascending (lowest N).
|
||||
|
||||
## Relative Date Filters
|
||||
|
||||
Filter by a rolling window relative to today:
|
||||
|
||||
```bash
|
||||
# Last 30 days
|
||||
pbi filters add-relative-date --page page1 \
|
||||
--table Calendar --column Date \
|
||||
--period days --count 30 --direction last
|
||||
|
||||
# Next 7 days
|
||||
pbi filters add-relative-date --page page1 \
|
||||
--table Calendar --column Date \
|
||||
--period days --count 7 --direction next
|
||||
```
|
||||
|
||||
Period options: `days`, `weeks`, `months`, `quarters`, `years`.
|
||||
Direction: `last` (past) or `next` (future).
|
||||
|
||||
## Visual-Level Filters
|
||||
|
||||
Add a filter to a specific visual instead of the whole page by including
|
||||
`--visual`:
|
||||
|
||||
```bash
|
||||
pbi filters add-categorical --page page1 --visual vis_abc \
|
||||
--table Sales --column Channel \
|
||||
--values "Online"
|
||||
```
|
||||
|
||||
## Removing Filters
|
||||
|
||||
```bash
|
||||
# Remove a specific filter by name
|
||||
pbi filters remove --page page1 --name filter_abc123
|
||||
|
||||
# Clear ALL filters from a page
|
||||
pbi filters clear --page page1
|
||||
```
|
||||
|
||||
Filter names are auto-generated unique IDs. Use `pbi filters list` to find
|
||||
the name of the filter you want to remove.
|
||||
|
||||
## Workflow: Set Up Dashboard Filters
|
||||
|
||||
```bash
|
||||
# 1. Add a date filter to the overview page
|
||||
pbi filters add-relative-date --page overview \
|
||||
--table Calendar --column Date \
|
||||
--period months --count 12 --direction last
|
||||
|
||||
# 2. Add a TopN filter to show only top customers
|
||||
pbi filters add-topn --page overview \
|
||||
--table Customer --column Name \
|
||||
--n 20 \
|
||||
--order-by-table Sales --order-by-column Revenue
|
||||
|
||||
# 3. Verify
|
||||
pbi filters list --page overview
|
||||
```
|
||||
|
||||
## JSON Output
|
||||
|
||||
```bash
|
||||
pbi --json filters list --page page1
|
||||
```
|
||||
151
src/pbi_cli/skills/power-bi-pages/SKILL.md
Normal file
151
src/pbi_cli/skills/power-bi-pages/SKILL.md
Normal file
|
|
@ -0,0 +1,151 @@
|
|||
---
|
||||
name: Power BI Pages
|
||||
description: >
|
||||
Manage Power BI report pages and bookmarks -- add, remove, configure, and lay
|
||||
out pages in PBIR reports using pbi-cli. Invoke this skill whenever the user
|
||||
mentions "add page", "new page", "delete page", "page layout", "page size",
|
||||
"page background", "hide page", "show page", "drillthrough", "page order",
|
||||
"page visibility", "page settings", "page navigation", "bookmark", "create
|
||||
bookmark", "save bookmark", "delete bookmark", or wants to manage bookmarks
|
||||
that capture page-level state. Also invoke when the user asks about drillthrough
|
||||
configuration or pageBinding.
|
||||
tools: pbi-cli
|
||||
---
|
||||
|
||||
# Power BI Pages Skill
|
||||
|
||||
Manage pages in PBIR reports. Pages are folders inside `definition/pages/`
|
||||
containing a `page.json` file and a `visuals/` directory. No Power BI Desktop
|
||||
connection is needed.
|
||||
|
||||
## Listing and Inspecting Pages
|
||||
|
||||
```bash
|
||||
# List all pages with display names, order, and visibility
|
||||
pbi report list-pages
|
||||
|
||||
# Get full details of a specific page
|
||||
pbi report get-page page_abc123
|
||||
```
|
||||
|
||||
`get-page` returns:
|
||||
- `name`, `display_name`, `ordinal` (sort order)
|
||||
- `width`, `height` (canvas size in pixels)
|
||||
- `display_option` (e.g. `"FitToPage"`)
|
||||
- `visual_count` -- how many visuals on the page
|
||||
- `is_hidden` -- whether the page is hidden in the navigation pane
|
||||
- `page_type` -- `"Default"` or `"Drillthrough"`
|
||||
- `filter_config` -- page-level filter configuration (if any)
|
||||
- `visual_interactions` -- custom visual interaction rules (if any)
|
||||
- `page_binding` -- drillthrough parameter definition (if drillthrough page)
|
||||
|
||||
## Adding Pages
|
||||
|
||||
```bash
|
||||
# Add with display name (folder name auto-generated)
|
||||
pbi report add-page --display-name "Executive Overview"
|
||||
|
||||
# Custom folder name and canvas size
|
||||
pbi report add-page --display-name "Details" --name detail_page \
|
||||
--width 1920 --height 1080
|
||||
```
|
||||
|
||||
Default canvas size is 1280x720 (standard 16:9). Common alternatives:
|
||||
- 1920x1080 -- Full HD
|
||||
- 1280x960 -- 4:3
|
||||
- Custom dimensions for mobile or dashboard layouts
|
||||
|
||||
## Deleting Pages
|
||||
|
||||
```bash
|
||||
# Delete a page and all its visuals
|
||||
pbi report delete-page page_abc123
|
||||
```
|
||||
|
||||
This removes the entire page folder including all visual subdirectories.
|
||||
|
||||
## Page Background
|
||||
|
||||
```bash
|
||||
# Set a solid background colour
|
||||
pbi report set-background page_abc123 --color "#F5F5F5"
|
||||
```
|
||||
|
||||
## Page Visibility
|
||||
|
||||
Control whether a page appears in the report navigation pane:
|
||||
|
||||
```bash
|
||||
# Hide a page (useful for drillthrough or tooltip pages)
|
||||
pbi report set-visibility page_abc123 --hidden
|
||||
|
||||
# Show a hidden page
|
||||
pbi report set-visibility page_abc123 --visible
|
||||
```
|
||||
|
||||
## Bookmarks
|
||||
|
||||
Bookmarks capture page-level state (filters, visibility, scroll position).
|
||||
They live in `definition/bookmarks/`:
|
||||
|
||||
```bash
|
||||
# List all bookmarks in the report
|
||||
pbi bookmarks list
|
||||
|
||||
# Get details of a specific bookmark
|
||||
pbi bookmarks get "My Bookmark"
|
||||
|
||||
# Add a new bookmark
|
||||
pbi bookmarks add "Executive View"
|
||||
|
||||
# Delete a bookmark
|
||||
pbi bookmarks delete "Old Bookmark"
|
||||
|
||||
# Toggle bookmark visibility
|
||||
pbi bookmarks set-visibility "Draft View" --hidden
|
||||
```
|
||||
|
||||
## Drillthrough Pages
|
||||
|
||||
Drillthrough pages have a `pageBinding` field in `page.json` that defines the
|
||||
drillthrough parameter. When you call `get-page` on a drillthrough page, the
|
||||
`page_binding` field returns the full binding definition including parameter
|
||||
name, bound filter, and field expression. Regular pages return `null`.
|
||||
|
||||
To create a drillthrough page, add a page and then configure it as drillthrough
|
||||
in Power BI Desktop (PBIR drillthrough configuration is not yet supported via
|
||||
CLI -- the CLI can read and report on drillthrough configuration).
|
||||
|
||||
## Workflow: Set Up Report Pages
|
||||
|
||||
```bash
|
||||
# 1. Add pages in order
|
||||
pbi report add-page --display-name "Overview" --name overview
|
||||
pbi report add-page --display-name "Sales Detail" --name sales_detail
|
||||
pbi report add-page --display-name "Regional Drillthrough" --name region_drill
|
||||
|
||||
# 2. Hide the drillthrough page from navigation
|
||||
pbi report set-visibility region_drill --hidden
|
||||
|
||||
# 3. Set backgrounds
|
||||
pbi report set-background overview --color "#FAFAFA"
|
||||
|
||||
# 4. Verify the setup
|
||||
pbi report list-pages
|
||||
```
|
||||
|
||||
## Path Resolution
|
||||
|
||||
Page commands inherit the report path from the parent `pbi report` group:
|
||||
|
||||
1. Explicit: `pbi report --path ./MyReport.Report list-pages`
|
||||
2. Auto-detect: walks up from CWD looking for `*.Report/definition/`
|
||||
3. From `.pbip`: finds sibling `.Report` folder from `.pbip` file
|
||||
|
||||
## JSON Output
|
||||
|
||||
```bash
|
||||
pbi --json report list-pages
|
||||
pbi --json report get-page page_abc123
|
||||
pbi --json bookmarks list
|
||||
```
|
||||
169
src/pbi_cli/skills/power-bi-report/SKILL.md
Normal file
169
src/pbi_cli/skills/power-bi-report/SKILL.md
Normal file
|
|
@ -0,0 +1,169 @@
|
|||
---
|
||||
name: Power BI Report
|
||||
description: >
|
||||
Scaffold, validate, preview, and manage Power BI PBIR report projects using
|
||||
pbi-cli. Invoke this skill whenever the user mentions "create report", "new
|
||||
report", "PBIR", "scaffold", "validate report", "report structure", "preview
|
||||
report", "report info", "reload Desktop", "convert report", ".pbip project",
|
||||
"report project", or wants to understand the PBIR folder format, set up a new
|
||||
report from scratch, or work with the report as a whole. For specific tasks,
|
||||
see also: power-bi-visuals (charts, binding), power-bi-pages (page management),
|
||||
power-bi-themes (themes, formatting), power-bi-filters (page/visual filters).
|
||||
tools: pbi-cli
|
||||
---
|
||||
|
||||
# Power BI Report Skill
|
||||
|
||||
Manage Power BI PBIR report projects at the top level -- scaffolding, validation,
|
||||
preview, and Desktop integration. No connection to Power BI Desktop is needed
|
||||
for most operations.
|
||||
|
||||
## PBIR Format
|
||||
|
||||
PBIR (Enhanced Report Format) stores reports as a folder of JSON files:
|
||||
|
||||
```
|
||||
MyReport.Report/
|
||||
definition.pbir # dataset reference
|
||||
definition/
|
||||
version.json # PBIR version
|
||||
report.json # report settings, theme
|
||||
pages/
|
||||
pages.json # page order
|
||||
page_abc123/
|
||||
page.json # page settings
|
||||
visuals/
|
||||
visual_def456/
|
||||
visual.json # visual type, position, bindings
|
||||
```
|
||||
|
||||
Each file has a public JSON schema from Microsoft for validation.
|
||||
PBIR is GA as of January 2026 and the default format in Desktop since March 2026.
|
||||
|
||||
## Creating a Report
|
||||
|
||||
```bash
|
||||
# Scaffold a new report project
|
||||
pbi report create ./MyProject --name "Sales Report"
|
||||
|
||||
# With dataset reference
|
||||
pbi report create ./MyProject --name "Sales" --dataset-path "../Sales.Dataset"
|
||||
```
|
||||
|
||||
This creates the full folder structure with `definition.pbir`, `report.json`,
|
||||
`version.json`, and an empty `pages/` directory.
|
||||
|
||||
## Report Info and Validation
|
||||
|
||||
```bash
|
||||
# Show report metadata summary (pages, theme, dataset)
|
||||
pbi report info
|
||||
pbi report info --path ./MyReport.Report
|
||||
|
||||
# Validate report structure and JSON files
|
||||
pbi report validate
|
||||
```
|
||||
|
||||
Validation checks:
|
||||
- Required files exist (`definition.pbir`, `report.json`, `version.json`)
|
||||
- All JSON files parse without errors
|
||||
- Schema URLs are present and consistent
|
||||
- Page references in `pages.json` match actual page folders
|
||||
|
||||
## Preview
|
||||
|
||||
Start a live HTML preview of the report layout:
|
||||
|
||||
```bash
|
||||
pbi report preview
|
||||
```
|
||||
|
||||
Opens a browser showing all pages with visual placeholders, types, positions,
|
||||
and data bindings. The preview auto-refreshes when files change.
|
||||
|
||||
Requires the `preview` optional dependency: `pip install pbi-cli-tool[preview]`
|
||||
|
||||
## Desktop Integration
|
||||
|
||||
```bash
|
||||
# Trigger Power BI Desktop to reload the current report
|
||||
pbi report reload
|
||||
```
|
||||
|
||||
Power BI Desktop's Developer Mode auto-detects TMDL changes but not PBIR
|
||||
changes. This command sends a keyboard shortcut to the Desktop window to
|
||||
trigger a reload. Requires the `reload` optional dependency: `pip install pbi-cli-tool[reload]`
|
||||
|
||||
## Convert
|
||||
|
||||
```bash
|
||||
# Convert a .Report folder into a distributable .pbip project
|
||||
pbi report convert ./MyReport.Report --output ./distributable/
|
||||
```
|
||||
|
||||
## Path Resolution
|
||||
|
||||
All report commands auto-detect the `.Report` folder:
|
||||
|
||||
1. Explicit: `pbi report --path ./MyReport.Report info`
|
||||
2. Auto-detect: walks up from CWD looking for `*.Report/definition/`
|
||||
3. From `.pbip`: finds sibling `.Report` folder from `.pbip` file
|
||||
|
||||
## Workflow: Build a Complete Report
|
||||
|
||||
This workflow uses commands from multiple skills:
|
||||
|
||||
```bash
|
||||
# 1. Scaffold report (this skill)
|
||||
pbi report create . --name "SalesDashboard" --dataset-path "../SalesModel.Dataset"
|
||||
|
||||
# 2. Add pages (power-bi-pages skill)
|
||||
pbi report add-page --display-name "Overview" --name overview
|
||||
pbi report add-page --display-name "Details" --name details
|
||||
|
||||
# 3. Add visuals (power-bi-visuals skill)
|
||||
pbi visual add --page overview --type card --name revenue_card
|
||||
pbi visual add --page overview --type bar --name sales_by_region
|
||||
|
||||
# 4. Bind data (power-bi-visuals skill)
|
||||
pbi visual bind revenue_card --page overview --field "Sales[Total Revenue]"
|
||||
pbi visual bind sales_by_region --page overview \
|
||||
--category "Geo[Region]" --value "Sales[Amount]"
|
||||
|
||||
# 5. Apply theme (power-bi-themes skill)
|
||||
pbi report set-theme --file brand-colors.json
|
||||
|
||||
# 6. Validate (this skill)
|
||||
pbi report validate
|
||||
```
|
||||
|
||||
## Combining Model and Report
|
||||
|
||||
pbi-cli covers both the semantic model layer and the report layer:
|
||||
|
||||
```bash
|
||||
# Model layer (requires pbi connect)
|
||||
pbi connect
|
||||
pbi measure create Sales "Total Revenue" "SUM(Sales[Amount])"
|
||||
|
||||
# Report layer (no connection needed)
|
||||
pbi report create . --name "Sales"
|
||||
pbi visual add --page overview --type card --name rev_card
|
||||
pbi visual bind rev_card --page overview --field "Sales[Total Revenue]"
|
||||
```
|
||||
|
||||
## Related Skills
|
||||
|
||||
| Skill | When to use |
|
||||
|-------|-------------|
|
||||
| **power-bi-visuals** | Add, bind, update, delete visuals |
|
||||
| **power-bi-pages** | Add, remove, configure pages and bookmarks |
|
||||
| **power-bi-themes** | Themes, conditional formatting |
|
||||
| **power-bi-filters** | Page and visual filters |
|
||||
|
||||
## JSON Output
|
||||
|
||||
```bash
|
||||
pbi --json report info
|
||||
pbi --json report validate
|
||||
```
|
||||
137
src/pbi_cli/skills/power-bi-themes/SKILL.md
Normal file
137
src/pbi_cli/skills/power-bi-themes/SKILL.md
Normal file
|
|
@ -0,0 +1,137 @@
|
|||
---
|
||||
name: Power BI Themes
|
||||
description: >
|
||||
Apply, inspect, and compare Power BI report themes and conditional formatting
|
||||
rules using pbi-cli. Invoke this skill whenever the user mentions "theme",
|
||||
"colours", "colors", "branding", "dark mode", "corporate theme", "styling",
|
||||
"conditional formatting", "colour scale", "gradient", "data bars",
|
||||
"background colour", "formatting rules", "visual formatting", or wants to
|
||||
change the overall look-and-feel of a report or apply data-driven formatting
|
||||
to specific visuals.
|
||||
tools: pbi-cli
|
||||
---
|
||||
|
||||
# Power BI Themes Skill
|
||||
|
||||
Manage report-wide themes and per-visual conditional formatting. No Power BI
|
||||
Desktop connection is needed.
|
||||
|
||||
## Applying a Theme
|
||||
|
||||
Power BI themes are JSON files that define colours, fonts, and visual defaults
|
||||
for the entire report. Apply one with:
|
||||
|
||||
```bash
|
||||
pbi report set-theme --file corporate-theme.json
|
||||
```
|
||||
|
||||
This copies the theme file into the report's `StaticResources/RegisteredResources/`
|
||||
folder and updates `report.json` to reference it. The theme takes effect when
|
||||
the report is opened in Power BI Desktop.
|
||||
|
||||
## Inspecting the Current Theme
|
||||
|
||||
```bash
|
||||
pbi report get-theme
|
||||
```
|
||||
|
||||
Returns:
|
||||
- `base_theme` -- the built-in theme name (e.g. `"CY24SU06"`)
|
||||
- `custom_theme` -- custom theme name if one is applied (or `null`)
|
||||
- `theme_data` -- full JSON of the custom theme file (if it exists)
|
||||
|
||||
## Comparing Themes
|
||||
|
||||
Before applying a new theme, preview what would change:
|
||||
|
||||
```bash
|
||||
pbi report diff-theme --file proposed-theme.json
|
||||
```
|
||||
|
||||
Returns:
|
||||
- `current` / `proposed` -- display names
|
||||
- `added` -- keys in proposed but not current
|
||||
- `removed` -- keys in current but not proposed
|
||||
- `changed` -- keys present in both but with different values
|
||||
|
||||
This helps catch unintended colour changes before committing.
|
||||
|
||||
## Theme JSON Structure
|
||||
|
||||
A Power BI theme JSON file typically contains:
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "Corporate Brand",
|
||||
"dataColors": ["#0078D4", "#00BCF2", "#FFB900", "#D83B01", "#8661C5", "#00B294"],
|
||||
"background": "#FFFFFF",
|
||||
"foreground": "#252423",
|
||||
"tableAccent": "#0078D4",
|
||||
"visualStyles": { ... }
|
||||
}
|
||||
```
|
||||
|
||||
Key sections:
|
||||
- `dataColors` -- palette for data series (6-12 colours recommended)
|
||||
- `background` / `foreground` -- page and text defaults
|
||||
- `tableAccent` -- header colour for tables and matrices
|
||||
- `visualStyles` -- per-visual-type overrides (font sizes, padding, etc.)
|
||||
|
||||
See [Microsoft theme documentation](https://learn.microsoft.com/power-bi/create-reports/desktop-report-themes) for the full schema.
|
||||
|
||||
## Conditional Formatting
|
||||
|
||||
Apply data-driven formatting to individual visuals:
|
||||
|
||||
```bash
|
||||
# Gradient background (colour scale from min to max)
|
||||
pbi format background-gradient visual_abc --page page1 \
|
||||
--table Sales --column Revenue \
|
||||
--min-color "#FFFFFF" --max-color "#0078D4"
|
||||
|
||||
# Rules-based background (specific value triggers a colour)
|
||||
pbi format background-conditional visual_abc --page page1 \
|
||||
--table Sales --column Status --value "Critical" --color "#FF0000"
|
||||
|
||||
# Measure-driven background (a DAX measure returns the colour)
|
||||
pbi format background-measure visual_abc --page page1 \
|
||||
--table Sales --measure "Status Color"
|
||||
|
||||
# Inspect current formatting rules
|
||||
pbi format get visual_abc --page page1
|
||||
|
||||
# Clear all formatting rules on a visual
|
||||
pbi format clear visual_abc --page page1
|
||||
```
|
||||
|
||||
## Workflow: Brand a Report
|
||||
|
||||
```bash
|
||||
# 1. Create the theme file
|
||||
cat > brand-theme.json << 'EOF'
|
||||
{
|
||||
"name": "Acme Corp",
|
||||
"dataColors": ["#1B365D", "#5B8DB8", "#E87722", "#00A3E0", "#6D2077", "#43B02A"],
|
||||
"background": "#F8F8F8",
|
||||
"foreground": "#1B365D",
|
||||
"tableAccent": "#1B365D"
|
||||
}
|
||||
EOF
|
||||
|
||||
# 2. Preview the diff against the current theme
|
||||
pbi report diff-theme --file brand-theme.json
|
||||
|
||||
# 3. Apply it
|
||||
pbi report set-theme --file brand-theme.json
|
||||
|
||||
# 4. Verify
|
||||
pbi report get-theme
|
||||
```
|
||||
|
||||
## JSON Output
|
||||
|
||||
```bash
|
||||
pbi --json report get-theme
|
||||
pbi --json report diff-theme --file proposed.json
|
||||
pbi --json format get vis1 --page p1
|
||||
```
|
||||
223
src/pbi_cli/skills/power-bi-visuals/SKILL.md
Normal file
223
src/pbi_cli/skills/power-bi-visuals/SKILL.md
Normal file
|
|
@ -0,0 +1,223 @@
|
|||
---
|
||||
name: Power BI Visuals
|
||||
description: >
|
||||
Add, configure, bind data to, and bulk-manage visuals on Power BI PBIR report
|
||||
pages using pbi-cli. Invoke this skill whenever the user mentions "add a chart",
|
||||
"bar chart", "line chart", "card", "KPI", "gauge", "scatter", "table visual",
|
||||
"matrix", "slicer", "combo chart", "bind data", "visual type", "visual layout",
|
||||
"resize visuals", "bulk update visuals", "bulk delete", "visual calculations",
|
||||
or wants to place, move, bind, or remove any visual on a report page. Also invoke
|
||||
when the user asks what visual types are supported or how to connect a visual to
|
||||
their data model.
|
||||
tools: pbi-cli
|
||||
---
|
||||
|
||||
# Power BI Visuals Skill
|
||||
|
||||
Create and manage visuals on PBIR report pages. No Power BI Desktop connection
|
||||
is needed -- these commands operate directly on JSON files.
|
||||
|
||||
## Adding Visuals
|
||||
|
||||
```bash
|
||||
# Add by alias (pbi-cli resolves to the PBIR type)
|
||||
pbi visual add --page page_abc123 --type bar
|
||||
pbi visual add --page page_abc123 --type card --name "Revenue Card"
|
||||
|
||||
# Custom position and size (pixels)
|
||||
pbi visual add --page page_abc123 --type scatter \
|
||||
--x 50 --y 400 --width 600 --height 350
|
||||
|
||||
# Named visual for easy reference
|
||||
pbi visual add --page page_abc123 --type combo --name sales_combo
|
||||
```
|
||||
|
||||
Each visual is created as a folder with a `visual.json` file inside the page's
|
||||
`visuals/` directory. The template includes the correct schema URL and queryState
|
||||
roles for the chosen type.
|
||||
|
||||
## Binding Data
|
||||
|
||||
Visuals start empty. Use `visual bind` with `Table[Column]` notation to connect
|
||||
them to your semantic model. The bind options vary by visual type -- see the
|
||||
type table below.
|
||||
|
||||
```bash
|
||||
# Bar chart: category axis + value
|
||||
pbi visual bind mybar --page p1 \
|
||||
--category "Geography[Region]" --value "Sales[Revenue]"
|
||||
|
||||
# Card: single field
|
||||
pbi visual bind mycard --page p1 --field "Sales[Total Revenue]"
|
||||
|
||||
# Matrix: rows + values + optional column
|
||||
pbi visual bind mymatrix --page p1 \
|
||||
--row "Product[Category]" --value "Sales[Amount]" --value "Sales[Quantity]"
|
||||
|
||||
# Scatter: X, Y, detail, optional size and legend
|
||||
pbi visual bind myscatter --page p1 \
|
||||
--x "Sales[Quantity]" --y "Sales[Revenue]" --detail "Product[Name]"
|
||||
|
||||
# Combo chart: category + column series + line series
|
||||
pbi visual bind mycombo --page p1 \
|
||||
--category "Calendar[Month]" --column "Sales[Revenue]" --line "Sales[Margin]"
|
||||
|
||||
# KPI: indicator + goal + trend axis
|
||||
pbi visual bind mykpi --page p1 \
|
||||
--indicator "Sales[Revenue]" --goal "Sales[Target]" --trend "Calendar[Date]"
|
||||
|
||||
# Gauge: value + max/target
|
||||
pbi visual bind mygauge --page p1 \
|
||||
--value "Sales[Revenue]" --max "Sales[Target]"
|
||||
```
|
||||
|
||||
Binding uses ROLE_ALIASES to translate friendly names like `--value` into the PBIR
|
||||
role name (e.g. `Y`, `Values`, `Data`). Measure vs Column is inferred from the role:
|
||||
value/indicator/goal/max roles create Measure references, category/row/detail roles
|
||||
create Column references. Override with `--measure` flag if needed.
|
||||
|
||||
## Inspecting and Updating
|
||||
|
||||
```bash
|
||||
# List all visuals on a page
|
||||
pbi visual list --page page_abc123
|
||||
|
||||
# Get full details of one visual
|
||||
pbi visual get visual_def456 --page page_abc123
|
||||
|
||||
# Move, resize, or toggle visibility
|
||||
pbi visual update vis1 --page p1 --width 600 --height 400
|
||||
pbi visual update vis1 --page p1 --x 100 --y 200
|
||||
pbi visual update vis1 --page p1 --hidden
|
||||
pbi visual update vis1 --page p1 --visible
|
||||
|
||||
# Delete a visual
|
||||
pbi visual delete visual_def456 --page page_abc123
|
||||
```
|
||||
|
||||
## Container Properties
|
||||
|
||||
Set border, background, or title on the visual container itself:
|
||||
|
||||
```bash
|
||||
pbi visual set-container vis1 --page p1 --background "#F0F0F0"
|
||||
pbi visual set-container vis1 --page p1 --border-color "#CCCCCC" --border-width 2
|
||||
pbi visual set-container vis1 --page p1 --title "Sales by Region"
|
||||
```
|
||||
|
||||
## Visual Calculations
|
||||
|
||||
Add DAX calculations that run inside the visual scope:
|
||||
|
||||
```bash
|
||||
pbi visual calc-add vis1 --page p1 --role Values \
|
||||
--name "RunningTotal" --expression "RUNNINGSUM([Revenue])"
|
||||
|
||||
pbi visual calc-list vis1 --page p1
|
||||
pbi visual calc-delete vis1 --page p1 --name "RunningTotal"
|
||||
```
|
||||
|
||||
## Bulk Operations
|
||||
|
||||
Operate on many visuals at once by filtering with `--type` or `--name-pattern`:
|
||||
|
||||
```bash
|
||||
# Find visuals matching criteria
|
||||
pbi visual where --page overview --type barChart
|
||||
pbi visual where --page overview --type kpi --y-min 300
|
||||
|
||||
# Bind the same field to ALL bar charts on a page
|
||||
pbi visual bulk-bind --page overview --type barChart \
|
||||
--category "Date[Month]" --value "Sales[Revenue]"
|
||||
|
||||
# Resize all KPI cards
|
||||
pbi visual bulk-update --page overview --type kpi --width 250 --height 120
|
||||
|
||||
# Hide all visuals matching a pattern
|
||||
pbi visual bulk-update --page overview --name-pattern "Temp_*" --hidden
|
||||
|
||||
# Delete all placeholders
|
||||
pbi visual bulk-delete --page overview --name-pattern "Placeholder_*"
|
||||
```
|
||||
|
||||
Filter options for `where`, `bulk-bind`, `bulk-update`, `bulk-delete`:
|
||||
- `--type` -- PBIR visual type or alias (e.g. `barChart`, `bar`)
|
||||
- `--name-pattern` -- fnmatch glob on visual name (e.g. `Chart_*`)
|
||||
- `--x-min`, `--x-max`, `--y-min`, `--y-max` -- position bounds (pixels)
|
||||
|
||||
All bulk commands require at least `--type` or `--name-pattern` to prevent
|
||||
accidental mass operations.
|
||||
|
||||
## Supported Visual Types (32)
|
||||
|
||||
### Charts
|
||||
|
||||
| Alias | PBIR Type | Bind Options |
|
||||
|--------------------|------------------------------|-----------------------------------------------|
|
||||
| bar | barChart | --category, --value, --legend |
|
||||
| line | lineChart | --category, --value, --legend |
|
||||
| column | columnChart | --category, --value, --legend |
|
||||
| area | areaChart | --category, --value, --legend |
|
||||
| ribbon | ribbonChart | --category, --value, --legend |
|
||||
| waterfall | waterfallChart | --category, --value, --breakdown |
|
||||
| stacked_bar | stackedBarChart | --category, --value, --legend |
|
||||
| clustered_bar | clusteredBarChart | --category, --value, --legend |
|
||||
| clustered_column | clusteredColumnChart | --category, --value, --legend |
|
||||
| scatter | scatterChart | --x, --y, --detail, --size, --legend |
|
||||
| funnel | funnelChart | --category, --value |
|
||||
| combo | lineStackedColumnComboChart | --category, --column, --line, --legend |
|
||||
| donut / pie | donutChart | --category, --value, --legend |
|
||||
| treemap | treemap | --category, --value |
|
||||
|
||||
### Cards and KPIs
|
||||
|
||||
| Alias | PBIR Type | Bind Options |
|
||||
|--------------------|------------------------------|-----------------------------------------------|
|
||||
| card | card | --field |
|
||||
| card_visual | cardVisual | --field (modern card) |
|
||||
| card_new | cardNew | --field |
|
||||
| multi_row_card | multiRowCard | --field |
|
||||
| kpi | kpi | --indicator, --goal, --trend |
|
||||
| gauge | gauge | --value, --max / --target |
|
||||
|
||||
### Tables
|
||||
|
||||
| Alias | PBIR Type | Bind Options |
|
||||
|--------------------|------------------------------|-----------------------------------------------|
|
||||
| table | tableEx | --value |
|
||||
| matrix | pivotTable | --row, --value, --column |
|
||||
|
||||
### Slicers
|
||||
|
||||
| Alias | PBIR Type | Bind Options |
|
||||
|--------------------|------------------------------|-----------------------------------------------|
|
||||
| slicer | slicer | --field |
|
||||
| text_slicer | textSlicer | --field |
|
||||
| list_slicer | listSlicer | --field |
|
||||
| advanced_slicer | advancedSlicerVisual | --field (tile/image slicer) |
|
||||
|
||||
### Maps
|
||||
|
||||
| Alias | PBIR Type | Bind Options |
|
||||
|--------------------|------------------------------|-----------------------------------------------|
|
||||
| azure_map / map | azureMap | --category, --size |
|
||||
|
||||
### Decorative and Navigation
|
||||
|
||||
| Alias | PBIR Type | Bind Options |
|
||||
|--------------------|------------------------------|-----------------------------------------------|
|
||||
| action_button | actionButton | (no data binding) |
|
||||
| image | image | (no data binding) |
|
||||
| shape | shape | (no data binding) |
|
||||
| textbox | textbox | (no data binding) |
|
||||
| page_navigator | pageNavigator | (no data binding) |
|
||||
|
||||
## JSON Output
|
||||
|
||||
All commands support `--json` for agent consumption:
|
||||
|
||||
```bash
|
||||
pbi --json visual list --page overview
|
||||
pbi --json visual get vis1 --page overview
|
||||
pbi --json visual where --page overview --type barChart
|
||||
```
|
||||
1
src/pbi_cli/templates/__init__.py
Normal file
1
src/pbi_cli/templates/__init__.py
Normal file
|
|
@ -0,0 +1 @@
|
|||
"""PBIR visual and report templates."""
|
||||
1
src/pbi_cli/templates/visuals/__init__.py
Normal file
1
src/pbi_cli/templates/visuals/__init__.py
Normal file
|
|
@ -0,0 +1 @@
|
|||
"""PBIR visual templates."""
|
||||
19
src/pbi_cli/templates/visuals/actionButton.json
Normal file
19
src/pbi_cli/templates/visuals/actionButton.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "actionButton",
|
||||
"objects": {},
|
||||
"visualContainerObjects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
},
|
||||
"howCreated": "InsertVisualButton"
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/advancedSlicerVisual.json
Normal file
24
src/pbi_cli/templates/visuals/advancedSlicerVisual.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "advancedSlicerVisual",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/areaChart.json
Normal file
27
src/pbi_cli/templates/visuals/areaChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "areaChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/azureMap.json
Normal file
27
src/pbi_cli/templates/visuals/azureMap.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "azureMap",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Size": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/barChart.json
Normal file
27
src/pbi_cli/templates/visuals/barChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "barChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/card.json
Normal file
24
src/pbi_cli/templates/visuals/card.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "card",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/cardNew.json
Normal file
24
src/pbi_cli/templates/visuals/cardNew.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "cardNew",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Fields": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
29
src/pbi_cli/templates/visuals/cardVisual.json
Normal file
29
src/pbi_cli/templates/visuals/cardVisual.json
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "cardVisual",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Data": {
|
||||
"projections": []
|
||||
}
|
||||
},
|
||||
"sortDefinition": {
|
||||
"sort": [],
|
||||
"isDefaultSort": true
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"visualContainerObjects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/clusteredBarChart.json
Normal file
24
src/pbi_cli/templates/visuals/clusteredBarChart.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "clusteredBarChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {"projections": []},
|
||||
"Y": {"projections": []},
|
||||
"Legend": {"projections": []}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/clusteredColumnChart.json
Normal file
24
src/pbi_cli/templates/visuals/clusteredColumnChart.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "clusteredColumnChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {"projections": []},
|
||||
"Y": {"projections": []},
|
||||
"Legend": {"projections": []}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/columnChart.json
Normal file
27
src/pbi_cli/templates/visuals/columnChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "columnChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/donutChart.json
Normal file
27
src/pbi_cli/templates/visuals/donutChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "donutChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/funnelChart.json
Normal file
27
src/pbi_cli/templates/visuals/funnelChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "funnelChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/gauge.json
Normal file
27
src/pbi_cli/templates/visuals/gauge.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "gauge",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Y": {
|
||||
"projections": []
|
||||
},
|
||||
"MaxValue": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
19
src/pbi_cli/templates/visuals/image.json
Normal file
19
src/pbi_cli/templates/visuals/image.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "image",
|
||||
"objects": {},
|
||||
"visualContainerObjects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
},
|
||||
"howCreated": "InsertVisualButton"
|
||||
}
|
||||
30
src/pbi_cli/templates/visuals/kpi.json
Normal file
30
src/pbi_cli/templates/visuals/kpi.json
Normal file
|
|
@ -0,0 +1,30 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "kpi",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Indicator": {
|
||||
"projections": []
|
||||
},
|
||||
"Goal": {
|
||||
"projections": []
|
||||
},
|
||||
"TrendLine": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/lineChart.json
Normal file
27
src/pbi_cli/templates/visuals/lineChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "lineChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,30 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "lineStackedColumnComboChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"ColumnY": {
|
||||
"projections": []
|
||||
},
|
||||
"LineY": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
22
src/pbi_cli/templates/visuals/listSlicer.json
Normal file
22
src/pbi_cli/templates/visuals/listSlicer.json
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "listSlicer",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {"projections": [], "active": true}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/multiRowCard.json
Normal file
24
src/pbi_cli/templates/visuals/multiRowCard.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "multiRowCard",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
19
src/pbi_cli/templates/visuals/pageNavigator.json
Normal file
19
src/pbi_cli/templates/visuals/pageNavigator.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "pageNavigator",
|
||||
"objects": {},
|
||||
"visualContainerObjects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
},
|
||||
"howCreated": "InsertVisualButton"
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/pivotTable.json
Normal file
27
src/pbi_cli/templates/visuals/pivotTable.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "pivotTable",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Rows": {
|
||||
"projections": []
|
||||
},
|
||||
"Values": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/ribbonChart.json
Normal file
27
src/pbi_cli/templates/visuals/ribbonChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "ribbonChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
30
src/pbi_cli/templates/visuals/scatterChart.json
Normal file
30
src/pbi_cli/templates/visuals/scatterChart.json
Normal file
|
|
@ -0,0 +1,30 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "scatterChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Details": {
|
||||
"projections": []
|
||||
},
|
||||
"X": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
19
src/pbi_cli/templates/visuals/shape.json
Normal file
19
src/pbi_cli/templates/visuals/shape.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "shape",
|
||||
"objects": {},
|
||||
"visualContainerObjects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
},
|
||||
"howCreated": "InsertVisualButton"
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/slicer.json
Normal file
24
src/pbi_cli/templates/visuals/slicer.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "slicer",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/stackedBarChart.json
Normal file
27
src/pbi_cli/templates/visuals/stackedBarChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "stackedBarChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
24
src/pbi_cli/templates/visuals/tableEx.json
Normal file
24
src/pbi_cli/templates/visuals/tableEx.json
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "tableEx",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
22
src/pbi_cli/templates/visuals/textSlicer.json
Normal file
22
src/pbi_cli/templates/visuals/textSlicer.json
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "textSlicer",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {"projections": []}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
17
src/pbi_cli/templates/visuals/textbox.json
Normal file
17
src/pbi_cli/templates/visuals/textbox.json
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "textbox",
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/treemap.json
Normal file
27
src/pbi_cli/templates/visuals/treemap.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "treemap",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Values": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
27
src/pbi_cli/templates/visuals/waterfallChart.json
Normal file
27
src/pbi_cli/templates/visuals/waterfallChart.json
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/fabric/item/report/definition/visualContainer/2.7.0/schema.json",
|
||||
"name": "__VISUAL_NAME__",
|
||||
"position": {
|
||||
"x": __X__,
|
||||
"y": __Y__,
|
||||
"z": __Z__,
|
||||
"height": __HEIGHT__,
|
||||
"width": __WIDTH__,
|
||||
"tabOrder": __TAB_ORDER__
|
||||
},
|
||||
"visual": {
|
||||
"visualType": "waterfallChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {
|
||||
"projections": []
|
||||
},
|
||||
"Y": {
|
||||
"projections": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"objects": {},
|
||||
"drillFilterOtherVisuals": true
|
||||
}
|
||||
}
|
||||
185
src/pbi_cli/utils/desktop_reload.py
Normal file
185
src/pbi_cli/utils/desktop_reload.py
Normal file
|
|
@ -0,0 +1,185 @@
|
|||
"""Trigger Power BI Desktop to reload the current report.
|
||||
|
||||
Implements a fallback chain:
|
||||
1. pywin32 (if installed): find window, send keyboard shortcut
|
||||
2. PowerShell: use Add-Type + SendKeys via subprocess
|
||||
3. Manual: print instructions for the user
|
||||
|
||||
Power BI Desktop's Developer Mode auto-detects file changes in TMDL but
|
||||
not in PBIR. This module bridges the gap by programmatically triggering
|
||||
a reload from the CLI.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import Any
|
||||
|
||||
|
||||
def reload_desktop() -> dict[str, Any]:
|
||||
"""Attempt to reload the current report in Power BI Desktop.
|
||||
|
||||
Tries methods in order of reliability. Returns a dict with
|
||||
``status``, ``method``, and ``message``.
|
||||
"""
|
||||
# Method 1: pywin32
|
||||
result = _try_pywin32()
|
||||
if result is not None:
|
||||
return result
|
||||
|
||||
# Method 2: PowerShell
|
||||
result = _try_powershell()
|
||||
if result is not None:
|
||||
return result
|
||||
|
||||
# Method 3: manual instructions
|
||||
return {
|
||||
"status": "manual",
|
||||
"method": "instructions",
|
||||
"message": (
|
||||
"Could not auto-reload Power BI Desktop. "
|
||||
"Please press Ctrl+Shift+F5 in Power BI Desktop to refresh, "
|
||||
"or close and reopen the report file."
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
def _try_pywin32() -> dict[str, Any] | None:
|
||||
"""Try to use pywin32 to send a reload shortcut to PBI Desktop."""
|
||||
try:
|
||||
import win32api # type: ignore[import-untyped]
|
||||
import win32con # type: ignore[import-untyped]
|
||||
import win32gui # type: ignore[import-untyped]
|
||||
except ImportError:
|
||||
return None
|
||||
|
||||
hwnd = _find_pbi_window_pywin32()
|
||||
if hwnd == 0:
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "pywin32",
|
||||
"message": "Power BI Desktop window not found. Is it running?",
|
||||
}
|
||||
|
||||
try:
|
||||
# Bring window to foreground
|
||||
win32gui.SetForegroundWindow(hwnd)
|
||||
|
||||
# Send Ctrl+Shift+F5 (common refresh shortcut)
|
||||
VK_CONTROL = 0x11
|
||||
VK_SHIFT = 0x10
|
||||
VK_F5 = 0x74
|
||||
|
||||
win32api.keybd_event(VK_CONTROL, 0, 0, 0)
|
||||
win32api.keybd_event(VK_SHIFT, 0, 0, 0)
|
||||
win32api.keybd_event(VK_F5, 0, 0, 0)
|
||||
win32api.keybd_event(VK_F5, 0, win32con.KEYEVENTF_KEYUP, 0)
|
||||
win32api.keybd_event(VK_SHIFT, 0, win32con.KEYEVENTF_KEYUP, 0)
|
||||
win32api.keybd_event(VK_CONTROL, 0, win32con.KEYEVENTF_KEYUP, 0)
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"method": "pywin32",
|
||||
"message": "Sent Ctrl+Shift+F5 to Power BI Desktop.",
|
||||
}
|
||||
except Exception as e:
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "pywin32",
|
||||
"message": f"Failed to send keystrokes: {e}",
|
||||
}
|
||||
|
||||
|
||||
def _find_pbi_window_pywin32() -> int:
|
||||
"""Find Power BI Desktop's main window handle via pywin32."""
|
||||
import win32gui # type: ignore[import-untyped]
|
||||
|
||||
result = 0
|
||||
|
||||
def callback(hwnd: int, _: Any) -> bool:
|
||||
nonlocal result
|
||||
if win32gui.IsWindowVisible(hwnd):
|
||||
title = win32gui.GetWindowText(hwnd)
|
||||
if "Power BI Desktop" in title:
|
||||
result = hwnd
|
||||
return False # Stop enumeration
|
||||
return True
|
||||
|
||||
try:
|
||||
win32gui.EnumWindows(callback, None)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def _try_powershell() -> dict[str, Any] | None:
|
||||
"""Try to use PowerShell to activate PBI Desktop and send keystrokes."""
|
||||
if sys.platform != "win32":
|
||||
return None
|
||||
|
||||
ps_script = """
|
||||
Add-Type -AssemblyName System.Windows.Forms
|
||||
Add-Type -AssemblyName Microsoft.VisualBasic
|
||||
|
||||
$pbi = Get-Process -Name "PBIDesktop" -ErrorAction SilentlyContinue | Select-Object -First 1
|
||||
if (-not $pbi) {
|
||||
$pbi = Get-Process -Name "PBIDesktopStore" -ErrorAction SilentlyContinue | Select-Object -First 1
|
||||
}
|
||||
|
||||
if (-not $pbi) {
|
||||
Write-Output "NOT_FOUND"
|
||||
exit 0
|
||||
}
|
||||
|
||||
# Activate the window
|
||||
[Microsoft.VisualBasic.Interaction]::AppActivate($pbi.Id)
|
||||
Start-Sleep -Milliseconds 500
|
||||
|
||||
# Send Ctrl+Shift+F5
|
||||
[System.Windows.Forms.SendKeys]::SendWait("^+{F5}")
|
||||
Write-Output "OK"
|
||||
"""
|
||||
|
||||
try:
|
||||
proc = subprocess.run(
|
||||
["powershell", "-NoProfile", "-NonInteractive", "-Command", ps_script],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=10,
|
||||
)
|
||||
output = proc.stdout.strip()
|
||||
|
||||
if output == "NOT_FOUND":
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "powershell",
|
||||
"message": "Power BI Desktop process not found. Is it running?",
|
||||
}
|
||||
if output == "OK":
|
||||
return {
|
||||
"status": "success",
|
||||
"method": "powershell",
|
||||
"message": "Sent Ctrl+Shift+F5 to Power BI Desktop via PowerShell.",
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "powershell",
|
||||
"message": f"Unexpected output: {output}",
|
||||
}
|
||||
except FileNotFoundError:
|
||||
return None # PowerShell not available
|
||||
except subprocess.TimeoutExpired:
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "powershell",
|
||||
"message": "PowerShell command timed out.",
|
||||
}
|
||||
except Exception as e:
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "powershell",
|
||||
"message": f"PowerShell error: {e}",
|
||||
}
|
||||
326
src/pbi_cli/utils/desktop_sync.py
Normal file
326
src/pbi_cli/utils/desktop_sync.py
Normal file
|
|
@ -0,0 +1,326 @@
|
|||
"""Close and reopen Power BI Desktop to sync PBIR file changes.
|
||||
|
||||
Power BI Desktop does not auto-detect PBIR file changes on disk.
|
||||
When pbi-cli writes to report JSON files while Desktop has the .pbip
|
||||
open, Desktop's in-memory state overwrites CLI changes on save.
|
||||
|
||||
This module uses a safe **save-first-then-rewrite** pattern:
|
||||
|
||||
1. Snapshot recently modified PBIR files (our changes)
|
||||
2. Close Desktop WITH save (preserves user's unsaved modeling work)
|
||||
3. Re-apply our PBIR snapshots (Desktop's save overwrote them)
|
||||
4. Reopen Desktop with the .pbip file
|
||||
|
||||
This preserves both the user's in-progress Desktop work (measures,
|
||||
relationships, etc.) AND our report-layer changes (filters, visuals, etc.).
|
||||
|
||||
Requires pywin32.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import subprocess
|
||||
import time
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
|
||||
def sync_desktop(
|
||||
pbip_hint: str | Path | None = None,
|
||||
definition_path: str | Path | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""Close Desktop (with save), re-apply PBIR changes, and reopen.
|
||||
|
||||
*pbip_hint* narrows the search to a specific .pbip file.
|
||||
*definition_path* is the PBIR definition folder; recently modified
|
||||
files here are snapshotted before Desktop saves and restored after.
|
||||
"""
|
||||
try:
|
||||
import win32con # type: ignore[import-untyped] # noqa: F401
|
||||
import win32gui # type: ignore[import-untyped] # noqa: F401
|
||||
except ImportError:
|
||||
return {
|
||||
"status": "manual",
|
||||
"method": "instructions",
|
||||
"message": (
|
||||
"pywin32 is not installed. Install with: pip install pywin32\n"
|
||||
"Then pbi-cli can auto-sync Desktop after report changes.\n"
|
||||
"For now: save in Desktop, close, reopen the .pbip file."
|
||||
),
|
||||
}
|
||||
|
||||
info = _find_desktop_process(pbip_hint)
|
||||
if info is None:
|
||||
return {
|
||||
"status": "skipped",
|
||||
"method": "pywin32",
|
||||
"message": "Power BI Desktop is not running. No sync needed.",
|
||||
}
|
||||
|
||||
hwnd = info["hwnd"]
|
||||
pbip_path = info["pbip_path"]
|
||||
pid = info["pid"]
|
||||
|
||||
# Step 1: Snapshot our PBIR changes (files modified in the last 5 seconds)
|
||||
snapshots = _snapshot_recent_changes(definition_path)
|
||||
|
||||
# Step 2: Close Desktop WITH save (Enter = Save button)
|
||||
close_err = _close_with_save(hwnd, pid)
|
||||
if close_err is not None:
|
||||
return close_err
|
||||
|
||||
# Step 3: Re-apply our PBIR changes (Desktop's save overwrote them)
|
||||
restored = _restore_snapshots(snapshots)
|
||||
|
||||
# Step 4: Reopen
|
||||
reopen_result = _reopen_pbip(pbip_path)
|
||||
if restored:
|
||||
reopen_result["restored_files"] = restored
|
||||
return reopen_result
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Snapshot / Restore
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _snapshot_recent_changes(
|
||||
definition_path: str | Path | None,
|
||||
max_age_seconds: float = 5.0,
|
||||
) -> dict[Path, bytes]:
|
||||
"""Read files modified within *max_age_seconds* under *definition_path*."""
|
||||
if definition_path is None:
|
||||
return {}
|
||||
|
||||
defn = Path(definition_path)
|
||||
if not defn.is_dir():
|
||||
return {}
|
||||
|
||||
now = time.time()
|
||||
snapshots: dict[Path, bytes] = {}
|
||||
|
||||
for fpath in defn.rglob("*.json"):
|
||||
try:
|
||||
age = now - fpath.stat().st_mtime
|
||||
if age <= max_age_seconds:
|
||||
snapshots[fpath] = fpath.read_bytes()
|
||||
except OSError:
|
||||
continue
|
||||
|
||||
return snapshots
|
||||
|
||||
|
||||
def _restore_snapshots(snapshots: dict[Path, bytes]) -> list[str]:
|
||||
"""Write snapshotted file contents back to disk."""
|
||||
restored: list[str] = []
|
||||
for fpath, content in snapshots.items():
|
||||
try:
|
||||
fpath.write_bytes(content)
|
||||
restored.append(fpath.name)
|
||||
except OSError:
|
||||
continue
|
||||
return restored
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Desktop process discovery
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _find_desktop_process(
|
||||
pbip_hint: str | Path | None,
|
||||
) -> dict[str, Any] | None:
|
||||
"""Find the PBI Desktop window, its PID, and the .pbip file it has open."""
|
||||
import win32gui # type: ignore[import-untyped]
|
||||
import win32process # type: ignore[import-untyped]
|
||||
|
||||
hint_stem = None
|
||||
if pbip_hint is not None:
|
||||
hint_stem = Path(pbip_hint).stem.lower()
|
||||
|
||||
matches: list[dict[str, Any]] = []
|
||||
|
||||
def callback(hwnd: int, _: Any) -> bool:
|
||||
if not win32gui.IsWindowVisible(hwnd):
|
||||
return True
|
||||
title = win32gui.GetWindowText(hwnd)
|
||||
if not title:
|
||||
return True
|
||||
|
||||
_, pid = win32process.GetWindowThreadProcessId(hwnd)
|
||||
cmd_info = _get_process_info(pid)
|
||||
if cmd_info is None:
|
||||
return True
|
||||
|
||||
exe_path = cmd_info.get("exe", "")
|
||||
if "pbidesktop" not in exe_path.lower():
|
||||
return True
|
||||
|
||||
pbip_path = cmd_info.get("pbip")
|
||||
if pbip_path is None:
|
||||
return True
|
||||
|
||||
if hint_stem is not None:
|
||||
if hint_stem not in Path(pbip_path).stem.lower():
|
||||
return True
|
||||
|
||||
matches.append({
|
||||
"hwnd": hwnd,
|
||||
"pid": pid,
|
||||
"title": title,
|
||||
"exe_path": exe_path,
|
||||
"pbip_path": pbip_path,
|
||||
})
|
||||
return True
|
||||
|
||||
try:
|
||||
win32gui.EnumWindows(callback, None)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return matches[0] if matches else None
|
||||
|
||||
|
||||
def _get_process_info(pid: int) -> dict[str, str] | None:
|
||||
"""Get exe path and .pbip file from a process command line via wmic."""
|
||||
try:
|
||||
out = subprocess.check_output(
|
||||
[
|
||||
"wmic", "process", "where", f"ProcessId={pid}",
|
||||
"get", "ExecutablePath,CommandLine", "/format:list",
|
||||
],
|
||||
text=True,
|
||||
stderr=subprocess.DEVNULL,
|
||||
timeout=5,
|
||||
)
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
result: dict[str, str] = {}
|
||||
for line in out.strip().split("\n"):
|
||||
line = line.strip()
|
||||
if line.startswith("ExecutablePath="):
|
||||
result["exe"] = line[15:]
|
||||
elif line.startswith("CommandLine="):
|
||||
cmd = line[12:]
|
||||
for part in cmd.split('"'):
|
||||
part = part.strip()
|
||||
if part.lower().endswith(".pbip"):
|
||||
result["pbip"] = part
|
||||
break
|
||||
|
||||
return result if "exe" in result else None
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Close with save
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _close_with_save(hwnd: int, pid: int) -> dict[str, Any] | None:
|
||||
"""Close Desktop via WM_CLOSE and click Save in the dialog.
|
||||
|
||||
Returns an error dict on failure, or None on success.
|
||||
"""
|
||||
import win32con # type: ignore[import-untyped]
|
||||
import win32gui # type: ignore[import-untyped]
|
||||
|
||||
win32gui.PostMessage(hwnd, win32con.WM_CLOSE, 0, 0)
|
||||
time.sleep(2)
|
||||
|
||||
# Accept the save dialog (Enter = Save, which is the default button)
|
||||
_accept_save_dialog()
|
||||
|
||||
# Wait for process to exit (up to 20 seconds -- saving can take time)
|
||||
for _ in range(40):
|
||||
if not _process_alive(pid):
|
||||
return None
|
||||
time.sleep(0.5)
|
||||
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "pywin32",
|
||||
"message": (
|
||||
"Power BI Desktop did not close within 20 seconds. "
|
||||
"Please save and close manually, then reopen the .pbip file."
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
def _accept_save_dialog() -> None:
|
||||
"""Find and accept the save dialog by pressing Enter (Save is default).
|
||||
|
||||
After WM_CLOSE, Power BI Desktop shows a dialog:
|
||||
[Save] [Don't Save] [Cancel]
|
||||
'Save' is the default focused button, so Enter clicks it.
|
||||
"""
|
||||
import win32gui # type: ignore[import-untyped]
|
||||
|
||||
dialog_found = False
|
||||
|
||||
def callback(hwnd: int, _: Any) -> bool:
|
||||
nonlocal dialog_found
|
||||
if win32gui.IsWindowVisible(hwnd):
|
||||
title = win32gui.GetWindowText(hwnd)
|
||||
if title == "Microsoft Power BI Desktop":
|
||||
dialog_found = True
|
||||
return True
|
||||
|
||||
try:
|
||||
win32gui.EnumWindows(callback, None)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if not dialog_found:
|
||||
return
|
||||
|
||||
try:
|
||||
shell = _get_wscript_shell()
|
||||
activated = shell.AppActivate("Microsoft Power BI Desktop")
|
||||
if activated:
|
||||
time.sleep(0.3)
|
||||
# Enter = Save (the default button)
|
||||
shell.SendKeys("{ENTER}")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Reopen / utilities
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _reopen_pbip(pbip_path: str) -> dict[str, Any]:
|
||||
"""Launch the .pbip file with the system default handler."""
|
||||
try:
|
||||
os.startfile(pbip_path) # type: ignore[attr-defined]
|
||||
return {
|
||||
"status": "success",
|
||||
"method": "pywin32",
|
||||
"message": f"Desktop synced: {Path(pbip_path).name}",
|
||||
"file": pbip_path,
|
||||
}
|
||||
except Exception as e:
|
||||
return {
|
||||
"status": "error",
|
||||
"method": "pywin32",
|
||||
"message": f"Failed to reopen: {e}. Open manually: {pbip_path}",
|
||||
}
|
||||
|
||||
|
||||
def _process_alive(pid: int) -> bool:
|
||||
"""Check if a process is still running."""
|
||||
try:
|
||||
out = subprocess.check_output(
|
||||
["tasklist", "/FI", f"PID eq {pid}", "/NH"],
|
||||
text=True,
|
||||
stderr=subprocess.DEVNULL,
|
||||
timeout=3,
|
||||
)
|
||||
return str(pid) in out
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
def _get_wscript_shell() -> Any:
|
||||
"""Get a WScript.Shell COM object for SendKeys."""
|
||||
import win32com.client # type: ignore[import-untyped]
|
||||
|
||||
return win32com.client.Dispatch("WScript.Shell")
|
||||
313
tests/test_bookmark_backend.py
Normal file
313
tests/test_bookmark_backend.py
Normal file
|
|
@ -0,0 +1,313 @@
|
|||
"""Tests for pbi_cli.core.bookmark_backend."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.bookmark_backend import (
|
||||
SCHEMA_BOOKMARK,
|
||||
SCHEMA_BOOKMARKS_METADATA,
|
||||
bookmark_add,
|
||||
bookmark_delete,
|
||||
bookmark_get,
|
||||
bookmark_list,
|
||||
bookmark_set_visibility,
|
||||
)
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixtures
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def definition_path(tmp_path: Path) -> Path:
|
||||
"""Return a temporary PBIR definition folder."""
|
||||
defn = tmp_path / "MyReport.Report" / "definition"
|
||||
defn.mkdir(parents=True)
|
||||
return defn
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# bookmark_list
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bookmark_list_no_bookmarks_dir(definition_path: Path) -> None:
|
||||
"""bookmark_list returns [] when the bookmarks directory does not exist."""
|
||||
result = bookmark_list(definition_path)
|
||||
assert result == []
|
||||
|
||||
|
||||
def test_bookmark_list_empty_items(definition_path: Path) -> None:
|
||||
"""bookmark_list returns [] when bookmarks.json has an empty items list."""
|
||||
bm_dir = definition_path / "bookmarks"
|
||||
bm_dir.mkdir()
|
||||
index = {"$schema": SCHEMA_BOOKMARKS_METADATA, "items": []}
|
||||
(bm_dir / "bookmarks.json").write_text(json.dumps(index), encoding="utf-8")
|
||||
|
||||
result = bookmark_list(definition_path)
|
||||
assert result == []
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# bookmark_add
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bookmark_add_creates_directory(definition_path: Path) -> None:
|
||||
"""bookmark_add creates the bookmarks/ directory when it does not exist."""
|
||||
bookmark_add(definition_path, "Q1 View", "page_abc")
|
||||
|
||||
assert (definition_path / "bookmarks").is_dir()
|
||||
|
||||
|
||||
def test_bookmark_add_creates_index_file(definition_path: Path) -> None:
|
||||
"""bookmark_add creates bookmarks.json with the correct schema."""
|
||||
bookmark_add(definition_path, "Q1 View", "page_abc")
|
||||
|
||||
index_file = definition_path / "bookmarks" / "bookmarks.json"
|
||||
assert index_file.exists()
|
||||
data = json.loads(index_file.read_text(encoding="utf-8"))
|
||||
assert data["$schema"] == SCHEMA_BOOKMARKS_METADATA
|
||||
|
||||
|
||||
def test_bookmark_add_returns_status_dict(definition_path: Path) -> None:
|
||||
"""bookmark_add returns a status dict with the expected keys and values."""
|
||||
result = bookmark_add(definition_path, "Q1 View", "page_abc", name="mybookmark")
|
||||
|
||||
assert result["status"] == "created"
|
||||
assert result["name"] == "mybookmark"
|
||||
assert result["display_name"] == "Q1 View"
|
||||
assert result["target_page"] == "page_abc"
|
||||
|
||||
|
||||
def test_bookmark_add_writes_individual_file(definition_path: Path) -> None:
|
||||
"""bookmark_add writes a .bookmark.json file with the correct structure."""
|
||||
bookmark_add(definition_path, "Sales View", "page_sales", name="bm_sales")
|
||||
|
||||
bm_file = definition_path / "bookmarks" / "bm_sales.bookmark.json"
|
||||
assert bm_file.exists()
|
||||
data = json.loads(bm_file.read_text(encoding="utf-8"))
|
||||
assert data["$schema"] == SCHEMA_BOOKMARK
|
||||
assert data["name"] == "bm_sales"
|
||||
assert data["displayName"] == "Sales View"
|
||||
assert data["explorationState"]["activeSection"] == "page_sales"
|
||||
assert data["explorationState"]["version"] == "1.3"
|
||||
|
||||
|
||||
def test_bookmark_add_auto_generates_20char_name(definition_path: Path) -> None:
|
||||
"""bookmark_add generates a 20-character hex name when no name is given."""
|
||||
result = bookmark_add(definition_path, "Auto Name", "page_xyz")
|
||||
|
||||
assert len(result["name"]) == 20
|
||||
assert all(c in "0123456789abcdef" for c in result["name"])
|
||||
|
||||
|
||||
def test_bookmark_add_uses_explicit_name(definition_path: Path) -> None:
|
||||
"""bookmark_add uses the caller-supplied name."""
|
||||
result = bookmark_add(definition_path, "Named", "page_x", name="custom_id")
|
||||
|
||||
assert result["name"] == "custom_id"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# bookmark_list after add
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bookmark_list_after_add_returns_one(definition_path: Path) -> None:
|
||||
"""bookmark_list returns exactly one entry after a single bookmark_add."""
|
||||
bookmark_add(definition_path, "Q1 View", "page_q1", name="bm01")
|
||||
|
||||
results = bookmark_list(definition_path)
|
||||
assert len(results) == 1
|
||||
assert results[0]["name"] == "bm01"
|
||||
assert results[0]["display_name"] == "Q1 View"
|
||||
assert results[0]["active_section"] == "page_q1"
|
||||
|
||||
|
||||
def test_bookmark_list_after_two_adds_returns_two(definition_path: Path) -> None:
|
||||
"""bookmark_list returns two entries after two bookmark_add calls."""
|
||||
bookmark_add(definition_path, "View A", "page_a", name="bm_a")
|
||||
bookmark_add(definition_path, "View B", "page_b", name="bm_b")
|
||||
|
||||
results = bookmark_list(definition_path)
|
||||
assert len(results) == 2
|
||||
names = {r["name"] for r in results}
|
||||
assert names == {"bm_a", "bm_b"}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# bookmark_get
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bookmark_get_returns_full_data(definition_path: Path) -> None:
|
||||
"""bookmark_get returns the complete bookmark JSON dict."""
|
||||
bookmark_add(definition_path, "Full View", "page_full", name="bm_full")
|
||||
|
||||
data = bookmark_get(definition_path, "bm_full")
|
||||
assert data["name"] == "bm_full"
|
||||
assert data["displayName"] == "Full View"
|
||||
assert "$schema" in data
|
||||
|
||||
|
||||
def test_bookmark_get_raises_for_unknown_name(definition_path: Path) -> None:
|
||||
"""bookmark_get raises PbiCliError when the bookmark name does not exist."""
|
||||
with pytest.raises(PbiCliError, match="not found"):
|
||||
bookmark_get(definition_path, "nonexistent_bm")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# bookmark_delete
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bookmark_delete_removes_file(definition_path: Path) -> None:
|
||||
"""bookmark_delete removes the .bookmark.json file from disk."""
|
||||
bookmark_add(definition_path, "Temp", "page_temp", name="bm_temp")
|
||||
bm_file = definition_path / "bookmarks" / "bm_temp.bookmark.json"
|
||||
assert bm_file.exists()
|
||||
|
||||
bookmark_delete(definition_path, "bm_temp")
|
||||
|
||||
assert not bm_file.exists()
|
||||
|
||||
|
||||
def test_bookmark_delete_removes_from_index(definition_path: Path) -> None:
|
||||
"""bookmark_delete removes the name from the bookmarks.json items list."""
|
||||
bookmark_add(definition_path, "Temp", "page_temp", name="bm_del")
|
||||
bookmark_delete(definition_path, "bm_del")
|
||||
|
||||
index_file = definition_path / "bookmarks" / "bookmarks.json"
|
||||
index = json.loads(index_file.read_text(encoding="utf-8"))
|
||||
names_in_index = [i.get("name") for i in index.get("items", [])]
|
||||
assert "bm_del" not in names_in_index
|
||||
|
||||
|
||||
def test_bookmark_delete_raises_for_unknown_name(definition_path: Path) -> None:
|
||||
"""bookmark_delete raises PbiCliError when the bookmark does not exist."""
|
||||
with pytest.raises(PbiCliError, match="not found"):
|
||||
bookmark_delete(definition_path, "no_such_bookmark")
|
||||
|
||||
|
||||
def test_bookmark_list_after_delete_returns_n_minus_one(definition_path: Path) -> None:
|
||||
"""bookmark_list returns one fewer item after a delete."""
|
||||
bookmark_add(definition_path, "Keep", "page_keep", name="bm_keep")
|
||||
bookmark_add(definition_path, "Remove", "page_remove", name="bm_remove")
|
||||
|
||||
bookmark_delete(definition_path, "bm_remove")
|
||||
|
||||
results = bookmark_list(definition_path)
|
||||
assert len(results) == 1
|
||||
assert results[0]["name"] == "bm_keep"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# bookmark_set_visibility
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bookmark_set_visibility_hide_sets_display_mode(definition_path: Path) -> None:
|
||||
"""set_visibility with hidden=True writes display.mode='hidden' on singleVisual."""
|
||||
bookmark_add(definition_path, "Hide Test", "page_a", name="bm_hide")
|
||||
|
||||
result = bookmark_set_visibility(
|
||||
definition_path, "bm_hide", "page_a", "visual_x", hidden=True
|
||||
)
|
||||
|
||||
assert result["status"] == "updated"
|
||||
assert result["hidden"] is True
|
||||
|
||||
bm_file = definition_path / "bookmarks" / "bm_hide.bookmark.json"
|
||||
data = json.loads(bm_file.read_text(encoding="utf-8"))
|
||||
single = (
|
||||
data["explorationState"]["sections"]["page_a"]
|
||||
["visualContainers"]["visual_x"]["singleVisual"]
|
||||
)
|
||||
assert single["display"] == {"mode": "hidden"}
|
||||
|
||||
|
||||
def test_bookmark_set_visibility_show_removes_display_key(definition_path: Path) -> None:
|
||||
"""set_visibility with hidden=False removes the display key from singleVisual."""
|
||||
bookmark_add(definition_path, "Show Test", "page_b", name="bm_show")
|
||||
|
||||
# First hide it, then show it
|
||||
bookmark_set_visibility(
|
||||
definition_path, "bm_show", "page_b", "visual_y", hidden=True
|
||||
)
|
||||
bookmark_set_visibility(
|
||||
definition_path, "bm_show", "page_b", "visual_y", hidden=False
|
||||
)
|
||||
|
||||
bm_file = definition_path / "bookmarks" / "bm_show.bookmark.json"
|
||||
data = json.loads(bm_file.read_text(encoding="utf-8"))
|
||||
single = (
|
||||
data["explorationState"]["sections"]["page_b"]
|
||||
["visualContainers"]["visual_y"]["singleVisual"]
|
||||
)
|
||||
assert "display" not in single
|
||||
|
||||
|
||||
def test_bookmark_set_visibility_creates_path_if_absent(definition_path: Path) -> None:
|
||||
"""set_visibility creates the sections/visualContainers path if not present."""
|
||||
bookmark_add(definition_path, "New Path", "page_c", name="bm_newpath")
|
||||
|
||||
# The bookmark was created without any sections; the function should create them.
|
||||
bookmark_set_visibility(
|
||||
definition_path, "bm_newpath", "page_c", "visual_z", hidden=True
|
||||
)
|
||||
|
||||
bm_file = definition_path / "bookmarks" / "bm_newpath.bookmark.json"
|
||||
data = json.loads(bm_file.read_text(encoding="utf-8"))
|
||||
assert "page_c" in data["explorationState"]["sections"]
|
||||
assert "visual_z" in (
|
||||
data["explorationState"]["sections"]["page_c"]["visualContainers"]
|
||||
)
|
||||
|
||||
|
||||
def test_bookmark_set_visibility_preserves_existing_single_visual_keys(
|
||||
definition_path: Path,
|
||||
) -> None:
|
||||
"""set_visibility keeps existing singleVisual keys (e.g. visualType)."""
|
||||
bookmark_add(definition_path, "Preserve", "page_d", name="bm_preserve")
|
||||
|
||||
# Pre-populate a singleVisual with a visualType key via set_visibility helper
|
||||
bookmark_set_visibility(
|
||||
definition_path, "bm_preserve", "page_d", "visual_w", hidden=False
|
||||
)
|
||||
|
||||
# Manually inject a visualType into the singleVisual
|
||||
bm_file = definition_path / "bookmarks" / "bm_preserve.bookmark.json"
|
||||
raw = json.loads(bm_file.read_text(encoding="utf-8"))
|
||||
raw["explorationState"]["sections"]["page_d"]["visualContainers"]["visual_w"][
|
||||
"singleVisual"
|
||||
]["visualType"] = "barChart"
|
||||
bm_file.write_text(json.dumps(raw, indent=2), encoding="utf-8")
|
||||
|
||||
# Now hide and verify visualType is retained
|
||||
bookmark_set_visibility(
|
||||
definition_path, "bm_preserve", "page_d", "visual_w", hidden=True
|
||||
)
|
||||
|
||||
updated = json.loads(bm_file.read_text(encoding="utf-8"))
|
||||
single = (
|
||||
updated["explorationState"]["sections"]["page_d"]
|
||||
["visualContainers"]["visual_w"]["singleVisual"]
|
||||
)
|
||||
assert single["visualType"] == "barChart"
|
||||
assert single["display"] == {"mode": "hidden"}
|
||||
|
||||
|
||||
def test_bookmark_set_visibility_raises_for_unknown_bookmark(
|
||||
definition_path: Path,
|
||||
) -> None:
|
||||
"""set_visibility raises PbiCliError when the bookmark does not exist."""
|
||||
with pytest.raises(PbiCliError, match="not found"):
|
||||
bookmark_set_visibility(
|
||||
definition_path, "nonexistent", "page_x", "visual_x", hidden=True
|
||||
)
|
||||
317
tests/test_bulk_backend.py
Normal file
317
tests/test_bulk_backend.py
Normal file
|
|
@ -0,0 +1,317 @@
|
|||
"""Tests for pbi_cli.core.bulk_backend.
|
||||
|
||||
All tests use a minimal in-memory PBIR directory tree with a single page
|
||||
containing 5 visuals of mixed types.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.bulk_backend import (
|
||||
visual_bulk_bind,
|
||||
visual_bulk_delete,
|
||||
visual_bulk_update,
|
||||
visual_where,
|
||||
)
|
||||
from pbi_cli.core.visual_backend import visual_add, visual_get
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixtures
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
path.write_text(json.dumps(data, indent=2), encoding="utf-8")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def multi_visual_page(tmp_path: Path) -> Path:
|
||||
"""PBIR definition folder with one page containing 5 visuals.
|
||||
|
||||
Layout:
|
||||
- BarChart_1 barChart x=0 y=0
|
||||
- BarChart_2 barChart x=440 y=0
|
||||
- BarChart_3 barChart x=880 y=0
|
||||
- Card_1 card x=0 y=320
|
||||
- KPI_1 kpi x=440 y=320
|
||||
|
||||
Returns the ``definition/`` path.
|
||||
"""
|
||||
definition = tmp_path / "definition"
|
||||
definition.mkdir()
|
||||
|
||||
_write_json(
|
||||
definition / "version.json",
|
||||
{"version": "2.0.0"},
|
||||
)
|
||||
_write_json(
|
||||
definition / "report.json",
|
||||
{
|
||||
"$schema": "...",
|
||||
"themeCollection": {"baseTheme": {"name": "CY24SU06"}},
|
||||
"layoutOptimization": "None",
|
||||
},
|
||||
)
|
||||
|
||||
pages_dir = definition / "pages"
|
||||
pages_dir.mkdir()
|
||||
_write_json(
|
||||
pages_dir / "pages.json",
|
||||
{"pageOrder": ["test_page"], "activePageName": "test_page"},
|
||||
)
|
||||
|
||||
page_dir = pages_dir / "test_page"
|
||||
page_dir.mkdir()
|
||||
_write_json(
|
||||
page_dir / "page.json",
|
||||
{
|
||||
"name": "test_page",
|
||||
"displayName": "Test Page",
|
||||
"displayOption": "FitToPage",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
"ordinal": 0,
|
||||
},
|
||||
)
|
||||
|
||||
visuals_dir = page_dir / "visuals"
|
||||
visuals_dir.mkdir()
|
||||
|
||||
# Add 5 visuals via visual_add to get realistic JSON files
|
||||
visual_add(definition, "test_page", "bar", name="BarChart_1", x=0, y=0, width=400, height=300)
|
||||
visual_add(definition, "test_page", "bar", name="BarChart_2", x=440, y=0, width=400, height=300)
|
||||
visual_add(definition, "test_page", "bar", name="BarChart_3", x=880, y=0, width=400, height=300)
|
||||
visual_add(definition, "test_page", "card", name="Card_1", x=0, y=320, width=200, height=120)
|
||||
visual_add(definition, "test_page", "kpi", name="KPI_1", x=440, y=320, width=250, height=150)
|
||||
|
||||
return definition
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# visual_where -- filter by type
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_where_by_type_returns_correct_subset(multi_visual_page: Path) -> None:
|
||||
"""visual_where with visual_type='barChart' returns only the 3 bar charts."""
|
||||
result = visual_where(multi_visual_page, "test_page", visual_type="barChart")
|
||||
|
||||
assert len(result) == 3
|
||||
assert all(v["visual_type"] == "barChart" for v in result)
|
||||
|
||||
|
||||
def test_where_by_alias_resolves_type(multi_visual_page: Path) -> None:
|
||||
"""visual_where accepts user-friendly alias 'bar' and resolves it."""
|
||||
result = visual_where(multi_visual_page, "test_page", visual_type="bar")
|
||||
|
||||
assert len(result) == 3
|
||||
|
||||
|
||||
def test_where_by_type_card(multi_visual_page: Path) -> None:
|
||||
result = visual_where(multi_visual_page, "test_page", visual_type="card")
|
||||
|
||||
assert len(result) == 1
|
||||
assert result[0]["name"] == "Card_1"
|
||||
|
||||
|
||||
def test_where_no_filter_returns_all(multi_visual_page: Path) -> None:
|
||||
"""visual_where with no filters returns all 5 visuals."""
|
||||
result = visual_where(multi_visual_page, "test_page")
|
||||
|
||||
assert len(result) == 5
|
||||
|
||||
|
||||
def test_where_by_name_pattern(multi_visual_page: Path) -> None:
|
||||
"""visual_where with name_pattern='BarChart_*' returns 3 matching visuals."""
|
||||
result = visual_where(multi_visual_page, "test_page", name_pattern="BarChart_*")
|
||||
|
||||
assert len(result) == 3
|
||||
assert all(v["name"].startswith("BarChart_") for v in result)
|
||||
|
||||
|
||||
def test_where_by_x_max(multi_visual_page: Path) -> None:
|
||||
"""visual_where with x_max=400 returns visuals at x=0 only (left column)."""
|
||||
result = visual_where(multi_visual_page, "test_page", x_max=400)
|
||||
|
||||
names = {v["name"] for v in result}
|
||||
assert "BarChart_1" in names
|
||||
assert "Card_1" in names
|
||||
assert "BarChart_3" not in names
|
||||
|
||||
|
||||
def test_where_by_y_min(multi_visual_page: Path) -> None:
|
||||
"""visual_where with y_min=300 returns only visuals below y=300."""
|
||||
result = visual_where(multi_visual_page, "test_page", y_min=300)
|
||||
|
||||
names = {v["name"] for v in result}
|
||||
assert "Card_1" in names
|
||||
assert "KPI_1" in names
|
||||
assert "BarChart_1" not in names
|
||||
|
||||
|
||||
def test_where_type_and_position_combined(multi_visual_page: Path) -> None:
|
||||
"""Combining type and x_max narrows results correctly."""
|
||||
result = visual_where(
|
||||
multi_visual_page, "test_page", visual_type="barChart", x_max=400
|
||||
)
|
||||
|
||||
assert len(result) == 1
|
||||
assert result[0]["name"] == "BarChart_1"
|
||||
|
||||
|
||||
def test_where_nonexistent_type_returns_empty(multi_visual_page: Path) -> None:
|
||||
result = visual_where(multi_visual_page, "test_page", visual_type="lineChart")
|
||||
|
||||
assert result == []
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# visual_bulk_bind
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bulk_bind_applies_to_all_matching(multi_visual_page: Path) -> None:
|
||||
"""visual_bulk_bind applies bindings to all 3 bar charts."""
|
||||
result = visual_bulk_bind(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
visual_type="barChart",
|
||||
bindings=[{"role": "category", "field": "Date[Month]"}],
|
||||
)
|
||||
|
||||
assert result["bound"] == 3
|
||||
assert set(result["visuals"]) == {"BarChart_1", "BarChart_2", "BarChart_3"}
|
||||
|
||||
# Verify the projection was written to each visual
|
||||
for name in result["visuals"]:
|
||||
vfile = multi_visual_page / "pages" / "test_page" / "visuals" / name / "visual.json"
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
projections = data["visual"]["query"]["queryState"]["Category"]["projections"]
|
||||
assert len(projections) == 1
|
||||
assert projections[0]["nativeQueryRef"] == "Month"
|
||||
|
||||
|
||||
def test_bulk_bind_with_name_pattern(multi_visual_page: Path) -> None:
|
||||
"""visual_bulk_bind with name_pattern restricts to matching visuals only."""
|
||||
result = visual_bulk_bind(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
visual_type="barChart",
|
||||
bindings=[{"role": "value", "field": "Sales[Revenue]"}],
|
||||
name_pattern="BarChart_1",
|
||||
)
|
||||
|
||||
assert result["bound"] == 1
|
||||
assert result["visuals"] == ["BarChart_1"]
|
||||
|
||||
|
||||
def test_bulk_bind_returns_zero_when_no_match(multi_visual_page: Path) -> None:
|
||||
result = visual_bulk_bind(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
visual_type="lineChart",
|
||||
bindings=[{"role": "value", "field": "Sales[Revenue]"}],
|
||||
)
|
||||
|
||||
assert result["bound"] == 0
|
||||
assert result["visuals"] == []
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# visual_bulk_update
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bulk_update_sets_height_for_all_matching(multi_visual_page: Path) -> None:
|
||||
"""visual_bulk_update resizes all bar charts."""
|
||||
result = visual_bulk_update(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
where_type="barChart",
|
||||
set_height=250,
|
||||
)
|
||||
|
||||
assert result["updated"] == 3
|
||||
|
||||
for name in result["visuals"]:
|
||||
info = visual_get(multi_visual_page, "test_page", name)
|
||||
assert info["height"] == 250
|
||||
|
||||
|
||||
def test_bulk_update_hides_by_name_pattern(multi_visual_page: Path) -> None:
|
||||
result = visual_bulk_update(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
where_name_pattern="BarChart_*",
|
||||
set_hidden=True,
|
||||
)
|
||||
|
||||
assert result["updated"] == 3
|
||||
for name in result["visuals"]:
|
||||
info = visual_get(multi_visual_page, "test_page", name)
|
||||
assert info["is_hidden"] is True
|
||||
|
||||
|
||||
def test_bulk_update_requires_at_least_one_setter(multi_visual_page: Path) -> None:
|
||||
"""visual_bulk_update raises ValueError when no set_* arg provided."""
|
||||
with pytest.raises(ValueError, match="At least one set_"):
|
||||
visual_bulk_update(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
where_type="barChart",
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# visual_bulk_delete
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_bulk_delete_removes_matching_visuals(multi_visual_page: Path) -> None:
|
||||
result = visual_bulk_delete(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
where_type="barChart",
|
||||
)
|
||||
|
||||
assert result["deleted"] == 3
|
||||
assert set(result["visuals"]) == {"BarChart_1", "BarChart_2", "BarChart_3"}
|
||||
|
||||
# Confirm folders are gone
|
||||
visuals_dir = multi_visual_page / "pages" / "test_page" / "visuals"
|
||||
remaining = {d.name for d in visuals_dir.iterdir() if d.is_dir()}
|
||||
assert "BarChart_1" not in remaining
|
||||
assert "Card_1" in remaining
|
||||
assert "KPI_1" in remaining
|
||||
|
||||
|
||||
def test_bulk_delete_by_name_pattern(multi_visual_page: Path) -> None:
|
||||
result = visual_bulk_delete(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
where_name_pattern="BarChart_*",
|
||||
)
|
||||
|
||||
assert result["deleted"] == 3
|
||||
|
||||
|
||||
def test_bulk_delete_requires_filter(multi_visual_page: Path) -> None:
|
||||
"""visual_bulk_delete raises ValueError when no filter given."""
|
||||
with pytest.raises(ValueError, match="Provide at least"):
|
||||
visual_bulk_delete(multi_visual_page, "test_page")
|
||||
|
||||
|
||||
def test_bulk_delete_returns_zero_when_no_match(multi_visual_page: Path) -> None:
|
||||
result = visual_bulk_delete(
|
||||
multi_visual_page,
|
||||
"test_page",
|
||||
where_type="lineChart",
|
||||
)
|
||||
|
||||
assert result["deleted"] == 0
|
||||
assert result["visuals"] == []
|
||||
563
tests/test_filter_backend.py
Normal file
563
tests/test_filter_backend.py
Normal file
|
|
@ -0,0 +1,563 @@
|
|||
"""Tests for pbi_cli.core.filter_backend.
|
||||
|
||||
Covers filter_list, filter_add_categorical, filter_remove, and filter_clear
|
||||
for both page-level and visual-level scopes.
|
||||
|
||||
A ``sample_report`` fixture builds a minimal valid PBIR folder in tmp_path.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.filter_backend import (
|
||||
filter_add_categorical,
|
||||
filter_add_relative_date,
|
||||
filter_add_topn,
|
||||
filter_clear,
|
||||
filter_list,
|
||||
filter_remove,
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Schema constants used only for fixture JSON
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
_SCHEMA_PAGE = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/page/1.0.0/schema.json"
|
||||
)
|
||||
_SCHEMA_VISUAL_CONTAINER = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/visualContainer/2.7.0/schema.json"
|
||||
)
|
||||
_SCHEMA_VISUAL_CONFIG = (
|
||||
"https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/visualConfiguration/2.3.0/schema.json"
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _write(path: Path, data: dict[str, Any]) -> None:
|
||||
path.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8")
|
||||
|
||||
|
||||
def _read(path: Path) -> dict[str, Any]:
|
||||
return json.loads(path.read_text(encoding="utf-8")) # type: ignore[return-value]
|
||||
|
||||
|
||||
def _make_page(definition_path: Path, page_name: str, display_name: str = "Overview") -> Path:
|
||||
"""Create a minimal page folder and return the page dir."""
|
||||
page_dir = definition_path / "pages" / page_name
|
||||
page_dir.mkdir(parents=True, exist_ok=True)
|
||||
_write(page_dir / "page.json", {
|
||||
"$schema": _SCHEMA_PAGE,
|
||||
"name": page_name,
|
||||
"displayName": display_name,
|
||||
"displayOption": "FitToPage",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
"ordinal": 0,
|
||||
})
|
||||
return page_dir
|
||||
|
||||
|
||||
def _make_visual(page_dir: Path, visual_name: str) -> Path:
|
||||
"""Create a minimal visual folder and return the visual dir."""
|
||||
visual_dir = page_dir / "visuals" / visual_name
|
||||
visual_dir.mkdir(parents=True, exist_ok=True)
|
||||
_write(visual_dir / "visual.json", {
|
||||
"$schema": _SCHEMA_VISUAL_CONTAINER,
|
||||
"name": visual_name,
|
||||
"position": {"x": 0, "y": 0, "width": 400, "height": 300, "z": 0, "tabOrder": 0},
|
||||
"visual": {
|
||||
"$schema": _SCHEMA_VISUAL_CONFIG,
|
||||
"visualType": "barChart",
|
||||
"query": {"queryState": {}},
|
||||
"objects": {},
|
||||
},
|
||||
})
|
||||
return visual_dir
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixtures
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def definition_path(tmp_path: Path) -> Path:
|
||||
"""Return a minimal PBIR definition folder with one page and one visual."""
|
||||
defn = tmp_path / "MyReport.Report" / "definition"
|
||||
defn.mkdir(parents=True)
|
||||
_write(defn / "report.json", {"name": "MyReport"})
|
||||
page_dir = _make_page(defn, "page_overview", "Overview")
|
||||
_make_visual(page_dir, "visual_abc123")
|
||||
return defn
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# filter_list
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_list_empty_page(definition_path: Path) -> None:
|
||||
"""filter_list returns empty list when no filterConfig exists on a page."""
|
||||
result = filter_list(definition_path, "page_overview")
|
||||
assert result == []
|
||||
|
||||
|
||||
def test_filter_list_empty_visual(definition_path: Path) -> None:
|
||||
"""filter_list returns empty list when no filterConfig exists on a visual."""
|
||||
result = filter_list(definition_path, "page_overview", visual_name="visual_abc123")
|
||||
assert result == []
|
||||
|
||||
|
||||
def test_filter_list_with_filters(definition_path: Path) -> None:
|
||||
"""filter_list returns the filters after one is added."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Region", ["North", "South"]
|
||||
)
|
||||
result = filter_list(definition_path, "page_overview")
|
||||
assert len(result) == 1
|
||||
assert result[0]["type"] == "Categorical"
|
||||
|
||||
|
||||
def test_filter_list_missing_file(definition_path: Path) -> None:
|
||||
"""filter_list raises PbiCliError when the page does not exist."""
|
||||
with pytest.raises(PbiCliError):
|
||||
filter_list(definition_path, "nonexistent_page")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# filter_add_categorical (page scope)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_add_categorical_page_returns_status(definition_path: Path) -> None:
|
||||
"""filter_add_categorical returns the expected status dict."""
|
||||
result = filter_add_categorical(
|
||||
definition_path, "page_overview", "financials", "Country", ["Canada", "France"]
|
||||
)
|
||||
assert result["status"] == "added"
|
||||
assert result["type"] == "Categorical"
|
||||
assert result["scope"] == "page"
|
||||
assert "name" in result
|
||||
|
||||
|
||||
def test_filter_add_categorical_page_persisted(definition_path: Path) -> None:
|
||||
"""Added filter appears in the page.json file with correct structure."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "financials", "Country", ["Canada", "France"]
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
data = _read(page_json)
|
||||
filters = data["filterConfig"]["filters"]
|
||||
assert len(filters) == 1
|
||||
f = filters[0]
|
||||
assert f["type"] == "Categorical"
|
||||
assert f["howCreated"] == "User"
|
||||
|
||||
|
||||
def test_filter_add_categorical_json_structure(definition_path: Path) -> None:
|
||||
"""The filter body has correct Version, From, and Where structure."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "financials", "Country", ["Canada", "France"]
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
f = _read(page_json)["filterConfig"]["filters"][0]
|
||||
|
||||
assert f["filter"]["Version"] == 2
|
||||
|
||||
from_entries = f["filter"]["From"]
|
||||
assert len(from_entries) == 1
|
||||
assert from_entries[0]["Name"] == "f"
|
||||
assert from_entries[0]["Entity"] == "financials"
|
||||
assert from_entries[0]["Type"] == 0
|
||||
|
||||
where = f["filter"]["Where"]
|
||||
assert len(where) == 1
|
||||
in_clause = where[0]["Condition"]["In"]
|
||||
assert len(in_clause["Values"]) == 2
|
||||
assert in_clause["Values"][0][0]["Literal"]["Value"] == "'Canada'"
|
||||
assert in_clause["Values"][1][0]["Literal"]["Value"] == "'France'"
|
||||
|
||||
|
||||
def test_filter_add_categorical_alias_from_table_name(definition_path: Path) -> None:
|
||||
"""Source alias uses the first character of the table name, lowercased."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Product", ["Widget"]
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
f = _read(page_json)["filterConfig"]["filters"][0]
|
||||
alias = f["filter"]["From"][0]["Name"]
|
||||
assert alias == "s"
|
||||
source_ref = f["filter"]["Where"][0]["Condition"]["In"]["Expressions"][0]
|
||||
assert source_ref["Column"]["Expression"]["SourceRef"]["Source"] == "s"
|
||||
|
||||
|
||||
def test_filter_add_categorical_custom_name(definition_path: Path) -> None:
|
||||
"""filter_add_categorical uses the provided name when given."""
|
||||
result = filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Region", ["East"], name="myfilter123"
|
||||
)
|
||||
assert result["name"] == "myfilter123"
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
f = _read(page_json)["filterConfig"]["filters"][0]
|
||||
assert f["name"] == "myfilter123"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# filter_add_categorical (visual scope)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_add_categorical_visual_scope(definition_path: Path) -> None:
|
||||
"""filter_add_categorical adds a visual filter with scope='visual' and no howCreated."""
|
||||
result = filter_add_categorical(
|
||||
definition_path,
|
||||
"page_overview",
|
||||
"financials",
|
||||
"Segment",
|
||||
["SMB"],
|
||||
visual_name="visual_abc123",
|
||||
)
|
||||
assert result["scope"] == "visual"
|
||||
|
||||
visual_json = (
|
||||
definition_path / "pages" / "page_overview" / "visuals" / "visual_abc123" / "visual.json"
|
||||
)
|
||||
f = _read(visual_json)["filterConfig"]["filters"][0]
|
||||
assert "howCreated" not in f
|
||||
assert f["type"] == "Categorical"
|
||||
|
||||
|
||||
def test_filter_list_visual_after_add(definition_path: Path) -> None:
|
||||
"""filter_list on a visual returns the added filter."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Year", ["2024"],
|
||||
visual_name="visual_abc123",
|
||||
)
|
||||
result = filter_list(definition_path, "page_overview", visual_name="visual_abc123")
|
||||
assert len(result) == 1
|
||||
assert result[0]["type"] == "Categorical"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# filter_remove
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_remove_removes_by_name(definition_path: Path) -> None:
|
||||
"""filter_remove deletes the correct filter and leaves others intact."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Region", ["East"], name="filter_a"
|
||||
)
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Product", ["Widget"], name="filter_b"
|
||||
)
|
||||
result = filter_remove(definition_path, "page_overview", "filter_a")
|
||||
assert result == {"status": "removed", "name": "filter_a"}
|
||||
|
||||
remaining = filter_list(definition_path, "page_overview")
|
||||
assert len(remaining) == 1
|
||||
assert remaining[0]["name"] == "filter_b"
|
||||
|
||||
|
||||
def test_filter_remove_raises_for_unknown_name(definition_path: Path) -> None:
|
||||
"""filter_remove raises PbiCliError when the filter name does not exist."""
|
||||
with pytest.raises(PbiCliError, match="not found"):
|
||||
filter_remove(definition_path, "page_overview", "does_not_exist")
|
||||
|
||||
|
||||
def test_filter_remove_visual(definition_path: Path) -> None:
|
||||
"""filter_remove works on visual-level filters."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Year", ["2024"],
|
||||
visual_name="visual_abc123", name="vis_filter_x",
|
||||
)
|
||||
result = filter_remove(
|
||||
definition_path, "page_overview", "vis_filter_x", visual_name="visual_abc123"
|
||||
)
|
||||
assert result["status"] == "removed"
|
||||
assert filter_list(definition_path, "page_overview", visual_name="visual_abc123") == []
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# filter_clear
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_clear_removes_all(definition_path: Path) -> None:
|
||||
"""filter_clear removes every filter and returns the correct count."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Region", ["East"], name="f1"
|
||||
)
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Product", ["Widget"], name="f2"
|
||||
)
|
||||
result = filter_clear(definition_path, "page_overview")
|
||||
assert result == {"status": "cleared", "removed": 2, "scope": "page"}
|
||||
assert filter_list(definition_path, "page_overview") == []
|
||||
|
||||
|
||||
def test_filter_clear_empty_page(definition_path: Path) -> None:
|
||||
"""filter_clear on a page with no filters returns removed=0."""
|
||||
result = filter_clear(definition_path, "page_overview")
|
||||
assert result["removed"] == 0
|
||||
assert result["scope"] == "page"
|
||||
|
||||
|
||||
def test_filter_clear_visual_scope(definition_path: Path) -> None:
|
||||
"""filter_clear on a visual uses scope='visual'."""
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Year", ["2024"],
|
||||
visual_name="visual_abc123",
|
||||
)
|
||||
result = filter_clear(definition_path, "page_overview", visual_name="visual_abc123")
|
||||
assert result["scope"] == "visual"
|
||||
assert result["removed"] == 1
|
||||
assert filter_list(definition_path, "page_overview", visual_name="visual_abc123") == []
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Edge cases
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_list_no_filter_config_key(definition_path: Path) -> None:
|
||||
"""filter_list gracefully returns [] when filterConfig key is absent."""
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
data = _read(page_json)
|
||||
data.pop("filterConfig", None)
|
||||
_write(page_json, data)
|
||||
assert filter_list(definition_path, "page_overview") == []
|
||||
|
||||
|
||||
def test_multiple_adds_accumulate(definition_path: Path) -> None:
|
||||
"""Each call to filter_add_categorical appends rather than replaces."""
|
||||
for i in range(3):
|
||||
filter_add_categorical(
|
||||
definition_path, "page_overview", "Sales", "Region", [f"Region{i}"],
|
||||
name=f"filter_{i}",
|
||||
)
|
||||
result = filter_list(definition_path, "page_overview")
|
||||
assert len(result) == 3
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# filter_add_topn
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_add_topn_returns_status(definition_path: Path) -> None:
|
||||
"""filter_add_topn returns the expected status dict."""
|
||||
result = filter_add_topn(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Country",
|
||||
n=3, order_by_table="financials", order_by_column="Sales",
|
||||
)
|
||||
assert result["status"] == "added"
|
||||
assert result["type"] == "TopN"
|
||||
assert result["scope"] == "page"
|
||||
assert result["n"] == 3
|
||||
assert result["direction"] == "Top"
|
||||
assert "name" in result
|
||||
|
||||
|
||||
def test_filter_add_topn_persisted(definition_path: Path) -> None:
|
||||
"""filter_add_topn writes a TopN filter entry to page.json."""
|
||||
filter_add_topn(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Country",
|
||||
n=3, order_by_table="financials", order_by_column="Sales",
|
||||
name="topn_test",
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
data = _read(page_json)
|
||||
filters = data["filterConfig"]["filters"]
|
||||
assert len(filters) == 1
|
||||
f = filters[0]
|
||||
assert f["type"] == "TopN"
|
||||
assert f["name"] == "topn_test"
|
||||
assert f["howCreated"] == "User"
|
||||
|
||||
|
||||
def test_filter_add_topn_subquery_structure(definition_path: Path) -> None:
|
||||
"""The TopN filter has the correct Subquery/From/Where structure."""
|
||||
filter_add_topn(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Country",
|
||||
n=5, order_by_table="financials", order_by_column="Sales",
|
||||
name="topn_struct",
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
f = _read(page_json)["filterConfig"]["filters"][0]["filter"]
|
||||
|
||||
assert f["Version"] == 2
|
||||
assert len(f["From"]) == 2
|
||||
|
||||
subquery_entry = f["From"][0]
|
||||
assert subquery_entry["Name"] == "subquery"
|
||||
assert subquery_entry["Type"] == 2
|
||||
query = subquery_entry["Expression"]["Subquery"]["Query"]
|
||||
assert query["Top"] == 5
|
||||
|
||||
where = f["Where"][0]["Condition"]["In"]
|
||||
assert "Table" in where
|
||||
assert where["Table"]["SourceRef"]["Source"] == "subquery"
|
||||
|
||||
|
||||
def test_filter_add_topn_direction_bottom(definition_path: Path) -> None:
|
||||
"""direction='Bottom' produces PBI Direction=1 in the OrderBy."""
|
||||
filter_add_topn(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Country",
|
||||
n=3, order_by_table="financials", order_by_column="Profit",
|
||||
direction="Bottom", name="topn_bottom",
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
f = _read(page_json)["filterConfig"]["filters"][0]["filter"]
|
||||
query = f["From"][0]["Expression"]["Subquery"]["Query"]
|
||||
assert query["OrderBy"][0]["Direction"] == 1
|
||||
|
||||
|
||||
def test_filter_add_topn_invalid_direction(definition_path: Path) -> None:
|
||||
"""filter_add_topn raises PbiCliError for an unknown direction."""
|
||||
with pytest.raises(PbiCliError):
|
||||
filter_add_topn(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Country",
|
||||
n=3, order_by_table="financials", order_by_column="Sales",
|
||||
direction="Middle",
|
||||
)
|
||||
|
||||
|
||||
def test_filter_add_topn_visual_scope(definition_path: Path) -> None:
|
||||
"""filter_add_topn adds a visual filter with scope='visual' and no howCreated."""
|
||||
result = filter_add_topn(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Country",
|
||||
n=3, order_by_table="financials", order_by_column="Sales",
|
||||
visual_name="visual_abc123",
|
||||
)
|
||||
assert result["scope"] == "visual"
|
||||
visual_json = (
|
||||
definition_path / "pages" / "page_overview" / "visuals" / "visual_abc123" / "visual.json"
|
||||
)
|
||||
f = _read(visual_json)["filterConfig"]["filters"][0]
|
||||
assert "howCreated" not in f
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# filter_add_relative_date
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_filter_add_relative_date_returns_status(definition_path: Path) -> None:
|
||||
"""filter_add_relative_date returns the expected status dict."""
|
||||
result = filter_add_relative_date(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Date",
|
||||
amount=3, time_unit="months",
|
||||
)
|
||||
assert result["status"] == "added"
|
||||
assert result["type"] == "RelativeDate"
|
||||
assert result["scope"] == "page"
|
||||
assert result["amount"] == 3
|
||||
assert result["time_unit"] == "months"
|
||||
|
||||
|
||||
def test_filter_add_relative_date_persisted(definition_path: Path) -> None:
|
||||
"""filter_add_relative_date writes a RelativeDate entry to page.json."""
|
||||
filter_add_relative_date(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Date",
|
||||
amount=3, time_unit="months",
|
||||
name="reldate_test",
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
data = _read(page_json)
|
||||
filters = data["filterConfig"]["filters"]
|
||||
assert len(filters) == 1
|
||||
f = filters[0]
|
||||
assert f["type"] == "RelativeDate"
|
||||
assert f["name"] == "reldate_test"
|
||||
assert f["howCreated"] == "User"
|
||||
|
||||
|
||||
def test_filter_add_relative_date_between_structure(definition_path: Path) -> None:
|
||||
"""The RelativeDate filter uses a Between/DateAdd/DateSpan/Now structure."""
|
||||
filter_add_relative_date(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Date",
|
||||
amount=3, time_unit="months",
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
f = _read(page_json)["filterConfig"]["filters"][0]["filter"]
|
||||
|
||||
assert f["Version"] == 2
|
||||
between = f["Where"][0]["Condition"]["Between"]
|
||||
assert "LowerBound" in between
|
||||
assert "UpperBound" in between
|
||||
|
||||
# UpperBound is DateSpan(Now(), days)
|
||||
upper = between["UpperBound"]["DateSpan"]
|
||||
assert "Now" in upper["Expression"]
|
||||
assert upper["TimeUnit"] == 0 # days
|
||||
|
||||
# LowerBound: DateSpan(DateAdd(DateAdd(Now(), +1, days), -amount, time_unit), days)
|
||||
lower_date_add = between["LowerBound"]["DateSpan"]["Expression"]["DateAdd"]
|
||||
assert lower_date_add["Amount"] == -3
|
||||
assert lower_date_add["TimeUnit"] == 2 # months
|
||||
inner = lower_date_add["Expression"]["DateAdd"]
|
||||
assert inner["Amount"] == 1
|
||||
assert inner["TimeUnit"] == 0 # days
|
||||
|
||||
|
||||
def test_filter_add_relative_date_time_unit_years(definition_path: Path) -> None:
|
||||
"""time_unit='years' maps to TimeUnit=3 in the DateAdd."""
|
||||
filter_add_relative_date(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Date",
|
||||
amount=1, time_unit="years",
|
||||
)
|
||||
page_json = definition_path / "pages" / "page_overview" / "page.json"
|
||||
f = _read(page_json)["filterConfig"]["filters"][0]["filter"]
|
||||
lower_span = f["Where"][0]["Condition"]["Between"]["LowerBound"]["DateSpan"]
|
||||
lower_date_add = lower_span["Expression"]["DateAdd"]
|
||||
assert lower_date_add["TimeUnit"] == 3 # years
|
||||
|
||||
|
||||
def test_filter_add_relative_date_invalid_unit(definition_path: Path) -> None:
|
||||
"""filter_add_relative_date raises PbiCliError for an unknown time_unit."""
|
||||
with pytest.raises(PbiCliError):
|
||||
filter_add_relative_date(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Date",
|
||||
amount=3, time_unit="quarters",
|
||||
)
|
||||
|
||||
|
||||
def test_filter_add_relative_date_visual_scope(definition_path: Path) -> None:
|
||||
"""filter_add_relative_date adds a visual filter with no howCreated key."""
|
||||
result = filter_add_relative_date(
|
||||
definition_path, "page_overview",
|
||||
table="financials", column="Date",
|
||||
amount=7, time_unit="days",
|
||||
visual_name="visual_abc123",
|
||||
)
|
||||
assert result["scope"] == "visual"
|
||||
visual_json = (
|
||||
definition_path / "pages" / "page_overview" / "visuals" / "visual_abc123" / "visual.json"
|
||||
)
|
||||
f = _read(visual_json)["filterConfig"]["filters"][0]
|
||||
assert "howCreated" not in f
|
||||
631
tests/test_format_backend.py
Normal file
631
tests/test_format_backend.py
Normal file
|
|
@ -0,0 +1,631 @@
|
|||
"""Tests for pbi_cli.core.format_backend.
|
||||
|
||||
Covers format_get, format_clear, format_background_gradient, and
|
||||
format_background_measure against a minimal in-memory PBIR directory tree.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.format_backend import (
|
||||
format_background_conditional,
|
||||
format_background_gradient,
|
||||
format_background_measure,
|
||||
format_clear,
|
||||
format_get,
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixture helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
PAGE_NAME = "overview"
|
||||
VISUAL_NAME = "test_visual"
|
||||
FIELD_PROFIT = "Sum(financials.Profit)"
|
||||
FIELD_SALES = "Sum(financials.Sales)"
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
path.write_text(json.dumps(data, indent=2), encoding="utf-8")
|
||||
|
||||
|
||||
def _minimal_visual_json() -> dict[str, Any]:
|
||||
"""Return a minimal visual.json with two bound fields and no objects."""
|
||||
return {
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/visual/1.0.0/schema.json",
|
||||
"visual": {
|
||||
"visualType": "tableEx",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Values": {
|
||||
"projections": [
|
||||
{
|
||||
"field": {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Entity": "financials"}},
|
||||
"Property": "Profit",
|
||||
}
|
||||
},
|
||||
"queryRef": FIELD_PROFIT,
|
||||
"active": True,
|
||||
},
|
||||
{
|
||||
"field": {
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Entity": "financials"}},
|
||||
"Property": "Sales",
|
||||
}
|
||||
},
|
||||
"queryRef": FIELD_SALES,
|
||||
"active": True,
|
||||
},
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def report_with_visual(tmp_path: Path) -> Path:
|
||||
"""Build a minimal PBIR definition folder with one page containing one visual.
|
||||
|
||||
Returns the ``definition/`` path accepted by all format_* functions.
|
||||
|
||||
Layout::
|
||||
|
||||
<tmp_path>/
|
||||
definition/
|
||||
version.json
|
||||
report.json
|
||||
pages/
|
||||
pages.json
|
||||
overview/
|
||||
page.json
|
||||
visuals/
|
||||
test_visual/
|
||||
visual.json
|
||||
"""
|
||||
definition = tmp_path / "definition"
|
||||
definition.mkdir()
|
||||
|
||||
_write_json(
|
||||
definition / "version.json",
|
||||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/versionMetadata/1.0.0/schema.json",
|
||||
"version": "1.0.0",
|
||||
},
|
||||
)
|
||||
_write_json(
|
||||
definition / "report.json",
|
||||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/report/1.0.0/schema.json",
|
||||
"themeCollection": {"baseTheme": {"name": "CY24SU06"}},
|
||||
"layoutOptimization": "Disabled",
|
||||
},
|
||||
)
|
||||
|
||||
pages_dir = definition / "pages"
|
||||
pages_dir.mkdir()
|
||||
_write_json(
|
||||
pages_dir / "pages.json",
|
||||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/pagesMetadata/1.0.0/schema.json",
|
||||
"pageOrder": [PAGE_NAME],
|
||||
},
|
||||
)
|
||||
|
||||
page_dir = pages_dir / PAGE_NAME
|
||||
page_dir.mkdir()
|
||||
_write_json(
|
||||
page_dir / "page.json",
|
||||
{
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/page/1.0.0/schema.json",
|
||||
"name": PAGE_NAME,
|
||||
"displayName": "Overview",
|
||||
"displayOption": "FitToPage",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
"ordinal": 0,
|
||||
},
|
||||
)
|
||||
|
||||
visuals_dir = page_dir / "visuals"
|
||||
visuals_dir.mkdir()
|
||||
|
||||
visual_dir = visuals_dir / VISUAL_NAME
|
||||
visual_dir.mkdir()
|
||||
_write_json(visual_dir / "visual.json", _minimal_visual_json())
|
||||
|
||||
return definition
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helper to read saved visual.json directly
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _read_visual(definition: Path) -> dict[str, Any]:
|
||||
path = (
|
||||
definition / "pages" / PAGE_NAME / "visuals" / VISUAL_NAME / "visual.json"
|
||||
)
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 1. format_get on a fresh visual returns empty objects
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_get_fresh_visual_returns_empty_objects(report_with_visual: Path) -> None:
|
||||
"""format_get returns empty objects dict on a visual with no formatting."""
|
||||
result = format_get(report_with_visual, PAGE_NAME, VISUAL_NAME)
|
||||
|
||||
assert result["visual"] == VISUAL_NAME
|
||||
assert result["objects"] == {}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 2. format_background_gradient adds an entry to objects.values
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_gradient_adds_entry(report_with_visual: Path) -> None:
|
||||
"""format_background_gradient creates objects.values with one entry."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 1
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 3. Gradient entry has correct FillRule.linearGradient2 structure
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_gradient_correct_structure(report_with_visual: Path) -> None:
|
||||
"""Gradient entry contains the expected FillRule.linearGradient2 keys."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
entry = data["visual"]["objects"]["values"][0]
|
||||
fill_rule_expr = (
|
||||
entry["properties"]["backColor"]["solid"]["color"]["expr"]["FillRule"]
|
||||
)
|
||||
assert "linearGradient2" in fill_rule_expr["FillRule"]
|
||||
linear = fill_rule_expr["FillRule"]["linearGradient2"]
|
||||
assert "min" in linear
|
||||
assert "max" in linear
|
||||
assert "nullColoringStrategy" in linear
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 4. Gradient entry selector.metadata matches field_query_ref
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_gradient_selector_metadata(report_with_visual: Path) -> None:
|
||||
"""Gradient entry selector.metadata equals the supplied field_query_ref."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
entry = data["visual"]["objects"]["values"][0]
|
||||
assert entry["selector"]["metadata"] == FIELD_PROFIT
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 5. format_background_measure adds an entry to objects.values
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_measure_adds_entry(report_with_visual: Path) -> None:
|
||||
"""format_background_measure creates objects.values with one entry."""
|
||||
format_background_measure(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
measure_table="financials",
|
||||
measure_property="Conditional Formatting Sales",
|
||||
field_query_ref=FIELD_SALES,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 1
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 6. Measure entry has correct Measure expression structure
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_measure_correct_structure(report_with_visual: Path) -> None:
|
||||
"""Measure entry contains the expected Measure expression keys."""
|
||||
format_background_measure(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
measure_table="financials",
|
||||
measure_property="Conditional Formatting Sales",
|
||||
field_query_ref=FIELD_SALES,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
entry = data["visual"]["objects"]["values"][0]
|
||||
measure_expr = entry["properties"]["backColor"]["solid"]["color"]["expr"]
|
||||
assert "Measure" in measure_expr
|
||||
assert measure_expr["Measure"]["Property"] == "Conditional Formatting Sales"
|
||||
assert measure_expr["Measure"]["Expression"]["SourceRef"]["Entity"] == "financials"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 7. Applying gradient twice (same field_query_ref) replaces, not duplicates
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_gradient_idempotent(report_with_visual: Path) -> None:
|
||||
"""Applying gradient twice on same field replaces the existing entry."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 1
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 8. Applying gradient for different field_query_ref creates second entry
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_gradient_different_fields(report_with_visual: Path) -> None:
|
||||
"""Two different field_query_refs produce two entries in objects.values."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Sales",
|
||||
field_query_ref=FIELD_SALES,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 2
|
||||
refs = {e["selector"]["metadata"] for e in values}
|
||||
assert refs == {FIELD_PROFIT, FIELD_SALES}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 9. format_clear sets objects to {}
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_clear_sets_empty_objects(report_with_visual: Path) -> None:
|
||||
"""format_clear sets visual.objects to an empty dict."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
format_clear(report_with_visual, PAGE_NAME, VISUAL_NAME)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
assert data["visual"]["objects"] == {}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 10. format_clear after gradient clears the entries
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_clear_removes_values(report_with_visual: Path) -> None:
|
||||
"""format_clear removes objects.values that were set by gradient."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
result = format_clear(report_with_visual, PAGE_NAME, VISUAL_NAME)
|
||||
|
||||
assert result["status"] == "cleared"
|
||||
data = _read_visual(report_with_visual)
|
||||
assert "values" not in data["visual"]["objects"]
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 11. format_get after gradient returns non-empty objects
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_get_after_gradient_returns_objects(report_with_visual: Path) -> None:
|
||||
"""format_get returns non-empty objects after a gradient rule is applied."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
|
||||
result = format_get(report_with_visual, PAGE_NAME, VISUAL_NAME)
|
||||
|
||||
assert result["visual"] == VISUAL_NAME
|
||||
assert result["objects"] != {}
|
||||
assert len(result["objects"]["values"]) == 1
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 12. format_get on missing visual raises PbiCliError
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_get_missing_visual_raises(report_with_visual: Path) -> None:
|
||||
"""format_get raises PbiCliError when the visual folder does not exist."""
|
||||
with pytest.raises(PbiCliError, match="not found"):
|
||||
format_get(report_with_visual, PAGE_NAME, "nonexistent_visual")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 13. gradient + measure on different fields: objects.values has 2 entries
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_gradient_and_measure_different_fields(report_with_visual: Path) -> None:
|
||||
"""A gradient on one field and a measure rule on another yield two entries."""
|
||||
format_background_gradient(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Profit",
|
||||
field_query_ref=FIELD_PROFIT,
|
||||
)
|
||||
format_background_measure(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
measure_table="financials",
|
||||
measure_property="Conditional Formatting Sales",
|
||||
field_query_ref=FIELD_SALES,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 2
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 14. format_background_measure with same field replaces existing entry
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_background_measure_replaces_existing(report_with_visual: Path) -> None:
|
||||
"""Applying measure rule twice on same field replaces the existing entry."""
|
||||
format_background_measure(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
measure_table="financials",
|
||||
measure_property="CF Sales v1",
|
||||
field_query_ref=FIELD_SALES,
|
||||
)
|
||||
format_background_measure(
|
||||
report_with_visual,
|
||||
PAGE_NAME,
|
||||
VISUAL_NAME,
|
||||
measure_table="financials",
|
||||
measure_property="CF Sales v2",
|
||||
field_query_ref=FIELD_SALES,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 1
|
||||
prop = values[0]["properties"]["backColor"]["solid"]["color"]["expr"]["Measure"]["Property"]
|
||||
assert prop == "CF Sales v2"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 15. format_clear returns correct status dict
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_format_clear_return_value(report_with_visual: Path) -> None:
|
||||
"""format_clear returns the expected status dictionary."""
|
||||
result = format_clear(report_with_visual, PAGE_NAME, VISUAL_NAME)
|
||||
|
||||
assert result == {"status": "cleared", "visual": VISUAL_NAME}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# format_background_conditional
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
FIELD_UNITS = "Sum(financials.Units Sold)"
|
||||
|
||||
|
||||
def test_format_background_conditional_adds_entry(report_with_visual: Path) -> None:
|
||||
"""format_background_conditional creates an entry in objects.values."""
|
||||
format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Units Sold",
|
||||
threshold=100000,
|
||||
color_hex="#12239E",
|
||||
field_query_ref=FIELD_UNITS,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 1
|
||||
|
||||
|
||||
def test_format_background_conditional_correct_structure(report_with_visual: Path) -> None:
|
||||
"""Conditional entry has Conditional.Cases with ComparisonKind and color."""
|
||||
format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Units Sold",
|
||||
threshold=100000,
|
||||
color_hex="#12239E",
|
||||
field_query_ref=FIELD_UNITS,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
entry = data["visual"]["objects"]["values"][0]
|
||||
cond_expr = entry["properties"]["backColor"]["solid"]["color"]["expr"]["Conditional"]
|
||||
assert "Cases" in cond_expr
|
||||
case = cond_expr["Cases"][0]
|
||||
comparison = case["Condition"]["Comparison"]
|
||||
assert comparison["ComparisonKind"] == 2 # gt
|
||||
assert comparison["Right"]["Literal"]["Value"] == "100000D"
|
||||
assert case["Value"]["Literal"]["Value"] == "'#12239E'"
|
||||
|
||||
|
||||
def test_format_background_conditional_selector_metadata(report_with_visual: Path) -> None:
|
||||
"""Conditional entry selector.metadata equals the supplied field_query_ref."""
|
||||
format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Units Sold",
|
||||
threshold=100000,
|
||||
color_hex="#12239E",
|
||||
field_query_ref=FIELD_UNITS,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
entry = data["visual"]["objects"]["values"][0]
|
||||
assert entry["selector"]["metadata"] == FIELD_UNITS
|
||||
|
||||
|
||||
def test_format_background_conditional_default_field_query_ref(
|
||||
report_with_visual: Path,
|
||||
) -> None:
|
||||
"""When field_query_ref is omitted, it defaults to 'Sum(table.column)'."""
|
||||
result = format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials",
|
||||
input_column="Units Sold",
|
||||
threshold=100000,
|
||||
color_hex="#12239E",
|
||||
)
|
||||
|
||||
assert result["field"] == "Sum(financials.Units Sold)"
|
||||
|
||||
|
||||
def test_format_background_conditional_replaces_existing(report_with_visual: Path) -> None:
|
||||
"""Applying conditional twice on same field_query_ref replaces the entry."""
|
||||
format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials", input_column="Units Sold",
|
||||
threshold=100000, color_hex="#FF0000",
|
||||
field_query_ref=FIELD_UNITS,
|
||||
)
|
||||
format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials", input_column="Units Sold",
|
||||
threshold=50000, color_hex="#00FF00",
|
||||
field_query_ref=FIELD_UNITS,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
values = data["visual"]["objects"]["values"]
|
||||
assert len(values) == 1
|
||||
case = values[0]["properties"]["backColor"]["solid"]["color"]["expr"]["Conditional"]["Cases"][0]
|
||||
assert case["Value"]["Literal"]["Value"] == "'#00FF00'"
|
||||
|
||||
|
||||
def test_format_background_conditional_comparison_lte(report_with_visual: Path) -> None:
|
||||
"""comparison='lte' maps to ComparisonKind=5."""
|
||||
format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials", input_column="Units Sold",
|
||||
threshold=10000, color_hex="#AABBCC",
|
||||
comparison="lte",
|
||||
field_query_ref=FIELD_UNITS,
|
||||
)
|
||||
|
||||
data = _read_visual(report_with_visual)
|
||||
entry = data["visual"]["objects"]["values"][0]
|
||||
kind = (
|
||||
entry["properties"]["backColor"]["solid"]["color"]["expr"]
|
||||
["Conditional"]["Cases"][0]["Condition"]["Comparison"]["ComparisonKind"]
|
||||
)
|
||||
assert kind == 5 # lte
|
||||
|
||||
|
||||
def test_format_background_conditional_invalid_comparison(
|
||||
report_with_visual: Path,
|
||||
) -> None:
|
||||
"""An unknown comparison string raises PbiCliError."""
|
||||
with pytest.raises(PbiCliError):
|
||||
format_background_conditional(
|
||||
report_with_visual, PAGE_NAME, VISUAL_NAME,
|
||||
input_table="financials", input_column="Units Sold",
|
||||
threshold=100, color_hex="#000000",
|
||||
comparison="between",
|
||||
)
|
||||
214
tests/test_hardening.py
Normal file
214
tests/test_hardening.py
Normal file
|
|
@ -0,0 +1,214 @@
|
|||
"""Tests for PBIR report layer hardening fixes."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.pbir_path import _find_from_pbip
|
||||
from pbi_cli.core.report_backend import report_convert, report_create
|
||||
from pbi_cli.core.visual_backend import (
|
||||
visual_add,
|
||||
visual_bind,
|
||||
)
|
||||
|
||||
|
||||
def _write(path: Path, data: dict) -> None:
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
path.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def report_with_page(tmp_path: Path) -> Path:
|
||||
"""Build a minimal PBIR report with one page."""
|
||||
defn = tmp_path / "Test.Report" / "definition"
|
||||
defn.mkdir(parents=True)
|
||||
_write(defn / "version.json", {"$schema": "...", "version": "2.0.0"})
|
||||
_write(defn / "report.json", {
|
||||
"$schema": "...",
|
||||
"themeCollection": {"baseTheme": {"name": "CY24SU06"}},
|
||||
"layoutOptimization": "None",
|
||||
})
|
||||
_write(defn / "pages" / "pages.json", {
|
||||
"$schema": "...",
|
||||
"pageOrder": ["test_page"],
|
||||
"activePageName": "test_page",
|
||||
})
|
||||
page_dir = defn / "pages" / "test_page"
|
||||
page_dir.mkdir(parents=True)
|
||||
_write(page_dir / "page.json", {
|
||||
"$schema": "...",
|
||||
"name": "test_page",
|
||||
"displayName": "Test Page",
|
||||
"displayOption": "FitToPage",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
})
|
||||
(page_dir / "visuals").mkdir()
|
||||
return defn
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fix #1: Measure detection via role heuristic
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestMeasureDetection:
|
||||
def test_value_role_creates_measure_ref(self, report_with_page: Path) -> None:
|
||||
"""--value bindings should produce Measure references, not Column."""
|
||||
visual_add(report_with_page, "test_page", "bar_chart", name="chart1")
|
||||
visual_bind(
|
||||
report_with_page, "test_page", "chart1",
|
||||
bindings=[{"role": "value", "field": "Sales[Amount]"}],
|
||||
)
|
||||
vfile = report_with_page / "pages" / "test_page" / "visuals" / "chart1" / "visual.json"
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
proj = data["visual"]["query"]["queryState"]["Y"]["projections"][0]
|
||||
assert "Measure" in proj["field"]
|
||||
assert "Column" not in proj["field"]
|
||||
|
||||
def test_category_role_creates_column_ref(self, report_with_page: Path) -> None:
|
||||
"""--category bindings should produce Column references."""
|
||||
visual_add(report_with_page, "test_page", "bar_chart", name="chart2")
|
||||
visual_bind(
|
||||
report_with_page, "test_page", "chart2",
|
||||
bindings=[{"role": "category", "field": "Date[Year]"}],
|
||||
)
|
||||
vfile = report_with_page / "pages" / "test_page" / "visuals" / "chart2" / "visual.json"
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
proj = data["visual"]["query"]["queryState"]["Category"]["projections"][0]
|
||||
assert "Column" in proj["field"]
|
||||
assert "Measure" not in proj["field"]
|
||||
|
||||
def test_field_role_on_card_creates_measure(self, report_with_page: Path) -> None:
|
||||
"""--field on card should be a Measure (Values role is the correct Desktop key)."""
|
||||
visual_add(report_with_page, "test_page", "card", name="card1")
|
||||
visual_bind(
|
||||
report_with_page, "test_page", "card1",
|
||||
bindings=[{"role": "field", "field": "Sales[Revenue]"}],
|
||||
)
|
||||
vfile = report_with_page / "pages" / "test_page" / "visuals" / "card1" / "visual.json"
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
proj = data["visual"]["query"]["queryState"]["Values"]["projections"][0]
|
||||
assert "Measure" in proj["field"]
|
||||
|
||||
def test_explicit_measure_flag_override(self, report_with_page: Path) -> None:
|
||||
"""Explicit measure=True forces Measure even on category role."""
|
||||
visual_add(report_with_page, "test_page", "bar_chart", name="chart3")
|
||||
visual_bind(
|
||||
report_with_page, "test_page", "chart3",
|
||||
bindings=[{"role": "category", "field": "Sales[Calc]", "measure": True}],
|
||||
)
|
||||
vfile = report_with_page / "pages" / "test_page" / "visuals" / "chart3" / "visual.json"
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
proj = data["visual"]["query"]["queryState"]["Category"]["projections"][0]
|
||||
assert "Measure" in proj["field"]
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fix #2: visual_bind merges with existing bindings
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestBindMerge:
|
||||
def test_second_bind_preserves_first(self, report_with_page: Path) -> None:
|
||||
"""Calling bind twice should keep all bindings."""
|
||||
visual_add(report_with_page, "test_page", "bar_chart", name="merged")
|
||||
|
||||
# First bind: category
|
||||
visual_bind(
|
||||
report_with_page, "test_page", "merged",
|
||||
bindings=[{"role": "category", "field": "Date[Year]"}],
|
||||
)
|
||||
|
||||
# Second bind: value
|
||||
visual_bind(
|
||||
report_with_page, "test_page", "merged",
|
||||
bindings=[{"role": "value", "field": "Sales[Amount]"}],
|
||||
)
|
||||
|
||||
vfile = report_with_page / "pages" / "test_page" / "visuals" / "merged" / "visual.json"
|
||||
data = json.loads(vfile.read_text(encoding="utf-8"))
|
||||
query = data["visual"]["query"]
|
||||
|
||||
# Both roles should have projections
|
||||
assert len(query["queryState"]["Category"]["projections"]) == 1
|
||||
assert len(query["queryState"]["Y"]["projections"]) == 1
|
||||
|
||||
# Commands block should have both From entities
|
||||
cmds = query["Commands"][0]["SemanticQueryDataShapeCommand"]["Query"]
|
||||
from_names = {e["Entity"] for e in cmds["From"]}
|
||||
assert "Date" in from_names
|
||||
assert "Sales" in from_names
|
||||
|
||||
# Commands Select should have both fields
|
||||
assert len(cmds["Select"]) == 2
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fix #3: Table names with spaces
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestFieldRefParsing:
|
||||
def test_table_with_spaces(self, report_with_page: Path) -> None:
|
||||
"""Table[Column] notation should work with spaces in table name."""
|
||||
visual_add(report_with_page, "test_page", "bar_chart", name="spaces")
|
||||
result = visual_bind(
|
||||
report_with_page, "test_page", "spaces",
|
||||
bindings=[{"role": "category", "field": "Sales Table[Region Name]"}],
|
||||
)
|
||||
assert result["bindings"][0]["query_ref"] == "Sales Table.Region Name"
|
||||
|
||||
def test_simple_names(self, report_with_page: Path) -> None:
|
||||
"""Standard Table[Column] still works."""
|
||||
visual_add(report_with_page, "test_page", "bar_chart", name="simple")
|
||||
result = visual_bind(
|
||||
report_with_page, "test_page", "simple",
|
||||
bindings=[{"role": "category", "field": "Date[Year]"}],
|
||||
)
|
||||
assert result["bindings"][0]["query_ref"] == "Date.Year"
|
||||
|
||||
def test_invalid_format_raises(self, report_with_page: Path) -> None:
|
||||
"""Missing brackets should raise PbiCliError."""
|
||||
visual_add(report_with_page, "test_page", "card", name="bad")
|
||||
with pytest.raises(PbiCliError, match="Table\\[Column\\]"):
|
||||
visual_bind(
|
||||
report_with_page, "test_page", "bad",
|
||||
bindings=[{"role": "field", "field": "JustAName"}],
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fix #4: _find_from_pbip guard
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPbipGuard:
|
||||
def test_nonexistent_dir_returns_none(self, tmp_path: Path) -> None:
|
||||
result = _find_from_pbip(tmp_path / "does_not_exist")
|
||||
assert result is None
|
||||
|
||||
def test_file_instead_of_dir_returns_none(self, tmp_path: Path) -> None:
|
||||
f = tmp_path / "afile.txt"
|
||||
f.write_text("x")
|
||||
result = _find_from_pbip(f)
|
||||
assert result is None
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fix #9: report_convert overwrite guard
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestConvertGuard:
|
||||
def test_convert_blocks_overwrite(self, tmp_path: Path) -> None:
|
||||
"""Second convert without --force should raise."""
|
||||
report_create(tmp_path, "MyReport")
|
||||
# First convert works (pbip already exists from create, so it should block)
|
||||
with pytest.raises(PbiCliError, match="already exists"):
|
||||
report_convert(tmp_path, force=False)
|
||||
|
||||
def test_convert_force_allows_overwrite(self, tmp_path: Path) -> None:
|
||||
"""--force should allow overwriting existing .pbip."""
|
||||
report_create(tmp_path, "MyReport")
|
||||
result = report_convert(tmp_path, force=True)
|
||||
assert result["status"] == "converted"
|
||||
411
tests/test_pbir_path.py
Normal file
411
tests/test_pbir_path.py
Normal file
|
|
@ -0,0 +1,411 @@
|
|||
"""Tests for pbi_cli.core.pbir_path."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.errors import ReportNotFoundError
|
||||
from pbi_cli.core.pbir_path import (
|
||||
get_page_dir,
|
||||
get_pages_dir,
|
||||
get_visual_dir,
|
||||
get_visuals_dir,
|
||||
resolve_report_path,
|
||||
validate_report_structure,
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
_REPORT_JSON = json.dumps({
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/report/1.0.0/schema.json"
|
||||
})
|
||||
_VERSION_JSON = json.dumps({
|
||||
"$schema": "https://developer.microsoft.com/json-schemas/"
|
||||
"fabric/item/report/definition/version/1.0.0/schema.json",
|
||||
"version": "1.0.0",
|
||||
})
|
||||
|
||||
|
||||
def scaffold_valid_pbir(root: Path, report_name: str = "MyReport") -> Path:
|
||||
"""Create a minimal valid PBIR structure under *root*.
|
||||
|
||||
Returns the ``definition/`` path so tests can use it directly.
|
||||
|
||||
Structure created::
|
||||
|
||||
root/
|
||||
MyReport.Report/
|
||||
definition/
|
||||
report.json
|
||||
version.json
|
||||
"""
|
||||
definition = root / f"{report_name}.Report" / "definition"
|
||||
definition.mkdir(parents=True)
|
||||
(definition / "report.json").write_text(_REPORT_JSON)
|
||||
(definition / "version.json").write_text(_VERSION_JSON)
|
||||
return definition
|
||||
|
||||
|
||||
def add_page(definition: Path, page_name: str, *, with_page_json: bool = True) -> Path:
|
||||
"""Add a page directory inside *definition*/pages/ and return the page dir."""
|
||||
page_dir = definition / "pages" / page_name
|
||||
page_dir.mkdir(parents=True, exist_ok=True)
|
||||
if with_page_json:
|
||||
(page_dir / "page.json").write_text(json.dumps({"name": page_name}))
|
||||
return page_dir
|
||||
|
||||
|
||||
def add_visual(
|
||||
definition: Path,
|
||||
page_name: str,
|
||||
visual_name: str,
|
||||
*,
|
||||
with_visual_json: bool = True,
|
||||
) -> Path:
|
||||
"""Add a visual directory and return the visual dir."""
|
||||
visual_dir = definition / "pages" / page_name / "visuals" / visual_name
|
||||
visual_dir.mkdir(parents=True, exist_ok=True)
|
||||
if with_visual_json:
|
||||
(visual_dir / "visual.json").write_text(json.dumps({"name": visual_name}))
|
||||
return visual_dir
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# resolve_report_path -- explicit path variants
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_resolve_explicit_definition_folder(tmp_path: Path) -> None:
|
||||
"""Pointing directly at the definition/ folder resolves correctly."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
|
||||
result = resolve_report_path(explicit_path=str(definition))
|
||||
|
||||
assert result == definition.resolve()
|
||||
|
||||
|
||||
def test_resolve_explicit_report_folder(tmp_path: Path) -> None:
|
||||
"""Pointing at the .Report/ folder resolves to its definition/ child."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
report_folder = definition.parent # MyReport.Report/
|
||||
|
||||
result = resolve_report_path(explicit_path=str(report_folder))
|
||||
|
||||
assert result == definition.resolve()
|
||||
|
||||
|
||||
def test_resolve_explicit_parent_folder(tmp_path: Path) -> None:
|
||||
"""Pointing at the folder containing .Report/ resolves correctly."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
|
||||
# tmp_path contains MyReport.Report/
|
||||
result = resolve_report_path(explicit_path=str(tmp_path))
|
||||
|
||||
assert result == definition.resolve()
|
||||
|
||||
|
||||
def test_resolve_explicit_not_found(tmp_path: Path) -> None:
|
||||
"""An explicit path with no PBIR content raises ReportNotFoundError."""
|
||||
empty_dir = tmp_path / "not_a_report"
|
||||
empty_dir.mkdir()
|
||||
|
||||
with pytest.raises(ReportNotFoundError):
|
||||
resolve_report_path(explicit_path=str(empty_dir))
|
||||
|
||||
|
||||
def test_resolve_explicit_nonexistent_path(tmp_path: Path) -> None:
|
||||
"""A path that does not exist on disk raises ReportNotFoundError."""
|
||||
ghost = tmp_path / "ghost_folder"
|
||||
|
||||
with pytest.raises(ReportNotFoundError):
|
||||
resolve_report_path(explicit_path=str(ghost))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# resolve_report_path -- CWD walk-up detection
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_resolve_walkup_from_cwd(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
"""Walk-up detection finds .Report/definition when CWD is a child dir."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
nested_cwd = tmp_path / "deep" / "nested"
|
||||
nested_cwd.mkdir(parents=True)
|
||||
monkeypatch.chdir(nested_cwd)
|
||||
|
||||
result = resolve_report_path()
|
||||
|
||||
assert result == definition.resolve()
|
||||
|
||||
|
||||
def test_resolve_walkup_from_report_root(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
"""Walk-up detection works when CWD is already the project root."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
monkeypatch.chdir(tmp_path)
|
||||
|
||||
result = resolve_report_path()
|
||||
|
||||
assert result == definition.resolve()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# resolve_report_path -- .pbip sibling detection
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_resolve_pbip_sibling(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
"""A sibling .pbip file guides resolution when no .Report is in the walk-up."""
|
||||
# Place .pbip in an isolated directory (no .Report parent chain)
|
||||
project_dir = tmp_path / "workspace"
|
||||
project_dir.mkdir()
|
||||
(project_dir / "MyReport.pbip").write_text("{}")
|
||||
definition = project_dir / "MyReport.Report" / "definition"
|
||||
definition.mkdir(parents=True)
|
||||
(definition / "report.json").write_text(_REPORT_JSON)
|
||||
(definition / "version.json").write_text(_VERSION_JSON)
|
||||
monkeypatch.chdir(project_dir)
|
||||
|
||||
result = resolve_report_path()
|
||||
|
||||
assert result == definition.resolve()
|
||||
|
||||
|
||||
def test_resolve_no_report_anywhere_raises(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
"""When no PBIR structure is discoverable, ReportNotFoundError is raised."""
|
||||
monkeypatch.chdir(tmp_path)
|
||||
|
||||
with pytest.raises(ReportNotFoundError):
|
||||
resolve_report_path()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# get_pages_dir
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_get_pages_dir_creates_missing_dir(tmp_path: Path) -> None:
|
||||
"""get_pages_dir creates the pages/ folder when it does not exist."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
pages = definition / "pages"
|
||||
assert not pages.exists()
|
||||
|
||||
result = get_pages_dir(definition)
|
||||
|
||||
assert result == pages
|
||||
assert pages.is_dir()
|
||||
|
||||
|
||||
def test_get_pages_dir_idempotent(tmp_path: Path) -> None:
|
||||
"""get_pages_dir does not raise when pages/ already exists."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
(definition / "pages").mkdir()
|
||||
|
||||
result = get_pages_dir(definition)
|
||||
|
||||
assert result.is_dir()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# get_page_dir
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_get_page_dir_returns_correct_path(tmp_path: Path) -> None:
|
||||
"""get_page_dir returns the expected path without creating it."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
|
||||
result = get_page_dir(definition, "SalesOverview")
|
||||
|
||||
assert result == definition / "pages" / "SalesOverview"
|
||||
|
||||
|
||||
def test_get_page_dir_does_not_create_dir(tmp_path: Path) -> None:
|
||||
"""get_page_dir is a pure path computation -- it must not create the directory."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
|
||||
result = get_page_dir(definition, "NonExistentPage")
|
||||
|
||||
assert not result.exists()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# get_visuals_dir
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_get_visuals_dir_creates_missing_dirs(tmp_path: Path) -> None:
|
||||
"""get_visuals_dir creates pages/<page>/visuals/ when missing."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
visuals = definition / "pages" / "Page1" / "visuals"
|
||||
assert not visuals.exists()
|
||||
|
||||
result = get_visuals_dir(definition, "Page1")
|
||||
|
||||
assert result == visuals
|
||||
assert visuals.is_dir()
|
||||
|
||||
|
||||
def test_get_visuals_dir_idempotent(tmp_path: Path) -> None:
|
||||
"""get_visuals_dir does not raise when the visuals/ dir already exists."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
visuals = definition / "pages" / "Page1" / "visuals"
|
||||
visuals.mkdir(parents=True)
|
||||
|
||||
result = get_visuals_dir(definition, "Page1")
|
||||
|
||||
assert result.is_dir()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# get_visual_dir
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_get_visual_dir_returns_correct_path(tmp_path: Path) -> None:
|
||||
"""get_visual_dir returns the expected nested path without creating it."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
|
||||
result = get_visual_dir(definition, "Page1", "BarChart01")
|
||||
|
||||
assert result == definition / "pages" / "Page1" / "visuals" / "BarChart01"
|
||||
|
||||
|
||||
def test_get_visual_dir_does_not_create_dir(tmp_path: Path) -> None:
|
||||
"""get_visual_dir is a pure path computation -- it must not create the directory."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
|
||||
result = get_visual_dir(definition, "Page1", "Ghost")
|
||||
|
||||
assert not result.exists()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# validate_report_structure
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_validate_valid_structure_no_pages(tmp_path: Path) -> None:
|
||||
"""A minimal valid structure with no pages produces no errors."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert errors == []
|
||||
|
||||
|
||||
def test_validate_valid_structure_with_pages_and_visuals(tmp_path: Path) -> None:
|
||||
"""A fully valid structure with pages and visuals produces no errors."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
add_page(definition, "Page1")
|
||||
add_visual(definition, "Page1", "Visual01")
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert errors == []
|
||||
|
||||
|
||||
def test_validate_missing_report_json(tmp_path: Path) -> None:
|
||||
"""Absence of report.json is reported as an error."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
(definition / "report.json").unlink()
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert any("report.json" in e for e in errors)
|
||||
|
||||
|
||||
def test_validate_missing_version_json(tmp_path: Path) -> None:
|
||||
"""Absence of version.json is reported as an error."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
(definition / "version.json").unlink()
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert any("version.json" in e for e in errors)
|
||||
|
||||
|
||||
def test_validate_page_missing_page_json(tmp_path: Path) -> None:
|
||||
"""A page directory that lacks page.json is flagged."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
add_page(definition, "BadPage", with_page_json=False)
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert any("BadPage" in e and "page.json" in e for e in errors)
|
||||
|
||||
|
||||
def test_validate_visual_missing_visual_json(tmp_path: Path) -> None:
|
||||
"""A visual directory that lacks visual.json is flagged."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
add_page(definition, "Page1")
|
||||
add_visual(definition, "Page1", "BrokenVisual", with_visual_json=False)
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert any("BrokenVisual" in e and "visual.json" in e for e in errors)
|
||||
|
||||
|
||||
def test_validate_nonexistent_dir(tmp_path: Path) -> None:
|
||||
"""A definition path that does not exist on disk returns an error."""
|
||||
ghost = tmp_path / "does_not_exist" / "definition"
|
||||
|
||||
errors = validate_report_structure(ghost)
|
||||
|
||||
assert len(errors) == 1
|
||||
assert "does not exist" in errors[0].lower() or str(ghost) in errors[0]
|
||||
|
||||
|
||||
def test_validate_multiple_errors_reported(tmp_path: Path) -> None:
|
||||
"""Both report.json and version.json missing are returned together."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
(definition / "report.json").unlink()
|
||||
(definition / "version.json").unlink()
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert len(errors) == 2
|
||||
messages = " ".join(errors)
|
||||
assert "report.json" in messages
|
||||
assert "version.json" in messages
|
||||
|
||||
|
||||
def test_validate_multiple_page_errors(tmp_path: Path) -> None:
|
||||
"""Each page missing page.json produces a separate error entry."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
add_page(definition, "PageA", with_page_json=False)
|
||||
add_page(definition, "PageB", with_page_json=False)
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
page_errors = [e for e in errors if "page.json" in e]
|
||||
assert len(page_errors) == 2
|
||||
|
||||
|
||||
def test_validate_multiple_visual_errors(tmp_path: Path) -> None:
|
||||
"""Each visual missing visual.json produces a separate error entry."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
add_page(definition, "Page1")
|
||||
add_visual(definition, "Page1", "Vis1", with_visual_json=False)
|
||||
add_visual(definition, "Page1", "Vis2", with_visual_json=False)
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
visual_errors = [e for e in errors if "visual.json" in e]
|
||||
assert len(visual_errors) == 2
|
||||
|
||||
|
||||
def test_validate_valid_page_with_no_visuals_dir(tmp_path: Path) -> None:
|
||||
"""A page with no visuals/ sub-directory is still valid."""
|
||||
definition = scaffold_valid_pbir(tmp_path)
|
||||
add_page(definition, "Page1")
|
||||
# No visuals/ directory created -- that is fine
|
||||
|
||||
errors = validate_report_structure(definition)
|
||||
|
||||
assert errors == []
|
||||
297
tests/test_pbir_validators.py
Normal file
297
tests/test_pbir_validators.py
Normal file
|
|
@ -0,0 +1,297 @@
|
|||
"""Tests for enhanced PBIR validators."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.pbir_validators import (
|
||||
validate_bindings_against_model,
|
||||
validate_report_full,
|
||||
)
|
||||
|
||||
|
||||
def _write(path: Path, data: dict) -> None:
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
path.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def valid_report(tmp_path: Path) -> Path:
|
||||
"""Build a minimal valid PBIR report for validation tests."""
|
||||
defn = tmp_path / "Test.Report" / "definition"
|
||||
defn.mkdir(parents=True)
|
||||
|
||||
_write(defn / "version.json", {"$schema": "...", "version": "1.0.0"})
|
||||
_write(defn / "report.json", {
|
||||
"$schema": "...",
|
||||
"themeCollection": {"baseTheme": {"name": "CY24SU06"}},
|
||||
"layoutOptimization": "Disabled",
|
||||
})
|
||||
|
||||
page_dir = defn / "pages" / "page1"
|
||||
page_dir.mkdir(parents=True)
|
||||
_write(page_dir / "page.json", {
|
||||
"$schema": "...",
|
||||
"name": "page1",
|
||||
"displayName": "Page One",
|
||||
"displayOption": "FitToPage",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
"ordinal": 0,
|
||||
})
|
||||
_write(defn / "pages" / "pages.json", {
|
||||
"$schema": "...",
|
||||
"pageOrder": ["page1"],
|
||||
})
|
||||
|
||||
vis_dir = page_dir / "visuals" / "vis1"
|
||||
vis_dir.mkdir(parents=True)
|
||||
_write(vis_dir / "visual.json", {
|
||||
"$schema": "...",
|
||||
"name": "vis1",
|
||||
"position": {"x": 0, "y": 0, "width": 400, "height": 300},
|
||||
"visual": {"visualType": "barChart", "query": {}, "objects": {}},
|
||||
})
|
||||
|
||||
return defn
|
||||
|
||||
|
||||
class TestValidateReportFull:
|
||||
def test_valid_report_is_valid(self, valid_report: Path) -> None:
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is True
|
||||
assert result["summary"]["errors"] == 0
|
||||
|
||||
def test_valid_report_has_no_warnings(self, valid_report: Path) -> None:
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["summary"]["warnings"] == 0
|
||||
|
||||
def test_nonexistent_dir(self, tmp_path: Path) -> None:
|
||||
result = validate_report_full(tmp_path / "nope")
|
||||
assert result["valid"] is False
|
||||
assert result["summary"]["errors"] >= 1
|
||||
|
||||
def test_missing_report_json(self, valid_report: Path) -> None:
|
||||
(valid_report / "report.json").unlink()
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("report.json" in e["message"] for e in result["errors"])
|
||||
|
||||
def test_missing_version_json(self, valid_report: Path) -> None:
|
||||
(valid_report / "version.json").unlink()
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
|
||||
def test_invalid_json_syntax(self, valid_report: Path) -> None:
|
||||
(valid_report / "report.json").write_text("{bad json", encoding="utf-8")
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("Invalid JSON" in e["message"] for e in result["errors"])
|
||||
|
||||
def test_missing_theme_collection(self, valid_report: Path) -> None:
|
||||
_write(valid_report / "report.json", {
|
||||
"$schema": "...",
|
||||
"layoutOptimization": "Disabled",
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("themeCollection" in e["message"] for e in result["errors"])
|
||||
|
||||
def test_missing_layout_optimization(self, valid_report: Path) -> None:
|
||||
_write(valid_report / "report.json", {
|
||||
"$schema": "...",
|
||||
"themeCollection": {"baseTheme": {"name": "CY24SU06"}},
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("layoutOptimization" in e["message"] for e in result["errors"])
|
||||
|
||||
def test_page_missing_required_fields(self, valid_report: Path) -> None:
|
||||
_write(valid_report / "pages" / "page1" / "page.json", {
|
||||
"$schema": "...",
|
||||
"name": "page1",
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("displayName" in e["message"] for e in result["errors"])
|
||||
assert any("displayOption" in e["message"] for e in result["errors"])
|
||||
|
||||
def test_page_invalid_display_option(self, valid_report: Path) -> None:
|
||||
_write(valid_report / "pages" / "page1" / "page.json", {
|
||||
"$schema": "...",
|
||||
"name": "page1",
|
||||
"displayName": "P1",
|
||||
"displayOption": "InvalidOption",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert any("Unknown displayOption" in w["message"] for w in result["warnings"])
|
||||
|
||||
def test_visual_missing_position(self, valid_report: Path) -> None:
|
||||
vis_path = valid_report / "pages" / "page1" / "visuals" / "vis1" / "visual.json"
|
||||
_write(vis_path, {
|
||||
"$schema": "...",
|
||||
"name": "vis1",
|
||||
"visual": {"visualType": "barChart"},
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("position" in e["message"] for e in result["errors"])
|
||||
|
||||
def test_visual_missing_name(self, valid_report: Path) -> None:
|
||||
vis_path = valid_report / "pages" / "page1" / "visuals" / "vis1" / "visual.json"
|
||||
_write(vis_path, {
|
||||
"$schema": "...",
|
||||
"position": {"x": 0, "y": 0, "width": 100, "height": 100},
|
||||
"visual": {"visualType": "card"},
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("name" in e["message"] for e in result["errors"])
|
||||
|
||||
|
||||
class TestPageOrderConsistency:
|
||||
def test_phantom_page_in_order(self, valid_report: Path) -> None:
|
||||
_write(valid_report / "pages" / "pages.json", {
|
||||
"$schema": "...",
|
||||
"pageOrder": ["page1", "ghost_page"],
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert any("ghost_page" in w["message"] for w in result["warnings"])
|
||||
|
||||
def test_unlisted_page_info(self, valid_report: Path) -> None:
|
||||
page2 = valid_report / "pages" / "page2"
|
||||
page2.mkdir(parents=True)
|
||||
_write(page2 / "page.json", {
|
||||
"$schema": "...",
|
||||
"name": "page2",
|
||||
"displayName": "Page Two",
|
||||
"displayOption": "FitToPage",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert any("page2" in i["message"] and "not listed" in i["message"] for i in result["info"])
|
||||
|
||||
|
||||
class TestVisualNameUniqueness:
|
||||
def test_duplicate_visual_names(self, valid_report: Path) -> None:
|
||||
vis2_dir = valid_report / "pages" / "page1" / "visuals" / "vis2"
|
||||
vis2_dir.mkdir(parents=True)
|
||||
_write(vis2_dir / "visual.json", {
|
||||
"$schema": "...",
|
||||
"name": "vis1", # Duplicate of vis1
|
||||
"position": {"x": 0, "y": 0, "width": 100, "height": 100},
|
||||
"visual": {"visualType": "card"},
|
||||
})
|
||||
result = validate_report_full(valid_report)
|
||||
assert result["valid"] is False
|
||||
assert any("Duplicate visual name" in e["message"] for e in result["errors"])
|
||||
|
||||
|
||||
class TestBindingsAgainstModel:
|
||||
def test_valid_binding_passes(self, valid_report: Path) -> None:
|
||||
vis_path = valid_report / "pages" / "page1" / "visuals" / "vis1" / "visual.json"
|
||||
_write(vis_path, {
|
||||
"$schema": "...",
|
||||
"name": "vis1",
|
||||
"position": {"x": 0, "y": 0, "width": 400, "height": 300},
|
||||
"visual": {
|
||||
"visualType": "barChart",
|
||||
"query": {
|
||||
"Commands": [{
|
||||
"SemanticQueryDataShapeCommand": {
|
||||
"Query": {
|
||||
"Version": 2,
|
||||
"From": [{"Name": "s", "Entity": "Sales", "Type": 0}],
|
||||
"Select": [{
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Source": "s"}},
|
||||
"Property": "Region",
|
||||
},
|
||||
"Name": "s.Region",
|
||||
}],
|
||||
}
|
||||
}
|
||||
}],
|
||||
},
|
||||
},
|
||||
})
|
||||
model = [{"name": "Sales", "columns": [{"name": "Region"}], "measures": []}]
|
||||
findings = validate_bindings_against_model(valid_report, model)
|
||||
assert len(findings) == 0
|
||||
|
||||
def test_invalid_binding_warns(self, valid_report: Path) -> None:
|
||||
vis_path = valid_report / "pages" / "page1" / "visuals" / "vis1" / "visual.json"
|
||||
_write(vis_path, {
|
||||
"$schema": "...",
|
||||
"name": "vis1",
|
||||
"position": {"x": 0, "y": 0, "width": 400, "height": 300},
|
||||
"visual": {
|
||||
"visualType": "barChart",
|
||||
"query": {
|
||||
"Commands": [{
|
||||
"SemanticQueryDataShapeCommand": {
|
||||
"Query": {
|
||||
"Version": 2,
|
||||
"From": [{"Name": "s", "Entity": "Sales", "Type": 0}],
|
||||
"Select": [{
|
||||
"Column": {
|
||||
"Expression": {"SourceRef": {"Source": "s"}},
|
||||
"Property": "NonExistent",
|
||||
},
|
||||
"Name": "s.NonExistent",
|
||||
}],
|
||||
}
|
||||
}
|
||||
}],
|
||||
},
|
||||
},
|
||||
})
|
||||
model = [{"name": "Sales", "columns": [{"name": "Region"}], "measures": []}]
|
||||
findings = validate_bindings_against_model(valid_report, model)
|
||||
assert len(findings) == 1
|
||||
assert findings[0].level == "warning"
|
||||
assert "NonExistent" in findings[0].message
|
||||
|
||||
def test_measure_binding(self, valid_report: Path) -> None:
|
||||
vis_path = valid_report / "pages" / "page1" / "visuals" / "vis1" / "visual.json"
|
||||
_write(vis_path, {
|
||||
"$schema": "...",
|
||||
"name": "vis1",
|
||||
"position": {"x": 0, "y": 0, "width": 400, "height": 300},
|
||||
"visual": {
|
||||
"visualType": "card",
|
||||
"query": {
|
||||
"Commands": [{
|
||||
"SemanticQueryDataShapeCommand": {
|
||||
"Query": {
|
||||
"Version": 2,
|
||||
"From": [{"Name": "s", "Entity": "Sales", "Type": 0}],
|
||||
"Select": [{
|
||||
"Measure": {
|
||||
"Expression": {"SourceRef": {"Source": "s"}},
|
||||
"Property": "Total Revenue",
|
||||
},
|
||||
"Name": "s.Total Revenue",
|
||||
}],
|
||||
}
|
||||
}
|
||||
}],
|
||||
},
|
||||
},
|
||||
})
|
||||
model = [{"name": "Sales", "columns": [], "measures": [{"name": "Total Revenue"}]}]
|
||||
findings = validate_bindings_against_model(valid_report, model)
|
||||
assert len(findings) == 0
|
||||
|
||||
def test_no_commands_is_ok(self, valid_report: Path) -> None:
|
||||
findings = validate_bindings_against_model(
|
||||
valid_report,
|
||||
[{"name": "Sales", "columns": [], "measures": []}],
|
||||
)
|
||||
assert len(findings) == 0
|
||||
198
tests/test_preview.py
Normal file
198
tests/test_preview.py
Normal file
|
|
@ -0,0 +1,198 @@
|
|||
"""Tests for PBIR preview renderer and file watcher."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import threading
|
||||
import time
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.preview.renderer import render_page, render_report
|
||||
from pbi_cli.preview.watcher import PbirWatcher
|
||||
|
||||
|
||||
def _write(path: Path, data: dict) -> None:
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
path.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def preview_report(tmp_path: Path) -> Path:
|
||||
"""Build a PBIR report suitable for preview rendering."""
|
||||
defn = tmp_path / "Test.Report" / "definition"
|
||||
defn.mkdir(parents=True)
|
||||
|
||||
_write(defn / "report.json", {
|
||||
"$schema": "...",
|
||||
"themeCollection": {"baseTheme": {"name": "CY24SU06"}},
|
||||
"layoutOptimization": "Disabled",
|
||||
})
|
||||
_write(defn / "version.json", {"$schema": "...", "version": "1.0.0"})
|
||||
_write(defn / "pages" / "pages.json", {
|
||||
"$schema": "...",
|
||||
"pageOrder": ["overview"],
|
||||
})
|
||||
|
||||
page_dir = defn / "pages" / "overview"
|
||||
page_dir.mkdir(parents=True)
|
||||
_write(page_dir / "page.json", {
|
||||
"$schema": "...",
|
||||
"name": "overview",
|
||||
"displayName": "Executive Overview",
|
||||
"displayOption": "FitToPage",
|
||||
"width": 1280,
|
||||
"height": 720,
|
||||
"ordinal": 0,
|
||||
})
|
||||
|
||||
# Bar chart visual
|
||||
bar_dir = page_dir / "visuals" / "bar1"
|
||||
bar_dir.mkdir(parents=True)
|
||||
_write(bar_dir / "visual.json", {
|
||||
"$schema": "...",
|
||||
"name": "bar1",
|
||||
"position": {"x": 50, "y": 50, "width": 400, "height": 300, "z": 0},
|
||||
"visual": {
|
||||
"visualType": "barChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Category": {"projections": [{"queryRef": "g.Region", "field": {}}]},
|
||||
"Y": {"projections": [{"queryRef": "s.Amount", "field": {}}]},
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
# Card visual
|
||||
card_dir = page_dir / "visuals" / "card1"
|
||||
card_dir.mkdir(parents=True)
|
||||
_write(card_dir / "visual.json", {
|
||||
"$schema": "...",
|
||||
"name": "card1",
|
||||
"position": {"x": 500, "y": 50, "width": 200, "height": 120, "z": 1},
|
||||
"visual": {
|
||||
"visualType": "card",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Fields": {"projections": [{"queryRef": "s.Revenue", "field": {}}]},
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
return defn
|
||||
|
||||
|
||||
class TestRenderReport:
|
||||
def test_renders_html(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "<!DOCTYPE html>" in html
|
||||
|
||||
def test_includes_theme(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "CY24SU06" in html
|
||||
|
||||
def test_includes_page_title(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "Executive Overview" in html
|
||||
|
||||
def test_includes_visual_types(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "barChart" in html
|
||||
assert "card" in html
|
||||
|
||||
def test_includes_bar_chart_svg(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "<rect" in html # Bar chart renders SVG rects
|
||||
|
||||
def test_includes_card_value(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "card-value" in html
|
||||
|
||||
def test_includes_binding_refs(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "g.Region" in html or "s.Amount" in html
|
||||
|
||||
def test_includes_websocket_script(self, preview_report: Path) -> None:
|
||||
html = render_report(preview_report)
|
||||
assert "WebSocket" in html
|
||||
|
||||
def test_empty_report(self, tmp_path: Path) -> None:
|
||||
defn = tmp_path / "Empty.Report" / "definition"
|
||||
defn.mkdir(parents=True)
|
||||
_write(defn / "report.json", {
|
||||
"$schema": "...",
|
||||
"themeCollection": {"baseTheme": {"name": "Default"}},
|
||||
"layoutOptimization": "Disabled",
|
||||
})
|
||||
html = render_report(defn)
|
||||
assert "No pages" in html
|
||||
|
||||
|
||||
class TestRenderPage:
|
||||
def test_renders_single_page(self, preview_report: Path) -> None:
|
||||
html = render_page(preview_report, "overview")
|
||||
assert "Executive Overview" in html
|
||||
assert "barChart" in html
|
||||
|
||||
def test_page_not_found(self, preview_report: Path) -> None:
|
||||
html = render_page(preview_report, "nonexistent")
|
||||
assert "not found" in html
|
||||
|
||||
|
||||
class TestPbirWatcher:
|
||||
def test_detects_file_change(self, preview_report: Path) -> None:
|
||||
changes: list[bool] = []
|
||||
|
||||
def on_change() -> None:
|
||||
changes.append(True)
|
||||
|
||||
watcher = PbirWatcher(preview_report, on_change, interval=0.1)
|
||||
|
||||
# Start watcher in background
|
||||
thread = threading.Thread(target=watcher.start, daemon=True)
|
||||
thread.start()
|
||||
|
||||
# Wait for initial snapshot
|
||||
time.sleep(0.3)
|
||||
|
||||
# Modify a file
|
||||
report_json = preview_report / "report.json"
|
||||
data = json.loads(report_json.read_text(encoding="utf-8"))
|
||||
data["layoutOptimization"] = "Mobile"
|
||||
report_json.write_text(json.dumps(data), encoding="utf-8")
|
||||
|
||||
# Wait for detection
|
||||
time.sleep(0.5)
|
||||
watcher.stop()
|
||||
thread.join(timeout=2)
|
||||
|
||||
assert len(changes) >= 1
|
||||
|
||||
def test_no_false_positives(self, preview_report: Path) -> None:
|
||||
changes: list[bool] = []
|
||||
|
||||
def on_change() -> None:
|
||||
changes.append(True)
|
||||
|
||||
watcher = PbirWatcher(preview_report, on_change, interval=0.1)
|
||||
thread = threading.Thread(target=watcher.start, daemon=True)
|
||||
thread.start()
|
||||
|
||||
# Wait without changing anything
|
||||
time.sleep(0.5)
|
||||
watcher.stop()
|
||||
thread.join(timeout=2)
|
||||
|
||||
assert len(changes) == 0
|
||||
|
||||
def test_stop_terminates(self, preview_report: Path) -> None:
|
||||
watcher = PbirWatcher(preview_report, lambda: None, interval=0.1)
|
||||
thread = threading.Thread(target=watcher.start, daemon=True)
|
||||
thread.start()
|
||||
time.sleep(0.2)
|
||||
watcher.stop()
|
||||
thread.join(timeout=2)
|
||||
assert not thread.is_alive()
|
||||
1112
tests/test_report_backend.py
Normal file
1112
tests/test_report_backend.py
Normal file
File diff suppressed because it is too large
Load diff
109
tests/test_skill_triggering.py
Normal file
109
tests/test_skill_triggering.py
Normal file
|
|
@ -0,0 +1,109 @@
|
|||
"""Skill triggering evaluation -- verify prompts match expected skills.
|
||||
|
||||
This is NOT a pytest test. Run directly:
|
||||
python tests/test_skill_triggering.py
|
||||
|
||||
Uses keyword-based scoring to simulate which skill description best matches
|
||||
each user prompt, without requiring an LLM call.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import importlib.resources
|
||||
import re
|
||||
|
||||
import yaml
|
||||
|
||||
|
||||
def _load_skills() -> dict[str, str]:
|
||||
"""Load all skill names and descriptions from bundled skills."""
|
||||
skills_pkg = importlib.resources.files("pbi_cli.skills")
|
||||
skills: dict[str, str] = {}
|
||||
for item in skills_pkg.iterdir():
|
||||
if item.is_dir() and (item / "SKILL.md").is_file():
|
||||
content = (item / "SKILL.md").read_text(encoding="utf-8")
|
||||
match = re.match(r"^---\n(.*?)\n---", content, re.DOTALL)
|
||||
if match:
|
||||
fm = yaml.safe_load(match.group(1))
|
||||
skills[item.name] = fm.get("description", "").lower()
|
||||
return skills
|
||||
|
||||
|
||||
def _score_prompt(prompt: str, description: str) -> int:
|
||||
"""Score how well a prompt matches a skill description using word overlap."""
|
||||
prompt_words = set(re.findall(r"[a-z]+", prompt.lower()))
|
||||
desc_words = set(re.findall(r"[a-z]+", description))
|
||||
# Weight longer matching words higher (domain terms matter more)
|
||||
score = 0
|
||||
for word in prompt_words & desc_words:
|
||||
if len(word) >= 5:
|
||||
score += 3
|
||||
elif len(word) >= 3:
|
||||
score += 1
|
||||
return score
|
||||
|
||||
|
||||
def _find_best_skill(prompt: str, skills: dict[str, str]) -> str:
|
||||
"""Find the skill with the highest keyword overlap score."""
|
||||
scores = {name: _score_prompt(prompt, desc) for name, desc in skills.items()}
|
||||
return max(scores, key=lambda k: scores[k])
|
||||
|
||||
|
||||
# Test cases: (prompt, expected_skill)
|
||||
TEST_CASES: list[tuple[str, str]] = [
|
||||
# power-bi-visuals
|
||||
("Add a bar chart to the overview page showing sales by region", "power-bi-visuals"),
|
||||
("I need to bind Sales[Revenue] to the value field on my KPI visual", "power-bi-visuals"),
|
||||
("What visual types does pbi-cli support? I need a scatter plot", "power-bi-visuals"),
|
||||
("Resize all the card visuals on the dashboard page to 200x120", "power-bi-visuals"),
|
||||
# power-bi-pages
|
||||
("Add a new page called Regional Detail to my report", "power-bi-pages"),
|
||||
("Hide the drillthrough page from the navigation bar", "power-bi-pages"),
|
||||
("Create a bookmark for the current executive view", "power-bi-pages"),
|
||||
# power-bi-themes
|
||||
("Apply our corporate brand colours to the entire report", "power-bi-themes"),
|
||||
(
|
||||
"I want conditional formatting on the revenue column green for high red for low",
|
||||
"power-bi-themes",
|
||||
),
|
||||
("Compare this new theme JSON against what is currently applied", "power-bi-themes"),
|
||||
# power-bi-filters
|
||||
("Filter the overview page to show only the top 10 products by revenue", "power-bi-filters"),
|
||||
("Add a date filter for the last 30 days on the Sales page", "power-bi-filters"),
|
||||
("What filters are currently on my dashboard page", "power-bi-filters"),
|
||||
# power-bi-report
|
||||
("Create a new PBIR report project for our sales dashboard", "power-bi-report"),
|
||||
("Validate the report structure to make sure everything is correct", "power-bi-report"),
|
||||
("Start the preview server so I can see the layout", "power-bi-report"),
|
||||
# Should NOT trigger report skills
|
||||
("Create a measure called Total Revenue equals SUM of Sales Amount", "power-bi-modeling"),
|
||||
("Export the semantic model to TMDL for version control", "power-bi-deployment"),
|
||||
("Set up row-level security for regional managers", "power-bi-security"),
|
||||
]
|
||||
|
||||
|
||||
def main() -> None:
|
||||
skills = _load_skills()
|
||||
passed = 0
|
||||
failed = 0
|
||||
|
||||
print(f"Testing {len(TEST_CASES)} prompts against {len(skills)} skills\n")
|
||||
print(f"{'#':<3} {'Result':<6} {'Expected':<22} {'Got':<22} Prompt")
|
||||
print("-" * 100)
|
||||
|
||||
for i, (prompt, expected) in enumerate(TEST_CASES, 1):
|
||||
got = _find_best_skill(prompt, skills)
|
||||
ok = got == expected
|
||||
status = "PASS" if ok else "FAIL"
|
||||
if ok:
|
||||
passed += 1
|
||||
else:
|
||||
failed += 1
|
||||
short_prompt = prompt[:45] + "..." if len(prompt) > 45 else prompt
|
||||
print(f"{i:<3} {status:<6} {expected:<22} {got:<22} {short_prompt}")
|
||||
|
||||
print(f"\n{passed}/{len(TEST_CASES)} passed, {failed} failed")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
386
tests/test_tmdl_diff.py
Normal file
386
tests/test_tmdl_diff.py
Normal file
|
|
@ -0,0 +1,386 @@
|
|||
"""Tests for pbi_cli.core.tmdl_diff."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.tmdl_diff import diff_tmdl_folders
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixture helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
_MODEL_TMDL = """\
|
||||
model Model
|
||||
\tculture: en-US
|
||||
\tdefaultPowerBIDataSourceVersion: powerBI_V3
|
||||
\tsourceQueryCulture: en-US
|
||||
|
||||
ref table Sales
|
||||
ref cultureInfo en-US
|
||||
"""
|
||||
|
||||
_RELATIONSHIPS_TMDL = """\
|
||||
relationship abc-def-111
|
||||
\tlineageTag: xyz
|
||||
\tfromColumn: Sales.ProductID
|
||||
\ttoColumn: Product.ProductID
|
||||
|
||||
relationship abc-def-222
|
||||
\tfromColumn: Sales.CustomerID
|
||||
\ttoColumn: Customer.CustomerID
|
||||
"""
|
||||
|
||||
_SALES_TMDL = """\
|
||||
table Sales
|
||||
\tlineageTag: tbl-001
|
||||
|
||||
\tmeasure 'Total Revenue' = SUM(Sales[Amount])
|
||||
\t\tformatString: "$#,0"
|
||||
\t\tlineageTag: msr-001
|
||||
|
||||
\tcolumn Amount
|
||||
\t\tdataType: decimal
|
||||
\t\tlineageTag: col-001
|
||||
\t\tsummarizeBy: sum
|
||||
\t\tsourceColumn: Amount
|
||||
|
||||
\tpartition Sales = m
|
||||
\t\tmode: import
|
||||
\t\tsource
|
||||
\t\t\tlet
|
||||
\t\t\t Source = Csv.Document(...)
|
||||
\t\t\tin
|
||||
\t\t\t Source
|
||||
"""
|
||||
|
||||
_DATE_TMDL = """\
|
||||
table Date
|
||||
\tlineageTag: tbl-002
|
||||
|
||||
\tcolumn Date
|
||||
\t\tdataType: dateTime
|
||||
\t\tlineageTag: col-002
|
||||
\t\tsummarizeBy: none
|
||||
\t\tsourceColumn: Date
|
||||
"""
|
||||
|
||||
# Inline TMDL snippets reused across multiple tests
|
||||
_NEW_MEASURE_SNIPPET = (
|
||||
"\n\tmeasure 'YTD Revenue'"
|
||||
" = CALCULATE([Total Revenue], DATESYTD('Date'[Date]))"
|
||||
"\n\t\tlineageTag: msr-new\n"
|
||||
)
|
||||
_TOTAL_REVENUE_BLOCK = (
|
||||
"\n\tmeasure 'Total Revenue' = SUM(Sales[Amount])"
|
||||
'\n\t\tformatString: "$#,0"'
|
||||
"\n\t\tlineageTag: msr-001\n"
|
||||
)
|
||||
_NEW_COL_SNIPPET = (
|
||||
"\n\tcolumn Region"
|
||||
"\n\t\tdataType: string"
|
||||
"\n\t\tsummarizeBy: none"
|
||||
"\n\t\tsourceColumn: Region\n"
|
||||
)
|
||||
_AMOUNT_COL_BLOCK = (
|
||||
"\n\tcolumn Amount"
|
||||
"\n\t\tdataType: decimal"
|
||||
"\n\t\tlineageTag: col-001"
|
||||
"\n\t\tsummarizeBy: sum"
|
||||
"\n\t\tsourceColumn: Amount\n"
|
||||
)
|
||||
_NEW_REL_SNIPPET = (
|
||||
"\nrelationship abc-def-999"
|
||||
"\n\tfromColumn: Sales.RegionID"
|
||||
"\n\ttoColumn: Region.ID\n"
|
||||
)
|
||||
_TRIMMED_RELS = (
|
||||
"relationship abc-def-111"
|
||||
"\n\tfromColumn: Sales.ProductID"
|
||||
"\n\ttoColumn: Product.ProductID\n"
|
||||
)
|
||||
_REL_222_BASE = (
|
||||
"relationship abc-def-222"
|
||||
"\n\tfromColumn: Sales.CustomerID"
|
||||
"\n\ttoColumn: Customer.CustomerID"
|
||||
)
|
||||
_REL_222_CHANGED = (
|
||||
"relationship abc-def-222"
|
||||
"\n\tfromColumn: Sales.CustomerID"
|
||||
"\n\ttoColumn: Customer.CustomerID"
|
||||
"\n\tcrossFilteringBehavior: bothDirections"
|
||||
)
|
||||
|
||||
|
||||
def _make_tmdl_folder(
|
||||
root: Path,
|
||||
*,
|
||||
model_text: str = _MODEL_TMDL,
|
||||
relationships_text: str = _RELATIONSHIPS_TMDL,
|
||||
tables: dict[str, str] | None = None,
|
||||
) -> Path:
|
||||
"""Create a minimal TMDL folder under root and return its path."""
|
||||
if tables is None:
|
||||
tables = {"Sales": _SALES_TMDL, "Date": _DATE_TMDL}
|
||||
root.mkdir(parents=True, exist_ok=True)
|
||||
(root / "model.tmdl").write_text(model_text, encoding="utf-8")
|
||||
(root / "database.tmdl").write_text("database\n\tcompatibilityLevel: 1600\n", encoding="utf-8")
|
||||
(root / "relationships.tmdl").write_text(relationships_text, encoding="utf-8")
|
||||
tables_dir = root / "tables"
|
||||
tables_dir.mkdir()
|
||||
for name, text in tables.items():
|
||||
(tables_dir / f"{name}.tmdl").write_text(text, encoding="utf-8")
|
||||
return root
|
||||
|
||||
|
||||
def _make_semantic_model_folder(
|
||||
root: Path,
|
||||
**kwargs: Any,
|
||||
) -> Path:
|
||||
"""Create a SemanticModel-layout folder (definition/ subdirectory)."""
|
||||
root.mkdir(parents=True, exist_ok=True)
|
||||
defn_dir = root / "definition"
|
||||
defn_dir.mkdir()
|
||||
_make_tmdl_folder(defn_dir, **kwargs)
|
||||
(root / ".platform").write_text("{}", encoding="utf-8")
|
||||
return root
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Tests
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
class TestDiffTmdlFolders:
|
||||
def test_identical_folders_returns_no_changes(self, tmp_path: Path) -> None:
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(tmp_path / "head")
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is False
|
||||
assert result["summary"]["tables_added"] == 0
|
||||
assert result["summary"]["tables_removed"] == 0
|
||||
assert result["summary"]["tables_changed"] == 0
|
||||
|
||||
def test_lineage_tag_only_change_is_not_reported(self, tmp_path: Path) -> None:
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
changed_sales = _SALES_TMDL.replace("tbl-001", "NEW-TAG").replace("msr-001", "NEW-MSR")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
tables={"Sales": changed_sales, "Date": _DATE_TMDL},
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is False
|
||||
|
||||
def test_table_added(self, tmp_path: Path) -> None:
|
||||
product_tmdl = "table Product\n\tlineageTag: tbl-003\n\n\tcolumn ID\n\t\tdataType: int64\n"
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
tables={"Sales": _SALES_TMDL, "Date": _DATE_TMDL, "Product": product_tmdl},
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is True
|
||||
assert "Product" in result["tables"]["added"]
|
||||
assert result["tables"]["removed"] == []
|
||||
|
||||
def test_table_removed(self, tmp_path: Path) -> None:
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(tmp_path / "head", tables={"Sales": _SALES_TMDL})
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert "Date" in result["tables"]["removed"]
|
||||
|
||||
def test_measure_added(self, tmp_path: Path) -> None:
|
||||
modified_sales = _SALES_TMDL + _NEW_MEASURE_SNIPPET
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
tables={"Sales": modified_sales, "Date": _DATE_TMDL},
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is True
|
||||
sales_diff = result["tables"]["changed"]["Sales"]
|
||||
assert "YTD Revenue" in sales_diff["measures_added"]
|
||||
|
||||
def test_measure_removed(self, tmp_path: Path) -> None:
|
||||
stripped_sales = _SALES_TMDL.replace(_TOTAL_REVENUE_BLOCK, "")
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
tables={"Sales": stripped_sales, "Date": _DATE_TMDL},
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
sales_diff = result["tables"]["changed"]["Sales"]
|
||||
assert "Total Revenue" in sales_diff["measures_removed"]
|
||||
|
||||
def test_measure_expression_changed(self, tmp_path: Path) -> None:
|
||||
modified_sales = _SALES_TMDL.replace(
|
||||
"measure 'Total Revenue' = SUM(Sales[Amount])",
|
||||
"measure 'Total Revenue' = SUMX(Sales, Sales[Amount] * Sales[Qty])",
|
||||
)
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
tables={"Sales": modified_sales, "Date": _DATE_TMDL},
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
sales_diff = result["tables"]["changed"]["Sales"]
|
||||
assert "Total Revenue" in sales_diff["measures_changed"]
|
||||
|
||||
def test_column_added(self, tmp_path: Path) -> None:
|
||||
modified_sales = _SALES_TMDL + _NEW_COL_SNIPPET
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
tables={"Sales": modified_sales, "Date": _DATE_TMDL},
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
sales_diff = result["tables"]["changed"]["Sales"]
|
||||
assert "Region" in sales_diff["columns_added"]
|
||||
|
||||
def test_column_removed(self, tmp_path: Path) -> None:
|
||||
stripped = _SALES_TMDL.replace(_AMOUNT_COL_BLOCK, "")
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
tables={"Sales": stripped, "Date": _DATE_TMDL},
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
sales_diff = result["tables"]["changed"]["Sales"]
|
||||
assert "Amount" in sales_diff["columns_removed"]
|
||||
|
||||
def test_relationship_added(self, tmp_path: Path) -> None:
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(
|
||||
tmp_path / "head",
|
||||
relationships_text=_RELATIONSHIPS_TMDL + _NEW_REL_SNIPPET,
|
||||
)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert "Sales.RegionID -> Region.ID" in result["relationships"]["added"]
|
||||
|
||||
def test_relationship_removed(self, tmp_path: Path) -> None:
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(tmp_path / "head", relationships_text=_TRIMMED_RELS)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert "Sales.CustomerID -> Customer.CustomerID" in result["relationships"]["removed"]
|
||||
|
||||
def test_relationship_changed(self, tmp_path: Path) -> None:
|
||||
changed_rels = _RELATIONSHIPS_TMDL.replace(_REL_222_BASE, _REL_222_CHANGED)
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(tmp_path / "head", relationships_text=changed_rels)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert "Sales.CustomerID -> Customer.CustomerID" in result["relationships"]["changed"]
|
||||
|
||||
def test_model_property_changed(self, tmp_path: Path) -> None:
|
||||
changed_model = _MODEL_TMDL.replace("culture: en-US", "culture: fr-FR")
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(tmp_path / "head", model_text=changed_model)
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["summary"]["model_changed"] is True
|
||||
assert any("culture" in p for p in result["model"]["changed_properties"])
|
||||
|
||||
def test_semantic_model_layout(self, tmp_path: Path) -> None:
|
||||
"""Handles the SemanticModel folder layout (definition/ subdirectory)."""
|
||||
base = _make_semantic_model_folder(tmp_path / "MyModel.SemanticModel.base")
|
||||
head = _make_semantic_model_folder(tmp_path / "MyModel.SemanticModel.head")
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is False
|
||||
|
||||
def test_missing_base_folder_raises(self, tmp_path: Path) -> None:
|
||||
head = _make_tmdl_folder(tmp_path / "head")
|
||||
with pytest.raises(PbiCliError, match="Base folder not found"):
|
||||
diff_tmdl_folders(str(tmp_path / "nonexistent"), str(head))
|
||||
|
||||
def test_missing_head_folder_raises(self, tmp_path: Path) -> None:
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
with pytest.raises(PbiCliError, match="Head folder not found"):
|
||||
diff_tmdl_folders(str(base), str(tmp_path / "nonexistent"))
|
||||
|
||||
def test_result_keys_present(self, tmp_path: Path) -> None:
|
||||
base = _make_tmdl_folder(tmp_path / "base")
|
||||
head = _make_tmdl_folder(tmp_path / "head")
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert "base" in result
|
||||
assert "head" in result
|
||||
assert "changed" in result
|
||||
assert "summary" in result
|
||||
assert "tables" in result
|
||||
assert "relationships" in result
|
||||
assert "model" in result
|
||||
|
||||
def test_no_relationships_file(self, tmp_path: Path) -> None:
|
||||
"""Handles missing relationships.tmdl gracefully."""
|
||||
base = _make_tmdl_folder(tmp_path / "base", relationships_text="")
|
||||
head = _make_tmdl_folder(tmp_path / "head", relationships_text="")
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["relationships"] == {"added": [], "removed": [], "changed": []}
|
||||
|
||||
def test_backtick_fenced_measure_parsed_correctly(self, tmp_path: Path) -> None:
|
||||
"""Backtick-triple fenced multi-line measures are parsed without errors."""
|
||||
backtick_sales = (
|
||||
"table Sales\n"
|
||||
"\tlineageTag: tbl-001\n"
|
||||
"\n"
|
||||
"\tmeasure CY_Orders = ```\n"
|
||||
"\t\t\n"
|
||||
"\t\tCALCULATE ( [#Orders] , YEAR('Date'[Date]) = YEAR(TODAY()) )\n"
|
||||
"\t\t```\n"
|
||||
"\t\tformatString: 0\n"
|
||||
"\t\tlineageTag: msr-backtick\n"
|
||||
"\n"
|
||||
"\tcolumn Amount\n"
|
||||
"\t\tdataType: decimal\n"
|
||||
"\t\tlineageTag: col-001\n"
|
||||
"\t\tsummarizeBy: sum\n"
|
||||
"\t\tsourceColumn: Amount\n"
|
||||
)
|
||||
base = _make_tmdl_folder(tmp_path / "base", tables={"Sales": backtick_sales})
|
||||
head = _make_tmdl_folder(tmp_path / "head", tables={"Sales": backtick_sales})
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is False
|
||||
|
||||
def test_backtick_fenced_measure_expression_changed(self, tmp_path: Path) -> None:
|
||||
"""A changed backtick-fenced measure expression is detected."""
|
||||
base_tmdl = (
|
||||
"table Sales\n"
|
||||
"\tlineageTag: tbl-001\n"
|
||||
"\n"
|
||||
"\tmeasure CY_Orders = ```\n"
|
||||
"\t\tCALCULATE ( [#Orders] , YEAR('Date'[Date]) = YEAR(TODAY()) )\n"
|
||||
"\t\t```\n"
|
||||
"\t\tlineageTag: msr-backtick\n"
|
||||
)
|
||||
head_tmdl = base_tmdl.replace(
|
||||
"CALCULATE ( [#Orders] , YEAR('Date'[Date]) = YEAR(TODAY()) )",
|
||||
"CALCULATE ( [#Orders] , 'Date'[Year] = YEAR(TODAY()) )",
|
||||
)
|
||||
base = _make_tmdl_folder(tmp_path / "base", tables={"Sales": base_tmdl})
|
||||
head = _make_tmdl_folder(tmp_path / "head", tables={"Sales": head_tmdl})
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is True
|
||||
assert "CY_Orders" in result["tables"]["changed"]["Sales"]["measures_changed"]
|
||||
|
||||
def test_variation_stays_inside_column_block(self, tmp_path: Path) -> None:
|
||||
"""Variation blocks at 1-tab indent are part of their parent column."""
|
||||
tmdl_with_variation = (
|
||||
"table Date\n"
|
||||
"\tlineageTag: tbl-date\n"
|
||||
"\n"
|
||||
"\tcolumn Date\n"
|
||||
"\t\tdataType: dateTime\n"
|
||||
"\t\tlineageTag: col-date\n"
|
||||
"\t\tsummarizeBy: none\n"
|
||||
"\t\tsourceColumn: Date\n"
|
||||
"\n"
|
||||
"\tvariation Variation\n"
|
||||
"\t\tisDefault\n"
|
||||
"\t\trelationship: abc-def-123\n"
|
||||
"\t\tdefaultHierarchy: LocalDateTable.Date Hierarchy\n"
|
||||
)
|
||||
base = _make_tmdl_folder(tmp_path / "base", tables={"Date": tmdl_with_variation})
|
||||
head = _make_tmdl_folder(tmp_path / "head", tables={"Date": tmdl_with_variation})
|
||||
result = diff_tmdl_folders(str(base), str(head))
|
||||
assert result["changed"] is False
|
||||
1067
tests/test_visual_backend.py
Normal file
1067
tests/test_visual_backend.py
Normal file
File diff suppressed because it is too large
Load diff
340
tests/test_visual_calc.py
Normal file
340
tests/test_visual_calc.py
Normal file
|
|
@ -0,0 +1,340 @@
|
|||
"""Tests for visual calculation functions in pbi_cli.core.visual_backend.
|
||||
|
||||
Covers visual_calc_add, visual_calc_list, visual_calc_delete against a minimal
|
||||
in-memory PBIR directory tree.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
from pbi_cli.core.errors import PbiCliError
|
||||
from pbi_cli.core.visual_backend import (
|
||||
visual_calc_add,
|
||||
visual_calc_delete,
|
||||
visual_calc_list,
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixture helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def _write_json(path: Path, data: dict[str, Any]) -> None:
|
||||
path.write_text(json.dumps(data, indent=2), encoding="utf-8")
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any]:
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def visual_on_page(tmp_path: Path) -> tuple[Path, str, str]:
|
||||
"""Build a minimal PBIR definition folder with one page and one visual.
|
||||
|
||||
Returns (definition_path, page_name, visual_name).
|
||||
|
||||
The visual has a minimal barChart structure with an empty Y queryState role.
|
||||
"""
|
||||
definition = tmp_path / "definition"
|
||||
definition.mkdir()
|
||||
|
||||
pages_dir = definition / "pages"
|
||||
pages_dir.mkdir()
|
||||
|
||||
page_dir = pages_dir / "test_page"
|
||||
page_dir.mkdir()
|
||||
|
||||
visuals_dir = page_dir / "visuals"
|
||||
visuals_dir.mkdir()
|
||||
|
||||
visual_dir = visuals_dir / "myvisual"
|
||||
visual_dir.mkdir()
|
||||
|
||||
_write_json(
|
||||
visual_dir / "visual.json",
|
||||
{
|
||||
"name": "myvisual",
|
||||
"position": {"x": 0, "y": 0, "width": 400, "height": 300, "z": 0},
|
||||
"visual": {
|
||||
"visualType": "barChart",
|
||||
"query": {
|
||||
"queryState": {
|
||||
"Y": {
|
||||
"projections": [
|
||||
{
|
||||
"field": {
|
||||
"Measure": {
|
||||
"Expression": {"SourceRef": {"Entity": "Sales"}},
|
||||
"Property": "Amount",
|
||||
}
|
||||
},
|
||||
"queryRef": "Sales.Amount",
|
||||
"nativeQueryRef": "Amount",
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
return definition, "test_page", "myvisual"
|
||||
|
||||
|
||||
def _vfile(definition: Path, page: str, visual: str) -> Path:
|
||||
return definition / "pages" / page / "visuals" / visual / "visual.json"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 1. visual_calc_add -- adds projection to role
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_add_appends_projection(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""visual_calc_add appends a NativeVisualCalculation projection to the role."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Sum of Sales])")
|
||||
|
||||
data = _read_json(_vfile(definition, page, visual))
|
||||
projections = data["visual"]["query"]["queryState"]["Y"]["projections"]
|
||||
# Original measure projection plus the new calc
|
||||
assert len(projections) == 2
|
||||
last = projections[-1]
|
||||
assert "NativeVisualCalculation" in last["field"]
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 2. Correct NativeVisualCalculation structure
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_add_correct_structure(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""Added projection has correct NativeVisualCalculation fields."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(
|
||||
definition, page, visual, "Running sum", "RUNNINGSUM([Sum of Sales])", role="Y"
|
||||
)
|
||||
|
||||
data = _read_json(_vfile(definition, page, visual))
|
||||
projections = data["visual"]["query"]["queryState"]["Y"]["projections"]
|
||||
nvc_proj = next(
|
||||
p for p in projections if "NativeVisualCalculation" in p.get("field", {})
|
||||
)
|
||||
nvc = nvc_proj["field"]["NativeVisualCalculation"]
|
||||
|
||||
assert nvc["Language"] == "dax"
|
||||
assert nvc["Expression"] == "RUNNINGSUM([Sum of Sales])"
|
||||
assert nvc["Name"] == "Running sum"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 3. queryRef is "select", nativeQueryRef equals calc_name
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_add_query_refs(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""queryRef is always 'select' and nativeQueryRef equals the calc name."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "My Calc", "RANK()")
|
||||
|
||||
data = _read_json(_vfile(definition, page, visual))
|
||||
projections = data["visual"]["query"]["queryState"]["Y"]["projections"]
|
||||
nvc_proj = next(
|
||||
p for p in projections if "NativeVisualCalculation" in p.get("field", {})
|
||||
)
|
||||
|
||||
assert nvc_proj["queryRef"] == "select"
|
||||
assert nvc_proj["nativeQueryRef"] == "My Calc"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 4. visual_calc_list returns [] before any calcs added
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_list_empty_before_add(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""visual_calc_list returns an empty list when no calcs have been added."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
result = visual_calc_list(definition, page, visual)
|
||||
|
||||
assert result == []
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 5. visual_calc_list returns 1 item after add
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_list_one_after_add(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""visual_calc_list returns exactly one item after adding one calculation."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Sales])")
|
||||
result = visual_calc_list(definition, page, visual)
|
||||
|
||||
assert len(result) == 1
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 6. visual_calc_list returns correct name/expression/role
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_list_correct_fields(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""visual_calc_list returns correct name, expression, role, and query_ref."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Sales])", role="Y")
|
||||
result = visual_calc_list(definition, page, visual)
|
||||
|
||||
assert len(result) == 1
|
||||
item = result[0]
|
||||
assert item["name"] == "Running sum"
|
||||
assert item["expression"] == "RUNNINGSUM([Sales])"
|
||||
assert item["role"] == "Y"
|
||||
assert item["query_ref"] == "select"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 7. visual_calc_add is idempotent (same name replaces, not duplicates)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_add_idempotent(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""Adding a calc with the same name replaces the existing one, not duplicates."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Sales])")
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Revenue])")
|
||||
|
||||
result = visual_calc_list(definition, page, visual)
|
||||
|
||||
# Still exactly one NativeVisualCalculation named "Running sum"
|
||||
running_sum_items = [r for r in result if r["name"] == "Running sum"]
|
||||
assert len(running_sum_items) == 1
|
||||
assert running_sum_items[0]["expression"] == "RUNNINGSUM([Revenue])"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 8. visual_calc_add to non-existent role creates the role
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_add_creates_new_role(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""Adding a calc to a role that does not exist creates that role."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "My Rank", "RANK()", role="Values")
|
||||
|
||||
data = _read_json(_vfile(definition, page, visual))
|
||||
assert "Values" in data["visual"]["query"]["queryState"]
|
||||
projections = data["visual"]["query"]["queryState"]["Values"]["projections"]
|
||||
assert len(projections) == 1
|
||||
assert "NativeVisualCalculation" in projections[0]["field"]
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 9. Two different calcs: list returns 2
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_add_two_calcs_list_returns_two(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""Adding two distinct calcs results in two items returned by calc-list."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Sales])")
|
||||
visual_calc_add(definition, page, visual, "Rank", "RANK()")
|
||||
|
||||
result = visual_calc_list(definition, page, visual)
|
||||
|
||||
assert len(result) == 2
|
||||
names = {r["name"] for r in result}
|
||||
assert names == {"Running sum", "Rank"}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 10. visual_calc_delete removes the projection
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_delete_removes_projection(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""visual_calc_delete removes the named NativeVisualCalculation projection."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Sales])")
|
||||
visual_calc_delete(definition, page, visual, "Running sum")
|
||||
|
||||
data = _read_json(_vfile(definition, page, visual))
|
||||
projections = data["visual"]["query"]["queryState"]["Y"]["projections"]
|
||||
nvc_projections = [
|
||||
p for p in projections if "NativeVisualCalculation" in p.get("field", {})
|
||||
]
|
||||
assert nvc_projections == []
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 11. visual_calc_delete raises PbiCliError for unknown name
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_delete_raises_for_unknown_name(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""visual_calc_delete raises PbiCliError when the calc name does not exist."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
with pytest.raises(PbiCliError, match="not found"):
|
||||
visual_calc_delete(definition, page, visual, "Nonexistent Calc")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# 12. visual_calc_list after delete returns N-1
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
def test_visual_calc_list_after_delete_returns_n_minus_one(
|
||||
visual_on_page: tuple[Path, str, str],
|
||||
) -> None:
|
||||
"""visual_calc_list returns N-1 items after deleting one of N calcs."""
|
||||
definition, page, visual = visual_on_page
|
||||
|
||||
visual_calc_add(definition, page, visual, "Running sum", "RUNNINGSUM([Sales])")
|
||||
visual_calc_add(definition, page, visual, "Rank", "RANK()")
|
||||
|
||||
assert len(visual_calc_list(definition, page, visual)) == 2
|
||||
|
||||
visual_calc_delete(definition, page, visual, "Running sum")
|
||||
result = visual_calc_list(definition, page, visual)
|
||||
|
||||
assert len(result) == 1
|
||||
assert result[0]["name"] == "Rank"
|
||||
Loading…
Reference in a new issue