Beyond the Azure Portal: Why Custom Dashboards Matter
Azure Cost Management‘s built-in views cover the fundamentals — trend charts, service breakdowns, and budget tracking. But the moment you need to combine cost data with business KPIs, display team-specific views behind corporate SSO, or embed cost metrics into an existing monitoring stack, the portal’s built-in capabilities reach their ceiling. Custom dashboards bridge this gap by pulling Azure cost data into platforms that your organization already uses for decision-making.
The integration paths vary depending on your target platform, data volume, and update frequency requirements. This guide walks through the major approaches: REST API integration for web applications and custom portals, webhook-based feeds for real-time alerting systems, exported data for data warehouse integration, and embedded Power BI for portal-grade reporting within your own applications.
Feeding Custom Dashboards via the Query API
The Cost Management Query API is the most direct path from Azure cost data to a custom dashboard. It returns JSON responses that any web application, internal tool, or dashboard framework can consume.
API Integration Pattern
A typical integration follows this flow:
- Your dashboard backend authenticates to Azure AD using a service principal or managed identity
- It sends a POST request to the Query API with the desired scope, timeframe, grouping, and filters
- The response contains
columns(schema) androws(data) arrays - The backend transforms this into the format your dashboard framework expects
- The frontend renders charts and tables from the transformed data
POST https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.CostManagement/query?api-version=2025-03-01
{
"type": "ActualCost",
"timeframe": "MonthToDate",
"dataset": {
"granularity": "Daily",
"aggregation": {
"totalCost": { "name": "PreTaxCost", "function": "Sum" }
},
"grouping": [
{ "name": "ServiceName", "type": "Dimension" }
]
}
}
The response schema is predictable: the first element of each row corresponds to the first column definition, and so on. Parse it into your application’s data model and render it with any charting library — Chart.js, D3, Highcharts, or your framework’s native components.
Caching Strategy
Cost data in Azure refreshes every 4 to 8 hours. Hitting the Query API on every dashboard page load wastes QPU quota and adds unnecessary latency. Implement server-side caching with a 4-hour TTL (time to live) to match the data refresh cycle. Store the cached response in Redis, a local file, or your application database. Invalidate the cache when users explicitly request a refresh.
The API enforces rate limits: 12 QPU per 10 seconds, 60 per minute, and 600 per hour. A dashboard serving 50 concurrent users without caching would quickly exhaust these limits. Caching transforms the pattern from per-user queries to periodic background refreshes that serve all users from the cached result.
Authentication for Automated Access
# Service principal authentication for dashboard backend
$tenantId = "your-tenant-id"
$clientId = "your-service-principal-id"
$clientSecret = "your-client-secret"
$tokenBody = @{
grant_type = "client_credentials"
client_id = $clientId
client_secret = $clientSecret
resource = "https://management.azure.com/"
}
$tokenResponse = Invoke-RestMethod -Uri "https://login.microsoftonline.com/$tenantId/oauth2/token" -Method POST -Body $tokenBody
$token = $tokenResponse.access_token
The service principal needs the Cost Management Reader role on the scope it queries. For dashboards covering multiple subscriptions, assign the role at the management group level to avoid per-subscription role assignments.
Building Dashboards with Exported Data
For dashboards that need historical data beyond 13 months, or that serve high traffic volumes that would exceed API rate limits, exported data provides a decoupled architecture with no API dependencies at query time.
Data Pipeline Architecture
- Scheduled exports deposit cost data in a storage account (Parquet format)
- A processing layer (Azure Function, Data Factory, or Logic App) reads new exports and loads them into a database
- The dashboard queries the database rather than the Cost Management API
Database options include Azure SQL for traditional relational queries, Azure Data Explorer for KQL-based analytical queries, Cosmos DB for globally distributed dashboards, or even a simple SQLite file for internal tools with modest traffic.
Azure Function as Data Processor
An Event Grid-triggered Azure Function that fires when new export files land in storage provides the processing step:
# Azure Function (PowerShell) - Process cost export
param($eventGridEvent, $TriggerMetadata)
$blobUrl = $eventGridEvent.data.url
if ($blobUrl -match "manifest\.json$") {
$manifestContent = Get-AzStorageBlobContent -Uri $blobUrl
$manifest = $manifestContent | ConvertFrom-Json
foreach ($blob in $manifest.blobs) {
# Process each partition file
# Load into your target database
Write-Host "Processing: $($blob.blobName) ($($blob.dataRowCount) rows)"
}
}
Embedding Power BI in Custom Applications
Power BI Embedded provides the richest visualization experience within custom applications. Instead of building chart components from scratch, embed fully interactive Power BI reports that connect to Azure cost data.
Setup Pattern
- Create Power BI reports using the Cost Management connector or exported data
- Publish reports to the Power BI service
- Use the Power BI Embedded API to generate embed tokens
- Render the report in your web application using the Power BI JavaScript SDK
This approach delivers publication-quality visualizations with cross-filtering, drill-through, and export capabilities — all within your application’s UI. Users interact with cost data without needing Power BI licenses or portal access.
Row-Level Security for Multi-Tenant Dashboards
When building a dashboard that serves multiple teams, each seeing only their own costs, Power BI’s row-level security (RLS) filters the data based on the authenticated user’s identity. Define RLS roles that filter on resource group, subscription, or tag values, and the same report securely shows different data to different users.
Grafana Integration
Teams already running Grafana for infrastructure monitoring prefer adding cost panels to their existing dashboards. The integration typically runs through Azure Data Explorer or Azure Monitor as a data source.
ADX-Based Approach
Export cost data to ADX, then configure the Azure Data Explorer data source in Grafana. Write KQL queries that return time-series data matching Grafana’s expected format:
// Grafana-compatible time series for daily costs
CostExports
| where ChargePeriodStart >= $__timeFrom and ChargePeriodStart <= $__timeTo
| summarize Cost = sum(BilledCost) by bin(ChargePeriodStart, 1d)
| order by ChargePeriodStart asc
Grafana's template variables ($__timeFrom, $__timeTo) integrate with the dashboard's time picker, making the cost panels respond to the same time range controls as performance panels.
Azure Monitor Metrics Approach
For simpler cost visibility without ADX infrastructure, push cost summary metrics to Azure Monitor custom metrics via the Metrics Ingestion API. An Azure Function that runs daily queries the Cost Management API for key metrics (total daily cost, cost by top services) and publishes them as custom metrics. Grafana's Azure Monitor data source then reads these metrics alongside standard infrastructure telemetry.
Webhook-Based Real-Time Integration
Budget alerts and anomaly alerts support webhook action types that send HTTP POST requests to any endpoint. This enables real-time cost event integration into systems that Cost Management does not natively support.
Common Webhook Targets
- Slack/Teams — Post budget threshold notifications to team channels via incoming webhook URLs
- PagerDuty/Opsgenie — Route critical budget alerts through incident management systems with escalation policies
- Custom dashboards — Update a "budget status" indicator on an internal dashboard in near-real-time
- Jira/ServiceNow — Automatically create cost review tickets when thresholds are crossed
The webhook payload includes the alert details, threshold values, current spending amount, and scope information. Configure the webhook in an action group, attach the action group to a budget, and every threshold breach triggers the integration automatically.
Building a Cost API Gateway
For organizations with multiple internal consumers of cost data, building a thin API gateway in front of the Cost Management API standardizes access patterns and enforces consistent caching.
# Simplified cost API endpoint (Azure Function HTTP trigger)
param($Request, $TriggerMetadata)
$scope = $Request.Query.scope ?? "/subscriptions/default-sub-id"
$timeframe = $Request.Query.timeframe ?? "MonthToDate"
$groupBy = $Request.Query.groupBy ?? "ServiceName"
# Check cache first
$cacheKey = "$scope-$timeframe-$groupBy"
$cached = Get-AzTableEntity -Table "CostCache" -PartitionKey "queries" -RowKey $cacheKey
if ($cached -and (Get-Date) -lt $cached.Expiry) {
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
StatusCode = 200
Body = $cached.Data
})
return
}
# Query Cost Management API
$body = @{
type = "ActualCost"
timeframe = $timeframe
dataset = @{
granularity = "Daily"
aggregation = @{ totalCost = @{ name = "PreTaxCost"; function = "Sum" } }
grouping = @(@{ name = $groupBy; type = "Dimension" })
}
} | ConvertTo-Json -Depth 5
$result = Invoke-RestMethod -Uri "https://management.azure.com$scope/providers/Microsoft.CostManagement/query?api-version=2025-03-01" `
-Method POST -Headers @{ Authorization = "Bearer $token" } -Body $body -ContentType "application/json"
# Cache and return
# Store in cache with 4-hour expiry...
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
StatusCode = 200
Body = ($result | ConvertTo-Json -Depth 10)
})
This pattern decouples dashboard consumers from the Azure API directly. Internal teams hit your cost API endpoint with simple parameters, and the gateway handles authentication, caching, rate limit management, and response formatting. Adding a new dashboard or consumer requires no new role assignments or API configuration — just point it at your gateway.
Choosing the Right Integration Approach
| Approach | Best For | Data Freshness | Effort |
|---|---|---|---|
| Query API direct | Low-traffic internal tools | 4-8 hours | Low |
| Query API + cache | Multi-user dashboards | 4-8 hours | Medium |
| Export + database | High-traffic, historical analysis | Daily | Medium-High |
| Power BI Embedded | Rich interactive reports | Daily refresh | Medium |
| Grafana + ADX | Ops teams with existing Grafana | Daily | Medium |
| Webhooks | Event-driven notifications | Real-time (on alert) | Low |
Start with the simplest approach that meets your requirements. A cached Query API integration takes a day to build and covers most custom dashboard needs. Scale to the export-based architecture when data volume, historical depth, or traffic demands outgrow the API-based approach. The beauty of Azure's cost data ecosystem is that all paths lead to the same underlying data — switching approaches later does not require re-engineering your cost analysis, just the delivery mechanism.
For more details, refer to the official documentation: What is Microsoft Cost Management.