Why Twelve Months of Cost Data Is Not Enough
Most organizations start paying attention to Azure costs around the time a budget gets blown. Someone opens Cost Analysis in the portal, sees a spike in the current month, compares it against last month, and declares a crisis. What they often lack is the longer view — the trajectory that shows whether this month’s spike is truly anomalous or just the continuation of a gradual upward trend that has been building for two years.
The Azure portal displays up to 13 months of cost data. That is enough for short-term troubleshooting but falls well short of what serious financial planning requires. Forecasting next year’s Azure budget based on a single year of history is like predicting the weather from one season of data — you will miss the patterns that only reveal themselves across multiple cycles of growth, optimization, and seasonal variation.
This guide covers how to build a long-term cost history that stretches back years instead of months, and how to use that history to generate forecasts that actually hold up when finance asks pointed questions during budget planning season.
Understanding Data Retention in Azure Cost Management
Azure stores cost data at different retention depths depending on how you access it. Knowing these limits is fundamental to planning your data strategy.
| Access Method | Cost & Usage Data | Reservation Transactions | Price Sheet |
|---|---|---|---|
| Azure Portal (Cost Analysis) | Up to 13 months | Up to 13 months | Up to 13 months |
| Exports REST API | Up to 7 years | Up to 7 years | EA: 25 months; MCA: 13 months |
| Cost Details API | Up to 13 months | — | — |
The critical insight here is the gap between portal access and REST API access. While the portal limits you to 13 months, the Exports REST API can retrieve cost data going back up to seven years for Enterprise Agreement and Microsoft Customer Agreement accounts. This means the raw data exists — you just need to extract it and store it somewhere accessible for analysis.
Cost data becomes available within 8 to 24 hours for EA and MCA accounts. Pay-as-you-go accounts may take up to 72 hours. All charges are finalized within 72 hours after the billing period ends.
Setting Up Scheduled Exports for Continuous History
The single most important action for building long-term cost history is enabling scheduled exports. Once configured, Azure automatically deposits cost data files into a storage account on a daily or monthly schedule, creating a growing archive that persists indefinitely in your own storage.
Configuring Exports in the Portal
Navigate to Cost Management → Exports in the left navigation pane. Click Add to create a new export with these settings:
- Export type: “Cost and usage details (FOCUS)” — this uses the FinOps Open Cost and Usage Specification format which combines actual and amortized costs into a single dataset
- Frequency: Daily (month-to-date) for granular tracking, plus Monthly (last month) for complete period snapshots
- Format: Parquet with Snappy compression — significantly smaller file sizes than CSV, faster query performance, and schema enforcement that prevents data quality issues
- File partitioning: Enabled by default, splits files to stay under 1 GB with a manifest.json for metadata
The export writes files to your storage account in a structured path: Container/Directory/ExportName/[YYYYMMDD-YYYYMMDD]/[RunID]/. Each run creates a new subfolder, and with overwrite enabled (the default for daily exports), the latest run replaces the previous day’s data for the current month.
Backfilling Historical Data via REST API
Scheduled exports only capture data going forward. To backfill historical months and build the multi-year baseline you need for forecasting, use the Exports REST API to trigger exports for specific past periods:
# Backfill 3 years of monthly cost data
$subscriptionId = "your-subscription-id"
$exportName = "monthly-cost-backfill"
$token = (Get-AzAccessToken -ResourceUrl https://management.azure.com).Token
$headers = @{ Authorization = "Bearer $token"; "Content-Type" = "application/json" }
# Create the export definition first (one-time setup)
$exportBody = @{
properties = @{
schedule = @{ status = "Inactive" }
format = "Csv"
deliveryInfo = @{
destination = @{
resourceId = "/subscriptions/$subscriptionId/resourceGroups/rg-finops/providers/Microsoft.Storage/storageAccounts/stfinopshistory"
container = "cost-exports"
rootFolderPath = "backfill"
}
}
definition = @{
type = "ActualCost"
timeframe = "Custom"
timePeriod = @{
from = "2023-04-01T00:00:00Z"
to = "2023-04-30T23:59:59Z"
}
}
}
} | ConvertTo-Json -Depth 10
$scope = "/subscriptions/$subscriptionId"
$apiVersion = "2023-11-01"
# Execute exports month by month
$startDate = [datetime]"2023-04-01"
$endDate = [datetime]"2026-03-01"
while ($startDate -lt $endDate) {
$monthEnd = $startDate.AddMonths(1).AddDays(-1)
$body = @{
timePeriod = @{
from = $startDate.ToString("yyyy-MM-01T00:00:00Z")
to = $monthEnd.ToString("yyyy-MM-ddT23:59:59Z")
}
} | ConvertTo-Json
$uri = "https://management.azure.com$scope/providers/Microsoft.CostManagement/exports/$exportName/run?api-version=$apiVersion"
Invoke-RestMethod -Uri $uri -Method POST -Headers $headers -Body $body
Write-Host "Exported: $($startDate.ToString('yyyy-MM'))"
$startDate = $startDate.AddMonths(1)
}
This script iterates through each month and triggers an export execution. The resulting files land in your storage account where they accumulate into a complete historical record.
Using the Built-In Forecast Feature
Before building custom forecasting models, it is worth understanding what Azure provides out of the box. The Cost Analysis forecast is available in both smart views and customizable views when you select area or stacked column chart types.
How the Forecast Algorithm Works
The built-in forecast uses a time-series linear regression model. The lookback window adjusts based on how far forward you are forecasting:
- Forecast period up to 28 days: uses 28 days of historical data
- Forecast period between 29 and 90 days: uses the same number of historical days as the forecast period
- Forecast period beyond 90 days: caps the lookback at 90 days
The model adjusts for events like reservation purchases that create temporary cost spikes, stabilizing the forecast within a few days. However, it requires at least 90 days of historical data to produce a reasonably accurate annual forecast. New subscriptions or recently restructured billing scopes may not have enough history for meaningful predictions.
The forecast can project up to one year into the future, displayed as a shaded confidence interval alongside your actual cost line in the accumulated costs view.
Forecast REST API for Programmatic Access
The Cost Management Forecast API lets you retrieve forecast data programmatically:
POST https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.CostManagement/forecast?api-version=2025-03-01
{
"type": "ActualCost",
"dataset": {
"aggregation": {
"totalCost": { "name": "Cost", "function": "Sum" }
},
"granularity": "Monthly"
},
"includeActualCost": false,
"includeFreshPartialCost": false,
"timePeriod": {
"from": "2026-04-01T00:00:00+00:00",
"to": "2026-12-31T23:59:59+00:00"
},
"timeframe": "Custom"
}
The response contains rows with PreTaxCost, UsageDate, CostStatus (set to “Forecast”), and Currency. You can request forecasts using Usage, ActualCost, or AmortizedCost types, and aggregate by Cost, CostUSD, PreTaxCost, or PreTaxCostUSD.
Building Advanced Forecasts with KQL in Azure Data Explorer
The built-in forecast works for high-level projections, but finance teams usually need more control: custom seasonality detection, trend decomposition, and the ability to model scenarios like planned infrastructure changes. This is where Azure Data Explorer and KQL shine.
Preparing the Data Pipeline
The pipeline starts with your scheduled exports. Cost data arrives in your storage account as Parquet or CSV files. Azure Data Explorer can ingest directly from Azure Blob Storage using data connections or the .ingest command. The FinOps toolkit also provides pre-built ingestion pipelines that handle schema mapping and deduplication.
Creating a Time Series from Cost Data
The make-series operator converts tabular cost data into time-series arrays that KQL’s forecasting functions can consume:
// Build daily cost time series for the last 2 years
let startDate = datetime(2024-04-01);
let endDate = datetime(2026-04-01);
CostExports
| where ChargePeriodStart between (startDate .. endDate)
| make-series DailyCost = sum(BilledCost) default=0
on ChargePeriodStart from startDate to endDate step 1d
| render timechart
The default=0 parameter fills missing dates with zero cost instead of leaving gaps, which is essential for the forecasting functions to work correctly.
Forecasting with Series Decomposition
The series_decompose_forecast() function is the workhorse for KQL-based cost forecasting. It breaks the time series into seasonal components, trend, and residual noise, then projects the trend and seasonal pattern forward:
// Forecast next 90 days of Azure spend
let startDate = datetime(2024-04-01);
let endDate = datetime(2026-07-01); // Extends 90 days past today
let forecastPoints = 90;
CostExports
| where ChargePeriodStart between (startDate .. datetime(2026-04-01))
| make-series DailyCost = sum(BilledCost) default=0
on ChargePeriodStart from startDate to endDate step 1d
| extend ForecastCost = series_decompose_forecast(DailyCost, forecastPoints)
| render timechart
The function automatically detects seasonality (weekly patterns of lower weekend spending, monthly billing cycles) and extracts the underlying trend. The forecastPoints parameter controls how many future data points to predict — these are appended to the end of the series.
Forecasting by Service Category
Aggregate forecasts hide important dynamics. Different services grow at different rates, and the useful forecast is the one that tells you which service is driving projected growth:
// Per-service monthly forecast
let startDate = datetime(2024-04-01);
let endDate = datetime(2026-10-01);
let forecastMonths = 6;
CostExports
| where ChargePeriodStart between (startDate .. datetime(2026-04-01))
| make-series MonthlyCost = sum(BilledCost) default=0
on ChargePeriodStart from startDate to endDate step 30d
by ServiceCategory
| extend Forecast = series_decompose_forecast(MonthlyCost, forecastMonths)
| render timechart
This query produces separate forecast lines for each service category — Compute, Storage, Networking, Databases — letting you see that while total spend might grow 15 percent, compute is growing at 25 percent while storage is flat.
Visualizing Long-Term History with Power BI
For stakeholders who prefer visual dashboards over KQL queries, the Azure Cost Management Power BI connector provides direct access to cost data without requiring manual exports.
Connecting Power BI to Cost Management
In Power BI Desktop, go to Get Data → Azure → Azure Cost Management. Select your scope type (Enterprise Agreement enrollment number or Microsoft Customer Agreement billing profile) and configure the connection:
- Set the Number of months to control how far back data is loaded. Set to 0 for a custom date range under 31 days.
- Authenticate with Azure AD OAuth 2.0 using an account that has at least Cost Management Reader permissions on the billing scope.
The connector exposes tables including Usage details, Usage details amortized, Reservation usage summary, Budgets, and Price sheets. For long-term trend visualizations, the Usage details amortized table provides the cleanest view because it distributes reservation costs evenly across their terms.
Connector Limitations
The Cost Management connector works well for mid-size environments but has practical limits: it caps at approximately $5 million of raw cost details and 1 million rows per request. Organizations with large-scale Azure footprints should use exports to a data lake and connect Power BI via DirectQuery to Azure Data Explorer or Microsoft Fabric for the full dataset.
Refreshing the connector more than twice daily is not recommended because the underlying data only updates every 4 to 8 hours.
The FinOps Toolkit for Centralized Cost Processing
If building a custom export-to-ADX pipeline feels like reinventing the wheel, the Microsoft FinOps toolkit provides an open-source alternative. Available on GitHub at microsoft/finops-toolkit, the toolkit includes pre-built components for centralizing and analyzing cost data at scale.
The toolkit’s most relevant components for long-term forecasting:
- FinOps hubs — A centralized data processing layer that ingests cost exports, normalizes them into the FOCUS format, and makes them available for analysis through connected services
- Power BI starter kits — Pre-built reports with trend visualizations, budget tracking, and commitment utilization dashboards
- Data lake connectivity guides — Instructions for connecting Azure Data Explorer, Microsoft Fabric, or Azure Synapse Analytics to the hub’s data store
The toolkit is built primarily in PowerShell and Bicep, deploys via standard Azure Resource Manager templates, and supports the FOCUS 1.0 specification for cost data standardization. For organizations managing costs across multiple subscriptions or management groups, it provides the data foundation that makes multi-year forecasting practical without building everything from scratch.
Translating History into Actionable Forecasts
Having two years of cost data in a queryable store is valuable, but the real payoff comes from how you use that history to inform financial decisions.
Setting Evidence-Based Budgets
Instead of budgeting based on last quarter plus an arbitrary growth percentage, use your historical data to calculate the actual compound monthly growth rate for each service category. A service that grew at 3 percent monthly for the past 18 months is unlikely to suddenly flatten without a specific intervention. Multiply the current monthly cost by the growth factor across 12 months, add a buffer for planned new projects, and you have a defensible budget number that finance can trust because it is grounded in real data.
Identifying Seasonal Patterns
Many organizations have predictable cost cycles tied to business events: higher compute usage during month-end financial processing, reduced spending during holiday periods, burst capacity for annual testing or migration projects. These patterns are invisible in 13 months of data but become obvious in a multi-year view. Once identified, seasonal patterns let you set variable monthly budgets that account for expected fluctuations rather than triggering false alarms every time predictable spending peaks occur.
Modeling What-If Scenarios
With a sufficiently long baseline, you can model the financial impact of planned changes. If you are migrating a workload from VMs to containers, overlay the VM cost trend against the projected container cost to estimate when the migration reaches break-even. If you are expanding into a new region, use per-region cost history from existing deployments to estimate the incremental spend.
Validating Optimization Impact
Every optimization initiative — right-sizing VMs, purchasing reservations, implementing auto-shutdown policies — should produce a measurable change in cost trajectory. Long-term history lets you draw a clear before-and-after line: here is the trend before the optimization, here is the trend after, and here is the quantified savings over time. When the next budget review asks whether the cloud optimization program is delivering value, you have the numbers to answer definitively.
Building the Practice Step by Step
Start with scheduled exports. This requires minimal effort — a few minutes in the portal — and immediately begins building the historical record you need. Even if you do not analyze the data for months, having it available is the prerequisite for everything else.
Next, backfill your available history through the REST API. Depending on your account type, you may be able to recover up to seven years of data. Store it in the same structure as your scheduled exports so that queries treat historical and current data identically.
Then choose your analysis platform. For teams already comfortable with KQL, Azure Data Explorer provides the most powerful forecasting capabilities natively. For teams that prefer visual tools, Power BI with the Cost Management connector or the FinOps toolkit’s starter kits gets results with less query-writing. For large enterprises, the FinOps hub architecture centralizes data from hundreds of subscriptions into a single analyzable store.
Whatever path you choose, the underlying principle is the same: cloud cost forecasting improves with more data, and the best time to start collecting that data was years ago. The second best time is right now. Every month of cost data you capture today becomes a data point that improves next year’s forecasts and strengthens the case for every optimization initiative your team proposes.
For more details, refer to the official documentation: What is Microsoft Cost Management.