Automate Azure Cost Exports Using Scheduled Exports: A Practical Azure FinOps Guide

The Problem with Manual Cost Reporting

Every month, someone on the FinOps team opens the Azure portal, navigates to Cost Analysis, adjusts the date range, exports a CSV, drops it into a spreadsheet, and emails it to finance. This process works until it does not — someone forgets, the date range is wrong, a new subscription is missed, or the person who runs the report goes on vacation and nobody knows the routine.

Scheduled exports eliminate this fragility. Azure Cost Management can automatically deposit cost data files into a storage account on a fixed schedule, creating a reliable data pipeline that operates without human intervention. Once configured, the exports run daily, capture every dollar across the configured scope, and produce files that downstream systems can ingest for dashboards, chargeback calculations, and budget tracking.

This guide covers every aspect of setting up and managing automated cost exports: the different dataset types, scheduling options, file formats, storage requirements, infrastructure-as-code deployment, and the automation patterns that turn raw exported data into actionable reports.

Understanding Export Dataset Types

Azure Cost Management supports seven distinct dataset types, each serving a different analytical purpose. Choosing the right one determines what cost data lands in your storage account.

Export Type Contains Best For
ActualCost Usage charges and purchases as invoiced Invoice reconciliation, actual spend tracking
AmortizedCost Costs with reservations/savings plans spread across terms Chargeback, daily cost allocation
FocusCost Combined actual + amortized in FOCUS specification format Standardized multi-cloud reporting, reduced storage
PriceSheet Organization’s negotiated Azure pricing Rate analysis, discount validation
ReservationDetails Current reservation inventory Commitment tracking
ReservationRecommendations Purchase recommendations based on usage Commitment planning
ReservationTransactions Reservation purchases, exchanges, and refunds Commitment audit trail

For most organizations, the FocusCost export is the best starting point. It uses the FinOps Open Cost and Usage Specification (FOCUS) format, which combines actual and amortized cost perspectives into a single dataset. This reduces storage costs and processing complexity compared to maintaining separate ActualCost and AmortizedCost exports. The FOCUS format is also designed for cross-vendor compatibility, making it the right choice if you manage costs across Azure, AWS, and GCP.

Configuring Your First Scheduled Export

Navigate to Cost Management → Exports in the Azure portal. The experience walks you through the configuration with clear defaults for each option.

Scope Selection

The scope determines what costs are captured. You can create exports at these levels:

  • Subscription — All resources within a single subscription
  • Resource group — A specific resource group only
  • Management group — Aggregates up to 3,000 subscriptions (Enterprise Agreement only, limited to Usage/CSV format)
  • Billing account — All subscriptions under an EA enrollment or MCA billing account
  • Department — EA department scope

For comprehensive coverage, a billing account-level export captures everything in a single file. For team-level chargeback, subscription or resource group exports provide the granularity teams need to review their own costs.

Schedule and Frequency

Each dataset type supports different scheduling options. For cost and usage data, the most common configurations are:

  • Daily (month-to-date) — Runs every day, producing a file containing all charges from the first of the current month through the current day. Each day’s export overwrites the previous day’s file by default, so you always have one complete, current dataset for the month.
  • Monthly (last month) — Runs once after the close of each month, producing a complete and finalized snapshot. This is the file you want for permanent archival and month-end reporting.

An important behavioral detail: during the first five days of each month, daily exports run twice per day instead of once. This extra run ensures that charges from the previous month, which can take up to 72 hours to finalize, are captured accurately.

All scheduling uses UTC time. The API does not convert local time zones, so plan accordingly if your financial calendar follows a specific timezone.

File Format and Compression

Two format choices are available: CSV and Parquet. The decision depends on your downstream processing infrastructure.

CSV is universally compatible. Every tool, script, and spreadsheet application can read it. It supports Gzip compression, which typically reduces file sizes by 80 to 90 percent. For organizations where Excel users need to open cost files directly, CSV is the pragmatic choice.

Parquet is the performance-optimized option. It uses columnar storage with strong typing and optional Snappy compression, producing smaller files than CSV for the same data while enabling dramatically faster query performance in tools like Azure Data Explorer, Synapse, or Fabric. If your cost data flows into an analytical platform rather than spreadsheets, Parquet is the clear winner.

Storage Account Setup and Security

Exports write to an Azure Storage account blob container. The storage account must meet several requirements.

Basic Requirements

  • The account must be configured for blob or file storage
  • The export service needs write permissions on the storage account
  • The storage account must not be a destination in an object replication rule
  • The “Permitted scope for copy operations” setting must allow copies from any storage account

Storage Firewall Configuration

For organizations that restrict storage account access through network firewalls, exports support secure delivery with a few additional steps. The export creates a system-assigned managed identity, and Cost Management assigns the StorageBlobDataContributor role to that identity scoped to the destination container.

You must enable “Allow trusted Azure services access” on the storage account firewall. The user creating the export also needs Microsoft.Authorization/roleAssignments/write and Microsoft.Authorization/permissions/read permissions to set up the role assignment. Firewall-protected exports work only for same-tenant storage accounts.

File Structure in Storage

Exported files follow a consistent folder structure:

{Container}/{RootFolder}/{ExportName}/{YYYYMMDD-YYYYMMDD}/{RunID}/
  ├── manifest.json
  ├── part0.csv
  ├── part1.csv
  └── ...

File partitioning is always enabled and cannot be disabled. Each partition stays under 1 GB uncompressed. The manifest.json file contains metadata including the total byte count, row count, blob count, export configuration, and an array describing each partition file — essential for programmatic consumers that need to discover all data files in a given run.

Deploying Exports as Infrastructure-as-Code

Creating exports through the portal works for initial setup, but managing exports across dozens of subscriptions demands an infrastructure-as-code approach. Azure provides a Bicep resource type specifically for this.

Bicep Template for a Daily Export

targetScope = 'subscription'

param storageAccountId string
param containerName string = 'cost-exports'
param exportName string = 'daily-actual-cost'

resource costExport 'Microsoft.CostManagement/exports@2025-03-01' = {
  name: exportName
  scope: subscription()
  identity: {
    type: 'SystemAssigned'
  }
  location: 'centralus'
  properties: {
    format: 'Parquet'
    compressionMode: 'snappy'
    dataOverwriteBehavior: 'OverwritePreviousReport'
    partitionData: true
    definition: {
      type: 'FocusCost'
      timeframe: 'MonthToDate'
      dataSet: {
        granularity: 'Daily'
      }
    }
    deliveryInfo: {
      destination: {
        type: 'AzureBlob'
        container: containerName
        resourceId: storageAccountId
        rootFolderPath: 'finops'
      }
    }
    schedule: {
      status: 'Active'
      recurrence: 'Daily'
      recurrencePeriod: {
        from: '2026-04-01T00:00:00Z'
        to: '2027-04-01T00:00:00Z'
      }
    }
  }
}

Deploy this template across multiple subscriptions using a management group-level deployment that loops through each subscription, and you have consistent export configuration everywhere without manual portal work.

Creating Exports via the REST API

# Create a daily FOCUS export via REST API
$subscriptionId = "your-subscription-id"
$exportName = "daily-focus-export"
$token = (Get-AzAccessToken -ResourceUrl https://management.azure.com).Token

$body = @{
    properties = @{
        format = "Parquet"
        compressionMode = "snappy"
        dataOverwriteBehavior = "OverwritePreviousReport"
        partitionData = $true
        definition = @{
            type = "FocusCost"
            timeframe = "MonthToDate"
            dataSet = @{ granularity = "Daily" }
        }
        deliveryInfo = @{
            destination = @{
                type = "AzureBlob"
                container = "cost-exports"
                resourceId = "/subscriptions/$subscriptionId/resourceGroups/rg-finops/providers/Microsoft.Storage/storageAccounts/stfinopsexports"
                rootFolderPath = "cost-data"
            }
        }
        schedule = @{
            status = "Active"
            recurrence = "Daily"
            recurrencePeriod = @{
                from = "2026-04-01T00:00:00Z"
                to = "2027-04-01T00:00:00Z"
            }
        }
    }
} | ConvertTo-Json -Depth 10

$uri = "https://management.azure.com/subscriptions/$subscriptionId/providers/Microsoft.CostManagement/exports/${exportName}?api-version=2025-03-01"
$headers = @{ Authorization = "Bearer $token"; "Content-Type" = "application/json" }
Invoke-RestMethod -Uri $uri -Method PUT -Headers $headers -Body $body

On-Demand Export Execution and Backfilling

Scheduled exports capture data going forward. When you need to fill in historical gaps or rerun a specific period, the Execute API triggers an immediate export run:

# Execute an existing export for a specific historical period
$uri = "https://management.azure.com/subscriptions/$subscriptionId/providers/Microsoft.CostManagement/exports/${exportName}/run?api-version=2025-03-01"
$body = @{
    timePeriod = @{
        from = "2026-01-01T00:00:00Z"
        to = "2026-01-31T23:59:59Z"
    }
} | ConvertTo-Json

Invoke-RestMethod -Uri $uri -Method POST -Headers $headers -Body $body

The portal also supports this through “Export selected dates” — you can pick a specific month and rerun it without creating a new one-time export. The data retention limit applies: up to 13 months through the portal, up to 7 years through the REST API for cost and usage data.

Processing Exported Data Automatically

Raw exports in a storage account are only half the solution. The real value comes from automatically processing those files into dashboards, databases, or notification systems.

Event Grid Triggers

Configure an Event Grid subscription on the storage account to fire on BlobCreated events. Filter the event subject to match your export container path. This triggers an Azure Function or Logic App immediately when new cost data arrives.

Azure Function Processing Pattern

A common pattern uses a blob-triggered Azure Function that:

  1. Reads the manifest.json file to discover all partition files in the run
  2. Downloads and processes each partition (CSV parsing or Parquet reading)
  3. Aggregates the data into summary tables grouped by team, service, or environment
  4. Writes results to a database, sends summary emails, or posts alerts to a Teams channel when thresholds are exceeded

Processing the manifest first is essential because partitioned exports can produce multiple files per run, and the number of partitions varies based on data volume.

Ingestion into Azure Data Explorer

For organizations using Azure Data Explorer as their cost analytics platform, create a data connection directly from the storage account to an ADX database. ADX monitors the blob container for new files and ingests them automatically. Combined with a table’s ingestion mapping (CSV or Parquet schema mapping), this creates a fully automated pipeline from Cost Management to a queryable analytical database.

Managing Exports at Scale

Organizations with dozens or hundreds of subscriptions need systematic approaches to export management. A few practices make this manageable.

Centralized vs. Distributed Exports

The choice between a single billing account-level export and per-subscription exports depends on your organizational structure. A billing account export captures everything in one place, simplifying ingestion but producing very large files. Per-subscription exports let teams own their cost data independently but require coordinated setup and monitoring.

A balanced approach uses billing account-level exports for centralized FinOps reporting and subscription-level exports for team self-service dashboards. The two are not mutually exclusive.

Monitoring Export Health

Exports can fail silently — a storage account runs out of space, permissions get revoked, or an export schedule expires. Use the execution history API to check run status:

# Check export run history for failures
$uri = "https://management.azure.com/subscriptions/$subscriptionId/providers/Microsoft.CostManagement/exports/${exportName}/runHistory?api-version=2025-03-01"
$history = Invoke-RestMethod -Uri $uri -Headers $headers
$history.value | ForEach-Object {
    [PSCustomObject]@{
        RunId = $_.properties.runId
        Status = $_.properties.status
        SubmittedTime = $_.properties.submittedTime
        FileName = $_.properties.fileName
    }
} | Format-Table -AutoSize

Build a scheduled check (Azure Function on a timer trigger, or even a simple Logic App) that queries the execution history daily and sends an alert if the last run did not succeed or if no run occurred in the expected window.

Export Scope Considerations

Management group-level exports aggregate costs across all child subscriptions but carry several limitations: they only support the Usage export type (not FOCUS, AmortizedCost, or ReservationTransactions), output CSV without compression, and do not support multiple currencies. They work exclusively with Enterprise Agreement billing. If these limitations are acceptable, a single management group export covering up to 3,000 subscriptions is the most efficient way to capture comprehensive cost data.

Cost Export Best Practices

Several practices distinguish a reliable export setup from one that quietly degrades over time.

Use FOCUS Format When Possible

The FocusCost export type reduces the number of exports you need to maintain from two (ActualCost plus AmortizedCost) to one. It follows the open FOCUS specification, which standardizes column names and semantics across cloud providers. If you plan to add AWS or GCP cost data to your analysis later, starting with FOCUS avoids a painful schema migration.

Prefer Parquet Over CSV

Parquet files are smaller, faster to query, and enforce data types natively. The only reasons to choose CSV are direct human consumption (opening in Excel) or management group-level exports where Parquet is not supported. For everything else, Parquet with Snappy compression delivers the best combination of storage efficiency and query performance.

Set Up Both Daily and Monthly Exports

Daily exports with overwrite enabled give you a rolling near-real-time view of month-to-date costs. Monthly exports produce the finalized, immutable snapshot that serves as the system of record for each billing period. Run both against the same storage account in different folders — the daily export supports operational monitoring while the monthly export supports financial reporting and audit.

Extend Export Recurrence Periods

Exports have a recurrence period with a start and end date. When the end date passes, the export stops running silently. Set the recurrence end date far into the future (three years is reasonable) and add a calendar reminder to review and extend it before expiry. Alternatively, use the Bicep template with a parameterized end date and redeploy annually as part of your infrastructure maintenance cycle.

Tag Your Storage Account

The storage account receiving cost exports incurs its own costs for storage and transactions. Tag it with a FinOps or cost reporting identifier so that the storage cost of your cost reporting infrastructure is tracked and justified separately from application workloads. The irony of losing visibility into your cost visibility infrastructure is worth preventing.

Automated exports form the foundation of any mature FinOps practice. They replace manual processes, create historical records that become more valuable over time, and feed the downstream analytics that turn cost data into cost decisions. The setup effort is measured in minutes; the return compounds every day the exports run.

Leave a Reply