How to automate Azure cost exports using scheduled exports

Introduction to Azure Cost Exports

Managing cloud costs at scale requires more than manual checks in the Azure portal. Scheduled exports in Azure Cost Management automatically push cost and usage data to Azure Blob Storage on a recurring basis, giving you a reliable pipeline for downstream analytics, compliance auditing, and FinOps workflows. Whether you need daily snapshots of month-to-date spending or monthly summaries for chargeback reports, exports eliminate the manual effort of downloading CSV files and ensure consistent, timely data delivery.

This guide walks through every aspect of setting up automated cost exports, from portal configuration to CLI commands and REST API calls, so you can build a fully automated cost data pipeline for your organization.

Understanding Export Types and Dataset Options

Azure Cost Management supports several dataset types for exports, each serving a different analytical purpose. Choosing the right one ensures you capture the data you actually need without unnecessary bloat.

  • ActualCost — Shows costs as they appear on your invoice, including one-time purchases like reservations charged in full during the purchase month.
  • AmortizedCost — Spreads reservation and savings plan purchases evenly across the commitment term, making it easier to attribute costs to consuming resources.
  • FOCUS — Uses the FinOps Open Cost and Usage Specification format, combining actual and amortized costs into a single export with a standardized schema designed for multi-cloud cost analysis.
  • Usage — Legacy format primarily maintained for backward compatibility.
  • PriceSheet — Exports your negotiated pricing for all meters.
  • ReservationDetails — Shows utilization records for each reservation.
  • ReservationRecommendations — Provides purchase recommendations based on your usage patterns.
  • ReservationTransactions — Records reservation purchase and refund events.

For most FinOps workflows, start with ActualCost for invoice reconciliation and AmortizedCost for internal chargeback. If you are building a multi-cloud cost platform, FOCUS provides the best schema compatibility.

File Formats and Storage Organization

Exports support two output formats: CSV and Parquet. Each has compression options that significantly affect storage costs and query performance.

Format Compression Best For Typical Size Reduction
CSV None Simple scenarios, Excel analysis Baseline
CSV Gzip Storage savings with CSV compatibility ~80%
Parquet Snappy Large-scale analytics, Power BI, Fabric ~90%

File partitioning is always enabled. Large exports are automatically split into chunks smaller than 1 GB, with a manifest.json file that describes all parts. The directory structure follows a predictable pattern that makes it easy to build automated pipelines:

StorageContainer/
  StorageDirectory/
    ExportName/
      [YYYYMMDD-YYYYMMDD]/
        [RunID]/
          manifest.json
          part0.csv
          part1.csv

The date range folder uses the export period boundaries, and the RunID ensures each execution creates a unique folder. For daily exports with file overwrite enabled, the previous day’s file is replaced with an updated version that includes any late-arriving charges.

Creating Scheduled Exports in the Azure Portal

The portal provides a guided wizard that walks you through export configuration in four steps. This approach is ideal for initial setup or when you need to create a small number of exports.

Step 1: Navigate to Exports

Sign in to the Azure portal and search for Cost Management. Select your billing scope — this can be a subscription, resource group, management group, or billing account. In the left menu, click Exports, then click + Create.

Step 2: Configure the Basics

Choose from predefined templates or select Create your own export for full control. Templates include common configurations like monthly actual cost exports or daily amortized cost exports. Custom exports let you specify:

  • Export name (unique within the scope)
  • Up to 10 datasets per export definition
  • Frequency: One-time, Daily, Weekly, or Monthly
  • Date range for the export schedule

Step 3: Set the Destination

Select the storage account, container, and directory path. The storage account must be in the same Azure AD tenant. For firewall-protected storage accounts, you must enable Allow trusted Azure services access and use API version 2023-08-01 or later. The storage account also needs the Permitted scope for copy operations setting configured to allow copies from any storage account.

Step 4: Review and Create

Verify all settings and click Create. The first export run typically starts within a few hours. You can also trigger an immediate run by selecting the export and clicking Run now.

Automating Exports with Azure CLI

For infrastructure-as-code workflows and automated provisioning, the Azure CLI provides full control over export creation and management. Here is a complete example that creates the prerequisite resources and configures a daily export.

# Create a resource group for cost management resources
az group create \
  --name rg-costmanagement \
  --location eastus

# Create a storage account for export data
az storage account create \
  --resource-group rg-costmanagement \
  --name stcostexports2025 \
  --sku Standard_LRS \
  --kind StorageV2 \
  --min-tls-version TLS1_2

# Create a blob container
az storage container create \
  --name cost-exports \
  --account-name stcostexports2025

# Create a daily actual cost export
az costmanagement export create \
  --name DailyActualCost \
  --type ActualCost \
  --scope "subscriptions/00000000-0000-0000-0000-000000000000" \
  --storage-account-id "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/rg-costmanagement/providers/Microsoft.Storage/storageAccounts/stcostexports2025" \
  --storage-container cost-exports \
  --storage-directory daily-actual \
  --timeframe MonthToDate \
  --recurrence Daily \
  --recurrence-period from="2025-01-01T00:00:00Z" to="2025-12-31T00:00:00Z" \
  --schedule-status Active

You can manage existing exports with additional commands:

# List all exports at subscription scope
az costmanagement export list \
  --scope "subscriptions/00000000-0000-0000-0000-000000000000"

# Trigger an immediate export run
az costmanagement export execute \
  --name DailyActualCost \
  --scope "subscriptions/00000000-0000-0000-0000-000000000000"

# View export execution history
az costmanagement export execution-history list \
  --name DailyActualCost \
  --scope "subscriptions/00000000-0000-0000-0000-000000000000"

# Delete an export
az costmanagement export delete \
  --name DailyActualCost \
  --scope "subscriptions/00000000-0000-0000-0000-000000000000"

Creating Exports with the REST API

The Exports REST API provides the most granular control, including column selection and advanced scheduling options. This is the preferred approach for enterprise automation and Terraform/Bicep deployments.

PUT https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.CostManagement/exports/DailyActualCost?api-version=2023-08-01

{
  "properties": {
    "schedule": {
      "status": "Active",
      "recurrence": "Daily",
      "recurrencePeriod": {
        "from": "2025-01-01T00:00:00Z",
        "to": "2025-12-31T00:00:00Z"
      }
    },
    "format": "Csv",
    "deliveryInfo": {
      "destination": {
        "resourceId": "/subscriptions/{subscriptionId}/resourceGroups/rg-costmanagement/providers/Microsoft.Storage/storageAccounts/stcostexports2025",
        "container": "cost-exports",
        "rootFolderPath": "daily-actual"
      }
    },
    "definition": {
      "type": "ActualCost",
      "timeframe": "MonthToDate",
      "dataSet": {
        "granularity": "Daily",
        "configuration": {
          "columns": [
            "Date",
            "MeterId",
            "MeterName",
            "MeterCategory",
            "ResourceId",
            "ResourceName",
            "ResourceType",
            "ResourceLocation",
            "ResourceGroupName",
            "SubscriptionId",
            "SubscriptionName",
            "Quantity",
            "CostInBillingCurrency",
            "UnitPrice",
            "Tags"
          ]
        }
      }
    }
  }
}

Specifying individual columns in the configuration.columns array reduces file size and processing time by excluding fields you do not need. If you omit the columns property entirely, all available columns are included.

Provisioning Exports with Bicep

For infrastructure-as-code deployments, you can define cost exports as Bicep resources. This ensures your export configuration is version-controlled and repeatable.

resource costExport 'Microsoft.CostManagement/exports@2023-08-01' = {
  name: 'DailyActualCost'
  scope: subscription()
  properties: {
    schedule: {
      status: 'Active'
      recurrence: 'Daily'
      recurrencePeriod: {
        from: '2025-01-01T00:00:00Z'
        to: '2025-12-31T00:00:00Z'
      }
    }
    format: 'Csv'
    deliveryInfo: {
      destination: {
        resourceId: storageAccount.id
        container: 'cost-exports'
        rootFolderPath: 'daily-actual'
      }
    }
    definition: {
      type: 'ActualCost'
      timeframe: 'MonthToDate'
      dataSet: {
        granularity: 'Daily'
      }
    }
  }
}

Handling Firewall-Protected Storage Accounts

Many organizations restrict their storage accounts with virtual network rules and firewalls. Cost Management exports can still write to these accounts, but additional configuration is required.

  1. On the storage account, navigate to Networking and ensure Allow trusted Microsoft services to access this storage account is checked under Exceptions.
  2. Use the Exports API version 2023-08-01 or later — older API versions do not support system-assigned managed identity authentication.
  3. Assign the Storage Blob Data Contributor role to the Cost Management service principal on the storage account.
  4. Set the Permitted scope for copy operations to allow copies from any storage account.

Monitoring Export Health and Execution History

After creating exports, establish a monitoring routine to ensure data is flowing reliably. Export failures are often silent — they do not generate Azure Monitor alerts by default.

# Check execution history for errors
az costmanagement export execution-history list \
  --name DailyActualCost \
  --scope "subscriptions/00000000-0000-0000-0000-000000000000" \
  --query "[].{Status:executionType, Start:processingStartTime, End:processingEndTime, RowCount:rowCount}" \
  --output table

Common failure reasons include storage account access issues, expired SAS tokens (if using SAS-based authentication), and scope changes that invalidate the export. Set up an Azure Logic App or Function that periodically checks the execution history API and sends alerts when runs fail or row counts drop unexpectedly.

Building a Pipeline: Exports to Analytics

Raw cost data in blob storage becomes valuable when connected to analytics tools. A common pipeline architecture connects exports to Power BI, Microsoft Fabric, or Azure Data Explorer for interactive analysis.

Power BI Direct Connect

Use the Azure Cost Management connector in Power BI Desktop to pull data directly from your exports. Point the connector at your storage account, select the export container, and Power BI automatically parses the manifest files to combine partitioned data.

Microsoft Fabric Integration

Create a Fabric Lakehouse and use a data pipeline to ingest Parquet exports. This approach scales to hundreds of subscriptions and years of historical data. Schedule the pipeline to run after your daily export completes.

Azure Data Factory

For complex transformations — such as joining cost data with CMDB records, tag normalization, or currency conversion — use Azure Data Factory pipelines to process export files and load them into a dedicated analytics database.

Common Pitfalls and Troubleshooting

Even well-configured exports can encounter issues. Understanding the most common problems helps you resolve them quickly.

  • Delayed data for new subscriptions — New subscriptions may take up to 48 hours before export data becomes available. Wait at least two days before investigating missing data.
  • Double runs at month boundaries — Daily exports run twice per day during the first five days of each month to capture late-arriving charges from the previous billing period. This is expected behavior.
  • Export runs take up to 24 hours — Large exports, especially at management group scope, can take a full day to complete. Do not assume failure until 24 hours have passed.
  • Scope limitations — Management groups support a maximum of 3,000 subscriptions per export. If you exceed this limit, split into multiple management group scopes.
  • Historical data limits — The portal supports up to 13 months of historical data in exports. For longer lookback periods, use the REST API, which supports up to 7 years of cost and usage data.
  • UTC timezone requirement — The recurrencePeriod dates in the API must be in UTC. The API does not perform timezone conversion, and using local times can cause exports to run at unexpected times.

Best Practices for Production Deployments

Follow these recommendations to build a robust, cost-effective export pipeline:

  1. Use Parquet with Snappy compression for any export feeding analytics workloads. The storage savings alone can be significant for large organizations.
  2. Create separate exports for actual and amortized costs rather than relying on a single export type. Each serves a different analytical purpose.
  3. Set export scope at the management group level to aggregate costs across subscriptions in a single dataset, reducing the number of exports to manage.
  4. Implement lifecycle management policies on the storage account to automatically move old export data to cool or archive tiers, or delete it after your retention period.
  5. Monitor export executions programmatically — do not rely on manual checks. Build an alert system that flags failed or zero-row exports.
  6. Version your export configurations using Bicep or Terraform so changes are tracked and environments stay consistent.
  7. Use the FOCUS format if you plan to integrate with multi-cloud cost tools like the FinOps toolkit or third-party platforms.

Conclusion

Automated cost exports are the foundation of any serious Azure FinOps practice. They transform cost visibility from a reactive, portal-based activity into a proactive, data-driven pipeline. By configuring scheduled exports with the right dataset types, file formats, and destination storage, you create a reliable stream of cost intelligence that feeds dashboards, alerts, chargeback systems, and anomaly detection workflows. Start with a daily ActualCost export at your management group scope, add Parquet formatting for analytics efficiency, and build monitoring around the execution history API to ensure continuous data flow.

For more details, refer to the official documentation: What is Microsoft Cost Management.

Leave a Reply