Connect your cloud and SaaS billing data so Costory can ingest, normalize, and surface costs in the Cost Explorer. Unlike native billing consoles, Costory gives platform engineers a single schema to report multi-cloud costs in context of usage and events data, with no pipelines to build or maintain. Once a datasource is added, Costory starts ingesting automatically. All data is mapped to Standard Columns so you can group and filter costs by service, account, region, resource, and tags across all your providers in a single view.Documentation Index
Fetch the complete documentation index at: https://docs.costory.io/llms.txt
Use this file to discover all available pages before exploring further.

Supported providers
Click a provider to jump to the setup steps.AWS
GCP
Azure
Custom Billing Format (FOCUS)
Confluent
Aiven
Cursor
Anthropic
GCP CUD Metadata
Datadog (billing)
Elastic Cloud
Snowflake
Before you start
- A Costory account. You need an existing account to connect datasources.
- Admin billing access to the cloud provider(s) you want to connect (e.g., AWS Billing console, GCP Billing Admin, Azure Cost Management). See Your permissions below for details.
- Terraform is optional. Every provider can be connected via the Costory UI. Terraform examples are provided for infrastructure-as-code workflows.
- Using Terraform? Generate a Costory API token under your org name (top-right) > API Tokens. Full provider reference: Costory Terraform Provider.
- Read-only access only. Costory never writes to your cloud accounts or modifies your infrastructure. See Security for details.
- Unlimited datasources. Connect as many AWS accounts, GCP projects, Azure subscriptions, and SaaS providers as you need within one Costory organization.
- Already have a billing export? If you already export CUR 2.0 (AWS), BigQuery billing (GCP), or Cost Management data (Azure) for another tool, you can reuse it. No need to create a second export. Just grant Costory read access to the existing bucket, dataset, or container.
Your permissions
To perform the setup, you need the following permissions on your cloud account (separate from the read-only access Costory uses afterwards):| Provider | Permissions you need |
|---|---|
| AWS | iam:CreateRole, s3:CreateBucket, Billing console access (to create CUR exports) |
| GCP | BigQuery Admin on the billing dataset, Billing Admin |
| Azure | Cost Management Contributor, Storage Account Contributor |
Historical data backfill
Each provider has different backfill capabilities. Costory ingests whatever history is available in the export:| Provider | Backfill depth | How |
|---|---|---|
| AWS | Up to 12 months | Automatic via aws:createdBy tag backfill. For longer periods, open an AWS Support request. |
| GCP | Current + previous month | GCP only backfills 2 months by default. |
| Azure | Up to 13 months | Use Export selected dates in the portal (1-month chunks) or the Terraform run_backfill variable. |
| SaaS providers | Up to 12 months | Automatic on first sync (depends on provider data retention). |
| Elastic Cloud | ~6 months on first sync | Product-line costs per project from the Elastic Cloud Billing API; incremental sync afterward. |
What happens after setup
Once a datasource is connected, Costory automatically normalizes your billing data into Standard Columns and runs Feature Engineering to detect, merge, and clean up your cost allocation tags across all providers. Within minutes (or up to 12 hours for the first AWS export), your costs are ready to explore in the Cost Explorer. Follow the Quickstart guide to build your first dashboard, set up alerts, and start tracking cost trends.Next steps
Connect Usage Metrics
Connect Events
FAQ
What permissions does Costory need?
What permissions does Costory need?
| Provider | Permissions |
|---|---|
| AWS | s3:ListBucket, s3:GetObject on the CUR export bucket (via federated IAM role) |
| GCP | roles/bigquery.dataViewer, roles/bigquery.metadataViewer on the billing dataset |
| Azure | Read + List on the storage container (via SAS token) |
| Datadog | usage_read, billing_read, ci_visibility_read (optional), timeseries_query (optional) |
| Elastic Cloud | Elastic Cloud Organization API key with access to the Billing API (see Elastic Cloud setup) |
| Confluent | Billing read access on the API key |
| Aiven | organization:billing:read on the application user |
| Cursor | Admin API key (organization-scoped, not billing-specific) |
| Anthropic | Admin API key (organization-scoped, not billing-specific) |
Can I connect at the organization level instead of per-account?
Can I connect at the organization level instead of per-account?
- AWS: Yes. Connect a single CUR export from your management (payer) account to cover all member accounts in the organization.
- GCP: Yes. The billing export covers all projects under the billing account automatically.
- Azure: Exports can be scoped at the Management Group or Billing Account level, but this is often restricted by customer agreements. Per-subscription exports are the most common setup.
Where does Costory store my data?
Where does Costory store my data?
How do I rotate or update credentials?
How do I rotate or update credentials?
- AWS: No rotation needed. Costory uses a federated IAM role (web identity), so there are no long-lived credentials to manage.
- GCP: No rotation needed. Access is granted via a GCP service account.
- Azure: You can update the SAS token in-place from the Costory UI or via Terraform. The default Terraform config sets a 900-day expiry, so no action is needed for nearly 3 years.
- SaaS providers (Datadog, Confluent, Aiven, Cursor, Anthropic, Elastic Cloud): Create a new datasource with the new API key, then delete the old one. No data gap or re-ingestion occurs.
Can I delete a datasource? What happens to historical data?
Can I delete a datasource? What happens to historical data?
terraform destroy. All ingested data for that datasource is permanently deleted. If you need to restore it, contact us.How do I fully remove Costory from my cloud accounts?
How do I fully remove Costory from my cloud accounts?
- AWS: Delete the S3 bucket, IAM role, and CUR export, or revert the CloudFormation stack if you used that path.
- GCP: Revoke the BigQuery dataset access granted to the Costory service account.
- Azure: Delete the storage account, Cost Management exports, and resource group.
terraform destroy handles all cloud-side and Costory-side cleanup automatically.Can I reuse an existing billing export?
Can I reuse an existing billing export?
- AWS: Reuse your existing CUR 2.0 export (Parquet format). Skip the bucket/export creation and only create the federated IAM role so Costory can read the data.
- GCP: There is only one BigQuery billing export table per billing account, so Costory always uses the same table as any other tool.
- Azure: Reuse your existing Cost Management export (Parquet format). Skip the export creation and only generate a SAS token for Costory.
Does Costory require IP allowlisting or special network rules?
Does Costory require IP allowlisting or special network rules?
What cloud costs does the setup create?
What cloud costs does the setup create?
- AWS: S3 storage for CUR exports. Use overwrite mode (enabled by default) to keep storage flat.
- GCP: BigQuery storage for the billing export table.
- Azure: Blob storage for the export container.
How do I monitor ingestion health?
How do I monitor ingestion health?
- Check the status of each datasource.
- See the last successful sync timestamp.
- Force a manual refresh if needed.
What if my first AWS CUR export doesn't arrive?
What if my first AWS CUR export doesn't arrive?
aws:createdBy tag backfill (up to 12 months) or open an AWS Support request for longer periods.What if my connection fails?
What if my connection fails?
Still having trouble?
Still having trouble?
