
Supported providers
Click a provider to jump to the setup steps.AWS
CUR 2.0 exports via Terraform or manual setup. ~5 min.
GCP
BigQuery billing export with detailed usage costs. <5 min.
Azure
Azure Cost Management exports with SAS access. ~5 min.
Custom Billing Format (FOCUS)
Import FOCUS tables from BigQuery. <5 min.
Confluent
Connect Confluent Cloud invoice data. <5 min.
Aiven
Sync Aiven billing via service account. <5 min.
Cursor
Admin API key for usage charges. <5 min.
Anthropic
Admin API key for API and subscription usage. <5 min.
GCP CUD Metadata
Commitments metadata from BigQuery exports. <5 min.
Datadog (billing)
Pull Datadog usage and billing estimates. <5 min.
Before you start
- A Costory account. You need an existing account to connect datasources.
- Admin billing access to the cloud provider(s) you want to connect (e.g., AWS Billing console, GCP Billing Admin, Azure Cost Management). See Your permissions below for details.
- Terraform is optional. Every provider can be connected via the Costory UI. Terraform examples are provided for infrastructure-as-code workflows.
- Using Terraform? Generate a Costory API token under your org name (top-right) > API Tokens. Full provider reference: Costory Terraform Provider.
- Read-only access only. Costory never writes to your cloud accounts or modifies your infrastructure. See Security for details.
- Unlimited datasources. Connect as many AWS accounts, GCP projects, Azure subscriptions, and SaaS providers as you need within one Costory organization.
- Already have a billing export? If you already export CUR 2.0 (AWS), BigQuery billing (GCP), or Cost Management data (Azure) for another tool, you can reuse it. No need to create a second export. Just grant Costory read access to the existing bucket, dataset, or container.
Your permissions
To perform the setup, you need the following permissions on your cloud account (separate from the read-only access Costory uses afterwards):| Provider | Permissions you need |
|---|---|
| AWS | iam:CreateRole, s3:CreateBucket, Billing console access (to create CUR exports) |
| GCP | BigQuery Admin on the billing dataset, Billing Admin |
| Azure | Cost Management Contributor, Storage Account Contributor |
Historical data backfill
Each provider has different backfill capabilities. Costory ingests whatever history is available in the export:| Provider | Backfill depth | How |
|---|---|---|
| AWS | Up to 12 months | Automatic via aws:createdBy tag backfill. For longer periods, open an AWS Support request. |
| GCP | Current + previous month | GCP only backfills 2 months by default. |
| Azure | Up to 13 months | Use Export selected dates in the portal (1-month chunks) or the Terraform run_backfill variable. |
| SaaS providers | Up to 12 months | Automatic on first sync (depends on provider data retention). |
What happens after setup
Once a datasource is connected, Costory automatically normalizes your billing data into Standard Columns and runs Feature Engineering to detect, merge, and clean up your cost allocation tags across all providers. Within minutes (or up to 12 hours for the first AWS export), your costs are ready to explore in the Cost Explorer. Follow the Quickstart guide to build your first dashboard, set up alerts, and start tracking cost trends.Next steps
Connect Usage Metrics
Overlay usage data on your costs for unit economics and root-cause analysis.
Connect Events
Link deploys, incidents, and other engineering events for full context.
FAQ
What permissions does Costory need?
What permissions does Costory need?
Costory requires read-only access. Here is the minimal permission set per provider:
Costory never writes to your cloud accounts. See Security for details.
| Provider | Permissions |
|---|---|
| AWS | s3:ListBucket, s3:GetObject on the CUR export bucket (via federated IAM role) |
| GCP | roles/bigquery.dataViewer, roles/bigquery.metadataViewer on the billing dataset |
| Azure | Read + List on the storage container (via SAS token) |
| Datadog | usage_read, billing_read, ci_visibility_read (optional), timeseries_query (optional) |
| Confluent | Billing read access on the API key |
| Aiven | organization:billing:read on the application user |
| Cursor | Admin API key (organization-scoped, not billing-specific) |
| Anthropic | Admin API key (organization-scoped, not billing-specific) |
Can I connect at the organization level instead of per-account?
Can I connect at the organization level instead of per-account?
- AWS: Yes. Connect a single CUR export from your management (payer) account to cover all member accounts in the organization.
- GCP: Yes. The billing export covers all projects under the billing account automatically.
- Azure: Exports can be scoped at the Management Group or Billing Account level, but this is often restricted by customer agreements. Per-subscription exports are the most common setup.
Where does Costory store my data?
Where does Costory store my data?
Costory stores ingested billing data in GCP EU by default. If you need US residency, contact us to update your configuration. For full details on compliance and certifications, see Security.
How do I rotate or update credentials?
How do I rotate or update credentials?
- AWS: No rotation needed. Costory uses a federated IAM role (web identity), so there are no long-lived credentials to manage.
- GCP: No rotation needed. Access is granted via a GCP service account.
- Azure: You can update the SAS token in-place from the Costory UI or via Terraform. The default Terraform config sets a 900-day expiry, so no action is needed for nearly 3 years.
- SaaS providers (Datadog, Confluent, Aiven, Cursor, Anthropic): Create a new datasource with the new API key, then delete the old one. No data gap or re-ingestion occurs.
Can I delete a datasource? What happens to historical data?
Can I delete a datasource? What happens to historical data?
Yes, you can delete a datasource from the Costory UI or via
terraform destroy. All ingested data for that datasource is permanently deleted. If you need to restore it, contact us.How do I fully remove Costory from my cloud accounts?
How do I fully remove Costory from my cloud accounts?
Delete the datasource in Costory first, then clean up the cloud-side resources:
- AWS: Delete the S3 bucket, IAM role, and CUR export, or revert the CloudFormation stack if you used that path.
- GCP: Revoke the BigQuery dataset access granted to the Costory service account.
- Azure: Delete the storage account, Cost Management exports, and resource group.
terraform destroy handles all cloud-side and Costory-side cleanup automatically.Can I reuse an existing billing export?
Can I reuse an existing billing export?
Yes. If you already have a billing export for another tool, you do not need a second one:
- AWS: Reuse your existing CUR 2.0 export (Parquet format). Skip the bucket/export creation and only create the federated IAM role so Costory can read the data.
- GCP: There is only one BigQuery billing export table per billing account, so Costory always uses the same table as any other tool.
- Azure: Reuse your existing Cost Management export (Parquet format). Skip the export creation and only generate a SAS token for Costory.
Does Costory require IP allowlisting or special network rules?
Does Costory require IP allowlisting or special network rules?
No. Costory accesses your data over the public internet using the credentials you provide (IAM role, SAS token, or API key). No IP allowlisting, firewall rules, or private connectivity setup is needed on your side.
What cloud costs does the setup create?
What cloud costs does the setup create?
The cloud-side costs are negligible. You only pay for storage (and minor egress) on the export bucket or dataset:
- AWS: S3 storage for CUR exports. Use overwrite mode (enabled by default) to keep storage flat.
- GCP: BigQuery storage for the billing export table.
- Azure: Blob storage for the export container.
How do I monitor ingestion health?
How do I monitor ingestion health?
Use the Data Health dashboard in Costory (Settings > Data health) to:
- Check the status of each datasource.
- See the last successful sync timestamp.
- Force a manual refresh if needed.
What if my first AWS CUR export doesn't arrive?
What if my first AWS CUR export doesn't arrive?
The first CUR export can take up to 12 hours. Costory sends you an email when data is received. If nothing arrives after 12 hours, contact AWS Support to force-trigger the export. This is rare but can happen.A common pitfall is forgetting to backfill historical data: AWS only exports going forward. Use the
aws:createdBy tag backfill (up to 12 months) or open an AWS Support request for longer periods.What if my connection fails?
What if my connection fails?
Costory validates the connection when you add a datasource, both via the UI and Terraform. If credentials or permissions are misconfigured (e.g., wrong IAM role trust, invalid SAS token), you’ll receive an immediate error message before any sync is attempted.
Still having trouble?
Still having trouble?
If your question isn’t answered here, contact us. We typically respond within a few hours.
