I migrated manually created GCP workload identity pools and GitHub repositories into Terraform so they could be managed consistently in code. The main friction point was provider-specific import ID formats.

Basic import Link to heading

terraform import module.my-app.github_repository.repo my-repo-name

Import blocks (Terraform 1.5+) Link to heading

For team workflows, I prefer import blocks because they’re reviewable in code:

import {
  to = module.my_app.github_repository.repo
  id = "my-repo-name"
}

Run:

terraform plan
terraform apply

After a successful apply, remove the import block (or keep it briefly if you want an audit trail in git history).

For modules, the address includes the module path.

Resources with special characters Link to heading

Quote the resource address if it has square brackets:

terraform import 'module.my_module.google_iam_workload_identity_pool_provider.provider["my-key"]' \
    my-pool/my-provider

Finding resource IDs Link to heading

For GCP resources, use gcloud to find the right IDs:

gcloud iam workload-identity-pools providers list \
    --workload-identity-pool=my-pool \
    --location=global \
    --project=my-project

After importing Link to heading

Run plan to see what doesn’t match:

terraform plan

Usually some settings differ from your config - update the config to match reality, then plan again until it shows no changes.

Before large migrations, back up state first:

terraform state pull > state-backup-$(date +%F).tfstate

Inspect state Link to heading

List what’s in the state:

terraform state list | grep github_repository

Show a specific resource:

terraform state show module.my-app.github_repository.repo

Warning: Don’t mess with state unless you know what you’re doing. Always back up first.

Terraform import vs Terraformer Link to heading

For importing a handful of resources, terraform import is fine. But if you’re migrating dozens or hundreds of existing resources, look at Terraformer instead. It can automatically generate both the Terraform configuration files and import the state for entire GCP projects, AWS accounts, or Kubernetes clusters.

I’ve only used Terraformer once for a complete GCP project migration, and whilst it saved hours of manual work, you still need to review and clean up the generated code. It creates a file per resource which can be messy, but it’s better than writing everything by hand.

For my use case (a few specific resources), manual import was cleaner and let me organise the code exactly how I wanted it.

Why import instead of recreating? Link to heading

You might wonder why not just delete and recreate these resources in Terraform. For some resources like GitHub repos or GCP IAM bindings, recreating means:

  • Losing commit history and issues (GitHub repos)
  • Breaking existing service account credentials (IAM)
  • Requiring DNS updates or service disruption

Importing lets you adopt existing infrastructure without downtime or data loss.