Terraform
Prompt a model connected to the Terraform MCP server
This topic provides example prompts for helping you write Terraform configuration after installing the Terraform model context protocol (MCP) server. Refer to Deploy the Terraform MCP server for instructions on how to deploy the server.
Note
This feature is currently in beta. Do not use beta functionality in production environments.
Overview
To trigger the MCP server tools, ask your LLM questions about your provider configuration or general questions about Terraform providers and modules. Note that for your LLM to use the server, your questions must pertain to provider configuration from the registry. The LLM uses other sources, such as content from the internet, to respond to general questions about Terraform configuration or requests to generate code.
Requirements
Verify that you have deployed the Terraform MCP server and configured your client. Refer to Deploy the Terraform MCP server for instructions.
Prompt your MCP client
Your model may still decide to use other resources to formulate responses, but you can use the following guidelines to help you create prompts that are more likely to be routed through the server:
- Include the names of specific providers or provider modules. For example, use
google_compute_disk
in your prompt instead ofthe compute disk
. - Use the full Terraform resource address in your query. Refer to Resource Address Reference in the Terraform CLI documentation for additional information about resource addresses.
- Ask for the latest information about the provider or module.
Examples
You can use the following set of prompts as a guide to help you write your own configuration. The following responses are returned from the server using ChatGPT 4.1 in GitHub Copilot on VS Code. Your responses may vary.
Get AI-related resources from the Google provider
In the following example, the server obtains a list of Google provider document IDs so that you can decide which resources to create:
Prompt
I need help understanding what resources are available
in the Google provider that are for AI
Request
The server sends the following request payload to the registry:
{
"provider_Data_Type": "resources",
"provider_Name": "google",
"provider_Namespace": "hashicorp",
"providerVersion": "latest",
"serviceSlug": "ai"
}
Response
The Google Terraform provider offers several resources related to AI, primarily under the Vertex AI and Document AI services. Here are the main AI-related resources available:
**Vertex AI Resources**
- `vertex_ai_dataset`
- `vertex_ai_deployment_resource_pool`
- `vertex_ai_endpoint`
- `vertex_ai_endpoint_iam`
- `vertex_ai_feature_group`
- `vertex_ai_featurestore`
- `vertex_ai_index`
- `vertex_ai_metadata_store`
- `vertex_ai_tensorboard`
- `document_ai_processor`
- `document_ai_warehouse_document_schema`
These resources allow you to manage datasets, endpoints, feature stores, indexes, metadata, and more for AI/ML workflows on Google Cloud, as well as Document AI processors and schemas.
Get information about storage buckets in Azure
The following prompt gets a list of provider documentation IDs. Then, it calls the appropriate documentation for the use case:
I need help setting up storage buckets in the azure provider
Request
The server sends the following request payloads to the registry:
{
"providerDataType": "resources",
"providerName": "azurerm",
"providerNamespace": "hashicorp",
"providerVersion": "latest",
"serviceSlug": "storage_account"
}
{
"providerDocID": "8957547"
}
{
"providerDocID": "8957555"
}
Response
The server retrieves several responses from each tool, including example configurations from the registry. The following response is one of the examples:
Available Documentation (top matches) for resources in Terraform provider hashicorp/azurerm version: 4.29.0
# azurerm_storage_account
Manages an Azure Storage Account.
## Example Usage
```hcl
resource "azurerm_resource_group" "example" {
name = "example-resources"
location = "West Europe"
}
resource "azurerm_storage_account" "example" {
name = "storageaccountname"
resource_group_name = azurerm_resource_group.example.name
location = azurerm_resource_group.example.location
account_tier = "Standard"
account_replication_type = "GRS"
tags = {
environment = "staging"
}
}
```
To set up storage buckets in Azure using the Terraform azurerm provider, you need to use the azurerm_storage_account resource. Here’s a quick guide and example configuration:
1. Terraform Resource for Azure Storage Account
```hcl
resource "azurerm_resource_group" "example" {
name = "example-resources"
location = "West Europe"
}
resource "azurerm_storage_account" "example" {
name = "storageaccountname" # must be globally unique, lowercase
resource_group_name = azurerm_resource_group.example.name
location = azurerm_resource_group.example.location
account_tier = "Standard" # or "Premium"
account_replication_type = "GRS" # or "LRS", "RAGRS", "ZRS", etc.
tags = {
environment = "staging"
}
}
```
Get help creating an IBM VM
The following prompt gets a list of modules:
Is there a module to help with setting up an IBM VM?
Request
The server sends the following request payload:
{
"moduleQuery": "ibm vm"
}
Response
The client prints the following response using the context acquired from the MCP server:
Yes, there is a Terraform module to help with setting up an IBM VM:
- IBM VM Module
- ID: intel/ibm-vm/intel/1.2.2
- Description: Intel Cloud Optimization Module - IBM VM
- Module Version: 1.2.2
- Namespace: intel
- Source: GitHub - intel/terraform-intel-ibm-vm
You can use this module to set up and manage IBM VMs with Terraform.
If you need more details or usage examples, let me know.