Terraform
Deploy the Terraform MCP server
The Terraform Model Context Protocol (MCP) server enables AI models to generate Terraform configuration using up-to-date information from the Terraform Registry. This page explains how to install, configure, and integrate the MCP server with your AI client.
Note
This feature is currently in beta. Do not use beta functionality in production environments.
Overview
The Terraform MCP server is a specialized service that provides AI models with access to current Terraform provider documentation and module information. You can deploy the server to the following environments:
- Local deployment: Run the server on your workstation using
stdio
mode for direct communication through standard input/output - Remote deployment: Run the server on a remote instance using
streamable-http
mode for network-based communication
Installation methods
Choose from three installation options based on your environment and preferences:
Method | Best for | Requirements |
---|---|---|
Docker | Most users, consistent environments | Docker Engine v20.10.21+ or Docker Desktop v4.14.0+. Refer to the Docker documentation for installation instructions. |
Compiled binary | Lightweight deployments, specific OS needs | Compatible operating system |
Source installation | Development, customization | Go development environment |
Run in Docker
Docker provides the most reliable and consistent way to run the Terraform MCP server across different environments.
- Start Docker on your system.
- Integrate with your AI client:
Verify Visual Studio Code is installed.
Verify the GitHub Copilot extension is installed and chats are configured to
Agent
mode.Verify MCP support enabled, refer to the VS Code MCP documentation for more information.
To use the MCP server in all workspaces, add the following configuration to your user settings JSON file:
{ "mcp": { "servers": { "terraform": { "command": "docker", "args": [ "run", "-i", "--rm", "hashicorp/terraform-mcp-server" ] } } } }
Alternatively, to use the server in a specific workspace, create an
mcp.json
file with the following configuration in your workspace's.vscode
directory:{ "servers": { "terraform": { "command": "docker", "args": [ "run", "-i", "--rm", "hashicorp/terraform-mcp-server" ] } } }
Verify the integration by opening the chat interface and selecting Agent from the mode settings.
Click the tools icon to verify that Terraform MCP server tools appear in the available tools list.
Run the compiled binary
The compiled binary option provides a lightweight installation without Docker dependencies. This method is ideal when you want to minimize resource usage or work in environments with restricted container access.
Download the binary for your operating system and architecture, visit the release library.
Add the following configuration to your client settings. Replace
/path/to/terraform-mcp-server
with the actual path to your downloaded binary.{ "mcp": { "servers": { "terraform": { "command": "/path/to/terraform-mcp-server", "args": ["stdio"] } } } }
Install from source
Installing from source gives you access to the latest features and allows for customization. This method requires a Go development environment.
Install the latest stable release.
$ go install github.com/hashicorp/terraform-mcp-server/cmd/terraform-mcp-server@latest
Alternatively, you can install the development version on
main
.$ go install github.com/hashicorp/terraform-mcp-server/cmd/terraform-mcp-server@main
After installation, add the following configuration to your client.
{ "mcp": { "servers": { "terraform": { "command": "/path/to/terraform-mcp-server", "args": ["stdio"] } } } }
Replace
/path/to/terraform-mcp-server
with the actual path to your downloaded binary. The binary location depends on your Go installation andGOPATH
configuration.Use
which terraform-mcp-server
to find the installed binary path.
Start the server
You can use the terraform-mcp-server
CLI and specify the transport protocol you want to use to start the server. Refer to the transport protocols reference for more information.
Start the server in stdio
mode.
$ terraform-mcp-server stdio [--log-file /path/to/log]
Run the following command on the local instance to start the server in streamable-http
mode:
$ terraform-mcp-server streamable-http \
[--transport-port 8080] \
[--transport-host 127.0.0.1] \
[--mcp-endpoint /mcp] \
[--log-file /path/to/log]
Instead of setting values manually, you can also use the supported environment variables. Refer to the environment variables reference for details.
Next steps
- Begin prompting your AI model about Terraform configurations. Refer to Prompt an AI model for guidance on effective prompting techniques.
- The server provides access to up-to-date provider documentation.
- Ask for help with specific Terraform resources and modules.
- Explore advanced configuration options for your specific deployment needs.