Vault
Benchmark Vault performance
Operating Vault in an efficient manner to support your use cases requires that you are able to accurately measure its performance. Ideally, you can benchmark and measure performance in environments which resemble production use cases to produce realistic results.
Challenge
You need to measure Vault performance in a meaningful way, and in an environment that also resembles that of your intended use case with respect to compute resources.
The Vault server under test should have the same auth methods and secrets engines enabled, along with example secrets, leases, and token data present to accurately simulate your use cases.
Solution
HashiCorp provides the open source utility vault-benchmark to help you measure Vault performance at a granular level using several of the available auth methods and secrets engines.
You can use vault-benchmark as a command line interface, as a Docker image, or as a Kubernetes workload to match the infrastructure you're using for Vault.
Personas
The end-to-end scenario and lab sandbox described in this tutorial involves one persona.
The persona is a Vault operator with privileged permissions to enable and disable auth methods and secrets engines. You perform all the tasks in the hands-on scenario as this persona.
Prerequisites
You need the following resources to complete the hands-on scenario based on whether you'll use the Vault community edition with CLI, Docker, Kubernetes, or HCP Vault Dedicated.
To complete the lab sandbox using CLI versions of Vault and vault-benchmark, you need the following:
Vault binary installed and in your system PATH.
vault-benchmark binary installed and in your system PATH.
Launch Terminal
This tutorial includes a free interactive command-line lab that lets you follow along on actual cloud infrastructure.
Versions used for this tutorial
This tutorial was last tested 05 Mar 2026 on macOS using the following software versions.
$ sw_vers --productVersion
26.3
$ vault version
Vault v1.21.0 (818ca8b3575ea937ca48b640baf35e1b2ede1833), built 2025-10-21T19:33:18Z
$ vault-benchmark version
vault-benchmark v0.3.0
$ docker version --format '{{.Server.Version}}'
29.2.1
$ curl --version | head -n 1 | awk '{print $2}'
8.7.1
$ jq --version
jq-1.8.1
$ helm version --short
v4.1.1+g5caf004
$ minikube version
minikube version: v1.38.1
commit: c93a4cb9311efc66b90d33ea03f75f2c4120e9b0
Lab setup
The setup for this tutorial's lab sandbox is different depending on whether you want to experiment with vault-benchmark in the CLI with a dev mode server, in the CLI with a Vault Dedicated server, in Docker, or in Kubernetes.
Create local lab sandbox home
You can create a temporary directory to hold all the content needed for this lab sandbox and then assign its path to an environment variable for later reference.
Open a terminal, and create the directory
/tmp/learn-vault-benchmark.$ mkdir /tmp/learn-vault-benchmarkExport the lab sandbox directory path as the value to the
HC_LEARN_LABenvironment variable.$ export HC_LEARN_LAB=/tmp/learn-vault-benchmark
Now choose the lab set up workflow that matches the environment you want to use for this lab sandbox.
Run a Vault dev mode server as a background process from your terminal session to follow the self-hosted Vault workflow in this lab sandbox.
Open a terminal and start a Vault dev server with
rootas the initial root token value.$ vault server \ -dev \ -dev-root-token-id root \ -dev-tls \ > "$HC_LEARN_LAB"/vault-server.log 2>&1 &The Vault dev server defaults to running at
127.0.0.1:8200. The server logs to the filevault-server.login the lab sandbox working directory, and gets automatically initialized and unsealed.Tail the server log to learn about important environment variables you need to export before you can use Vault.
$ tail -n 15 "$HC_LEARN_LAB"/vault-server.log | head -n 5Example output:
You may need to set the following environment variables: $ export VAULT_ADDR='https://127.0.0.1:8200' $ export VAULT_CACERT='/var/folders/4_/_10tk5gn3rd80q2q_nqj_7280000gn/T/vault-tls2551239917/vault-ca.pem'Export the environment variable for the
vaultCLI to address the Vault server.$ export VAULT_ADDR='https://127.0.0.1:8200'Export the environment variable for the client to use the Vault server CA certificate.
$ export VAULT_CACERT='/var/folders/4_/_10tk5gn3rd80q2q_nqj_7280000gn/T/vault-tls2551239917/vault-ca.pem'Export an environment variable for the
vaultCLI to authenticate with the Vault server.$ export VAULT_TOKEN=rootCheck Vault status.
$ vault statusExample output:
Key Value --- ----- Seal Type shamir Initialized true Sealed false Total Shares 1 Threshold 1 Version 1.21.0 Build Date 2025-10-21T19:33:18Z Storage Type inmem Cluster Name vault-cluster-4f99e7e8 Cluster ID 898011b6-f959-64a9-53fe-31965fa32f91 HA Enabled falseThe Vault server is ready for you to proceed with the lab sandbox.
Now that you've established a working lab, you're ready to explore vault-benchmark and its configuration.
Explore vault-benchmark
Your goal for this section is to explore vault-benchmark using a terminal session. Resources for diving deeper are provided at the section's conclusion.
Check your vault-benchmark version.
$ vault-benchmark versionExample output:
vault-benchmark v0.3.0You can get help to discover available commands.
$ vault-benchmark --helpExample output:
Usage: vault-benchmark <command> [args] Command list: run Run vault-benchmark test(s) review Review previous test resultsThis lab sandbox focuses on your use of the
runcommand. Get help for theruncommand.$ vault-benchmark run --helpUsage: vault-benchmark run [options] This command will run a vault-benchmark test. Run a vault-benchmark test with a configuration file: $ vault-benchmark run -config=/etc/vault-benchmark/test.hcl For a full list of examples, please see the documentation. Command Options: -annotate=<string> Comma-separated name=value pairs include in bench_running prometheus metric. Try name 'testname' for dashboard example. -audit_path=<string> Path to file for audit log. -ca_pem_file=<string> Path to PEM encoded CA file to verify external Vault. This can also be specified via the VAULT_CACERT environment variable. -cleanup Cleanup benchmark artifacts after run. The default is false. -cluster_json=<string> Path to cluster.json file -config=<string> Path to a vault-benchmark test configuration file. -debug Run vault-benchmark in Debug mode. The default is false. -disable_http2 Force HTTP/1.1 The default is false. -duration=<duration> Test Duration. The default is 10s. -log_level=<string> Level to emit logs. Options are: INFO, WARN, DEBUG, TRACE. The default is INFO. This can also be specified via the VAULT_BENCHMARK_LOG_LEVEL environment variable. -pprof_interval=<duration> Collection interval for vault debug pprof profiling. -random_mounts Use random mount names. The default is true. -report_mode=<string> Reporting Mode. Options are: terse, verbose, json. The default is terse. -rps=<int> Requests per second. Setting to 0 means as fast as possible. -vault_addr=<string> Target Vault API Address. The default is http://127.0.0.1:8200. This can also be specified via the VAULT_ADDR environment variable. -vault_namespace=<string> Vault Namespace to create test mounts. This can also be specified via the VAULT_NAMESPACE environment variable. -vault_token=<string> Vault Token to be used for test setup. This can also be specified via the VAULT_TOKEN environment variable. -workers=<int> Number of workers The default is 10.
Configure benchmark
Your goal for this section is to configure vault-benchmark for a basic benchmark run.
Vault Benchmark is configured with a HashiCorp Configuration Language (HCL) file. You can use the example configuration shown in the Usage documentation in this lab sandbox.
Here is the entire example configuration file with detailed explanation of each configuration entry.
vault-benchmark-config.hcl
vault_addr = "https://127.0.0.1:8200"
vault_token = "root"
vault_namespace="root"
duration = "30s"
cleanup = true
test "approle_auth" "approle_logins" {
weight = 50
config {
role {
role_name = "benchmark-role"
token_ttl="2m"
}
}
}
test "kvv2_write" "static_secret_writes" {
weight = 50
config {
numkvs = 100
kvsize = 100
}
}
Lines 1-5 are global parameters:
vault_addr: the full URL plus port for the Vault server to benchmark.vault_token: the literal token value for a token with capabilities to enable and manage secrets engines and auth methods. In this lab sandbox, the initial root token value is used, but you should not use a root token in production this way.vault_namespace: the name of the Enterprise namespace to use for the benchmark.duration: number of seconds to run the benchmark.cleanup: whether to remove all resources created by the benchmark run.
Lines 7-15 and 17-23 represent the 2 tests making up this benchmark configuration. This test configuration runs two different tests, an approle_auth test and a kvv2_write test, with the percentage of requests split evenly between the two.
The first test beginning at line 7, is for AppRole auth method logins. Note that the first parameter is weight. You can think of this as a percentage of the entire workload. This means that the benchmark performs AppRole logins 50% of the time throughout the entire run.
You can configure a test in its config stanza. In this test, role_name specifies the auth method role name benchmark-role. Each token Vault issues after an AppRole login with the benchmark-role is configured with a token_ttl value of 2 minutes to specify the token's time-to-live (TTL). You can learn more about available parameters in the AppRole auth method benchmark documentation.
The second test beginning at line 17, is for Key/Value version 2 secrets engine writes. This test takes the remaining 50% of benchmark operations with its weight setting. It also uses numkvs to specify that 100 key/value secrets be written, and kvsize to limit each value to 100 bytes. Available parameters are documented in the KV v1 and KV v2 secret benchmark documentation
You can learn more about all the tests in the vault-benchmark test documentation.
Now that you've reviewed the example configuration, you need to change it a bit to function with the Vault cluster you're using.
In the lab setup section, you set the VAULT_ADDR and VAULT_TOKEN (and VAULT_NAMESPACE for Vault Dedicated) environment variables for communicating to your Vault cluster.
Values from environment variables override anything specified in the vault-benchmark configuration file.
With this in mind, you can set the vault_addr, vault_token, and vault_namespace values to an empty string. Their values come from the environment variables you set.
These steps apply equally to self-hosted, Docker and Vault Dedicated.
Return to your terminal session and write the configuration file to
/tmp/learn-vault-benchmark/vault-benchmark-config.hcl.$ cat > "$HC_LEARN_LAB"/vault-benchmark-config.hcl << EOF vault_addr = "" vault_token = "" vault_namespace="" duration = "30s" cleanup = true test "approle_auth" "approle_logins" { weight = 50 config { role { role_name = "benchmark-role" token_ttl="2m" } } } test "kvv2_write" "static_secret_writes" { weight = 50 config { numkvs = 100 kvsize = 100 } } EOF
You've configured vault-benchmark for 2 tests, and they're now ready for you to run them.
Run benchmark
Your goal for this section is to run vault-benchmark with the configuration that you just created.
Run the benchmark.
$ vault-benchmark run \
-config="$HC_LEARN_LAB"/vault-benchmark-config.hcl
Example output:
2026-03-05T12:04:17.912-0500 [INFO] vault-benchmark: setting up targets
2026-03-05T12:04:20.042-0500 [INFO] vault-benchmark: starting benchmarks: duration=30s
2026-03-05T12:04:50.044-0500 [INFO] vault-benchmark: cleaning up targets
2026-03-05T12:05:05.590-0500 [INFO] vault-benchmark: benchmark complete
Target: https://127.0.0.1:8200
op count rate throughput mean 95th% 99th% successRatio
approle_logins 210721 7023.818619 7023.695679 720.46µs 1.639838ms 2.775216ms 100.00%
static_secret_writes 209678 6989.393786 6989.032447 685.226µs 1.55472ms 2.704063ms 100.00%
Review the output
The benchmark output is tabular by default. This example is from the self-hosted dev mode server, but the output will be similar for Vault in any environment.
2026-03-05T12:05:42.312-0500 [INFO] vault-benchmark: setting up targets
2026-03-05T12:05:44.376-0500 [INFO] vault-benchmark: starting benchmarks: duration=30s
2026-03-05T12:06:14.377-0500 [INFO] vault-benchmark: cleaning up targets
2026-03-05T12:06:29.601-0500 [INFO] vault-benchmark: benchmark complete
Target: https://127.0.0.1:8200
op count rate throughput mean 95th% 99th% successRatio
approle_logins 205869 6862.296969 6862.225774 734.673µs 1.716825ms 2.77376ms 100.00%
static_secret_writes 206253 6875.167739 6875.069110 698.976µs 1.623511ms 2.740157ms 100.00%
The first 4 lines are log outputs from vault-benchmark which describe its current actions.
The 5th line shows the benchmark target URL, and the 6th line represents headings for the metric data.
The data metrics are as follows for the last 2 lines, which represent each of the 2 benchmark tests you configured.
op: the test name.
count: the number of tests completed during the benchmark duration.
rate: the true operations per second rate of all tests of this type.
throughput: the operations per second rate of all successful tests of this type.
mean: the mean time in milliseconds per operation.
95th%: the 95th percentile time in milliseconds per operation.
99th%: the 99th percentile time in milliseconds per operation.
successRatio: the percentage of successful tests of this type. When tests are not successful, you should consult the Vault operational and audit logs for details on the unsuccessful tests.
Cleanup
Follow the steps for the environment you used in the lab sandbox to clean up.
Stop the Vault dev mode server.
$ pkill vaultRemove the lab sandbox directory.
$ rm -rf "$HC_LEARN_LAB"Unset environment variables.
$ unset VAULT_ADDR VAULT_CACERT VAULT_TOKENRemove cached Vault token.
$ rm -f ~/.vault-token
Next steps
You learned the basics around the Vault Benchmark tool, including how to configure and run a benchmark. You also learned about the default benchmark output, and available documentation resources for Vault Benchmark.
To dive deeper into Vault performance, consider reviewing the HashiCorp Well Architected Framework on identifying metrics, performance tuning and production hardening documentation.


