• HashiCorp Developer

  • HashiCorp Cloud Platform
  • Terraform
  • Packer
  • Consul
  • Vault
  • Boundary
  • Nomad
  • Waypoint
  • Vagrant
Nomad
  • Install
  • Intro
  • Tutorials
  • Documentation
  • API
  • Tools
  • Plugins
  • Sign up
Nomad Home

Documentation

Skip to main content
  • Documentation


    • Overview
    • artifact
    • affinity
    • change_script
    • check
    • check_restart
    • connect
    • constraint
    • csi_plugin
    • device
    • dispatch_payload
    • env
    • ephemeral_disk
    • expose
    • gateway
    • group
    • job
    • lifecycle
    • logs
    • meta
    • migrate
    • multiregion
    • network
    • parameterized
    • periodic
    • proxy
    • reschedule
    • resources
    • restart
    • scaling
    • service
    • sidecar_service
    • sidecar_task
    • spread
    • task
    • template
    • update
    • upstreams
    • vault
    • volume
    • volume_mount
  • Schedulers

  • Nomad Ecosystem
  • Nomad Partnerships
  • Who Uses Nomad
  • FAQ

  • Resources

  • Tutorial Library
  • Community Forum
    (opens in new tab)
  • Support
    (opens in new tab)
  • GitHub
    (opens in new tab)
  1. Developer
  2. Nomad
  3. Documentation
  4. Job Specification
  5. parameterized
  • Nomad
  • v1.3.x
  • v1.2.x
  • v1.1.x
  • v1.0.x
  • v0.12.x
  • v0.11.x

»parameterized Block

Placementjob -> parameterized

A parameterized job is used to encapsulate a set of work that can be carried out on various inputs much like a function definition. When the parameterized block is added to a job, the job acts as a function to the cluster as a whole.

The parameterized block allows job operators to configure a job that carries out a particular action, define its resource requirements and configure how inputs and configuration are retrieved by the tasks within the job.

To invoke a parameterized job, nomad job dispatch or the equivalent HTTP APIs are used. When dispatching against a parameterized job, an opaque payload and metadata may be injected into the job. These inputs to the parameterized job act like arguments to a function. The job consumes them to change its behavior, without exposing the implementation details to the caller.

To that end, tasks within the job can add a dispatch_payload block that defines where on the filesystem this payload gets written to. An example payload would be a task's JSON configuration.

Further, certain metadata may be marked as required when dispatching a job so it can be used to inject configuration directly into a task's arguments using interpolation. An example of this would be to require a run ID key that could be used to lookup the work the job is suppose to do from a management service or database.

Each time a job is dispatched, a unique job ID is generated. This allows a caller to track the status of the job, much like a future or promise in some programming languages. The dispatched job cannot be updated after dispatching; to update the job definition you need to update the parent job.

job "docs" {
  parameterized {
    payload       = "required"
    meta_required = ["dispatcher_email"]
    meta_optional = ["pager_email"]
  }
}

See the multiregion documentation for additional considerations when dispatching parameterized jobs.

parameterized Requirements

  • The job's scheduler type must be batch or sysbatch.

parameterized Parameters

  • meta_optional (array<string>: nil) - Specifies the set of metadata keys that may be provided when dispatching against the job.

  • meta_required (array<string>: nil) - Specifies the set of metadata keys that must be provided when dispatching against the job.

  • payload (string: "optional") - Specifies the requirement of providing a payload when dispatching against the parameterized job. The maximum size of a payload is 16 KiB. The options for this field are:

    • "optional" - A payload is optional when dispatching against the job.

    • "required" - A payload must be provided when dispatching against the job.

    • "forbidden" - A payload is forbidden when dispatching against the job.

parameterized Examples

The following examples show non-runnable example parameterized jobs:

Required Inputs

This example shows a parameterized job that requires both a payload and metadata:

job "video-encode" {
  # ...

  type = "batch"

  parameterized {
    payload       = "required"
    meta_required = ["dispatcher_email"]
  }

  group "encode" {
    # ...

    task "ffmpeg" {
      driver = "exec"

      config {
        command = "ffmpeg-wrapper"

        # When dispatched, the payload is written to a file that is then read by
        # the created task upon startup
        args = ["-config=${NOMAD_TASK_DIR}/config.json"]
      }

      dispatch_payload {
        file = "config.json"
      }
    }
  }
}

Metadata Interpolation

job "email-blast" {
  # ...

  type = "batch"

  parameterized {
    payload       = "forbidden"
    meta_required = ["CAMPAIGN_ID"]
  }

  group "emails" {
    # ...

    task "emailer" {
      driver = "exec"

      config {
        command = "emailer"

        # The campaign ID is interpolated and injected into the task's
        # arguments
        args = ["-campaign=${NOMAD_META_CAMPAIGN_ID}"]
      }
    }
  }
}
Edit this page on GitHub

On this page

  1. parameterized Block
  2. parameterized Requirements
  3. parameterized Parameters
  4. parameterized Examples
Give Feedback(opens in new tab)
  • Certifications
  • System Status
  • Terms of Use
  • Security
  • Privacy
  • Trademark Policy
  • Trade Controls
  • Give Feedback(opens in new tab)