Job B needs the container image tag that Job A built. You set a variable in Job A. Job B cannot see it. You search the documentation, find three different reference syntaxes, and none of them have a worked example that matches your pipeline structure.
Output variables are the primary mechanism for passing data between jobs and stages in Azure DevOps YAML pipelines, but the reference path syntax is verbose and unforgiving. The wrong step name, a missing dependsOn, or a dependencies vs. stageDependencies mixup produces an empty string with no error. For a multi-stage release pipeline, a broken output variable reference means a deployment targeting the wrong artifact or a condition that silently never fires.
This article covers:
- The exact
task.setvariablesyntax that produces an output variable, and why the stepname:field is mandatory. - The three reference path syntaxes: same-stage cross-job, cross-stage, and matrix leg.
- The two consumption patterns: variable mapping vs. direct condition reference.
- The fan-out pattern for multiple downstream jobs reading from one producer.
- A decision framework for when output variables are the wrong tool and artifacts are the right one.
Producing an Output Variable
task.setvariable with isOutput=true
An output variable is produced by writing a logging command to standard output during a pipeline step. The Azure DevOps agent intercepts lines matching the ##vso[...] pattern and acts on them.
The flag that matters is isOutput=true. Without it, the variable is scoped to the current job and invisible to any dependencies or stageDependencies object in downstream jobs. With it, the variable is promoted to the job’s output variable namespace, where downstream jobs can read it during their initialization phase.
# Bash: without isOutput=true — variable is job-scoped, invisible to downstream jobs
echo "##vso[task.setvariable variable=localVar]local-value"
# Bash: with isOutput=true — promoted to output variable, readable via dependencies
SHORT_SHA=$(git rev-parse --short HEAD)
IMAGE_TAG="${BUILD_BUILDID}-${SHORT_SHA}"
echo "##vso[task.setvariable variable=imageTag;isOutput=true]$IMAGE_TAG"
# PowerShell: same distinction applies
Write-Host "##vso[task.setvariable variable=localVar]local-value"
# PowerShell: promoted output variable
$shortSha = (git rev-parse --short HEAD)
$imageTag = "$($env:BUILD_BUILDID)-$shortSha"
Write-Host "##vso[task.setvariable variable=imageTag;isOutput=true]$imageTag"
The value is always stored and returned as a string. Numeric or boolean values must be serialized by the producer and deserialized by the consumer — there is no type system for output variables.
The task.setvariable command accepts three flags that can be combined:
| Flag | Effect | Notes |
|---|---|---|
isOutput=true | Promotes variable to cross-job output namespace | Required for cross-job access |
issecret=true | Masks value in log output | Automatically masks in all downstream jobs |
isreadonly=true | Prevents downstream redefinition | Rarely needed; use with caution |
The maximum variable value length is 32,766 characters — the Windows environment variable size limit. Exceeding it triggers a ##[warning] in the logs and causes truncation. For values larger than a few KB, use pipeline artifacts instead.
The Step name: Requirement
The step that writes an output variable must have a name: field. The step name is a required component of every cross-job reference path:
dependencies.JobName.outputs['stepName.varName']
Without name:, the output variable is set within the job but the reference path has no valid step name. The expression evaluates to empty string — no error, no warning.
The name: field is distinct from displayName:. The name: value is the identifier used in reference paths. The displayName: value is the label shown in the pipeline UI. Only name: appears in the output variable reference.
jobs:
- job: BuildJob
pool:
vmImage: ubuntu-latest
steps:
- script: |
SHORT_SHA=$(git rev-parse --short HEAD)
IMAGE_TAG="${BUILD_BUILDID}-${SHORT_SHA}"
echo "##vso[task.setvariable variable=imageTag;isOutput=true]$IMAGE_TAG"
name: computeTag # This identifier appears in the reference path
displayName: "Compute container image tag" # UI label only — not part of any reference
The downstream reference uses computeTag (the name: value), not Compute container image tag:
dependencies.BuildJob.outputs['computeTag.imageTag']
Use camelCase for step names. The reference path is case-sensitive, and a casing mismatch on any component produces an empty string. Standardize on a convention and apply it consistently to avoid silent failures.
Output Variables and Secrets
When you combine isOutput=true and issecret=true, the value is added to the pipeline’s log-scrubbing engine. This masking carries over automatically to all downstream jobs in the same pipeline run. If a downstream job maps the variable and prints it, the value will be masked in the logs.
# Secret output variable — masked in logs automatically across the entire run
echo "##vso[task.setvariable variable=apiToken;isOutput=true;issecret=true]$API_TOKEN"
While log masking is automatic, secret variables are not automatically mapped to environment variables in scripts. You must still map them explicitly in the downstream job’s variables: block to use them:
# In the consuming job: map the secret to make it accessible
variables:
myToken: $[ dependencies.JobA.outputs['stepName.apiToken'] ]
Do not use output variables to propagate long-lived credentials. Use variable groups with Azure Key Vault integration or dedicated secret retrieval tasks for anything that should remain protected outside the context of a single run.
Cross-Job Output Variable References (Same Stage)
The Reference Path Anatomy
The full reference path for a same-stage cross-job output variable is:
$[ dependencies.ProducerJobName.outputs['stepName.varName'] ]
Four components, all required:
$[ ]— the runtime expression wrapper. This evaluates during “Initialize Job”, not at template expansion time. Using${{ }}here is a compile-time expression that does not have access to runtime data — it produces the literal string instead of the variable value.dependencies— the same-stage dependency context object. This is notstageDependencies— that object is for cross-stage references only.ProducerJobName— thejob:key of the producing job, exactly as written in the YAML. Not thedisplayName:.outputs['stepName.varName']— the accessor combining the step name and variable name with a dot separator, in a single-quoted string.
All four components are case-sensitive. A casing mismatch on any one of them produces empty string with no diagnostic.
Diagram: Output Variable Propagation Flow
This diagram visualizes how the reference syntax changes depending on whether you are accessing a variable within the same stage or across different stages.
Visual Notes:
- Green Path (Same Stage): Uses the
dependenciesobject. The path is shorter because the stage name is implied. - Blue Path (Cross-Stage): Uses the
stageDependenciesobject. The path requires theStageNamesegment. - Mandatory Link: Both paths require a
dependsOndeclaration to populate the respective dependency context.
# $[ dependencies.BuildJob.outputs['computeTag.imageTag'] ]
# │ │ │ │ │ │
# │ │ │ │ │ └── variable name (isOutput=true)
# │ │ │ │ └── step name: (not displayName:)
# │ │ │ └── outputs accessor
# │ │ └── job: key from producer
# │ └── same-stage dependencies object
# └── runtime expression wrapper
The dependsOn Requirement
The consuming job must declare dependsOn: ProducerJobName. Without it, the dependencies object for the producing job is not populated, and the reference resolves to empty string.
dependsOn serves two functions simultaneously: it establishes execution order (consumer runs after producer completes) and grants the consumer access to the producer’s output variable data. You cannot separate these two effects — dependsOn is always required for both.
jobs:
- job: BuildJob
pool:
vmImage: ubuntu-latest
steps:
- script: |
IMAGE_TAG="${BUILD_BUILDID}-$(git rev-parse --short HEAD)"
echo "##vso[task.setvariable variable=imageTag;isOutput=true]$IMAGE_TAG"
name: computeTag
displayName: "Compute and set image tag"
- job: DeployJob
dependsOn: BuildJob # Required: establishes ordering AND populates dependencies object
pool:
vmImage: ubuntu-latest
variables:
imageTag: $[ dependencies.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: |
echo "Deploying image: $(imageTag)"
docker pull myregistry.azurecr.io/myapp:$(imageTag)
displayName: "Deploy tagged image"
A job can declare dependsOn on multiple jobs and access output variables from any of them in the same variables: block.
The Two Consumption Patterns
Pattern 1 — Variable Mapping: Map the output variable into the consuming job’s variables: block. All steps in the job can then access it via $(varName) macro expansion. This is required when the value reaches a script, task input, or any step-level field.
Pattern 2 — Direct Condition Reference: Use the full $[ dependencies... ] reference directly in a condition: field. No variable mapping is needed when the value only controls whether a job or step executes.
# Pattern 1: Variable mapping — required for script and task access
jobs:
- job: DeployJob
dependsOn: BuildJob
pool:
vmImage: ubuntu-latest
variables:
# Map into job scope so steps can read it via $(imageTag)
imageTag: $[ dependencies.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: |
echo "Deploying: $(imageTag)"
docker pull myregistry.azurecr.io/myapp:$(imageTag)
displayName: "Pull and deploy image"
# Pattern 2: Direct condition reference — sufficient when value only drives a condition
jobs:
- job: NotifyJob
dependsOn: BuildJob
# No variables mapping needed — reference evaluates inline in condition:
condition: eq(dependencies.BuildJob.outputs['computeTag.imageTag'], 'latest')
pool:
vmImage: ubuntu-latest
steps:
- script: echo "Latest tag detected — triggering downstream notification"
displayName: "Notify on latest tag"
The $[ ] runtime expression syntax is only evaluated in two contexts: condition: fields and variables: block values. It does not work inside step inputs, script bodies, or task parameter fields. If you write $(imageTag) in a script and it resolves to the literal string, check that the variable mapping exists in the job’s variables: block — the $[ ] expression must be the bridge.
Cross-Stage Output Variable References
stageDependencies vs. dependencies
Within a stage, use dependencies — the object contains jobs from the same stage.
Across stages, use stageDependencies — the object contains jobs from other stages that the current stage depends on.
Using dependencies to reference a job in another stage resolves to empty string. No parse error, no warning. It is one of the most common sources of silent failures in multi-stage pipelines.
dependencies | stageDependencies | |
|---|---|---|
| Scope | Jobs in the same stage | Jobs in other stages |
| Syntax | dependencies.JobName.outputs['step.var'] | stageDependencies.StageName.JobName.outputs['step.var'] |
| When to use | Same-stage cross-job reference | Cross-stage reference |
| Common mistake | Used for cross-stage (wrong) | Forgetting the stage name segment |
# Wrong: using dependencies for a cross-stage reference — silent empty string
variables:
imageTag: $[ dependencies.BuildJob.outputs['computeTag.imageTag'] ] # BuildJob is in a different stage
# Correct: using stageDependencies with the stage name included
variables:
imageTag: $[ stageDependencies.BuildStage.BuildJob.outputs['computeTag.imageTag'] ]
The Full Cross-Stage Reference Path
The full syntax for a cross-stage output variable reference is:
$[ stageDependencies.ProducerStageName.ProducerJobName.outputs['stepName.varName'] ]
Five components: the $[ ] wrapper, stageDependencies, the stage’s stage: key value, the job’s job: key value, and the outputs['stepName.varName'] accessor.
The consuming stage must declare dependsOn: ProducerStageName. Without it, stageDependencies does not contain the producer stage’s data — the reference resolves to empty string.
stages:
# Stage 1: Build the image and emit the tag as an output variable
- stage: BuildStage
displayName: "Build"
jobs:
- job: BuildJob
pool:
vmImage: ubuntu-latest
steps:
- script: |
SHORT_SHA=$(git rev-parse --short HEAD)
IMAGE_TAG="${BUILD_BUILDID}-${SHORT_SHA}"
echo "##vso[task.setvariable variable=imageTag;isOutput=true]$IMAGE_TAG"
name: computeTag # Required identifier in downstream references
displayName: "Compute image tag"
# Stage 2: Read the tag from Stage 1 and verify the image
- stage: TestStage
displayName: "Integration Tests"
dependsOn: BuildStage # Required for stageDependencies.BuildStage access
jobs:
- job: IntegrationTests
pool:
vmImage: ubuntu-latest
variables:
# stageDependencies: stage name, then job name, then outputs accessor
imageTag: $[ stageDependencies.BuildStage.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: |
echo "Verifying image: $(imageTag)"
if [ -z "$(imageTag)" ]; then
echo "##vso[task.logissue type=error]imageTag is empty — check BuildStage dependsOn"
exit 1
fi
displayName: "Verify image tag is populated"
# Stage 3: Deploy using the tag from Stage 1 — accessed directly, not via Stage 2
- stage: DeployStage
displayName: "Deploy"
dependsOn:
- BuildStage # Direct access to BuildStage's output variables
- TestStage # Execution dependency — ensures tests passed first
condition: and(succeeded('BuildStage'), succeeded('TestStage'))
jobs:
- job: Deploy
pool:
vmImage: ubuntu-latest
variables:
imageTag: $[ stageDependencies.BuildStage.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: |
echo "Deploying image: $(imageTag)"
displayName: "Deploy $(imageTag)"
Propagating Variables Through an Intermediate Stage
stageDependencies does not provide transitive access. If Stage C depends only on Stage B, and Stage B depends on Stage A, Stage C cannot read Stage A’s output variables — they are not in Stage C’s stageDependencies object.
Two solutions:
Solution 1 — Direct multi-stage dependsOn: Declare Stage C as depending on both Stage A and Stage B. This gives Stage C direct access to Stage A’s outputs without any re-export. This is the simpler approach and the one to use when Stage C logically needs both stages to complete before it runs.
stages:
- stage: StageA
jobs:
- job: JobA
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo "##vso[task.setvariable variable=buildId;isOutput=true]$(Build.BuildId)"
name: setId
displayName: "Set build ID"
- stage: StageB
dependsOn: StageA
jobs:
- job: JobB
pool:
vmImage: ubuntu-latest
steps:
- script: echo "Running integration checks"
- stage: StageC
dependsOn:
- StageA # Direct dependency — populates stageDependencies.StageA
- StageB # Execution dependency — StageC runs after StageB completes
jobs:
- job: JobC
pool:
vmImage: ubuntu-latest
variables:
# Access StageA's output directly — no re-export needed
buildId: $[ stageDependencies.StageA.JobA.outputs['setId.buildId'] ]
steps:
- script: echo "Build ID: $(buildId)"
Solution 2 — Re-export: Stage B reads Stage A’s output variable via stageDependencies, assigns it to a local variable, and a step in Stage B emits it with isOutput=true under the same or a new name. Stage C then reads Stage B’s re-emitted variable. Use this only when Stage C must not depend directly on Stage A for architectural reasons — it adds complexity and a fragile dependency chain.
Matrix Job Output Variables
The Leg Name Component
When the producer is a matrix job, each leg runs as a distinct copy of the job with its own output variable namespace. The leg name — the key in the strategy.matrix block — is a required component of the reference path.
Full syntax for a matrix output variable reference:
$[ dependencies.MatrixJobName.outputs['legName.stepName.varName'] ]
The leg name is the exact matrix key as defined in the strategy.matrix block, case-sensitive.
jobs:
- job: TestMatrix
strategy:
matrix:
Node18: # Leg name — appears verbatim in the outputs accessor
nodeVersion: '18.x'
Node20:
nodeVersion: '20.x'
Node22:
nodeVersion: '22.x'
pool:
vmImage: ubuntu-latest
steps:
- task: NodeTool@0
inputs:
versionSpec: $(nodeVersion)
- script: |
# Run tests and emit pass/fail as an output variable for this leg
TEST_RESULT="passed"
echo "##vso[task.setvariable variable=testResult;isOutput=true]$TEST_RESULT"
name: runTests
displayName: "Run tests"
# Aggregation job reads each leg by its exact name
- job: Aggregate
dependsOn: TestMatrix
pool:
vmImage: ubuntu-latest
variables:
# Leg name ('Node18', 'Node20', 'Node22') prefixes the step name in the accessor
node18Result: $[ dependencies.TestMatrix.outputs['Node18.runTests.testResult'] ]
node20Result: $[ dependencies.TestMatrix.outputs['Node20.runTests.testResult'] ]
node22Result: $[ dependencies.TestMatrix.outputs['Node22.runTests.testResult'] ]
steps:
- script: |
echo "Node 18: $(node18Result)"
echo "Node 20: $(node20Result)"
echo "Node 22: $(node22Result)"
displayName: "Aggregate test results"
Dots (.) and hyphens (-) in matrix leg names break the output variable reference path. The path accessor outputs['legName.stepName.varName'] uses dots as separators — a dot in the leg name creates an extra path segment that does not resolve to a valid property. Use underscores only in leg names.
When Leg Names Are Dynamic
When the matrix is generated at compile time via ${{ each }}, the leg names are known during template expansion. The downstream job’s variables: block can use a matching ${{ each }} loop to generate the output variable mappings alongside the matrix definition, keeping both in sync.
parameters:
- name: nodeVersions
type: object
default:
- version: Node18
spec: '18.x'
- version: Node20
spec: '20.x'
- version: Node22
spec: '22.x'
jobs:
- job: TestMatrix
strategy:
matrix:
${{ each node in parameters.nodeVersions }}:
${{ node.version }}:
nodeSpec: ${{ node.spec }}
pool:
vmImage: ubuntu-latest
steps:
- task: NodeTool@0
inputs:
versionSpec: $(nodeSpec)
- script: echo "##vso[task.setvariable variable=testResult;isOutput=true]passed"
name: runTests
displayName: "Run tests"
- job: Aggregate
dependsOn: TestMatrix
pool:
vmImage: ubuntu-latest
variables:
# Generate one variable mapping per matrix leg — stays in sync with the matrix definition
${{ each node in parameters.nodeVersions }}:
${{ node.version }}Result: $[ dependencies.TestMatrix.outputs['${{ node.version }}.runTests.testResult'] ]
steps:
- ${{ each node in parameters.nodeVersions }}:
- script: echo "${{ node.version }}: $(${{ node.version }}Result)"
displayName: "Result: ${{ node.version }}"
When leg names are not known at compile time — for example, when a runtime matrix is populated from a previous step’s output — individual output variable references are not possible. The reference path requires a literal leg name; there is no runtime-evaluated accessor for dynamic leg names. For runtime matrix scenarios, use pipeline artifacts to pass data between jobs instead.
When Output Variables Are the Wrong Tool
Size and Complexity Limits
Output variables are for scalar values: build IDs, image tags, environment URLs, pass/fail flags. For anything larger or more structured, use pipeline artifacts.
| Criterion | Use an output variable | Use a pipeline artifact |
|---|---|---|
| Data size | Under ~32 KB | Over ~32 KB |
| Data type | String (ID, tag, flag, URL) | Structured data (JSON, binary, file) |
| Persistence | Current run only | Persists after run completion |
| Cross-pipeline access | Not supported | Supported |
| Downstream consumer count | Any number of jobs | Any number of jobs or pipelines |
A variable value that exceeds 32,766 characters triggers a ##[warning] in the logs and causes truncation. There is no exception — the truncated value is silently used by consumers. If your variable value is a JSON object, an array, or a file path list, serialize it to an artifact file instead.
Multi-Consumer Fan-Out
Multiple downstream jobs can read from the same output variable. Each consumer declares dependsOn: ProducerJob and maps the same reference independently. There is no fan-out limit.
jobs:
- job: BuildJob
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo "##vso[task.setvariable variable=imageTag;isOutput=true]$(Build.BuildId)"
name: computeTag
displayName: "Compute image tag"
# Three parallel jobs all reading the same output variable
- job: DeployEastUS
dependsOn: BuildJob
pool:
vmImage: ubuntu-latest
variables:
imageTag: $[ dependencies.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: echo "Deploying $(imageTag) to East US"
- job: DeployWestEU
dependsOn: BuildJob
pool:
vmImage: ubuntu-latest
variables:
imageTag: $[ dependencies.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: echo "Deploying $(imageTag) to West Europe"
- job: NotifySlack
dependsOn: BuildJob
pool:
vmImage: ubuntu-latest
variables:
imageTag: $[ dependencies.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: echo "Image $(imageTag) built — Slack notification sent"
Output variables are immutable within a run — each consumer reads the same value that the producer set. For very wide fan-outs (10+ consumers) where the variable value is larger than a few hundred bytes, publishing the value as a pipeline artifact reduces the load on the pipeline service’s variable store.
Cross-Pipeline Communication
Output variables are scoped to a single pipeline run. A separate pipeline run triggered via resources: cannot read output variables from the triggering pipeline.
For the common scenario of passing an image tag from a build pipeline to a release pipeline, the reliable pattern is a tag file artifact:
- Build pipeline writes the image tag to a small text file and publishes it as a pipeline artifact named
image-metadata - Release pipeline declares the build pipeline as a
resourcesentry and downloads the artifact in its first job - The release job reads the tag from the file:
TAG=$(cat $(Pipeline.Workspace)/image-metadata/imagetag.txt)
This pattern survives pipeline re-runs, works across pipeline boundaries, and leaves a durable record of which tag was deployed and when — none of which output variables provide.
Hands-On Example: Build-to-Deploy Pipeline with Image Tag Propagation
Scenario: A three-stage pipeline (Build → Integration Tests → Deploy) passes a container image tag from the Build stage to the Deploy stage. The Build job computes the tag from the build ID and short commit SHA, then pushes the image. The Integration Tests stage verifies the tagged image is accessible before Deploy proceeds. The Deploy stage uses the exact tag from the Build stage — not latest.
Prerequisites:
- An Azure DevOps pipeline with access to a container registry
- A container registry service connection named
myregistry-sc - Basic familiarity with multi-stage YAML syntax
Step 1: Build stage — compute and emit the output variable
trigger:
- main
stages:
# Stage 1: Build the image and emit the tag as an output variable
- stage: BuildStage
displayName: "Build"
jobs:
- job: BuildJob
displayName: "Build and push image"
pool:
vmImage: ubuntu-latest
steps:
- checkout: self
- script: |
SHORT_SHA=$(git rev-parse --short HEAD)
IMAGE_TAG="${BUILD_BUILDID}-${SHORT_SHA}"
echo "Image tag: $IMAGE_TAG"
# isOutput=true promotes this variable to the cross-stage output namespace
# name: 'computeTag' is required — it becomes part of every downstream reference path
echo "##vso[task.setvariable variable=imageTag;isOutput=true]$IMAGE_TAG"
name: computeTag # Required: stageDependencies.BuildStage.BuildJob.outputs['computeTag.imageTag']
displayName: "Compute image tag"
- task: Docker@2
displayName: "Build and push image"
inputs:
command: buildAndPush
containerRegistry: myregistry-sc
repository: myapp
dockerfile: Dockerfile
# Use the tag set in the previous step — $(computeTag.imageTag) within same job
tags: $(computeTag.imageTag)
Step 2: Integration Tests stage — read and verify
# Stage 2: Read the tag from BuildStage and verify the image before deploying
- stage: TestStage
displayName: "Integration Tests"
dependsOn: BuildStage # Required: populates stageDependencies.BuildStage
jobs:
- job: IntegrationTests
displayName: "Verify image"
pool:
vmImage: ubuntu-latest
variables:
# Map from BuildStage into this job's local variable scope
# Syntax: stageDependencies.StageName.JobName.outputs['stepName.varName']
imageTag: $[ stageDependencies.BuildStage.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: |
echo "Testing with image tag: $(imageTag)"
# Guard: fail early if the output variable did not propagate
if [ -z "$(imageTag)" ]; then
echo "##vso[task.logissue type=error]imageTag is empty — check BuildStage dependsOn declaration"
exit 1
fi
echo "Image $(imageTag) confirmed accessible"
displayName: "Verify image tag is populated"
Step 3: Deploy stage — read from BuildStage directly
# Stage 3: Deploy using the tag from BuildStage — accessed directly, not relayed through TestStage
- stage: DeployStage
displayName: "Deploy"
dependsOn:
- BuildStage # Direct dependency gives access to BuildStage's output variables
- TestStage # Execution dependency — ensures integration tests passed
condition: and(succeeded('BuildStage'), succeeded('TestStage'))
jobs:
- job: Deploy
displayName: "Deploy to production"
pool:
vmImage: ubuntu-latest
variables:
# Access BuildStage's output directly — TestStage does not need to re-export it
imageTag: $[ stageDependencies.BuildStage.BuildJob.outputs['computeTag.imageTag'] ]
steps:
- script: |
echo "Deploying image: $(imageTag)"
if [ -z "$(imageTag)" ]; then
echo "##vso[task.logissue type=error]imageTag is empty — check DeployStage dependsOn"
exit 1
fi
# az containerapp update --image myregistry.azurecr.io/myapp:$(imageTag) ...
echo "Deployment of $(imageTag) complete"
displayName: "Deploy image $(imageTag)"
Verification
Run the pipeline and check each stage:
- BuildStage log: the
computeTagstep output showsImage tag: 12345-a1b2c3d. The Initialize Job log for downstream stages shows the variable in theoutputssection. - TestStage log:
$(imageTag)resolves to12345-a1b2c3d, not the literal string or empty. - DeployStage log:
$(imageTag)matches the value from BuildStage — notlatest, not a hardcoded tag.
To confirm the reference is required: remove BuildStage from the Deploy stage’s dependsOn list and re-run. The $(imageTag) in the Deploy stage resolves to empty string, proving the dependency declaration is what populates the stageDependencies object.
Best Practices and Optimization
Always set name: on the step that writes an output variable. Make the name descriptive and stable. Renaming it later breaks every downstream reference silently. Treat the step name as a stable identifier.
Use the variable mapping pattern to bridge into script scope. The $[ dependencies... ] expression is evaluated only in condition: and variables: contexts. Map it into a job-level variables: entry, then access it via $(varName) in steps.
Validate that the output variable is non-empty before using it. A missing dependsOn produces an empty string, not an error. Add a guard step:
if [ -z "$(imageTag)" ]; then
echo "##vso[task.logissue type=error]imageTag is empty — check dependsOn declaration"
exit 1
fi
Or use it as a defensive condition:
condition: and(succeeded(), ne(dependencies.BuildJob.outputs['computeTag.imageTag'], ''))
Declare dependsOn explicitly on every stage and job that needs an output variable. Transitive access does not propagate stageDependencies data — only stages you directly depend on appear in your stageDependencies object.
Keep output variable values small. The value is serialized and stored as part of the pipeline run record. Use it for scalar identifiers (IDs, tags, flags, short strings). Serialize large structured data to a file and publish it as a pipeline artifact.
Map secret output variables explicitly. While log masking is automatic and global for secret variable values, they are not automatically mapped to environment variables. Map them in the variables: block of the consuming job to use them in scripts.
Troubleshooting Common Issues
Output variable reference resolves to empty string
Check all four prerequisites in order:
- The producing step has a
name:field — without it, the reference path has no valid step identifier. - The consuming job or stage declares
dependsOn:on the producer — without it, thedependencies/stageDependenciesobject is not populated. - The reference uses
stageDependenciesfor cross-stage references anddependenciesfor same-stage references. - All path components (job name, step name, variable name) match the YAML definitions exactly, including casing.
Add a diagnostic step that dumps the available dependency data:
# Dump all available output variables for debugging
echo "Dependencies context:"
echo "$(convertToJson(dependencies))"
Output variable is visible within the producing job but not in the consuming job
The isOutput=true flag was omitted from the task.setvariable command. Without it, the variable is job-scoped and does not appear in the dependencies or stageDependencies object.
# Wrong: job-scoped only
echo "##vso[task.setvariable variable=imageTag]$IMAGE_TAG"
# Correct: promoted to output variable namespace
echo "##vso[task.setvariable variable=imageTag;isOutput=true]$IMAGE_TAG"
Cross-stage output variable is empty despite correct stageDependencies syntax
The consuming stage’s dependsOn: references the stage by displayName: instead of the YAML stage: key. All dependency declarations and reference paths require the stage: key value.
# Wrong: using displayName in dependsOn
dependsOn: "Build and Push" # This is the displayName: value — fails
# Correct: using the stage: key value
dependsOn: BuildStage # This is the stage: key value — works
A matrix leg’s output variable is inaccessible — invalid characters in leg name
The matrix leg name was generated from a property containing hyphens or dots (e.g., node-18.x). The reference accessor outputs['legName.stepName.varName'] treats dots as path separators. Use underscores only in matrix leg names.
Key Takeaways
- An output variable requires three things: the
##vso[task.setvariable ...]logging command withisOutput=true, aname:field on the producing step, and successful step execution. - Consuming jobs need
dependsOn:on the producer for both execution ordering and access to thedependenciesorstageDependenciesobject. - Use
dependenciesfor same-stage cross-job references andstageDependenciesfor cross-stage references — using the wrong object produces an empty string silently. - The variable mapping pattern (
variables: myVar: $[ dependencies... ]) is required to make an output variable available to script steps via$(myVar). - When data is large or structured, use pipeline artifacts — output variables are for small scalar values (IDs, tags, flags, status strings).
Next Steps:
- Audit your multi-stage pipelines for any
dependenciesreferences used in cross-stage contexts — replace withstageDependencies. - Read the Dynamic Matrix Strategies article for patterns combining matrix leg names with output variable references.
- Read the Debugging Silent Pipeline Failures article for techniques to confirm root causes of empty variables.
