The Ultimate Guide to Azure DevOps YAML Expressions

Apr 29, 2026 min read

Your pipeline condition evaluates to false. You know the variable is set. The logs show it. Yet the step is skipped. No error. No warning. Just silence.

This failure pattern is the most common source of confusion in Azure DevOps pipelines. The root cause is almost always the same: you used the wrong expression syntax for the phase at which the value exists. Azure DevOps pipelines evaluate three distinct syntaxes — ${{ }}, $[ ], and $( ) — at three different points in the pipeline lifecycle. Mixing them up produces conditions that are always true, always false, or invisibly broken. Most documentation treats the three syntaxes as interchangeable footnotes. They are not.

This guide covers the exact phase at which each syntax evaluates, which variables are visible at compile time versus runtime, how to refactor broken conditions, and how to recognize the four most common expression failure patterns before they reach production.


The Three Expression Syntaxes at a Glance

Syntax Reference

SyntaxNameEvaluation PhaseValid ContextsVariable SourcesPrimary Use Case
${{ expression }}Template expressionPhase 2 — Compileif, each, else, template parameters, any YAML key or valueparameters, static variablesStructural modifications: conditional jobs, loop-generated stages, template inclusion
$[ expression ]Runtime expressionPhase 4 — Initializecondition:, variables: mappingsAll pipeline variables, output variables, variable groupsBehavioral decisions: step conditions, output variable mapping
$(variableName)MacroPhase 5 — ExecuteTask inputs, script arguments, variable valuesQueue-time variables, variable groups, task.setvariable valuesString substitution in task arguments

All three syntaxes appear valid in a YAML file. None of them produces a parse error when used in the wrong context. The pipeline simply evaluates the expression against the variable store that exists at that phase — and if the variable is not there yet, the expression silently resolves to an empty string or false.

Here is a minimal pipeline that uses all three syntaxes in the same file:

# azure-pipelines.yml
parameters:
  - name: runSecurityScan
    type: boolean
    default: false

variables:
  - name: buildConfig
    value: Release

stages:
  # ${{ }} — compile-time: decides whether this stage exists in the plan at all
  - ${{ if parameters.runSecurityScan }}:
    - stage: SecurityScan
      displayName: Security Scan
      jobs:
        - job: Scan
          steps:
            - script: echo "Running security scan"

  - stage: Build
    jobs:
      - job: BuildApp
        steps:
          - script: dotnet build --configuration $(buildConfig)
            # $(buildConfig) — macro: substituted just before dotnet.exe receives the argument
            displayName: Build

          - bash: |
              echo "##vso[task.setvariable variable=imageTag;isOutput=true]$(Build.BuildId)"
            name: setTag
            displayName: Set image tag output variable

  - stage: Deploy
    dependsOn: Build
    jobs:
      - job: DeployApp
        variables:
          # $[ ] — runtime: reads the output variable after Build stage finishes
          IMAGE_TAG: $[ stageDependencies.Build.BuildApp.outputs['setTag.imageTag'] ]
        steps:
          # $[ ] — runtime condition: only deploy from main branch
          - script: echo "Deploying image tag $(IMAGE_TAG)"
            displayName: Deploy
            condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')

Why the Distinction Matters

Using ${{ if variables['Build.SourceBranch'] }} to conditionally include a deployment job is a common mistake. The condition always evaluates to false because Build.SourceBranch does not exist during YAML parsing — it is a predefined pipeline variable that the server populates in Phase 4, long after the compiled execution plan is locked in.

The pipeline accepts this YAML without complaint. There is no error. The deployment job simply never appears in the execution plan, and the run log offers no hint as to why.

The fix is a one-character change in the delimiter:

# BROKEN: Build.SourceBranch is not available at compile time (Phase 2)
- ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
  - job: Deploy
    # ...

# FIXED: Use a runtime condition evaluated at Phase 4
- job: Deploy
  condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')
  # ...

Both lines look almost identical. One works. One silently removes the job from the plan every time.


Evaluation Order: The Exact Sequence

The Five-Phase Pipeline Lifecycle

Understanding when each syntax fires requires understanding when each phase runs.

Phase 1 — Parse. The Azure DevOps service reads your YAML file from the repository. Any - template: references trigger additional HTTP fetches for each referenced file. Each nested template that references another template adds another round-trip.

Phase 2 — Compile. The service evaluates all ${{ }} compile-time expressions against the data available right now: parameters values, statically declared variables, and built-in compile-time functions. The result is the Expanded YAML — a flat, fully resolved execution plan with no template references or conditional blocks remaining. This is the document the queue receives. Once the Expanded YAML is produced, no compile-time expression will ever run again for this pipeline run.

Phase 3 — Queue. The Expanded YAML is submitted to the pipeline queue. No expressions are evaluated here. The plan is immutable at this point.

Phase 4 — Initialize. An agent is provisioned and the job begins. The service hydrates the variable store: predefined pipeline variables (Build.SourceBranch, Build.BuildId, etc.), variable group values, and queue-time overrides are all populated. Runtime expressions ($[ ]) in condition: fields and variables: mappings evaluate against this fully populated store.

Phase 5 — Execute. Tasks run in sequence. Immediately before each task receives its inputs, macro substitution ($( )) replaces every $(varName) token with the current string value of that variable. Variables set by prior tasks via ##vso[task.setvariable] are visible to macros in subsequent steps.

Phase 1: Parse         → Template files fetched, syntax validated
Phase 2: Compile       → ${{ }} expressions evaluated → Expanded YAML produced
Phase 3: Queue         → Plan submitted, immutable
Phase 4: Initialize    → Variables hydrated → $[ ] expressions evaluated
Phase 5: Execute       → Tasks run → $( ) macros expanded per-task

Diagram: The Azure DevOps Pipeline Lifecycle and Expression Evaluation Order

This diagram visualizes the Azure DevOps Pipeline Lifecycle and Expression Evaluation Order. It maps the three expression syntaxes (${{ }}, $[ ], $( )) to the specific phases where they are evaluated, highlighting the critical “compile-time vs. runtime” boundary that governs variable availability and pipeline structure.

TBhoeunSdta::rr::uyEIOcIOvnutEnuCaptuHxptoLluprypupmoutuadatupaa:tQlPrn:tidPt:uPRhaPd:lPhePePluathBehYaaEuhansea'uT-aAs'rxeantes$iaTsMe$ap-si'e(lsieL{maTeim4$mdkm2{enise:[5a.e1&:tdm3:cRC:eeee:I(Iereo(TCxrdmPnxEoamPPeops(Qmhipx)smhammr,YPuuatre'oaarppeAhetsiecnnsslisSMauaeasup,deealstLseblstesteiael4iierG1eote-zo-r-sni35entp2c))as)}]s,}V'k'aSrescs::::

Visual Notes:

  • The Structural Boundary: Decisions made in the orange zone (Phase 2) change the structure of the pipeline (stages/jobs). Decisions made in the blue zone (Phases 4-5) change the behavior or values within that fixed structure.
  • The Availability Gap: Variables like Build.SourceBranch do not exist in the orange zone. If you use ${{ if }} to test them, the result is always false because the data hasn’t arrived yet.

The “Golden Rule” follows directly from this sequence: you cannot use data from a later phase in an earlier phase. ${{ }} cannot read a value that only exists in Phase 4. $[ ] cannot alter the structure of the pipeline that was fixed in Phase 2.

What Variables Exist at Compile Time

The compile-time variable store is small and static. Only two sources contribute to it:

# AVAILABLE at compile time (${{ }})
parameters:
  - name: environment        # parameters are always available at compile time
    type: string
    default: dev
  - name: regions
    type: object
    default:
      - name: eastus
        short: eus

variables:
  - name: buildConfig        # static YAML-defined variables are available
    value: Release
  - name: registryName
    value: myacr.azurecr.io

# NOT AVAILABLE at compile time — these fail silently in ${{ }} expressions
# Build.SourceBranch         → predefined pipeline variable (Phase 4)
# Build.BuildId              → predefined pipeline variable (Phase 4)
# System.AccessToken         → predefined pipeline variable (Phase 4)
# myVar (from variable group) → variable group values (Phase 4)
# outputVar (from prior job)  → set during Phase 5, never available at Phase 2

A common mistake is using ${{ if eq(variables['Build.Reason'], 'PullRequest') }} to add a PR-specific validation job. Build.Reason is a predefined variable populated in Phase 4. At Phase 2, variables['Build.Reason'] resolves to an empty string, eq('', 'PullRequest') evaluates to false, and the job never appears in the plan — even on pull requests.

Variable Precedence and Override Order

When the same variable name appears in multiple sources, Azure DevOps applies this precedence order (later entries win):

PrioritySourceExpression Access
1 (lowest)Static YAML variables: declaration${{ luxury_expr }}, $[ ], $( )
2Variable group$[ ], $( ) only
3Queue-time override (pipeline UI or API)$[ ], $( ) only
4task.setvariable (non-output)$( ) in subsequent steps only
5 (highest)task.setvariable isOutput=true$[ dependencies... ] in subsequent jobs

A variable group value at priority 2 overrides the static YAML declaration at priority 1 at runtime, but the static YAML value is still what the compile-time ${{ }} expression sees. If you use ${{ variables.buildConfig }} in a template and a variable group also defines buildConfig, the template expression uses the YAML-defined value, but $(buildConfig) in a script uses the variable group value.

Mark variables that must not change with readonly: true to block queue-time overrides:

variables:
  - name: serviceConnection
    value: prod-arm-connection
    readonly: true   # Queue-time override rejected; pipeline fails with validation error

Compile-Time Expressions — ${{ }}

Structural Modification

Compile-time expressions are the only mechanism that can alter the structure of the pipeline — adding or removing jobs, stages, and steps before the execution plan is finalized.

${{ if }}, ${{ elseif }}, and ${{ else }} evaluate a boolean expression and include or exclude the YAML subtree that follows. The excluded subtree is completely absent from the Expanded YAML; it consumes no queue time, no agent time, and produces no skipped-step entries in the log.

${{ each item in collection }} iterates over a parameter array or object to generate repeated structure. Each iteration renders a copy of the template body with ${{ item.property }} tokens replaced by the current item’s values.

# Conditionally add a security scan job based on a boolean parameter
parameters:
  - name: runSecurityScan
    type: boolean
    default: false

jobs:
  - job: Build
    steps:
      - script: dotnet build

  - ${{ if parameters.runSecurityScan }}:
    - job: SecurityScan
      dependsOn: Build
      steps:
        - task: MicrosoftSecurityDevOps@1

# Generate one deployment stage per environment from an object parameter
parameters:
  - name: environments
    type: object
    default:
      - name: dev
        serviceConnection: dev-arm
        resourceGroup: rg-myapp-dev
      - name: prod
        serviceConnection: prod-arm
        resourceGroup: rg-myapp-prod

stages:
  - ${{ each env in parameters.environments }}:
    - stage: Deploy_${{ env.name }}
      displayName: Deploy to ${{ env.name }}
      jobs:
        - deployment: DeployApp
          environment: ${{ env.name }}
          strategy:
            runOnce:
              deploy:
                steps:
                  - task: AzureRmWebAppDeployment@4
                    inputs:
                      ConnectedServiceName: ${{ env.serviceConnection }}
                      ResourceGroupName: ${{ env.resourceGroup }}

Template Inclusion

${{ if }} controls which template files are included in the pipeline. Removing a - template: line via a compile-time conditional eliminates the template’s entire subtree from the plan, including all of its steps, jobs, and variables. This is the correct mechanism for optional feature injection — not a condition: on individual steps.

# Conditional template inclusion based on a boolean parameter
parameters:
  - name: publishArtifact
    type: boolean
    default: true

steps:
  - template: steps/build.yml

  - ${{ if eq(parameters.publishArtifact, true) }}:
    - template: steps/publish.yml
      parameters:
        artifactName: drop

# Typed object parameter passing a list of environments into a stage template
parameters:
  - name: deployConfig
    type: object
    default:
      environments:
        - name: staging
          slot: staging
        - name: production
          slot: production
      approvalRequired: true

stages:
  - template: stages/deploy-all.yml
    parameters:
      config: ${{ parameters.deployConfig }}

Type checking on parameters is strict at parse time. If you declare a parameter as type: boolean and pass a string, the pipeline fails immediately during Phase 1 with Parameter 'runSecurityScan' expected type Boolean. This early failure surfaces misconfigurations before any agent time is consumed.

Limitations

Compile-time expressions cannot read predefined pipeline variables, variable group values, or any value produced by a running task. The inputs to ${{ }} expressions are fixed at the moment the YAML file is read from the repository.

Nesting ${{ each }} inside another ${{ each }} is valid but must be managed carefully. Azure DevOps caps template nesting at 100 levels. When a deeply nested loop generates a large number of jobs, you can hit the 20 MB parsing memory limit before reaching the nesting cap. The error "Maximum object size exceeded" or "Pipeline memory budget exceeded" appears when the server cannot hold the expanded document in memory. If you reach this limit, flatten the outer loop by moving the inner logic to a dedicated template file, or move the branching logic into a script task that runs on the agent.

Expressions cap at 21,000 characters. Generating a massive join() call or chaining many format() calls inside a single expression will produce an Exceeded max expression length error. Split long expressions across multiple variables.


Runtime Expressions — $[ ]

Conditions

condition: on a step, job, or stage accepts a runtime expression. Azure DevOps evaluates the expression at Phase 4 after the variable store is fully hydrated. The expression must return a boolean; any non-boolean return value is coerced.

The default condition for every step, job, and stage is succeeded(), which is equivalent to writing condition: succeeded() explicitly. This means a step only runs if all prior steps in the job passed. Override it when a step must run regardless of upstream status:

# Run only on main branch AND only if the upstream job succeeded
- job: Deploy
  dependsOn: Build
  condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
  steps:
    - script: echo "Deploying to production"

# Always run a cleanup step even if prior steps failed
- script: docker system prune -f
  displayName: Cleanup Docker
  condition: always()

# Run a notification step whether the job succeeded or was skipped
- script: ./notify.sh
  displayName: Notify Slack
  condition: succeededOrSkipped()

The full set of condition functions available at runtime:

FunctionDescription
succeeded()True if all previous dependencies succeeded
succeededOrSkipped()True if dependencies succeeded or were skipped
failed()True if any previous dependency failed
always()True regardless of prior status
canceled()True if the pipeline was canceled
eq(a, b)True if a equals b (case-insensitive for strings)
ne(a, b)True if a does not equal b
gt(a, b)True if a is greater than b
and(a, b, ...)True if all arguments are true
or(a, b, ...)True if any argument is true
not(a)Inverts the boolean
contains(string, substring)True if string contains substring
startsWith(string, prefix)True if string starts with prefix
endsWith(string, suffix)True if string ends with suffix
in(value, a, b, ...)True if value matches any listed item
coalesce(a, b, ...)Returns the first non-empty value

(Note: containsValue() is also available but is primarily effective with compile-time arrays passed via parameters; runtime variables evaluate as strings, making contains() or in() more appropriate.)

Variable Mapping Between Jobs

Output variables cross job boundaries through a $[ dependencies... ] reference. The referencing job must declare dependsOn: to establish the dependency — without it, the variable is not available, and the reference resolves to an empty string.

jobs:
  - job: BuildApp
    steps:
      # Publish the image tag as an output variable
      - bash: |
          SHORT_SHA=$(git rev-parse --short HEAD)
          IMAGE_TAG="${BUILD_BUILDID}-${SHORT_SHA}"
          echo "Image tag: ${IMAGE_TAG}"
          echo "##vso[task.setvariable variable=imageTag;isOutput=true]${IMAGE_TAG}"
        name: setTag                      # step name is the key in the output reference path
        displayName: Compute and publish image tag

  - job: DeployApp
    dependsOn: BuildApp                   # required: establishes the dependency
    variables:
      # Map the output variable into this job's variable store
      IMAGE_TAG: $[ dependencies.BuildApp.outputs['setTag.imageTag'] ]
    steps:
      - script: echo "Deploying image $(IMAGE_TAG)"
        displayName: Deploy
        # Also usable in conditions
        condition: and(succeeded(), ne(variables['IMAGE_TAG'], ''))

For output variables that cross stage boundaries, the reference path uses stageDependencies:

stages:
  - stage: Build
    jobs:
      - job: BuildApp
        steps:
          - bash: echo "##vso[task.setvariable variable=imageTag;isOutput=true]$(Build.BuildId)"
            name: setTag

  - stage: Deploy
    dependsOn: Build
    jobs:
      - job: DeployApp
        variables:
          # stageDependencies.<StageName>.<JobName>.outputs['<stepName>.<varName>']
          IMAGE_TAG: $[ stageDependencies.Build.BuildApp.outputs['setTag.imageTag'] ]
        steps:
          - script: docker pull myregistry.azurecr.io/myapp:$(IMAGE_TAG)

Common Pitfalls

The Truthiness Trap. The string 'false' is truthy in runtime expressions. A variable set to the string value false does not evaluate as boolean false — it evaluates as a non-empty string, which is truthy. This produces conditions that always fire even when you intend them to be off:

# BROKEN: the string 'false' is truthy — this step always runs
variables:
  - name: deployToProd
    value: false
steps:
  - script: ./deploy-prod.sh
    condition: variables['deployToProd']  # evaluates to 'false' (string) = truthy

# FIXED: compare explicitly against the string 'true'
  - script: ./deploy-prod.sh
    condition: eq(variables['deployToProd'], 'true')

Missing dependsOn. If Job B references dependencies.JobA.outputs['stepName.varName'] but does not declare dependsOn: JobA, the output variable resolves to an empty string. Azure DevOps does not warn about this. The dependsOn: and the dependencies.* reference must name the same job with the same casing.

Undefined variable references. $[ variables['unsetVar'] ] returns an empty string, not an error. A condition like eq(variables['tier'], 'production') silently evaluates to eq('', 'production') = false if tier was never set. Use coalesce(variables['tier'], 'dev') to supply a default.

succeeded() vs. succeededOrSkipped(). A step conditioned on succeeded() does not run if its upstream job was skipped. If you have a notification or cleanup step that must run after an optional job, use succeededOrSkipped() rather than succeeded().


Macro Expressions — $( )

How Macros Work

Macro substitution is the simplest of the three mechanisms. Immediately before Azure DevOps delivers a task’s inputs to the agent, it performs a string find-and-replace on every input value: $(varName) becomes the current string value of varName.

The substitution reads from the variable store as it exists at the moment that specific task begins. This means macros can read values set by task.setvariable in any prior step in the same job — something runtime expressions ($[ ]) cannot do, because those evaluate once at job initialization before any steps run.

steps:
  - task: DotNetCoreCLI@2
    inputs:
      command: publish
      arguments: >
        --configuration $(buildConfig)
        --output $(Build.ArtifactStagingDirectory)/$(Build.BuildId)
        # $(Build.ArtifactStagingDirectory) → /home/vsts/work/1/a (expanded at task start)
        # $(Build.BuildId)                  → 42 (expanded at task start)
        # $(buildConfig)                    → Release (expanded at task start)
    displayName: Publish application

If a macro references a variable that does not exist at execution time, the literal token is passed to the task unchanged. A task receiving $(undefinedVar) as an argument gets the string $(undefinedVar). The task usually accepts this silently and produces wrong output rather than failing. To diagnose this, add a diagnostic step before the failing task and check whether the variable appears in the environment:

- bash: |
    # Dump all variables to the log to identify undefined macros
    printenv | sort           # Linux/macOS agents
  displayName: Diagnostic variable dump (Linux)
  condition: always()

- pwsh: |
    Get-ChildItem Env: | Sort-Object Name | Format-Table -AutoSize
  displayName: Diagnostic variable dump (Windows)
  condition: always()

Remove these diagnostic steps before merging to the main branch.

Secrets and Security

Secret variables are masked in logs — any log line containing the secret value is replaced with ***. Macro substitution still delivers the plaintext value to the task; the masking is a log post-processing step, not an access control mechanism.

Passing secrets via macro expansion to inline scripts creates an injection risk. When a secret value contains shell metacharacters (spaces, quotes, semicolons), the script can break or behave unexpectedly. Use the env: block to pass secrets to scripts as environment variables instead:

# AVOID: macro expansion in a script argument
- script: ./deploy.sh --token $(MY_SECRET_TOKEN)

# PREFER: environment variable via env: block
- bash: ./deploy.sh --token "$DEPLOY_TOKEN"
  env:
    DEPLOY_TOKEN: $(MY_SECRET_TOKEN)
    # MY_SECRET_TOKEN is now an env var; the shell sees $DEPLOY_TOKEN, not the literal token value

The env: block approach prevents the secret from appearing in the process argument list (visible to other processes via /proc/<pid>/cmdline on Linux) and avoids shell word-splitting issues entirely.

Never add a diagnostic printenv or variable dump step to a pipeline that has secret variables configured. Even though the values appear as *** in the log, the presence of the step creates an opportunity for the masked value to leak if the masking logic misses an encoding variant (e.g., URL-encoded or base64-wrapped forms of the secret).


Choosing the Right Syntax: A Decision Framework

Decision Tree

Is the decision structural — does it add or remove jobs, stages, or steps?
  YES → ${{ if }} / ${{ each }}
  NO  ↓

Does the value only exist after a prior job or step runs (output variable)?
  YES → $[ dependencies.X.outputs['step.var'] ] in variables: block
  NO  ↓

Does the decision control whether a step/job runs or is skipped?
  YES → condition: $[ expr ] or condition: eq(variables['flag'], 'true')
  NO  ↓

Is this a string value being passed to a task argument or script?
  YES → $(variableName) macro
  NO  ↓

Is this a template parameter value used in a template file?
  YES → ${{ parameters.name }}

Quick-Reference Cheat Sheet

Use CaseCorrect SyntaxCommon MistakeWhy the Mistake Fails
Conditionally include a job${{ if parameters.flag }}:condition: variables['flag']condition: skips but does not remove the job; the job still appears in the plan and costs queue time
Conditionally skip a stepcondition: eq(variables['flag'], 'true')${{ if variables['flag'] }}Predefined/runtime variables are not available at compile time; the if always resolves to false
Generate N deployment stages${{ each env in parameters.envs }}:A matrix: strategymatrix generates parallel jobs within one stage; ${{ each }} generates separate named stages with separate approvals
Read a prior job’s output variable$[ dependencies.JobA.outputs['step.var'] ] in variables:$(dependencies.JobA.outputs['step.var'])Macro syntax does not support the dependencies object; the token passes through literally
String interpolation in a script$(Build.BuildId)$[ variables['Build.BuildId'] ]$[ ] is not valid in a script: value; it is only valid in condition: and variables:
Prevent 'false' string from being truthyeq(variables['flag'], 'true')variables['flag'] as a bare conditionNon-empty strings including 'false' are truthy in runtime expression context
Pass a secret to a script securelyenv: MYVAR: $(SECRET) in env: blockscript: ./run.sh $(SECRET)Macro in an argument exposes the value in the process argument list and risks word-splitting

Debugging Expressions with Expanded YAML

Using the Expanded YAML View

After a pipeline run starts, navigate to the run’s Logs tab. Select the Initialize job step for any job. Near the top of that step’s log, Azure DevOps writes the Expanded YAML — the fully compiled, flat execution plan that was produced in Phase 2.

The Expanded YAML is the most reliable debugging tool for compile-time expression problems. If a job you expected to appear is missing from the Expanded YAML, a ${{ if }} evaluated to false during Phase 2. The job was never part of the plan, which is why no “skipped” entry appears in the run graph. The absence is the signal.

A typical debugging workflow:

  1. Open the run that produced the unexpected behavior.
  2. Click the job that should have contained the missing step or the job that itself is missing.
  3. Open Initialize job → scan for the Expanded YAML block.
  4. Search the Expanded YAML for the job or step name. If it is absent, the ${{ if }} evaluated to false.
  5. Identify which variable the if depends on, and check whether that variable could possibly exist at compile time. If it is a predefined variable or variable group value, move the logic to a condition: field.

Diagnostic Variable Dump Pattern

When macro substitution produces unexpected values — the wrong string, a literal $(varName) token, or an empty value — a variable dump step inserted before the failing task shows exactly what is in the runtime variable store:

steps:
  # --- Temporary diagnostic step — remove before merging ---
  - bash: printenv | sort
    displayName: "[DEBUG] All environment variables (Linux)"
    condition: always()

  - pwsh: Get-ChildItem Env: | Sort-Object Name | Format-Table Name, Value -AutoSize
    displayName: "[DEBUG] All environment variables (Windows)"
    condition: always()
  # --- End diagnostic ---

  - task: AzureWebApp@1
    inputs:
      appName: $(APP_NAME)      # If APP_NAME is not in the dump, $(APP_NAME) passes literally

The [DEBUG] prefix in displayName makes these steps easy to locate and remove before the PR is merged.


Hands-On Example: Multi-Stage Pipeline with All Three Syntaxes

Scenario: A .NET application pipeline that builds the application, optionally runs a security scan on pull requests, publishes an artifact, and deploys to a named environment. The deployment job reads an output variable from the build job to select the correct container image tag.

Prerequisites:

  • An Azure DevOps project with dev and prod environments defined
  • A YAML pipeline file at the repository root
  • A self-hosted or Microsoft-hosted agent pool
# azure-pipelines.yml
trigger:
  branches:
    include:
      - main
      - refs/pull/*/merge

parameters:
  # runSecurityScan: boolean parameter controls whether SecurityScan stage exists in the plan
  - name: runSecurityScan
    type: boolean
    default: false
  # deployEnvironments: object parameter drives ${{ each }} stage generation
  - name: deployEnvironments
    type: object
    default:
      - name: dev
        serviceConnection: dev-arm-connection
        slot: staging
      - name: prod
        serviceConnection: prod-arm-connection
        slot: production

variables:
  - name: buildConfiguration
    value: Release
  - name: dotnetVersion
    value: '8.0.x'

stages:
  # ── Stage 1: Build ──────────────────────────────────────────────────────────
  - stage: Build
    displayName: Build
    jobs:
      - job: BuildApp
        displayName: Build Application
        pool:
          vmImage: ubuntu-latest
        steps:
          - task: UseDotNet@2
            inputs:
              version: $(dotnetVersion)       # $( ) macro: resolved at task execution time
            displayName: Use .NET $(dotnetVersion)

          - task: DotNetCoreCLI@2
            inputs:
              command: build
              arguments: --configuration $(buildConfiguration) --no-restore
            displayName: Build

          - task: DotNetCoreCLI@2
            inputs:
              command: publish
              arguments: >
                --configuration $(buildConfiguration)
                --output $(Build.ArtifactStagingDirectory)
            displayName: Publish artifact

          # Set an output variable that downstream stages will consume
          - bash: |
              SHORT_SHA=$(git rev-parse --short HEAD)
              IMAGE_TAG="${BUILD_BUILDID}-${SHORT_SHA}"
              echo "Image tag: ${IMAGE_TAG}"
              echo "##vso[task.setvariable variable=imageTag;isOutput=true]${IMAGE_TAG}"
            name: setTag                      # step name is the key in the output reference path
            displayName: Compute image tag

          - task: PublishBuildArtifacts@1
            inputs:
              pathToPublish: $(Build.ArtifactStagingDirectory)
              artifactName: drop
            displayName: Publish to pipeline

  # ── Stage 2: Security Scan (compile-time conditional) ────────────────────────
  # ${{ if }} — this entire stage is absent from the Expanded YAML when runSecurityScan is false
  - ${{ if eq(parameters.runSecurityScan, true) }}:
    - stage: SecurityScan
      displayName: Security Scan
      dependsOn: Build
      jobs:
        - job: Scan
          displayName: Run SAST and Dependency Scan
          pool:
            vmImage: ubuntu-latest
          steps:
            - task: MicrosoftSecurityDevOps@1
              displayName: Microsoft Security DevOps

  # ── Stages 3+: Deploy (generated via ${{ each }}) ─────────────────────────
  # ${{ each }} — generates one stage per entry in deployEnvironments
  - ${{ each env in parameters.deployEnvironments }}:
    - stage: Deploy_${{ env.name }}
      displayName: Deploy to ${{ env.name }}
      # If security scan ran, depend on it; otherwise depend on Build
      ${{ if eq(parameters.runSecurityScan, true) }}:
        dependsOn:
          - Build
          - SecurityScan
      ${{ else }}:
        dependsOn:
          - Build
      jobs:
        - deployment: DeployApp
          displayName: Deploy Application
          pool:
            vmImage: ubuntu-latest
          environment: ${{ env.name }}
          variables:
            # $[ ] runtime expression — reads Build stage output variable
            IMAGE_TAG: $[ stageDependencies.Build.BuildApp.outputs['setTag.imageTag'] ]
          strategy:
            runOnce:
              deploy:
                steps:
                  - download: current
                    artifact: drop

                  # $[ ] condition — only deploy prod from the main branch
                  - task: AzureRmWebAppDeployment@4
                    displayName: Deploy to ${{ env.name }}
                    condition: >
                      and(
                        succeeded(),
                        or(
                          ne('${{ env.name }}', 'prod'),
                          eq(variables['Build.SourceBranch'], 'refs/heads/main')
                        )
                      )
                    inputs:
                      ConnectedServiceName: ${{ env.serviceConnection }}
                      WebAppName: myapp-${{ env.name }}
                      DeployToSlotOrASEFlag: true
                      ResourceGroupName: rg-myapp-${{ env.name }}
                      SlotName: ${{ env.slot }}
                      Package: $(Pipeline.Workspace)/drop/*.zip
                      # $(IMAGE_TAG) — $( ) macro resolves at task execution time
                      AdditionalArguments: -imageTag $(IMAGE_TAG)

Verification steps:

  1. Run the pipeline with runSecurityScan: false. Open Initialize job for the Deploy_dev job and inspect the Expanded YAML. The SecurityScan stage should be absent. The Deploy_dev and Deploy_prod stages should both be present.
  2. Run the pipeline with runSecurityScan: true. The Expanded YAML should now include the SecurityScan stage, and the deploy stages should list both Build and SecurityScan in their dependsOn.
  3. After the BuildApp job completes, expand the setTag step log and confirm the IMAGE_TAG value was written. In the DeployApp job, check the Initialize job log to confirm IMAGE_TAG was mapped from the stage dependency.
  4. In the Deploy_prod deployment, check the condition evaluation log to confirm the ne('prod', 'prod') branch fires correctly — prod deployments are gated to the main branch.

Best Practices and Optimization

Use typed parameters for all compile-time branching. Declaring type: boolean or type: string with values: [dev, staging, prod] makes misuse fail at parse time with a clear error message. An untyped object parameter accepts anything, including malformed input that produces a runtime failure hours later.

Prefer condition: over ${{ if }} when the decision can be deferred to runtime. Compile-time conditionals remove structure from the plan; runtime conditions preserve it. A job that was removed by ${{ if }} produces no audit trail in the run history. A job that was skipped by condition: appears in the run graph as skipped, which is visible to everyone reviewing the pipeline history.

Name output variables explicitly and namespace them. imageTag is ambiguous across jobs. build.imageTag or publish.containerImageTag is unambiguous. Step names in the ##vso[task.setvariable] path are case-sensitive; a mismatch produces a silent empty-string resolution.

Avoid nesting ${{ each }} more than two levels deep. Each additional nesting level multiplies the template expansion work and pushes toward the 20 MB parsing memory limit. If you need three-level nesting, the outer loop almost always belongs in an orchestrator pipeline that calls inner pipelines via - pipeline: resource triggers.

Move complex branching logic to scripts. Five levels of nested ${{ if }} that recalculate the same value in different contexts adds to pipeline initialization time. A single PowerShell script that runs on the agent and writes its output via task.setvariable is faster, more readable, and easier to unit-test.

Lock production-critical variables with readonly: true. Variables that gate privileged deployments must not be overridable at queue time. A misconfigured queue-time override on serviceConnection or deployEnvironment can redirect a deployment to the wrong target with no compile-time or runtime error.


Troubleshooting Common Issues

Issue: Compile-time if block never executes despite the variable being set

The variable is a predefined pipeline variable (Build.SourceBranch, Build.Reason, System.TeamProject) or a variable group value. Neither is available during Phase 2. Move the condition to a runtime condition: field:

# BROKEN
- ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
  - job: Deploy

# FIXED
- job: Deploy
  condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')

Issue: Output variable from a prior job is empty in the consuming job

Check two things in order. First, confirm dependsOn: JobA is declared on the consuming job. Second, confirm the step name: in the producing job matches the name used in the reference path exactly — including casing. The reference dependencies.JobA.outputs['setTag.imageTag'] requires the step to have name: setTag, not name: SetTag or name: set-tag.

Issue: $(variableName) is passed literally to the task instead of being expanded

The variable was not set before the task executed, or the variable name is misspelled. Add the diagnostic variable dump step immediately before the failing task to confirm whether the variable name appears in the environment. If it is absent, trace backward to the step that should have set it and confirm the ##vso[task.setvariable variable=exactName]value command ran and the variable name matches.

Issue: ${{ each }} generates duplicate stage names, causing parse error Stage name 'Deploy_prod' has already been defined

Two items in the iterated collection produce the same rendered string. Add a disambiguating field to the stage name:

# BROKEN: two envs with the same tier name produce identical stage names
- stage: Deploy_${{ env.name }}

# FIXED: include region abbreviation or tier to ensure uniqueness
- stage: Deploy_${{ env.region }}_${{ env.name }}

Issue: A condition: succeeded() step fires on a skipped upstream job

succeeded() returns false when the dependency was skipped. If the step must run regardless of whether the upstream job ran, use succeededOrSkipped(). If it should run even when upstream failed, use always() — but be explicit about which behavior you intend.

Issue: The slicing expansion resulted in 257 jobs parse error

A matrix: strategy or ${{ each }} loop generated more than 256 jobs in a single stage. Azure DevOps caps jobs per stage at 256. Split the work across multiple stages, each with a subset of the matrix legs:

- stage: BatchA
  jobs:
    - ${{ each item in parameters.firstHalf }}:
      - job: Process_${{ item.name }}
        steps:
          - script: ./process.sh ${{ item.name }}

- stage: BatchB
  dependsOn: BatchA
  jobs:
    - ${{ each item in parameters.secondHalf }}:
      - job: Process_${{ item.name }}
        steps:
          - script: ./process.sh ${{ item.name }}

Key Takeaways

  1. The three syntaxes evaluate at different phases: ${{ }} at parse time (Phase 2), $[ ] at initialization (Phase 4), $( ) immediately before each task (Phase 5). Using the wrong syntax for the variable’s availability phase produces a silent failure, not an error.
  2. Predefined pipeline variables and variable group values are never available at compile time. All compile-time branching must use parameters: or static variables: declarations.
  3. The Expanded YAML view under Initialize job is the primary debugging tool for compile-time expressions. A job absent from the Expanded YAML was removed by a ${{ if }} that evaluated to false.
  4. Runtime boolean conditions must use explicit comparisons like eq(variables['flag'], 'true') because the string 'false' is truthy.
  5. Output variable references require an exact match on the step name: field and a dependsOn: declaration on the consuming job. Missing either produces a silent empty string.
  6. The maximum expression length is 21,000 characters, parsing memory is 20 MB, and template nesting caps at 100 levels.

Sources