Skip to content

Agent file format

The compiler expects markdown files with YAML front matter similar to gh-aw:

---
name: "name for this agent"
description: "One line description for this agent"
target: standalone # Optional: "standalone" (default), "1es", "job", or "stage". See docs/targets.md.
engine: copilot # Engine identifier. Defaults to copilot. Currently only 'copilot' (GitHub Copilot CLI) is supported.
# engine: # Alternative object format (with additional options)
# id: copilot
# model: claude-opus-4.7
# timeout-minutes: 30
workspace: repo # Optional: "root", "repo" (alias: "self"), or a checked-out repository alias. If not specified, defaults to "root" when no additional repositories are listed in `repos:`, and to "repo" when one or more additional repos are checked out. See "Workspace Defaults" below.
pool: AZS-1ES-L-MMS-ubuntu-22.04 # Agent pool name (string format). Defaults to AZS-1ES-L-MMS-ubuntu-22.04.
# pool: # Alternative object format (required for 1ES if specifying os)
# name: AZS-1ES-L-MMS-ubuntu-22.04
# os: linux # Operating system: "linux" or "windows". Defaults to "linux".
repos: # compact repository declarations (replaces repositories: + checkout:)
- my-org/my-repo # shorthand: alias="my-repo", type=git, ref=refs/heads/main, checkout=true
- reponame=my-org/another-repo # shorthand with explicit alias
- name: my-org/templates # object form for full control
ref: refs/heads/release/2.x
checkout: false # declared as resource only, not checked out by the agent
tools: # optional tool configuration
bash: ["cat", "ls", "grep"] # explicit bash allow-list; when omitted, all bash tools are allowed (unrestricted)
edit: true # enable file editing tool (default: true)
cache-memory: true # persistent memory across runs (see docs/tools.md)
# cache-memory: # Alternative object format (with options)
# allowed-extensions: [.md, .json]
azure-devops: true # first-class ADO MCP integration (see docs/tools.md)
# azure-devops: # Alternative object format (with scoping)
# toolsets: [repos, wit]
# allowed: [wit_get_work_item]
# org: myorg
runtimes: # optional runtime configuration (language environments)
lean: true # Lean 4 theorem prover (see docs/runtimes.md)
# lean: # Alternative object format (with toolchain pinning)
# toolchain: "leanprover/lean4:v4.29.1"
# python: true # Python runtime -- auto-installs via UsePythonVersion@0 (see docs/runtimes.md)
# python: # Alternative object format (pin version, configure internal feed)
# version: "3.12"
# feed-url: "https://pkgs.dev.azure.com/myorg/_packaging/myfeed/pypi/simple/"
# node: true # Node.js runtime -- auto-installs via NodeTool@0 (see docs/runtimes.md)
# node: # Alternative object format (pin version, configure internal feed)
# version: "22.x"
# feed-url: "https://pkgs.dev.azure.com/ORG/PROJECT/_packaging/FEED/npm/registry/"
# dotnet: true # .NET runtime -- auto-installs via UseDotNet@2 (see docs/runtimes.md)
# dotnet: # Alternative object format (pin version, configure internal feed via nuget.config)
# version: "8.0.x" # use "global.json" to pin from the repo's global.json
# feed-url: "https://pkgs.dev.azure.com/myorg/_packaging/myfeed/nuget/v3/index.json"
# env: # RESERVED: workflow-level environment variables (not yet implemented)
# CUSTOM_VAR: "value"
mcp-servers:
my-custom-tool: # containerized MCP server (requires container field)
container: "node:20-slim"
entrypoint: "node"
entrypoint-args: ["path/to/mcp-server.js"]
allowed:
- custom_function_1
- custom_function_2
safe-outputs: # optional per-tool configuration for safe outputs
create-work-item:
work-item-type: Task
assignee: "user@example.com"
tags:
- automated
- agent-created
artifact-link: # optional: link work item to repository branch
enabled: true
branch: main
on: # trigger configuration (unified under on: key)
schedule: daily around 14:00 # fuzzy schedule - see docs/schedule-syntax.md
pipeline:
name: "Build Pipeline" # source pipeline name
project: "OtherProject" # optional: project name if different
branches: # optional: branches to trigger on
- main
- release/*
filters: # optional runtime filters (compiled to gate step)
source-pipeline: "Build*"
time-window:
start: "09:00"
end: "17:00"
pr: # PR trigger
branches:
include: [main]
paths:
include: [src/*]
filters: # runtime PR filters (compiled to gate step)
title: "*[review]*"
author:
include: ["alice@corp.com"]
draft: false
labels:
any-of: ["run-agent"]
source-branch: "feature/*"
target-branch: "main"
commit-message: "*[skip-agent]*"
changed-files:
include: ["src/**/*.rs"]
min-changes: 5
max-changes: 100
time-window:
start: "09:00"
end: "17:00"
build-reason:
include: [PullRequest]
expression: "eq(variables['Custom.Flag'], 'true')" # raw ADO condition
steps: # inline steps before agent runs (same job, generate context)
- bash: echo "Preparing context for agent"
displayName: "Prepare context"
post-steps: # inline steps after agent runs (same job, process artifacts)
- bash: echo "Processing agent outputs"
displayName: "Post-steps"
setup: # separate job BEFORE agentic task
- bash: echo "Setup job step"
displayName: "Setup step"
teardown: # separate job AFTER safe outputs processing
- bash: echo "Teardown job step"
displayName: "Teardown step"
network: # optional network policy (standalone target only)
allowed: # allowed host patterns and/or ecosystem identifiers
- python # ecosystem identifier -- expands to Python/PyPI domains
- "*.mycompany.com" # raw domain pattern
blocked: # blocked host patterns or ecosystems (removes from allow list)
- "evil.example.com"
permissions: # optional ADO access token configuration
read: my-read-arm-connection # ARM service connection for read-only ADO access (Stage 1 agent)
write: my-write-arm-connection # ARM service connection for write ADO access (Stage 3 executor only)
parameters: # optional ADO runtime parameters (surfaced in UI when queuing a run)
- name: clearMemory
displayName: "Clear agent memory"
type: boolean
default: false
---
## Build and Test
Build the project and run all tests...

The workspace: field controls which directory the agent runs in. When it is not set explicitly, the compiler chooses a default based on which repositories are checked out (entries in repos: with checkout: true, which is the default):

  • If no additional repositories are checked out (i.e. only the pipeline’s own repository is checked out via the implicit self), workspace: defaults to root — the agent runs in the pipeline’s working directory root.
  • If one or more additional repositories are checked out, workspace: defaults to repo — the agent runs inside the trigger repository’s directory.

Set workspace: explicitly to root, repo (alias self), or a specific checked-out repository alias to override this behavior.

The repos: field provides a compact way to declare additional repository resources and control which ones the agent checks out. It replaces the legacy repositories: + checkout: pair.

Each entry can be:

FormSyntaxDescription
Shorthand- org/repoAlias derived from last segment, type=git, ref=refs/heads/main, checkout=true
Shorthand with alias- alias=org/repoExplicit alias before =
Object- name: org/repoFull control over all fields

Object fields:

FieldDefaultDescription
name(required)Full org/repo name (maps to ADO name:)
aliaslast segment of nameRepository alias (maps to ADO repository:)
typegitADO repository resource type
refrefs/heads/mainBranch or tag reference
checkouttrueWhether the agent job clones this repo

Three repos, all checked out (most common case):

repos:
- my-org/tools
- my-org/schemas
- my-org/docs

Mixed: two checked out, one resource-only (used by templates):

repos:
- my-org/tools
- my-org/schemas
- name: my-org/pipeline-templates
checkout: false

Custom ref and explicit alias:

repos:
- name: my-org/docs
alias: docs-v2
ref: refs/heads/release/2.x

The legacy repositories: + checkout: fields are auto-converted to repos: by the repos_unified codemod. On the next ado-aw compile, any source that still uses the legacy fields is rewritten in place to the new shape — each repositories: entry becomes a repos: entry, with checkout: false added for entries that weren’t listed under checkout:. Mixing the legacy fields with an existing repos: block is rejected; pick one shape.

The compiler validates filter configurations at compile time and will emit errors for impossible or conflicting combinations:

ConditionSeverityMessage
min-changes > max-changesErrorNo PR can satisfy both constraints
time-window.start = time-window.endErrorZero-width window never matches
Same value in author.include and author.excludeErrorConflicting include/exclude
Same value in build-reason.include and build-reason.excludeErrorConflicting include/exclude
Label in both labels.any-of and labels.none-ofErrorLabel both required and blocked
Label in both labels.all-of and labels.none-ofErrorLabel both required and blocked
Empty labels filter (no any-of/all-of/none-of)WarningNo label checks applied

Errors cause compilation to fail. Fix the conflicting filter configuration before recompiling.

Time windows use half-open intervals: [start, end). A window of start: "09:00", end: "17:00" matches from 09:00 up to but not including 17:00. A build triggered at exactly 17:00 UTC will not match.

Overnight windows are supported: start: "22:00", end: "06:00" matches from 22:00 through midnight to 05:59.

All times are evaluated in UTC.

The changed-files filter checks the list of files modified in the PR. If the PR has no changed files (empty diff) and an include pattern is set, the filter will not match. An exclude-only filter (no include) with no changed files passes vacuously (no excluded files are present).

The expression field on pr.filters and pipeline.filters is an advanced, unsafe escape hatch. Its value is inserted verbatim into the Agent job’s ADO condition: field. It can reference any ADO pipeline variable, including secrets. The compiler validates against ##vso[ injection and ${{ template markers, but otherwise trusts the value. Only use this if the built-in filters are insufficient.

The filter gate step uses System.AccessToken for self-cancellation (PATCH to the builds REST API) and PR metadata retrieval. This requires:

  1. “Allow scripts to access the OAuth token” must be enabled on the pipeline definition in ADO (Project Settings -> Pipelines -> Settings).
  2. The pipeline’s build service account must have permission to cancel builds.

If the token is unavailable, the gate step logs a warning and the build completes as “Succeeded” (with the agent job skipped via condition) rather than “Cancelled”.