CI/CD for Azure Data Factory and Synapse with Azure DevOps: The Complete Enterprise Guide

CI/CD for Azure Data Factory and Synapse with Azure DevOps: The Complete Enterprise Guide

In my previous post on CI/CD with GitHub, I covered how to deploy ADF pipelines using GitHub Actions. That works great for personal projects and small teams. But in enterprise environments — especially at consulting firms like Capgemini, Accenture, and Deloitte — the standard tool is Azure DevOps.

Why? Because enterprises already pay for Azure DevOps through their Microsoft licensing, it integrates tightly with Azure Boards for project management, and security teams prefer keeping everything within the Microsoft ecosystem.

This post covers the complete Azure DevOps CI/CD workflow for ADF and Synapse — from creating your DevOps organization to building multi-stage YAML pipelines that deploy across Dev, UAT, and Production with approval gates.

If you already read the GitHub post, the core concepts (ARM templates, parameterization, service principals) are identical. Only the pipeline syntax and platform differ.

Table of Contents

  • Azure DevOps vs GitHub: When to Use Which
  • Azure DevOps Components You Need to Know
  • The End-to-End Workflow
  • Step 1: Create an Azure DevOps Organization and Project
  • Step 2: Create an Azure Repos Git Repository
  • Step 3: Connect ADF/Synapse to Azure Repos
  • Step 4: The Branching and Pull Request Workflow
  • Step 5: Publish and Generate ARM Templates
  • Step 6: Create a Service Connection
  • Step 7: Create Environment-Specific Parameter Files
  • Step 8: Build the CI/CD Pipeline (YAML)
  • Step 9: Pre/Post Deployment Scripts
  • Step 10: Add Approval Gates
  • The Complete Multi-Stage YAML Pipeline
  • Classic Release Pipelines vs YAML Pipelines
  • Variable Groups and Library
  • Branch Policies and Pull Request Rules
  • Connecting ADF to Azure DevOps (Step-by-Step UI Walkthrough)
  • Connecting Synapse to Azure DevOps (Step-by-Step UI Walkthrough)
  • Troubleshooting Common DevOps CI/CD Issues
  • Real-World Enterprise Patterns
  • Interview Questions
  • Wrapping Up

Azure DevOps vs GitHub: When to Use Which

Aspect Azure DevOps GitHub
Cost Free for up to 5 users, included in enterprise Microsoft licensing Free for public repos, paid for private team features
Project management Azure Boards (built-in Kanban, sprints, work items) GitHub Issues + Projects (lighter)
Git hosting Azure Repos GitHub Repos
CI/CD Azure Pipelines (YAML or Classic) GitHub Actions
Artifact management Azure Artifacts (NuGet, npm, Maven) GitHub Packages
Test management Azure Test Plans Third-party integrations
Enterprise preference Most large enterprises, consulting firms Startups, open-source, smaller teams
Azure integration Tighter (same Microsoft ecosystem) Strong (via GitHub Actions azure/* actions)

Bottom line: Use Azure DevOps when your company already uses it or requires it. Use GitHub for personal projects and open-source. The CI/CD concepts are identical — only the YAML syntax differs.

Azure DevOps Components You Need to Know

Component What It Does Equivalent in GitHub
Azure Repos Git repository hosting GitHub Repos
Azure Pipelines CI/CD pipeline execution GitHub Actions
Azure Boards Project management (sprints, work items) GitHub Issues
Azure Artifacts Package management GitHub Packages
Azure Test Plans Test case management No direct equivalent

For ADF/Synapse CI/CD, you primarily use Azure Repos (to store pipeline code) and Azure Pipelines (to deploy ARM templates).

The End-to-End Workflow

Developer works in Dev ADF/Synapse (connected to Azure Repos)
  |
  |-- Creates feature branch in ADF Studio
  |-- Builds/modifies pipelines, datasets, linked services
  |-- Changes auto-saved as JSON to Azure Repos
  |
  |-- Creates Pull Request in Azure DevOps
  |-- Team reviews code changes
  |-- PR approved and merged to main branch
  |
  |-- Developer clicks Publish in ADF Studio
  |-- ARM templates generated in adf_publish branch
  |
  v
Azure Pipeline triggers (watches adf_publish branch)
  |
  |-- Stage 1: Deploy to UAT
  |   |-- Stop UAT triggers
  |   |-- Deploy ARM template with UAT parameters
  |   |-- Start UAT triggers
  |   |-- Run smoke tests
  |
  |-- Manual Approval Gate (team lead approves)
  |
  |-- Stage 2: Deploy to Production
  |   |-- Stop Prod triggers
  |   |-- Deploy ARM template with Prod parameters
  |   |-- Start Prod triggers
  |   |-- Verify deployment

Step 1: Create an Azure DevOps Organization and Project

Create Organization

  1. Go to dev.azure.com
  2. Sign in with your Microsoft account (or Azure AD account)
  3. Click New organization (if you do not have one)
  4. Organization name: naveen-devops (or your company name)
  5. Region: Canada Central (closest to you)
  6. Click Continue

Create Project

  1. Click + New project
  2. Project name: DataPlatform (or your project name)
  3. Visibility: Private
  4. Version control: Git
  5. Work item process: Agile (or Scrum — does not matter for CI/CD)
  6. Click Create

You now have a DevOps project with Repos, Pipelines, Boards, and Artifacts.

Step 2: Create an Azure Repos Git Repository

Your project comes with a default repository. You can use it or create a new one:

  1. Go to Repos > Files in the left sidebar
  2. If using the default repo, it is named after your project
  3. To create a new repo: Repos > + New repository
  4. Name: adf-pipelines (for ADF) or synapse-pipelines (for Synapse)
  5. Add a README: Yes
  6. Click Create

Repository Structure (After ADF Connection)

Once ADF/Synapse is connected, the repo will contain:

adf-pipelines/
  |-- pipeline/
  |   |-- PL_Copy_SqlToADLS.json
  |   |-- PL_Copy_SqlToADLS_Parquet_WithAudit.json
  |   |-- PL_IncrementalLoad.json
  |   |-- PL_Cleanup_OldFiles.json
  |-- dataset/
  |   |-- DS_SqlDB_Metadata.json
  |   |-- DS_SqlDB_SourceTable.json
  |   |-- DS_ADLS_Sink_Parquet.json
  |-- linkedService/
  |   |-- LS_AzureSqlDB.json
  |   |-- LS_ADLS_Gen2.json
  |   |-- LS_KeyVault.json
  |-- trigger/
  |   |-- TR_Daily_2AM.json
  |-- integrationRuntime/
  |   |-- IR_SelfHosted.json
  |-- publish_config.json (or factory/ for ADF)

Step 3: Connect ADF/Synapse to Azure Repos

For Azure Data Factory

  1. Open ADF Studio (adf.azure.com)
  2. Click Manage tab (wrench icon)
  3. Click Git configuration under Source control
  4. Click Configure
  5. Select Azure DevOps Git as the repository type
  6. Configure:
  7. Azure DevOps organization: naveen-devops
  8. Project: DataPlatform
  9. Repository: adf-pipelines
  10. Collaboration branch: main
  11. Publish branch: adf_publish (auto-created when you first publish)
  12. Root folder: /
  13. Import existing resources: Yes (imports current pipelines into the repo)
  14. Click Apply

For Azure Synapse

  1. Open Synapse Studio
  2. Click Manage tab
  3. Click Git configuration
  4. Select Azure DevOps Git
  5. Same configuration as ADF (different repo name if desired)
  6. Publish branch: workspace_publish
  7. Click Apply

What Changes in the UI

After connecting: – A branch dropdown appears in the top toolbar – You see your current branch (e.g., main) – A Publish button appears (replaces the old Save All) – Changes are auto-saved to the Git branch, not directly to the live service

Step 4: The Branching and Pull Request Workflow

Creating a Feature Branch

  1. In ADF/Synapse Studio, click the branch dropdown
  2. Click + New branch
  3. Name: feature/incremental-load (descriptive name)
  4. Base: main
  5. Make your changes (create/edit pipelines, datasets)
  6. Changes are auto-committed to your feature branch

Creating a Pull Request

  1. Click the branch dropdown > Create pull request
  2. This opens Azure DevOps in your browser
  3. Configure:
  4. Title: “Add incremental load pipeline with watermark pattern”
  5. Description: Explain what changed and why
  6. Reviewers: Add your team members
  7. Work items: Link to Azure Board work items (optional)
  8. Click Create

Reviewing a Pull Request

Reviewers see: – Every JSON file that changed (pipeline definitions, dataset configs) – Line-by-line diff showing exactly what was added/modified/removed – Comments can be added inline on specific changes

After review: 1. Reviewer clicks Approve 2. Author clicks Complete > Complete merge 3. Choose: Squash merge (recommended — creates a clean single commit) 4. Delete the feature branch (keeps repo clean)

Step 5: Publish and Generate ARM Templates

After merging to main:

  1. Switch to the main branch in ADF/Synapse Studio
  2. Click Publish
  3. ADF/Synapse validates all resources
  4. If valid, it generates ARM templates and pushes to the publish branch:
  5. ADF: adf_publish branch
  6. Synapse: workspace_publish branch

Publish Branch Contents

For ADF:

adf_publish/
  |-- ARMTemplateForFactory.json              (all resource definitions)
  |-- ARMTemplateParametersForFactory.json    (parameterized values)
  |-- linkedTemplates/                         (if factory is large, split into files)
  |   |-- ArmTemplate_0.json
  |   |-- ArmTemplate_1.json
  |   |-- ArmTemplateParameters_master.json

For Synapse:

workspace_publish/
  |-- TemplateForWorkspace.json
  |-- TemplateParametersForWorkspace.json

These ARM templates are the deployment artifacts that the CI/CD pipeline picks up.

Step 6: Create a Service Connection

Azure Pipelines needs permission to deploy to your Azure subscription. This uses a Service Connection (which creates a Service Principal behind the scenes).

  1. In Azure DevOps, go to Project Settings (bottom-left gear icon)
  2. Click Service connections under Pipelines
  3. Click New service connection
  4. Select Azure Resource Manager > Next
  5. Select Service principal (automatic) > Next
  6. Configure:
  7. Scope level: Subscription
  8. Subscription: Select your Azure subscription
  9. Resource group: leave blank (for subscription-level access)
  10. Service connection name: azure-subscription-connection
  11. Grant access permission to all pipelines: Check
  12. Click Save

This creates a Service Principal in your Azure AD with Contributor access to the subscription. The pipeline uses this to deploy ARM templates.

Granting Additional Permissions

The Service Principal needs Contributor role on the target resource groups. If it only has subscription-level access, it can deploy anywhere. For production, restrict it:

# Grant Contributor on specific resource group only
az role assignment create     --assignee <service-principal-id>     --role Contributor     --scope /subscriptions/<sub-id>/resourceGroups/rg-prod

Step 7: Create Environment-Specific Parameter Files

Create parameter override files for each environment. Store these in your main branch (not the publish branch).

Create a folder cicd/ in your repo:

adf-pipelines/
  |-- cicd/
  |   |-- uat.parameters.json
  |   |-- prod.parameters.json
  |   |-- pre-deploy.ps1
  |   |-- post-deploy.ps1
  |-- pipeline/
  |-- dataset/
  |-- linkedService/

uat.parameters.json:

{
    "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "factoryName": {
            "value": "adf-dataplatform-uat"
        },
        "LS_AzureSqlDB_connectionString": {
            "value": "integrated security=False;data source=sql-dataplatform-uat.database.windows.net;initial catalog=AdventureWorksLT-uat;encrypt=True"
        },
        "LS_ADLS_Gen2_properties_typeProperties_url": {
            "value": "https://uatdatalake.dfs.core.windows.net"
        },
        "LS_KeyVault_properties_typeProperties_baseUrl": {
            "value": "https://kv-dataplatform-uat.vault.azure.net/"
        }
    }
}

prod.parameters.json:

{
    "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "factoryName": {
            "value": "adf-dataplatform-prod"
        },
        "LS_AzureSqlDB_connectionString": {
            "value": "integrated security=False;data source=sql-dataplatform-prod.database.windows.net;initial catalog=AdventureWorksLT-prod;encrypt=True"
        },
        "LS_ADLS_Gen2_properties_typeProperties_url": {
            "value": "https://proddatalake.dfs.core.windows.net"
        },
        "LS_KeyVault_properties_typeProperties_baseUrl": {
            "value": "https://kv-dataplatform-prod.vault.azure.net/"
        }
    }
}

Important: Never put passwords in parameter files. Use Key Vault references in your linked services, and each environment’s Key Vault has its own secrets.

Step 8: Build the CI/CD Pipeline (YAML)

Create a file azure-pipelines.yml in your repo:

Basic Single-Stage Pipeline (Deploy to UAT)

trigger:
  branches:
    include:
      - adf_publish

pool:
  vmImage: 'ubuntu-latest'

variables:
  - name: azureSubscription
    value: 'azure-subscription-connection'
  - name: resourceGroupUAT
    value: 'rg-dataplatform-uat'
  - name: adfNameUAT
    value: 'adf-dataplatform-uat'

stages:
  - stage: DeployToUAT
    displayName: 'Deploy to UAT'
    jobs:
      - deployment: DeployADF
        displayName: 'Deploy ADF to UAT'
        environment: 'UAT'
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self

                - task: AzureResourceManagerTemplateDeployment@3
                  displayName: 'Deploy ARM Template'
                  inputs:
                    deploymentScope: 'Resource Group'
                    azureResourceManagerConnection: '$(azureSubscription)'
                    subscriptionId: '$(subscriptionId)'
                    action: 'Create Or Update Resource Group'
                    resourceGroupName: '$(resourceGroupUAT)'
                    location: 'Canada Central'
                    templateLocation: 'Linked artifact'
                    csmFile: '$(Pipeline.Workspace)/ARMTemplateForFactory.json'
                    csmParametersFile: '$(Pipeline.Workspace)/cicd/uat.parameters.json'
                    deploymentMode: 'Incremental'

Key YAML Concepts

Concept What It Does
trigger Which branch triggers the pipeline
pool What VM runs the pipeline (ubuntu-latest)
variables Reusable values across the pipeline
stages Major phases (UAT, Prod)
jobs Units of work within a stage
deployment Special job type that tracks environments
environment Named target (UAT, Production) with approval gates
steps Individual tasks (checkout, deploy, script)

Step 9: Pre/Post Deployment Scripts

Pre-Deployment: Stop Triggers

Create cicd/pre-deploy.ps1:

param(
    [string]$ResourceGroupName,
    [string]$DataFactoryName
)

Write-Host "Stopping triggers in $DataFactoryName..."

$triggers = Get-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName

foreach ($trigger in $triggers) {
    if ($trigger.RuntimeState -eq "Started") {
        Write-Host "Stopping trigger: $($trigger.Name)"
        Stop-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName `
            -DataFactoryName $DataFactoryName -Name $trigger.Name -Force
    }
}

Write-Host "All triggers stopped."

Post-Deployment: Start Triggers

Create cicd/post-deploy.ps1:

param(
    [string]$ResourceGroupName,
    [string]$DataFactoryName
)

Write-Host "Starting triggers in $DataFactoryName..."

$triggers = Get-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName

foreach ($trigger in $triggers) {
    if ($trigger.RuntimeState -eq "Stopped") {
        Write-Host "Starting trigger: $($trigger.Name)"
        Start-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName `
            -DataFactoryName $DataFactoryName -Name $trigger.Name -Force
    }
}

Write-Host "All triggers started."

Adding Scripts to the Pipeline

steps:
  - task: AzurePowerShell@5
    displayName: 'Stop Triggers'
    inputs:
      azureSubscription: '$(azureSubscription)'
      ScriptPath: '$(Pipeline.Workspace)/cicd/pre-deploy.ps1'
      ScriptArguments: '-ResourceGroupName $(resourceGroupUAT) -DataFactoryName $(adfNameUAT)'
      azurePowerShellVersion: 'LatestVersion'

  - task: AzureResourceManagerTemplateDeployment@3
    displayName: 'Deploy ARM Template'
    inputs:
      # ... (same as before)

  - task: AzurePowerShell@5
    displayName: 'Start Triggers'
    inputs:
      azureSubscription: '$(azureSubscription)'
      ScriptPath: '$(Pipeline.Workspace)/cicd/post-deploy.ps1'
      ScriptArguments: '-ResourceGroupName $(resourceGroupUAT) -DataFactoryName $(adfNameUAT)'
      azurePowerShellVersion: 'LatestVersion'

Step 10: Add Approval Gates

Create Environments with Approvals

  1. Go to Pipelines > Environments
  2. Click New environment
  3. Name: Production
  4. Resource: None (we just need the approval gate)
  5. Click Create
  6. Click the three dots on the Production environment > Approvals and checks
  7. Click + Add check > Approvals
  8. Add approvers (yourself, team lead, or a group)
  9. Set instructions: “Verify UAT deployment was successful before approving production”
  10. Click Create

Now when the pipeline reaches the Production stage, it pauses and waits for approval.

The Complete Multi-Stage YAML Pipeline

Here is the full production-ready pipeline:

trigger:
  branches:
    include:
      - adf_publish

pool:
  vmImage: 'ubuntu-latest'

variables:
  - name: azureSubscription
    value: 'azure-subscription-connection'
  - name: subscriptionId
    value: 'your-subscription-id'

stages:
  # ========== UAT DEPLOYMENT ==========
  - stage: DeployToUAT
    displayName: 'Deploy to UAT'
    variables:
      - name: resourceGroup
        value: 'rg-dataplatform-uat'
      - name: adfName
        value: 'adf-dataplatform-uat'
      - name: parameterFile
        value: 'cicd/uat.parameters.json'
    jobs:
      - deployment: DeployUAT
        displayName: 'Deploy ADF to UAT'
        environment: 'UAT'
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self

                - task: AzurePowerShell@5
                  displayName: 'Stop UAT Triggers'
                  inputs:
                    azureSubscription: '$(azureSubscription)'
                    ScriptPath: '$(Pipeline.Workspace)/cicd/pre-deploy.ps1'
                    ScriptArguments: >-
                      -ResourceGroupName $(resourceGroup)
                      -DataFactoryName $(adfName)
                    azurePowerShellVersion: 'LatestVersion'

                - task: AzureResourceManagerTemplateDeployment@3
                  displayName: 'Deploy ARM Template to UAT'
                  inputs:
                    deploymentScope: 'Resource Group'
                    azureResourceManagerConnection: '$(azureSubscription)'
                    subscriptionId: '$(subscriptionId)'
                    action: 'Create Or Update Resource Group'
                    resourceGroupName: '$(resourceGroup)'
                    location: 'Canada Central'
                    templateLocation: 'Linked artifact'
                    csmFile: '$(Pipeline.Workspace)/ARMTemplateForFactory.json'
                    csmParametersFile: '$(Pipeline.Workspace)/$(parameterFile)'
                    deploymentMode: 'Incremental'

                - task: AzurePowerShell@5
                  displayName: 'Start UAT Triggers'
                  inputs:
                    azureSubscription: '$(azureSubscription)'
                    ScriptPath: '$(Pipeline.Workspace)/cicd/post-deploy.ps1'
                    ScriptArguments: >-
                      -ResourceGroupName $(resourceGroup)
                      -DataFactoryName $(adfName)
                    azurePowerShellVersion: 'LatestVersion'

  # ========== PRODUCTION DEPLOYMENT ==========
  - stage: DeployToProd
    displayName: 'Deploy to Production'
    dependsOn: DeployToUAT
    condition: succeeded()
    variables:
      - name: resourceGroup
        value: 'rg-dataplatform-prod'
      - name: adfName
        value: 'adf-dataplatform-prod'
      - name: parameterFile
        value: 'cicd/prod.parameters.json'
    jobs:
      - deployment: DeployProd
        displayName: 'Deploy ADF to Production'
        environment: 'Production'   # Requires manual approval
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self

                - task: AzurePowerShell@5
                  displayName: 'Stop Prod Triggers'
                  inputs:
                    azureSubscription: '$(azureSubscription)'
                    ScriptPath: '$(Pipeline.Workspace)/cicd/pre-deploy.ps1'
                    ScriptArguments: >-
                      -ResourceGroupName $(resourceGroup)
                      -DataFactoryName $(adfName)
                    azurePowerShellVersion: 'LatestVersion'

                - task: AzureResourceManagerTemplateDeployment@3
                  displayName: 'Deploy ARM Template to Prod'
                  inputs:
                    deploymentScope: 'Resource Group'
                    azureResourceManagerConnection: '$(azureSubscription)'
                    subscriptionId: '$(subscriptionId)'
                    action: 'Create Or Update Resource Group'
                    resourceGroupName: '$(resourceGroup)'
                    location: 'Canada Central'
                    templateLocation: 'Linked artifact'
                    csmFile: '$(Pipeline.Workspace)/ARMTemplateForFactory.json'
                    csmParametersFile: '$(Pipeline.Workspace)/$(parameterFile)'
                    deploymentMode: 'Incremental'

                - task: AzurePowerShell@5
                  displayName: 'Start Prod Triggers'
                  inputs:
                    azureSubscription: '$(azureSubscription)'
                    ScriptPath: '$(Pipeline.Workspace)/cicd/post-deploy.ps1'
                    ScriptArguments: >-
                      -ResourceGroupName $(resourceGroup)
                      -DataFactoryName $(adfName)
                    azurePowerShellVersion: 'LatestVersion'

Classic Release Pipelines vs YAML Pipelines

Azure DevOps offers two ways to build CI/CD:

Feature Classic (UI-based) YAML (Code-based)
Definition Drag-and-drop in browser Code in a YAML file
Version controlled No (stored in DevOps only) Yes (YAML file in your repo)
Reviewable Cannot PR the pipeline definition Pipeline changes go through PRs
Reusable Copy/paste between projects Templates and includes
Enterprise preference Legacy projects New projects (recommended)

Recommendation: Use YAML pipelines. They are version-controlled, reviewable in PRs, and the industry standard for new projects. Classic pipelines still work but Microsoft is investing in YAML going forward.

Variable Groups and Library

For sensitive values (subscription IDs, service principal secrets), use Variable Groups instead of hardcoding in YAML:

  1. Go to Pipelines > Library
  2. Click + Variable group
  3. Name: adf-deployment-vars
  4. Add variables:
  5. subscriptionId: your subscription ID
  6. adfNameUAT: adf-dataplatform-uat
  7. adfNameProd: adf-dataplatform-prod
  8. resourceGroupUAT: rg-dataplatform-uat
  9. resourceGroupProd: rg-dataplatform-prod
  10. Mark sensitive values as secret (padlock icon)
  11. Click Save

Reference in YAML:

variables:
  - group: adf-deployment-vars

Linking to Azure Key Vault

For secrets, link the variable group directly to Key Vault:

  1. In the variable group, toggle Link secrets from an Azure key vault
  2. Select your Key Vault
  3. Select which secrets to import
  4. Secrets are fetched at pipeline runtime — never stored in DevOps

Branch Policies and Pull Request Rules

Protect the main branch:

  1. Go to Repos > Branches
  2. Click the three dots next to main > Branch policies
  3. Configure:
  4. Require a minimum number of reviewers: 1 (or more)
  5. Check for linked work items: Optional
  6. Check for comment resolution: Required
  7. Limit merge types: Squash merge only (keeps history clean)
  8. Build validation: Add a build pipeline that validates the JSON files

This ensures no untested or unreviewed code reaches the main branch.

Connecting ADF to Azure DevOps (Step-by-Step UI Walkthrough)

  1. Open ADF Studio at adf.azure.com
  2. Click Manage (wrench icon) in the left sidebar
  3. Under Source control, click Git configuration
  4. Click Configure
  5. In the dialog:
  6. Repository type: Azure DevOps Git
  7. Azure Active Directory: your tenant (auto-detected)
  8. Azure DevOps Account: naveen-devops
  9. Project name: DataPlatform
  10. Repository name: adf-pipelines
  11. Collaboration branch: main
  12. Publish branch: adf_publish
  13. Root folder: /
  14. Import existing resources to repository: checked
  15. Click Apply
  16. ADF imports all existing pipelines, datasets, linked services, and triggers as JSON files into the repo

After connecting, you will see: – Branch selector dropdown in the top bar – Your pipelines listed in the Author tab (same as before) – Every change you make is auto-saved to the current Git branch

Connecting Synapse to Azure DevOps (Step-by-Step UI Walkthrough)

  1. Open Synapse Studio
  2. Click Manage in the left sidebar
  3. Click Git configuration
  4. Click Configure
  5. In the dialog:
  6. Repository type: Azure DevOps Git
  7. Azure Active Directory: your tenant
  8. Azure DevOps Account: naveen-devops
  9. Project name: DataPlatform
  10. Repository name: synapse-pipelines
  11. Collaboration branch: main
  12. Publish branch: workspace_publish
  13. Root folder: /
  14. Import existing resources: checked
  15. Click Apply

Synapse difference: The publish branch is workspace_publish (not adf_publish), and the ARM template files are named TemplateForWorkspace.json instead of ARMTemplateForFactory.json. Update your YAML pipeline accordingly.

Troubleshooting Common DevOps CI/CD Issues

“Service connection does not have permission”

The Service Principal does not have Contributor role on the target resource group.

Fix: Go to the resource group’s Access Control (IAM) and add the Service Principal as Contributor.

“Template validation failed: linked service reference not found”

The ARM template references a linked service that does not exist in the target environment.

Fix: Ensure all linked services are included in the ARM template. Check that the parameter file overrides the correct linked service properties.

“Trigger cannot be started after deployment”

The trigger references a pipeline that failed to deploy.

Fix: Check the deployment logs for earlier errors. Fix the failed resources first, then restart triggers.

“YAML pipeline not triggering on adf_publish branch”

The trigger branch filter might not match.

Fix: Ensure the YAML file is in the main branch (or wherever your pipeline definition lives) and the trigger includes adf_publish:

trigger:
  branches:
    include:
      - adf_publish

“Publish button is grayed out”

You are not on the collaboration branch (main).

Fix: Switch to the main branch in ADF Studio, then click Publish.

“Parameter file path not found”

The YAML references a path that does not exist in the checked-out repo.

Fix: Verify the parameter file path matches the actual repo structure. Use $(Pipeline.Workspace) for the root path.

Real-World Enterprise Patterns

Pattern 1: Separate Pipelines for Infrastructure and ADF

Pipeline 1 (Terraform/Bicep): Deploys Azure resources
  - Storage accounts, SQL databases, Key Vaults, ADF workspace itself

Pipeline 2 (ARM templates): Deploys ADF content
  - Pipelines, datasets, linked services, triggers
  - Depends on Pipeline 1 completing first

Pattern 2: Multiple Teams, Multiple Repos

Repo: adf-ingestion (Team A)
  - Pipelines for data ingestion from source systems

Repo: adf-transformation (Team B)
  - Pipelines for data transformation and modeling

Both deploy to the same ADF workspace via separate CI/CD pipelines

Pattern 3: Hotfix Pipeline

main branch: Normal development flow
hotfix branch: Emergency fixes that bypass UAT

Hotfix pipeline:
  - Deploys directly to Production (with senior approval)
  - Used only for critical production issues
  - Changes are back-merged to main afterward

Pattern 4: Rollback Strategy

If production deployment fails:
  1. Stop all triggers in Production
  2. Re-deploy the PREVIOUS ARM template from the last successful run
  3. Start triggers
  4. Investigate and fix in Dev

Azure DevOps keeps all previous pipeline runs and artifacts,
making rollback straightforward.

Interview Questions

Q: How do you set up CI/CD for Azure Data Factory using Azure DevOps? A: Connect the Dev ADF workspace to an Azure Repos Git repository. Developers work in feature branches, create PRs for review, merge to main, and click Publish to generate ARM templates in the adf_publish branch. An Azure Pipeline watches that branch and deploys the ARM templates to UAT (then Production after approval) using environment-specific parameter files.

Q: What is the difference between the collaboration branch and the publish branch? A: The collaboration branch (main) contains the source JSON files for all ADF resources — this is where developers merge their feature branches. The publish branch (adf_publish) contains the generated ARM templates that are used for deployment. You edit in the collaboration branch and deploy from the publish branch.

Q: How do you handle different configurations across environments? A: ADF automatically parameterizes linked service connection strings in the ARM template. You create environment-specific parameter files (uat.parameters.json, prod.parameters.json) that override these values during deployment. Secrets are stored in Azure Key Vault, with each environment having its own Key Vault.

Q: What is a Service Connection in Azure DevOps? A: A Service Connection stores the credentials (Service Principal) that Azure Pipelines uses to authenticate with Azure for deployments. It is created in Project Settings and referenced in the YAML pipeline. It eliminates the need to store Azure credentials in the pipeline code.

Q: Why do you stop triggers before deployment? A: To prevent pipelines from running during deployment. A trigger that fires while resources are being updated could run a half-deployed pipeline, causing failures or data corruption. Triggers are stopped in the pre-deployment script and restarted in the post-deployment script.

Q: What is the difference between Classic and YAML pipelines in Azure DevOps? A: Classic pipelines are built using a drag-and-drop UI in the browser. YAML pipelines are defined in code files stored in your repository. YAML is recommended because it is version-controlled, reviewable in PRs, and reusable across projects. Classic pipelines still work but are considered legacy for new projects.

Q: How do you implement approval gates for production deployment? A: Create an Environment called “Production” in Azure DevOps Pipelines, add Approvals and checks, and configure required approvers. When the YAML pipeline’s deployment job targets this environment, it pauses and waits for the designated approvers to approve before proceeding.

Q: How do you handle rollbacks in ADF CI/CD? A: Azure DevOps retains all previous pipeline run artifacts. To rollback, re-run the previous successful pipeline or manually deploy the previous ARM template version. Always stop triggers before rollback and restart after.

Wrapping Up

CI/CD with Azure DevOps follows the same principles as GitHub Actions — ARM templates, parameter files, service principals, and trigger management. The main differences are the YAML syntax, Variable Groups for secrets, and the built-in Environments feature for approval gates.

If your company uses Azure DevOps, this is the workflow you will follow. If you already understand the GitHub Actions approach from my previous post, picking up Azure DevOps is straightforward — the concepts are identical.

The key takeaway: Whether you use GitHub or Azure DevOps, the underlying process is the same — connect Dev to Git, work in feature branches, publish ARM templates, deploy with CI/CD. Master the process, and the tool is interchangeable.

Related posts:CI/CD with GitHubWhat is Azure Data Factory?ADF vs Synapse ComparisonAzure Fundamentals (IAM, Subscriptions)Top 15 ADF Interview Questions

If this guide helped you set up CI/CD, share it with your team. Questions? Drop a comment below.


Naveen Vuppula is a Senior Data Engineering Consultant and app developer based in Ontario, Canada. He writes about Python, SQL, AWS, Azure, and everything data engineering at DriveDataScience.com.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
Share via
Copy link