Azure Data Factory Training: Designing and Implementing Data Integration Solutions

Course 1231

  • Duration: 3 days
  • Labs: Yes
  • Language: English
  • Level: Intermediate

This course covers all key aspects of the Azure Data Factory v2 platform.

It is ideal for architects, developers, administrators, IT managers, and anyone else who would like to make the best possible use of this Azure service. Key areas covered include ADF v2 architecture, UI-based and automated data movement mechanisms, 10+ data transformation approaches, control-flow activities, reuse options, operational best-practices, and a multi-tiered approach to ADF security.

Special attention is paid to covering Azure services which are commonly used with ADF v2 solutions. These services are Azure Data Lake Storage Gen 2, Azure SQL Database, Azure Databricks, Azure Key Vault, Azure Functions, and a few others.

6 hands-on instructor-led labs are included with the course. These allow students to practice applying ADF v2 concepts and prepare them for real-world Azure data integration projects.

Microsoft Azure Fundamentals Training (AZ-900T00) or equivalent experience.

Azure Data Factory Training: Designing and Implementing Data Integration Solutions Delivery Methods

  • After-course instructor coaching benefit
  • Hands-on labs included

Azure Data Factory Training: Designing and Implementing Data Integration Solutions Course Benefits

Build end-to-end ETL and ELT solutions using Azure Data Factory v2Architect, develop and deploy sophisticated, high-performance, easy-to-maintain and secure pipelines that integrate data from a variety of Azure and non-Azure data sources.Apply the latest DevOps best practices available for the ADF v2 platform.

Azure Data Factory Training Outline

  • Historical background: SSIS, ADF v1, other ETL/ELT tools
  • Key capabilities and benefits of ADF v2
  • Recent feature updates and enhancements
  • Connectors: Azure services, databases, NoSQL, files, generic protocols, services & apps, custom
  • Pipelines
  • Activities: data movement, data transformation, control flow
  • Datasets: source, sink
  • Integration Runtimes: Azure, Self-Hosted, Azure-SSIS
  • Creating ADF v2 instance
  • Creating a pipeline and associated activities
  • Executing the pipeline
  • Monitoring execution
  • Reviewing results

Copying Tools and SDKS

  • Copy Data Tool/Wizard
  • Copy activity
  • SDKs: Python, .NET
  • Automation: PowerShell, REST API, ARM Templates

Copying Considerations

  • File formats: Avro, binary, delimited, JSON, ORC, Parquet
  • Data store support matrix
  • Write behaviour: append, upsert, overwrite, write with custom logic
  • Schema and data type mapping
  • Fault tolerance options

Transformation with Mapping Data Flows

  • Introduction to mapping data flows
  • Data flow canvas
  • Debug mode
  • Dealing with schema drift
  • Expression builder & language
  • Transformation types: Aggregate, Alter row, Conditional split, Derived column, Exists, Filter, Flatten, Join, Lookup, New branch, Pivot, Select, Sink, Sort, Source, Surrogate key, Union, Unpivot, Window

Transformation with External Services

  • Databricks: Notebook, Jar, Python
  • HDInsight: Hive, Pig, MapReduce, Streaming, Spark
  • Azure Machine Learning service
  • SQL Stored procedures
  • Azure Data Lake Analytics U-SQL
  • Custom activities with .NET or R
  • Purpose of activity dependencies: branching and chaining
  • Activity dependency conditions: succeeded, failed, skipped, completed
  • Control flow activities: Append Variable, Azure Function, Execute Pipeline, Filter, ForEach, Get Metadata, If Condition, Lookup, Set Variable, Until, Wait, Web
  • Debugging
  • Monitoring: visual, Azure Monitor, SDKs, runtime-specific best practices
  • Scheduling execution with triggers: event-based, schedule, tumbling window
  • Performance, scalability, tuning
  • Common troubleshooting scenarios in activities, connectors, data flows and integration runtimes
  • Quick introduction to source control with Git
  • Integration with GitHub and Azure DevOps platforms
  • Environment management: Development, QA, Production
  • Iterative development best practices
  • Continuous Integration (CI) pipelines
  • Continuous Delivery (CD) pipelines
  • Templates: out-of-the-box and organizational
  • Parameters
  • Naming convention
  • Data movement security
  • Azure Key Vault
  • Self-hosted IR considerations
  • IP address blocks
  • Managed identity

Need Help Finding The Right Training Solution?

Our training advisors are here for you.

Course FAQs

Azure Data Factory (ADF) v2 is an Azure data integration service which allows creation of data-driven workflows to orchestrate and automate data movement and transformation across cloud, on-prem and hybrid environments.

While this new course is designed to bring students from zero expertise with ADF v2 to an intermediate or even advanced level of knowledge, Microsoft Azure Fundamentals Training (AZ-900T00) or equivalent is expected.

Chat With Us