DP-750: Implement data engineering solutions using Azure Databricks

  • Home
  • /
  • Courses
  • /
  • DP-750: Implement data engineering solutions using Azure Databricks
Course ID: DP-750
Exam Code: DP-750
Duration: 4 Days
Training Fee: HK$14000
Private in-house training

Apart from public, instructor-led classes, we also offer private in-house trainings for organizations based on their needs. Call us at +852 2116 3328 or email us at [email protected] for more details.

What are the skills covered
  • Set up and configure an Azure Databricks environment
  • Secure and govern Unity Catalog objects in Azure Databricks
  • Prepare and process data with Azure Databricks
  • Deploy and maintain data pipelines and workloads with Azure Databricks
Target Audience
    The target audience is data engineers who have fundamental knowledge of data analytics concepts, a basic understanding of cloud storage, and familiarity with data organization principles.
     
     
    They should be comfortable working with SQL and have experience using Python, including notebooks, for data engineering tasks.
Prerequisites
    Learners are expected to have a good understanding of Azure Databricks workspaces and Unity Catalog, along with familiarity with data access patterns and core data engineering and data warehouse concepts.
     
     
    In addition, they should have foundational knowledge of Azure security, including Microsoft Entra ID, and be familiar with Git version control fundamentals.
Course Modules

Module 1: Set up and configure an Azure Databricks environment
 
Build a solid foundation in Azure Databricks by understanding its architecture, integrations, compute options, and data organization capabilities. Learn how Azure Databricks provides a unified platform for data engineering, analytics, and AI workloads in the cloud.
 

  • Explore Azure Databricks
  • Understand Azure Databricks architecture
  • Understand Azure Databricks Integrations
  • Select and Configure Compute in Azure Databricks
  • Create and organize objects in Unity Catalog
       

    Module 2:Secure and govern Unity Catalog objects in Azure Databricks
     
    Unity Catalog provides centralized governance and security for data assets in Azure Databricks. This module explores how to secure Unity Catalog objects through access control strategies, fine-grained permissions, credential management, and authentication mechanisms.
     
    You’ll learn how to implement table and schema-level security, enforce row and column filtering, securely access secrets from Azure Key Vault, and authenticate data access using service principals and managed identities.
     

  • Secure Unity Catalog objects
  • Govern Unity Catalog objects
       

    Module 3: Prepare and process data with Azure Databricks
     
    Master the essential skills to build robust, scalable data engineering solutions with Azure Databricks and Unity Catalog. Learn to design effective data models, ingest data from diverse sources, transform raw data into analytics-ready formats, and ensure data quality across your lakehouse architecture.
     

  • Design and implement data modeling with Azure Databricks
  • Ingest data into Unity Catalog
  • Cleanse, transform, and load data into Unity Catalog
  • Implement and manage data quality constraints with Azure Databricks
       

    Module 4: Deploy and maintain data pipelines and workloads with Azure Databricks
     
    Master the complete lifecycle of building, deploying, and maintaining production-ready data pipelines in Azure Databricks—from design and orchestration to monitoring and optimization.
     

  • Design and implement data pipelines with Azure Databricks
  • Implement Lakeflow Jobs with Azure Databricks
  • Implement development lifecycle processes in Azure Databricks
  • Monitor, troubleshoot and optimize workloads in Azure Databricks
       

  • Exam & Certification

    Microsoft Certified: Azure Databricks Data Engineer Associate
     
     
    Demonstrate expertise in integrating and modeling data, building and deploying optimized pipelines, and troubleshooting and maintaining workloads in Azure Databricks.
     

  • Level: Intermediate
  • Product: Azure Databricks
  • Role: Data Engineer
  • Subject: Data engineering
  • Search for a course