Implementing a Lakehouse with Microsoft Fabric

Master Lakehouse implementation with Microsoft Fabric to empower data professionals in engineering seamless end-to-end analytics solutions utilizing Apache Spark and Delta Lake technologies.

Course Thumbnail

Essential Skills Gained

Checkmark

Design robust lakehouses using Microsoft Fabric

Checkmark

Implement data processing with Apache Spark

Checkmark

Manage data with Delta Lake tables

Checkmark

Optimize data ingestion using Dataflows Gen2 and pipelines

Format

  • Instructor-led
  • 1 days with lectures and hands-on labs.

Audience

  • Data Engineers
  • Data Analysts
  • Business Intelligence Professionals
  • Data Architects

Description

This course is designed to build your foundational skills in data engineering on Microsoft Fabric, focusing on the Lakehouse concept. This course will explore the powerful capabilities of Apache Spark for distributed data processing and the essential techniques for efficient data management, versioning, and reliability by working with Delta Lake tables. This course will also explore data ingestion and orchestration using Dataflows Gen2 and Data Factory pipelines. This course includes a combination of lectures and hands-on exercises that will prepare you to work with lakehouses in Microsoft Fabric.

Calendar icon

Upcoming Course Dates

October 6-6, 2025

9:00 AM - 5:00 PM

Virtual: Online - America/Chicago

Enroll

$695

December 15-15, 2025

9:00 AM - 5:00 PM

Virtual: Online - America/Los_Angeles

Enroll

$695

Course Outline

Download PDF

Introduction to end-to-end analytics using Microsoft Fabric

  1. Explore end-to-end analytics with Microsoft Fabric

  2. Data teams and Microsoft Fabric

  3. Enable and use Microsoft Fabric

Get started with lakehouses in Microsoft Fabric

  1. Explore the Microsoft Fabric Lakehouse

  2. Work with Microsoft Fabric Lakehouses

  3. Explore and transform data in a lakehouse

  4. Create and ingest data with a Microsoft Fabric Lakehouse

Use Apache Spark in Microsoft Fabric

  1. Prepare to use Apache Spark

  2. Run Spark code

  3. Work with data in a Spark dataframe

  4. Work with data using Spark SQL

  5. Visualize data in a Spark notebook

  6. Analyze data with Apache Spark

Work with Delta Lake tables in Microsoft Fabric

  1. Understand Delta Lake

  2. Create delta tables

  3. Work with delta tables in Spark

  4. Use delta tables with streaming data

  5. Use delta tables in Apache Spark

Ingest Data with Dataflows Gen2 in Microsoft Fabric

  1. Understand Dataflows (Gen2) in Microsoft Fabric

  2. Explore Dataflows (Gen2) in Microsoft Fabric

  3. Integrate Dataflows (Gen2) and Pipelines in Microsoft Fabric

  4. Create and use a Dataflow (Gen2) in Microsoft Fabric

Use Data Factory pipelines in Microsoft Fabric

  1. Understand pipelines

  2. Use the Copy Data activity

  3. Use pipeline templates

  4. Run and monitor pipelines

  5. Ingest data with a pipeline

Your Team has Unique Training Needs.

Your team deserves training as unique as they are.

Let us tailor the course to your needs at no extra cost.