1-Day Instructor-Led Training
Official Google Content led by Google Authorised Instructors
After-Course Instructor-Coaching Included
Build Batch Data Pipelines on Google Cloud
Course 1491
- Duration: 1 day
- Language: English
- Level: Intermediate
In this intermediate course, you will learn to design, build, and optimise robust batch data pipelines on Google Cloud. Moving beyond fundamental data handling, you will explore large-scale data transformations and efficient workflow orchestration, essential for timely business intelligence and critical reporting.
Get hands-on practice using Dataflow for Apache Beam and Serverless for Apache Spark (Dataproc Serverless) for implementation, and tackle crucial considerations for data quality, monitoring, and alerting to ensure pipeline reliability and operational excellence. A basic knowledge of data warehousing, ETL/ELT, SQL, Python, and Google Cloud concepts is recommended.
Batch Data Pipelines on Cloud Delivery Methods
In-Person
Online
Upskill your whole team by bringing Private Team Training to your facility.
Batch Data Pipelines on Cloud Course Information
Important Course Information:
- Determine whether batch data pipelines are the correct choice for your business use case.
- Design and build scalable batch data pipelines for high-volume ingestion and transformation.
- Implement data quality controls within batch pipelines to ensure data integrity.
- Orchestrate, manage, and monitor batch data pipeline workflows, implementing error handling and observability using logging and monitoring tools.
Prerequisites:
- Basic proficiency with Data Warehousing and ETL/ELT concepts
- Basic proficiency in SQL
- Basic programming knowledge (Python recommended)
- Familiarity with gcloud CLI and the Google Cloud console Familiarity with core Google Cloud concepts and services
Batch Data Pipelines on Cloud Course Outline
Module 1) When to choose batch data pipelines
- You will learn the critical role of a data engineer in developing and maintaining batch data pipelines, understand their core components and lifecycle, and analyse common challenges in batch data processing. You'll also identify key Google Cloud services that address these challenges.
Module 2) Design and build batch data pipelines
- You will design scalable batch data pipelines for high-volume data ingestion and transformation. You'll also optimise batch jobs for high throughput and cost-efficiency using various resource management and performance tuning techniques.
Module 3) Control data quality in batch data pipelines
- You will develop data validation rules and cleansing logic to ensure data quality within batch pipelines. You'll also implement strategies for managing schema evolution and performing data deduplication in large datasets
Module 4) Orchestrate and monitor batch data pipelines
- You will orchestrate complex batch data pipeline workflows for efficient scheduling and lineage tracking. You'll also implement robust error handling, monitoring, and observability for batch data pipelines.
Need Help Finding The Right Training Solution?
Our training advisors are here for you.
Batch Data Pipelines on Cloud FAQs
Data Engineers and Data Analysts
4 modules, 4 labs, 5 Classroom Activities
The skills taught align with real enterprise use cases such as:
- Data warehousing and analytics pipelines
- Large-scale ETL processing
- Periodic reporting and business intelligence workflows
- Cost-optimised data processing architectures
Key services include:
- Cloud Storage – data ingestion and staging
- BigQuery – large-scale data analytics
- Dataflow – batch data processing using Apache Beam
- Cloud Composer (introductory concepts) – workflow orchestration
- IAM and security concepts related to data pipelines