Job Description
Job Title:  Senior Data Platform Engineer (m/f/x)
Posting Start Date:  29/10/2025
Job Description: 

For our headquarter in Munich we are looking for a

 

Senior Data Platform Engineer (m/f/x)

The Position

The role involves building and maintaining our data platform infrastructure to support efficient delivery of high-quality data solutions. Focus is on enhancing developer experience, platform reliability, automation, and self-service, rather than direct pipeline development. Combining DevOps with data engineering, you will create scalable, maintainable infrastructure and tooling, while enjoying participation in digitization projects.

Roles and Responsibilities

  • Platform Operations & Automation: Design and maintain data platform infrastructure using Azure, Snowflake, dbt Cloud, and Airflow. Implement Infrastructure as Code with Terraform, manage multi-environment workflows (DEV/UAT/PRD), and develop custom tools for platform management and resource provisioning
  • Developer Experience: Build self-service capabilities and internal tooling (common libraries, shared code, reusable components) to enable 30+ data engineers. Create code generation tools, streamline CI/CD pipelines for 100+ repositories, and establish development environment configuration guidelines, code quality gates, and coding standards
  • Monitoring & Reliability: Implement centralized monitoring, logging, and alerting across the platform stack.
  • Documentation & Knowledge: Create tutorials and runbooks, promote platform standards, help onboard new data engineers, and actively share knowledge across our internal data engineering community. Help growing our team by providing mentoring and promoting standards and best practices.
  • Be in constant communication with team members and other relevant parties and convey results efficiently and clearly

Personal Skills and Professional Experience

  • Several years in IT with good experience in data engineering, DevOps, or platform engineering roles
  • Strong expertise in SQL, Python, Linux, Docker, Git, and cloud providers (Azure preferred)
  • Understanding of Data Warehousing, Data Pipelines, ETL/ELT, and engineering practices like Infrastructure as Code and CI/CD
  • Excellent communicator and collaborator with internal and external teams, unbiased toward technologies
  • Ability to comprehend new technologies quickly, with intellectual curiosity and integrity
  • Strategic thinker with a clear vision for data practices and engineering; proactive and results-driven
  • Bachelor’s degree in Information Technology, Business Administration, Data Science, or related field