This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are working with a global leader in the pharmaceutical and life sciences industry, renowned for their commitment to innovation and improving patient outcomes. This German multinational with 60,000+ employees in 66 countries, offers a unique opportunity to transform the world through healthcare, life sciences, and performance materials. Our client is strengthening their cloud-driven data and analytics landscape. They are looking for a Technical Lead with deep expertise in Data Warehousing, Data Lakes, Snowflake, and DBT to guide the technical direction of their modern data platform. In this role, you will lead the end-to-end lifecycle of cloud data products, from architecture and modeling to pipeline development, performance optimization, and automation. You will act as a hands-on Subject Matter Expert, mentoring developers and ensuring the delivery of scalable, secure, and high-quality data solutions that power analytics, reporting, and AI use cases. This is a high-impact position for someone who enjoys designing cloud-native data platforms, driving technical excellence, and shaping data engineering best practices in a rapidly evolving environment.
Job Responsibility:
Own the end-to-end design of cloud data products
define and enforce standards for data quality, security, GxP compliance, and CI/CD
document decisions through Design Decision Records (DDRs)
Oversee implementation across internal and external teams
provide deep expertise in Snowflake performance tuning, dbt pipeline design, SQL optimization, and Python-based data processing
Mentor engineers at all levels, coordinate cross-team dependencies, and communicate risks, trade-offs, and progress clearly to stakeholders
Plan and scope engineering initiatives
balance feature delivery with technical debt
drive CI/CD adoption using Azure DevOps and automated data testing
Contribute to team growth through hiring, onboarding, and knowledge sharing
Requirements:
Master’s degree in Computer Science, Information Technology, Engineering, or a related field
8+ years of experience as a Snowflake Developer and data modeling (Data Vault 2.0, OLTP, OLAP)
Expert-level knowledge of SQL and hands-on experience in Python for data transformation and automation
Strong understanding of Data Warehouse and Data Lake concepts, architectures, and design patterns
Practical experience building and managing data pipelines using DBT
Familiarity with performance tuning techniques in Snowflake, including query optimization
Hands-on experience with CI/CD pipelines, ideally with Azure DevOps
Exposure to Infrastructure as Code tools (Terraform, CloudFormation, Pulumi, etc.)
Experience working in regulated industries or structured delivery environments
Strong communication skills and the ability to influence technical direction
What we offer:
Pension plan
Life and Accident Insurance
Health Insurance
Restaurant Vouchers
Employee and Family Psychological Support Program
LinkedIn Learning Subscription: access to over 15,000 courses with certification