This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The project focuses on building a cloud-based logistics analytics platform that provides standardized, high-quality reporting and insights across complex, global supply chain processes. The solution is designed to support data-driven decision-making by making logistics data transparent, reliable, and easy to consume for business users across multiple organizational units. Built on modern cloud technologies, the platform combines scalable data warehousing, automated data pipelines, and strong data governance to transform large volumes of logistics data into trusted KPIs and analytical datasets. It plays a critical role in improving efficiency, visibility, and operational performance across the end-to-end logistics value stream.
Job Responsibility:
Implement business logic in the Data Warehouse in accordance with functional and technical specifications
Perform light business analysis to ensure data is delivered accurately and in a meaningful, business-relevant way
Translate business requirements into scalable and efficient data models
Design, build, and maintain ETL pipelines using Azure Data Factory
Optimize data loading processes and tune query performance
Collaborate closely with senior stakeholders on the customer side to refine requirements and propose data-driven improvements
Actively contribute ideas and solutions to improve data architecture and pipeline reliability
Requirements:
3+ years of experience working with Microsoft Azure and readiness to work primarily with SQL (up to 80% of the time)
Proven experience in database development using MS SQL Server / T-SQL
Ability to write highly performant SQL and implement performance optimization techniques
Hands-on experience with version control systems (e.g. Git)
Solid understanding of data warehousing concepts and dimensional modeling
Experience working in an Agile development environment
Practical knowledge of creating and maintaining Azure DevOps and Azure Data Factory pipelines
Experience developing robust data pipelines using dbt
Nice to have:
Experience with Databricks
Background in Supply Chain & Logistics projects
Familiarity with SAP MM data structures
What we offer:
Flexible working format - remote, office-based or flexible
A competitive salary and good compensation package
Personalized career growth
Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
Active tech communities with regular knowledge sharing