This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Being responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems
Building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies
Evaluating and translating technical design to workable technical solutions/code and technical specifications at par with industry standards
Driving creation of re-usable artifacts
Establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation
Working closely with analysts/data scientists to understand impact to the downstream data models
Writing efficient and well-organized software to ship products in an iterative, continual release environment
Contributing and promoting good software engineering practices across the team
Communicating clearly and effectively to technical and non-technical audiences
Defining data retention policies
Monitoring performance and advising any necessary infrastructure changes
Requirements:
3+ years’ experience with Azure Data Factory and Databricks
5+ years’ experience with data engineering or backend/fullstack software development
Strong SQL skills
Python scripting proficiency
Experience with data transformation tools - Databricks and Spark
Experience in structuring and modelling data in both relational and non-relational forms
Experience with CI/CD tooling
Working knowledge of Git
English level: B2, C1
Nice to have:
Experience with Azure Event Hubs, CosmosDB, Spark Streaming, Airflow