This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Data Engineer you can expect to design and develop data pipelines and ETL processes, build and maintain data models, work with data scientists, collaborate with cross-functional teams, support optimization and tuning, ensure data quality, stay current with industry trends, participate in project and client meetings, and follow best practices.
Job Responsibility:
Design and develop data pipelines and ETL processes to ingest, transform, and load data from various sources into data warehouses or data lakes
Build and maintain data models and data structures to support analytical and reporting needs
Work with data scientists and analytics teams to help deploy data solutions and machine learning workflows
Collaborate with cross-functional teams to integrate data engineering solutions into client applications and systems
Support optimization and tuning of data pipelines and systems for performance and scalability
Help ensure data quality and integrity throughout the data lifecycle
Stay current with industry trends and best practices in data engineering and apply them to improve data processing efficiency
Participate in project and client meetings and support successful delivery of solutions
Follow best practices, security guidelines, and compliance requirements
Requirements:
Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience)
2–4 years of experience with data platforms, data warehouses, Data Lakes, and ETL
Hands-on experience with some of the following: ADF, Spark, Databricks, Python, Azure Synapse, ADLS, Azure Functions
Exposure to performance tuning and optimization in Databricks or Synapse (preferred, not required)
Familiarity with Delta Lake concepts such as delta tables, schema evolution, or SCD (nice to have)
Understanding of data concepts such as data governance, metadata, or data quality
Experience working with SQL / T-SQL
Familiarity with Git/GitHub and modern development practices
Knowledge of data modeling and data integration patterns
Strong analytical and problem-solving skills
Good communication and collaboration skills
Ability to work with clients and internal teams to deliver solutions
Nice to have:
Familiarity with Delta Lake concepts such as delta tables, schema evolution, or SCD
What we offer:
competitive pay with and performance-based bonus
generous paid time off
flexible and affordable benefits program designed to help you be and stay well, including: medical, dental & vision coverage, flexible spending accounts, health reimbursement account, and a 401(k) plan with a company match
benefit of working alongside enthusiastic and energetic teammates in a dynamic and thriving environment