This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Data Engineers are responsible for designing, building, and maintaining the systems and processes that collect, store, and analyze data. They are responsible for gathering data from various sources and integrating it into a cohesive system. Responsibilities include design and implement ETL (Extract, Transform, Load) processes to clean and prepare data for analysis. Creating and maintaining databases to store large volumes of data efficiently. Building scalable data pipelines to ensure smooth data flow and accessibility. Working with data scientists, analysts, and other stakeholders to understand data needs and provide solutions. Continuously monitoring data systems and optimizing them for better performance and scalability.
Job Responsibility:
Design and implement tailored data solutions to meet customer needs and use cases
Provide thought leadership by recommending the most appropriate technologies and solutions
Demonstrate proficiency in coding skills to efficiently move solutions into production
Collaborate seamlessly across diverse technical stacks
Develop and deliver detailed presentations to effectively communicate complex technical concepts
Generate comprehensive solution documentation
Adhere to Agile practices throughout the solution development process
Design, build, and deploy databases and data stores to support organizational requirements
Ensure data quality, consistency and governance across multiple sources
Requirements:
6 - 8 years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects
2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions
Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server)
Experience with data pipeline tools (e.g., Apache Airflow, Luigi, Prefect)
Proficiency with at least one programming language (Python, Java, or Scala)
Experience with cloud platforms (AWS, Azure, or GCP) and their data services (e.g., Redshift, BigQuery, Snowflake, Databricks)
Familiarity with data modeling, warehousing, and schema design
Understanding of data governance, privacy, and security best practices
Advanced English required
Nice to have:
Demonstrate production experience in core data platforms
Exhibit a strong understanding of Data integration technologies
Showcase professional written and verbal communication skills to effectively convey complex technical concepts
Knowledge of infrastructure-as-code tools
Experience with streaming technologies (Kafka, Kinesis, etc)