This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a skilled Data Engineer to join our growing data team. You’ll work on designing, developing, and maintaining scalable data pipelines and infrastructure that enable data-driven decisions across the organization. You’ll collaborate closely with senior engineers, product managers, and analysts to deliver reliable, high-quality data solutions. This role is ideal for someone who is self-sufficient and technically strong, but still looking to grow in architectural ownership and stakeholder engagement.
Job Responsibility:
Design, build, and maintain efficient, reusable, and reliable data pipelines
Collaborate with senior engineers on architecture and design decisions
Ensure data quality, governance, and reliability across multiple data sources
Implement and optimize ETL/ELT processes for analytics and business intelligence
Support integration of new data sources and help maintain data consistency
Participate in code reviews and contribute to team standards and best practices
Continuously improve data processes and look for automation opportunities
Requirements:
Strong experience in SQL, data modeling, and at least one programming language such as Python
Solid hands-on experience with AWS data services (e.g., S3, Redshift, Lambda, ECS, EMR)
Understanding of distributed data processing and workflow orchestration tools (e.g., Airflow)
Hands on experience with data transformation and modeling tools (e.g., dbt, sql mesh) and best practices for modular, testable data models
Experience building and maintaining ETL/ELT pipelines and data warehouses
Ability to take ownership of assigned projects and deliver high-quality work
Eagerness to learn from senior engineers and grow towards architectural responsibility
Strong problem-solving, collaboration, and communication skills
Passion for building scalable, reliable, and efficient data systems
Nice to have:
Experience with modern data observability, monitoring, or data quality tools
Exposure to containerization and deployment tools such as Docker or Kubernetes
Experience working with real-time data streaming technologies
Demonstrated initiative in improving data platform efficiency or cost optimization within cloud environments
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.