This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Valtech, we don’t just talk about transformation. We make it happen. Our people are the heart of our success, and we foster a workplace where everyone has the support to thrive, grow and innovate.
Job Responsibility:
Design and build data pipelines: Develop scalable Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) workflows for batch and streaming data processing.
Manage data infrastructure: Maintain and optimize data storage solutions, such as data lakes (Cloud Storage) and data warehouses (BigQuery), to ensure scalability, performance, and reliability.
Ensure data quality: Implement processes to guarantee the accuracy, integrity, and security of data throughout its lifecycle.
Collaborate with teams: Work closely with data scientists, analysts, and software engineers to understand their data needs and deliver robust data solutions.
Support machine learning: Operationalize machine learning models by building and maintaining the data pipelines that feed and integrate with Google's Vertex AI platform.
Automate processes: Implement automation for manual data tasks and workflows to improve efficiency.
Troubleshoot issues: Monitor and debug data systems and pipelines to ensure uninterrupted data delivery.
Requirements:
Education: A Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field is typically required.
Minimum 3 + Years of experience as a Data Engineer.
Programming expertise: Proficiency in Python, SQL, and other languages like Java or Scala is common.
GCP technologies: Hands-on experience with core GCP services is essential. Key services include: Data Processing: Dataflow, Dataproc, and Pub/Sub. Storage and Warehousing: BigQuery, Cloud Storage, and Cloud SQL. Orchestration: Cloud Composer (powered by Apache Airflow).
Data fundamentals: Deep understanding of data modeling, data warehousing, and distributed systems.
Big Data technologies: Experience with open-source tools like Apache Spark or Apache Beam.
Testing and DevOps: Familiarity with CI/CD pipelines (e.g., Cloud Build), Git, and automated testing.
Analytical skills: Strong problem-solving and analytical thinking abilities.
Communication: Excellent communication skills to explain complex technical concepts to both technical and non-technical stakeholders.
Certification (preferred): A Google Cloud Professional Data Engineer certification can boost a candidate's profile
Nice to have:
Experienced in Agile methodologies and consulting (a plus)
Certification (preferred): A Google Cloud Professional Data Engineer certification can boost a candidate's profile
What we offer:
Flexibility, with hybrid work options (country-dependent)
Learning and development, with access to cutting-edge tools, training and industry experts.