This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Are you a passionate Java developer with a strong interest in data and building robust, high-performance systems? We’re looking for a Data Engineer with deep Java expertise to design, develop, and maintain scalable ETL pipelines and data solutions in a fast-paced, innovative environment. This is not just a data role — it’s for a software engineer who loves data. If you thrive on turning complex data challenges into elegant, maintainable code using Java 11, Spring Boot, and cloud-native tools, this is your next big step. You’ll be part of a collaborative, agile team shaping the future of data infrastructure — with real impact on business decisions, product innovation, and system scalability.
Job Responsibility:
Design, develop, and deploy high-performance ETL pipelines using Java 11 and modern frameworks
Build and maintain RESTful microservices using Spring Boot, ensuring scalability and reliability
Write clean, efficient, and well-documented SQL scripts and contribute to data modeling and schema design
Collaborate across teams to integrate data from diverse sources into unified, actionable systems
Use Gradle, Maven, Git (GitHub), and TFS to manage code, CI/CD pipelines, and version control
Apply design patterns and best practices to create modular, testable, and maintainable code
Work in cloud environments (AWS) to deploy and manage data infrastructure
Participate in all phases of the SDLC: analysis, coding, testing (including Mockito), debugging, and release
Requirements:
Strong hands-on experience with Java 11, Spring Boot, RESTful APIs, and JdbcTemplate
Proven experience in ETL development and data pipeline engineering using Java
Solid understanding of SQL scripting, database design, and data modeling
Experience with IDEs like IntelliJ IDEA or Eclipse, and build tools like Maven and Gradle
Familiarity with testing frameworks (e.g., Mockito) and version control (Git)
Exposure to cloud platforms (AWS) and containerization concepts (Docker/Kubernetes — a plus)
A problem-solving mindset, strong analytical skills, and a passion for clean, efficient code
Nice to have:
Experience with Vertica or other analytical databases
Knowledge of data warehousing, streaming data, or big data tools (e.g., Apache Spark)