This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for an experienced Computer Programmer to develop, integrate, and optimize data processing systems. The role involves software development, ETL processes, and system integrations in an agile environment.
Job Responsibility:
Work collaboratively in agile teams, participating in scrum meetings and sprint planning
Gather and document customer requirements using JIRA
Develop and optimize data processing systems using Hadoop, Hive, PySpark, and Scala Spark
Maintain user provisioning and security within the Hadoop environment
Implement ETL functions, unit testing, and data validation
Troubleshoot and resolve software defects, performance, and security issues
Manage source code versions using GitHub/GitLab
Prepare technical documentation, reports, and stakeholder presentations
Requirements:
5+ years of experience in software development and Big Data technologies
Proficiency in Hadoop, Hive, PySpark, Scala Spark, and ETL processes
Strong understanding of software architecture, security, and performance tuning
Experience with agile methodologies and GitLab/GitHub
Excellent problem-solving and communication skills
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.