This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This position involves designing and implementing complex algorithms, working on complex data engineering pipelines, and contributing to Scala solutions on Hadoop (Spark). The professional will collaborate with technical leads, project managers, data analysts, and other stakeholders throughout the project lifecycle while managing multiple activities and solving analytical challenges.
Job Responsibility:
Designing and contributing to the implementation of complex algorithms
Complex data engineering pipelines
Responsible for implementing Scala solutions on Hadoop (Spark), development of Elastic Search components may also be required
Working closely with Project Managers, Technical Leads, Data analysts, server and web developers, QA engineers, and Product Owners throughout the project lifecycle
Requirements:
Strong in Java development
Developing, iterating and productionising complex data models
Scala, Spark
Big Data/Hadoop/Parallel Processing
Excellent communication skills
Ability to navigate a complex organization
Ability to be self-directed/motivated
Keen to learn new skills and approaches
Inquisitive nature
Excellent organization skills and capability to manage multiple parallel activities
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.