This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
An expert with 4-5 years of experience in Hadoop ecosystem and cloud- (AWS ecosystem/Azure), relational data stores, Data Integration techniques, XML, Python, Spark, and ETL techniques.
Requirements:
4-5 years of experience in Hadoop ecosystem and cloud (AWS ecosystem/Azure)
Experience working with in-memory computing using R, Python, Spark, PySpark, Kafka, and Scala
Experience in parsing and shredding XML and JSON, shell scripting, and SQL
Experience working with Hadoop ecosystem - HDFS, Hive
Experience working with AWS ecosystem - S3, EMR, EC2, Lambda Cloud Formation, Cloud Watch, SNS/SQS
Experience with Azure – Azure Data Factory (ADF)
Experience working with SQL and No SQL databases
Experience designing and developing data sourcing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking, and matching
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.