This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
An expert with 4-5 years of experience in Hadoop ecosystem and cloud- (AWS ecosystem/Azure), relational data stores, Data Integration techniques, XML, Python, Spark, and ETL techniques.
Requirements:
4-5 years of experience in Hadoop ecosystem and cloud (AWS ecosystem/Azure)
Experience working with in-memory computing using R, Python, Spark, PySpark, Kafka, and Scala
Experience in parsing and shredding XML and JSON, shell scripting, and SQL
Experience working with Hadoop ecosystem - HDFS, Hive
Experience working with AWS ecosystem - S3, EMR, EC2, Lambda Cloud Formation, Cloud Watch, SNS/SQS
Experience with Azure – Azure Data Factory (ADF)
Experience working with SQL and No SQL databases
Experience designing and developing data sourcing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking, and matching