This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Job Description: • Work as Data architect/Senior data engineer to design and develop cost effective and reliable data solutions on any cloud platform like Azure or AWS. • Should be able to understand client requirements and convert them into technical solution leveraging cloud capabilities and modern technologies. • Should be able to contribute into company’s internal innovation projects by conducting proof of concepts and developing frameworks using state of the art technologies. • Should have prior experience in developing data pipelines using Pyspark, SQL and Python • Should have good understanding of Snowflake and Azure cloud services like Synapse, Azure Databricks, Azure Data Factory, ADLS etc. • Should have prior understanding of applying ETL and ELT concepts and principals especially when migrating data from any legacy system to modern cloud platform. • Any experience in real time data processing using PySpark, Python and Kafka is an advantage. • Work as individual contributor and spend 70% of the time writing code in different languages, frameworks, and technology stacks. • Should be familiar with emerging technologies like GenAI, Snowflake Cortex and Databricks
Job Responsibility:
Work as Data architect/Senior data engineer to design and develop cost effective and reliable data solutions on any cloud platform like Azure or AWS
Work as individual contributor and spend 70% of the time writing code in different languages, frameworks, and technology stacks
Requirements:
8+ years of experience mainly Azure/AWS cloud platforms
SQL
Python
Pyspark
Snowflake
Should be able to understand client requirements and convert them into technical solution leveraging cloud capabilities and modern technologies
Should be able to contribute into company’s internal innovation projects by conducting proof of concepts and developing frameworks using state of the art technologies
Should have prior experience in developing data pipelines using Pyspark, SQL and Python
Should have good understanding of Snowflake and Azure cloud services like Synapse, Azure Databricks, Azure Data Factory, ADLS etc.
Should have prior understanding of applying ETL and ELT concepts and principals especially when migrating data from any legacy system to modern cloud platform
Should be familiar with emerging technologies like GenAI, Snowflake Cortex and Databricks
Nice to have:
Any experience in real time data processing using PySpark, Python and Kafka is an advantage