This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Senior Data Engineer role involves designing and implementing data solutions using technologies like Python, Java, and Snowflake. Candidates should have at least 4 years of experience in data engineering or analytics, with a preference for those with leadership experience. A degree is preferred, and strong communication skills are essential.
Job Responsibility:
Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack
Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure
Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS
Develop and deliver detailed presentations to effectively communicate complex technical concepts
Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc.
Adhere to Agile practices throughout the solution development process
Design, build, and deploy databases and data stores to support organizational requirements
Requirements:
4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects
2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions
Proficiency in coding skills, utilizing languages such as Python, Java, and Scala
Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS
Develop and deliver detailed presentations to effectively communicate complex technical concepts
Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc.
Adhere to Agile practices throughout the solution development process
Design, build, and deploy databases and data stores to support organizational requirements
Undergraduate or Graduate degree preferred
Nice to have:
Demonstrate production experience in core data platforms such as Snowflake, Databricks, Azure
Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc
Showcase professional written and verbal communication skills to effectively convey complex technical concepts