This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Intermediate Data Engineer role involves designing and implementing data solutions using programming languages like Python and Java. Candidates should have at least 4 years of experience in data engineering, with proficiency in Snowflake and AWS. A degree in a related field is preferred. This position offers the opportunity to work with cutting-edge technologies in a collaborative environment.
Job Responsibility:
Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack
Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure
Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS
Develop and deliver detailed presentations to effectively communicate complex technical concepts
Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc
Adhere to Agile practices throughout the solution development process
Design, build, and deploy databases and data stores to support organizational requirements
Requirements:
4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects
2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions
Undergraduate or Graduate degree preferred
Proficiency in coding skills, utilizing languages such as Python, Java, and Scala
Experience collaborating across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS
Demonstrate production experience in core data platforms such as Snowflake, Databricks, Azure
Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc
Showcase professional written and verbal communication skills
Nice to have:
Demonstrate production experience in core data platforms such as Snowflake, Databricks, Azure
Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc