This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Engineer will be part of the datastore-migration Factory team that will be responsible to perform for the end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse. This is a high visibility and crucial project for Goldman Sachs.
Job Responsibility:
Pipeline migration
Logic & Scheduling refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment
Data Transfer executing the physical migration of underlying datasets while ensuring data integrity
Stakeholder Engagement acting as a technical liaison to internal clients, facilitating handoff and sign-off with data owners
Code Conversion translating and optimizing legacy SQL and Spark-based consumption patterns for compatibility with Snowflake and Iceberg
Usage analysis to deliver required data products
Data Reconciliation & Quality working with reconciliation frameworks to build confidence that migrated data is functionally equivalent
Working with other internal data management platform
Requirements:
Bachelor's or Master's degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative, team-based environment
Ability to troubleshoot (SQL) and basic scripting experience
Professional proficiency in Python or Java
Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
Sophisticated understanding of Temporal Data Modeling, Schema Management, Performance Optimization, Architectural Theory
Experience with Kafka, ANSI SQL, FTP, Apache Spark
Data formats JSON, Avro, Parquet
Platforms Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ