This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Snowflake Developer will be responsible for designing, developing, and maintaining Snowflake environments that support data storage, processing, and analytics for BlackRock’s Global Infrastructure Partners (GIP). This role plays a critical part in building and managing scalable data architectures, robust data models, and automated data pipelines that enable efficient and reliable integration, transformation, and delivery of infrastructure fund data. The developer will work closely with cross-functional teams to translate business initiatives into technical solutions, ensuring data quality, integrity, and performance across GIP and BlackRock systems.
Job Responsibility:
Manage and maintain GIP Snowflake accounts, including performance monitoring, cost optimization, resolving issues, and backups
Partner with BlackRock’s Snowflake infrastructure team to implement guidelines, coordinate upgrades, and integrate enterprise-wide features
Ensure compliance with data governance, security, and audit requirements
Design and develop automated data pipelines using Fivetran and other ETL tools, integrating data from GIP systems, BlackRock warehouses, APIs, and external sources
Implement robust data validation, cleaning, and transformation processes to ensure data quality, accuracy, and reliability
Monitor and tune data pipelines for performance, scalability, and cost efficiency
Design and implement logical and physical data models to support analytical and reporting needs for GIP and BlackRock partners
Create ER diagrams, data flow diagrams, and other visualizations to represent data models and workflows
Develop and enforce data modeling guidelines and standards
Optimize data models for query performance, ensuring efficient data retrieval across large datasets
Partner with business and GIP technology teams to translate business requirements into technical solutions
Train and support end users on Snowflake capabilities, data access, and reporting tools
Collaborate with vendors and cross-functional teams to troubleshoot issues, deliver enhancements, and implement new features
Requirements:
Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or related quantitative field (Master’s degree or equivalent experience preferred)
Minimum of 5–7 years of experience in data engineering, including data integration, modeling, optimization, and data quality
Demonstrable experience designing, developing, and maintaining data warehouses using Snowflake in a production environment
Experience working in financial services or private markets (preferred)
Strong SQL skills for querying, data manipulation, and performance optimization
Proficiency with modern ETL/ELT tools (e.g., Fivetran, dbt, Informatica, Talend)
Expertise in data modeling concepts (relational, dimensional, and star schema)
Proficiency in Python, Java, or Scala for data engineering tasks
Knowledge of cloud services (AWS, Azure, GCP) and data integration within cloud ecosystems
Strong problem-solving and debugging skills
ability to identify and resolve performance bottlenecks
Excellent communication skills, with the ability to collaborate effectively across technical and business teams
Familiarity with data governance, security, and compliance standards
Nice to have:
Experience working in financial services or private markets