This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Design, develop and implement scalable batch/real time data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse and Data Lake. Design and implement data model changes that align with warehouse dimensional modeling standards. Proficient in Data Lake, Data Warehouse Concepts and Dimensional Data Model. Responsible for maintenance and support of all database environments, design and develop data pipelines, workflow, ETL solutions on both on-prem and cloud-based environments. Design and develop SQL stored procedures, functions, views, and triggers. Design, code, test, document and troubleshoot deliverables. Collaborate with others to test and resolve issues with deliverables. Maintain awareness of and ensure adherence to Zelis standards regarding privacy.
Job Responsibility:
Design, develop and implement scalable batch/real time data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse and Data Lake
Design and implement data model changes that align with warehouse dimensional modeling standards
Responsible for maintenance and support of all database environments, design and develop data pipelines, workflow, ETL solutions on both on-prem and cloud-based environments
Design and develop SQL stored procedures, functions, views, and triggers
Design, code, test, document and troubleshoot deliverables
Collaborate with others to test and resolve issues with deliverables
Maintain awareness of and ensure adherence to Zelis standards regarding privacy
Create and maintain Design documents, Source to Target mappings, unit test cases, data seeding
Ability to perform Data Analysis and Data Quality tests and create audit for the ETLs
Perform Continuous Integration and deployment using Azure DevOps and Git
Requirements:
3+ Years Microsoft BI Stack (SSIS, SSRS, SSAS)
3+ Years data engineering experience to include data analysis
3+ years programming SQL objects (procedures, triggers, views, functions) in SQL Server
Experience optimizing SQL queries
Advanced understanding of T-SQL, indexes, stored procedures, triggers, functions, views, etc.
Experience designing and implementing Data Warehouse
Working Knowledge of Azure/AWS Architecture, Data Lake
Must be detail oriented
Must work under limited supervision
Must demonstrate good analytical skills as it relates to data identification and mapping and excellent oral communication skills
Must be flexible and able to multi-task and be able to work within deadlines
must be team-oriented, but also be able to work independently
Nice to have:
Experience working with an ETL tool (DBT preferred)
Working Experience designing and developing Azure/AWS Data Factory Pipelines
Working understanding of Columnar MPP Cloud data warehouse using Snowflake
Working knowledge managing data in the Data Lake
Business analysis experience to analyze data to write code and drive solutions
Working knowledge of: Git, Azure DevOps, Agile, Jira and Confluence