This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a lead data engineer highly collaborative and comfortable with ambiguity and change in a fast-paced environment to partner closely with various teams to design, build and maintain data systems and applications. Must be a self-starter and highly organized with strong problem-solving and learning skills to work with various teams analyze and design data storage structures and build performant automated data pipelines that are reliable and scalable in a fast growing data ecosystem. Leverage strong understanding of data modeling principles and modern data platforms to properly design and implement data pipeline solutions. Experience using analytic SQL, working with traditional relational databases and/or distributed systems such as AWS S3, Hadoop / Hive, Redshift. Provide production support and adhering to the defined SLA(s). Strong technical skills and demonstrated ability to be detail-oriented. Good understanding of data security, compliance, and policies in healthcare services. Demonstrated ability to test and validate according to industry best practices and to communicate effectively with peers in technology teams. Strong interpersonal, written, and oral communication skills with ability to work with all levels of the organization. Prior experience in the healthcare industry is a plus Estimated salary range for this position is $126,148.63 - $163,993.22 / year depending on experience.
Job Responsibility:
Partner closely with various teams to design, build and maintain data systems and applications
Analyze and design data storage structures and build performant automated data pipelines that are reliable and scalable
Leverage strong understanding of data modeling principles and modern data platforms to properly design and implement data pipeline solutions
Provide production support and adhering to the defined SLA(s)
Demonstrated ability to test and validate according to industry best practices and to communicate effectively with peers in technology teams
Work with all levels of the organization
Requirements:
Bachelors degree in computer science or related
Master's degree preferred
Highly prefer Epic Certified in Caboodle and Clarity Data Models
Must possess strong understanding of Clarity Data Models and associated databases
Possess proven experience working with data warehouse architecture and solutions
At least 10 yrs with a Bachelors and at least 8 yrs with a Masters of recent exp in data engineering and end-to-end automation of data pipelines
Technical Skills: Data Warehousing, Strong SQL, and Python
Strong understanding of data science and business intelligence workflows
Programming exp ideally in Python & SQL
exp with large-scale data warehousing and analytics projects, including using AWS and GCP technologies
Proven track record of successful written communication, technically deep and business savvy
Experience in production support and troubleshooting
Hands-on experience with modern data platforms (Snowflake, Redshift, etc) required
10+ years of hands-on: data warehouse design/architecture with AWS, Data Vault 2
Helpful to have Dimensional Modeling, Orchestration tools like Airflow, Control-M, containerization of applications (Docker & Kubernetes) Cloud Formation and Terraform—AWS Services (S3, AWS Glue, Athena, Lake formation, DynamoDB, DMS, RDS, etc) Agile methodology and DevOps process, modern data Integrations tools (DBT, Informatica Power Center, AWS Glue or similar) preferred
*3+ years(s) of data modeling and familiarity with AI/ML and BI tools, a plus
Minimum Required Experience: 10 Years
Nice to have:
Prior experience in the healthcare industry is a plus
Helpful to have Dimensional Modeling, Orchestration tools like Airflow, Control-M, containerization of applications (Docker & Kubernetes) Cloud Formation and Terraform—AWS Services (S3, AWS Glue, Athena, Lake formation, DynamoDB, DMS, RDS, etc) Agile methodology and DevOps process, modern data Integrations tools (DBT, Informatica Power Center, AWS Glue or similar) preferred
*3+ years(s) of data modeling and familiarity with AI/ML and BI tools, a plus