This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for an experienced Analytics / ETL Developer with strong hands on expertise in the Hadoop ecosystem and Python to join our team. In this role, you will design, develop, and optimize scalable data integration and processing pipelines that power analytics, business intelligence, and operational reporting. Success in this position requires solid ETL fundamentals, strong SQL proficiency, and the ability to thrive in a fast paced, collaborative environment. You will partner closely with data architects, analysts, and business stakeholders to deliver high quality, reliable, and efficient data solutions. If you are passionate about building robust data workflows, solving complex data engineering challenges, and driving improvements in data quality and automation, we would be excited to connect with you.
Job Responsibility:
Design, develop, and optimize scalable data integration and processing pipelines that power analytics, business intelligence, and operational reporting
Partner closely with data architects, analysts, and business stakeholders to deliver high quality, reliable, and efficient data solutions
Requirements:
Spark
Scala
JSON
strong SQL skills (incl. ability to write stored procedures, CTEs, nested and complex joins)
Linux shell scripting
Python
FHIR Repo
Healthcare background
understanding of HIPAA/SOC2/HITRUST regulations
general understanding of the various domains in health care for data warehouse and analytics
concepts of data warehousing
Knowledge of Project management methodologies with SDLC (System Development Life Cycle) – strength in Waterfall and Rally