This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Data is integral to decision-making at Noora Health. However, data about our users and operations are scattered across multiple platforms and applications that are used to run our interventions. We recently started integrating data from these various sources into a centralized warehouse. You will be responsible for further extending and maintaining this central data system – including integrating new data sources, building robust pipelines and transformation models, and co-designing and implementing data governance policies across all the geographies we operate in.
Job Responsibility:
Design, develop, and maintain scalable data pipelines, ETL processes, and data warehouse structures to ensure data quality and accessibility for analytics and reporting
Collaborate with program delivery, program design and platforms teams to gather requirements, develop data models, and design analytics solutions that address specific business needs
Implement data governance and security measures based on our respective regions of operation
Support data analysts and monitoring teams to build dashboards and communicate challenges and insights back to the larger team
Manage our data warehouse and ensure performant databases for querying and dashboarding purposes
Develop and maintain data visualizations and dashboards to track key performance indicators (KPIs) and monitor business trends
Support the engineering team in refining schemas for our applications based on needs coming from internal and external stakeholder requirements
Requirements:
Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field
4+ years of proven experience in data analysis, engineering, data management or other equivalent fields
Knowledge of data warehousing, database design, and data modeling techniques
Strong proficiency in SQL, Python, and data manipulation languages (DML)
Hands-on experience with data pipeline and ELT/ETL tools (e.g., Fivetran, Airbyte, dbt)
Familiarity with cloud-based data warehouse and data processing platforms (e.g., BigQuery, Snowflake, Databricks)
Proficiency in data visualization and reporting tools (e.g., Tableau, Metabase, Superset, PowerBI)
Strong analytical, problem-solving, and communication skills, with the ability to translate complex data into actionable insights for non-technical stakeholders
Detail-oriented, self-motivated, and able to work independently as well as collaboratively within a team