This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Fabric Developer role involves designing and maintaining SQL-based data transformations and models, building ADF pipelines, and ensuring data quality across various sources. Candidates should have strong SQL and Azure Data Factory experience, with a focus on data architecture and collaboration with business analysts. This position offers an opportunity to work in a dynamic environment with a focus on data solutions.
Job Responsibility:
Design, develop, and maintain SQL-based data transformations and models
Build and manage ADF pipelines for ingesting data from multiple source systems
Develop and enhance data solutions using Microsoft Fabric components
Implement end-to-end data ingestion, transformation, and orchestration workflows
Ensure data quality, accuracy, and consistency across insurance data sources
Optimize SQL queries and pipelines for performance and scalability
Collaborate with business analysts and stakeholders on insurance reporting needs
Support data validation, reconciliation, and issue resolution activities
Participate in code reviews, deployments, and production support
Continuously improve data architecture aligned with enterprise and insurance standards
Shift Time 2-11pm , all customer calls are video calls
Requirements:
Strong hands-on experience in SQL (T SQL) for data analysis and transformations
Proven experience with Azure Data Factory (ADF) pipelines, triggers, and integrations
Practical knowledge of Microsoft Fabric (Lakehouse, Data Pipelines, Warehouses)
Experience building and optimizing ETL / ELT data pipelines
Solid understanding of data modeling, dimensional models, and schemas
Experience working with Azure Data Lake, OneLake, and cloud storage
Familiarity with insurance domain data (claims, policy, customer, financials) – preferred
Ability to handle large-scale data processing and performance tuning
Experience with Agile delivery models and DevOps practices
Strong analytical skills and ability to collaborate with data engineers and business teams
Nice to have:
Familiarity with insurance domain data (claims, policy, customer, financials)