This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Beacon Hill Technologies is seeking an experienced Fabric Developer for our client. The Data Engineer is responsible for designing, building, and maintaining scalable, end-to-end data solutions within the Microsoft Azure ecosystem, with a strong focus on Microsoft Fabric. This role supports modern data platform initiatives by developing robust ETL/ELT pipelines, implementing Medallion and Lakehouse architectures, and enabling analytics and reporting through Power BI. The Data Engineer partners with cross-functional teams to ensure data quality, security, governance, and performance while leveraging strong SQL, data modeling, and programming skills to deliver reliable, production-ready data solutions.
Job Responsibility:
Designing, building, and maintaining scalable, end-to-end data solutions within the Microsoft Azure ecosystem, with a strong focus on Microsoft Fabric
Supporting modern data platform initiatives by developing robust ETL/ELT pipelines
Implementing Medallion and Lakehouse architectures
Enabling analytics and reporting through Power BI
Partnering with cross-functional teams to ensure data quality, security, governance, and performance while leveraging strong SQL, data modeling, and programming skills to deliver reliable, production-ready data solutions
Requirements:
5+ years of experience as a Data Engineer with proven experience delivering production-ready solutions
Demonstrated hands-on experience designing and implementing end-to-end data solutions using Microsoft Fabric, including strong knowledge of Medallion architecture
Advanced proficiency in SQL with deep expertise in data modeling techniques such as star schema and dimensional modeling
Experience developing and maintaining robust ETL/ELT pipelines using Python or Scala for large-scale data processing
Experience with Pyspark
Solid understanding of Lakehouse architecture, distributed data processing systems, and modern data platform design
Experience working across the Microsoft Azure data ecosystem, including ADLS, Azure Data Factory, Synapse, and Fabric
Hands-on experience integrating data platforms with Power BI, including semantic model development
Strong understanding of data governance, security, and compliance best practices, including role-based access control (RBAC) and data masking
Experience with version control and DevOps practices, including Git and CI/CD pipelines using Azure DevOps or similar tools
Familiarity with monitoring, logging, and observability tools for managing data platforms
Experience working with Delta Lake and Parquet data formats
Bachelors degree in Business Administration, IT, or a related field
Nice to have:
Masters Degree in Business Administration, IT, or a related field