CrawlJobs Logo

Data Engineer (MS Fabric)

gt-hq.com Logo

GT

Location Icon

Location:
United Kingdom

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Dreams is implementing Microsoft Dynamics 365 Finance & Operations and building a modern data and reporting layer using Microsoft Fabric and Medallion architecture. The project focuses on accelerating the extraction and transformation of ERP data into clean, business-ready datasets that support operational and financial reporting, with Excel as the primary consumption tool. The team works in a pragmatic, delivery-focused environment, closely collaborating with business analysts and end users to ensure data is reliable, accessible, and valuable from day one.

Job Responsibility:

  • Extract data from Microsoft Dynamics 365 Finance & Operations into Microsoft Fabric
  • Work directly with D365 F&O tables, data entities, and the underlying data model
  • Understand how transactions flow through the system (sales orders, inventory, finance, procurement, etc.)
  • Interpret business logic embedded in F&O rather than relying solely on exported data
  • Troubleshoot issues where the root cause sits inside F&O rather than in Fabric
  • Collaborate with our F&O functional and technical teams using the correct terminology and system understanding
  • Build and maintain data pipelines using Medallion architecture (Bronze / Silver / Gold)
  • Transform transactional ERP data into business-ready datasets
  • Expose structured datasets natively for Excel consumption
  • Support creation of simple, column-based Excel reports
  • Work closely with business analysts and end users to understand reporting needs
  • Collaborate with Dreams’ internal team within their Azure environment
  • Ensure secure and reliable access to data

Requirements:

  • 5+ years of experience in data engineering, with a strong focus on Microsoft Azure–based data platforms
  • Hands-on experience with Microsoft Fabric and D365 Finance & Operations
  • Strong data engineering background (SQL, transformations, data modeling)
  • Experience working with transactional systems (ERP, finance, operations data)
  • Solid understanding of Medallion / layered data architecture
  • Experience exposing datasets for Excel-based consumption
  • Comfortable working in a business-facing environment
  • Strong English communication skills

Nice to have:

  • Broader Azure data platform experience (Synapse, ADF, etc.)
  • Power BI experience
  • Previous involvement in ERP implementations
  • Ability to travel to the UK for initial onboarding or workshops

Additional Information:

Job Posted:
February 18, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer (MS Fabric)

Senior Azure Data Engineer

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/...
Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
Job Responsibility
Job Responsibility
  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
Read More
Arrow Right

Principal Competitive Technical Marketing Engineer

The Competitive Technical Marketing Engineer (TME) position plays a vital role w...
Location
Location
United States , Roseville
Salary
Salary:
115500.00 - 266000.00 USD / Year
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS or MS in Computer Science, Information Systems, or related field. Hands-on experience building customer networks with a non-technical degree can substitute
  • 10+ years of experience
  • In-depth understanding of protocols such as MP-BGP, BGP, IS-IS, EVPN, VxLAN, OSPF, Multicast, MPLS, STP, VLANs, IPv4, IPv6, etc.
  • Strong understanding of Data Center technologies such as Data Center Fabric/Spine architecture, EVPN-VXLAN Fabric, Data Center Interconnect (DCI) Border, Secure DCI, Multi-tier network design, IP Fabric, IPv6, storage networking
  • Excellent communication skills and comfort with public speaking
  • Ability to translate complex technical concepts to understandable language to match the level of the audience
Job Responsibility
Job Responsibility
  • Collaborate with technical experts across a range of HPE Aruba Networking products and functional areas, including but not limited to data center and L2/L3 switching protocols (STP, QoS, BGP, OSPF, TCP/IP, IPv4, IPv6, etc.) and other networking services relevant to data center networking solutions and deployments
  • Bring up network topologies and solutions of varying complexities and compare other vendors solutions to HPE Aruba Networking in a lab environment
  • Present competitive sessions at HPE Aruba Networking events and webinars for field, partner, and R&D engineers
  • Generate technical collateral which includes testing and comparing HPE Aruba Networking with industry vendor solutions, creating competitive analysis reports, third-party testing, sales collateral and assist in development and delivery of competitive updates when required
  • Help manage and maintain lab equipment and inventory
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
  • Fulltime
Read More
Arrow Right

Senior Azure Data Engineer with Fabric

Senior Azure Data Engineer with Microsoft Fabric Experience
Location
Location
United States , Newark
Salary
Salary:
130000.00 USD / Year
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 10+ years experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, Microsoft Fabric etc.
  • Azure experience is preferred over other cloud platforms
  • Minimum 10 + years of proven experience with SQL, schema design and dimensional data modelling
  • Solid knowledge of data warehouse best practices, development standards and methodologies
  • Experience with ETL/ELT tools like ADF, Azure, MS Fabric etc., and data warehousing technologies like Azure Synapse, Azure SQL, ADLS etc.
  • Strong experience with big data tools(Microsoft Fabric, Databricks , Spark etc..) and programming skills in PySpark and Spark SQL
  • Be an independent self-learner with a let s get this done approach and ability to work in Fast paced and Dynamic environment
  • Knowledge in Microsoft Fabric Data Platform
  • Expertise in designing application solution on Azure Data Platform for complex business needs
  • Serves as key technical expert engaged on designing, architecting, deploying and maintaining cloud solutions, Data Management & Analytics
Job Responsibility
Job Responsibility
  • Build and maintain new and existing applications in preparation for a large-scale architectural migration within an Agile function
  • Align with the Product Owner and Scrum Master in assessing business needs and transforming them into scalable applications
  • Build and maintain code to manage data received from heterogenous data formats including web- based sources, internal/external databases, flat files, heterogenous data formats (binary, ASCII)
  • Help build new enterprise Datawarehouse and maintain the existing one
  • Design and support effective storage and retrieval of very large internal and external data set and be forward think about the convergence strategy with our Azure cloud migration
  • Assess the impact of scaling up and scaling out and ensure sustained data management and data delivery performance
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a Principal‑level Data Architect with deep expertise in enterpris...
Location
Location
Canada
Salary
Salary:
180000.00 - 250000.00 CAD / Year
valtech.com Logo
Valtech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering, data architecture, or platform engineering
  • Experience designing or building enterprise data platforms on at least one of Azure, GCP, or AWS and Databricks
  • Deep expertise in SQL, Python, distributed data processing, and cloud-native data design
  • Significant experience with medallion/lakehouse architecture patterns
  • Strong knowledge of modern data platforms: Databricks, Azure Synapse, Microsoft Fabric, Delta Lake, BigQuery, etc.
  • Proven experience leading architecture across large programs and multiple concurrent projects
  • Experience with enterprise automation and integration using REST APIs
  • Strong communication skills and ability to engage confidently with senior leadership and clients
  • Experience in pre-sales, technical solutioning, or client-facing architecture leadership
Job Responsibility
Job Responsibility
  • Design and own complex, enterprise-scale data architectures across MS Fabric, Azure, GCP, AWS, or Databricks serverless or hosted environments
  • Define and enforce architectural standards, patterns, and governance frameworks across ingestion, modeling, lineage, security, and orchestration
  • Shape AI‑enabled architecture approaches, including data foundations for ML, feature engineering, and low-latency operationalization pipelines
  • Act as a principal advisor to client technical leadership, helping shape long-term strategy, roadmaps, and modernization initiatives
  • Lead architectural direction during pre-sales cycles, including solutioning, scoping, estimation, and executive-level presentations
  • Anticipate downstream impacts of architectural decisions
  • maintain ownership when delivery teams or constraints require deviation from the original design
  • Architect highly available, distributed, fault‑tolerant data pipelines supporting batch and streaming workloads
  • Oversee migration and integration of complex, diverse data sources into Fabric, Azure, GCP, or Databricks platforms
  • Define medallion/lakehouse modeling patterns across Bronze/Silver/Gold zones or cloud equivalents
What we offer
What we offer
  • A comprehensive insurance plan, where you can choose the module that best suits your needs—Gold, Silver, or Bronze. The employer may contribute up to 80% of your coverage depending on the selected module. This plan includes short- and long-term disability coverage
  • Dialogue via Sun Life provides virtual healthcare services, allowing you to consult with a healthcare professional for emergencies, prescription renewals, and more. You also have access to the Employee and Family Assistance Program, as well as a complete mental health support program
  • A $500 Personal Spending Account, which can be used for healthcare reimbursements, gym memberships, public transit passes, office supplies, or contributions to your RRSP through Valtech
  • A retirement plan where Valtech will match 100% of your RRSP contributions through a Deferred Profit Sharing Plan (DPSP), up to a maximum of 4%. You can start contributing to your RRSP immediately, and to the DPSP after 3 months. The vesting of the DPSP will be after a 24 months of service
  • Access to a flexible vacation under Valtech's policy to support your work-life balance, with 5 days available during your probation period and a prorated amount calculated for the remainder of the year
  • Personal Technology Reimbursement – $30/month for every employee-offered on day 1
  • We close during the winter holidays and offer flexible scheduling throughout the year, so you can enjoy those sunny Friday afternoons—provided your weekly hours are completed
  • Fulltime
Read More
Arrow Right

Head of Data, Automation & AI

Knovia Group, the UK’s leading apprenticeship provider, is on a bold mission to ...
Location
Location
United Kingdom
Salary
Salary:
90000.00 GBP / Year
paragonskills.co.uk Logo
Paragon Skills
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in computer science, data science, engineering, a related field or equivalent professional training/qualifications
  • Strong CPD record keeping abreast of latest in data architecture, governance, cloud data platforms, advanced analytics, connected systems, and development of AI agents
  • Proven experience in data platform strategy, AI/ML enablement, or data transformation at scale
  • 5+ years' Senior experience in a data scientist, data engineer or developer role
  • 3+ years' leading a function
  • Experience in a senior data, AI, or digital transformation leadership role
  • Track record of delivering enterprise-scale data infrastructure and AI/automation initiatives
  • Strong understanding of data architecture, governance, and cloud data platforms (e.g., Snowflake, Databricks, AWS/GCP, MS Fabric)
  • Deep expertise in cloud-based data architectures (e.g., AWS, Azure, or GCP), data engineering, and MLOps
  • Familiarity with tools like Databricks, Snowflake, MLflow, Airflow, dbt, and LLM technologies would be advantageous
Job Responsibility
Job Responsibility
  • Lead the development of a modern data platform (data lake, warehouse, pipelines, BI suite)
  • Automating and integrating end-to-end business processes using tools like Workato
  • Developing and deploying AI agents to enhance operational efficiency and learner/employer experiences
  • Enhancing our analytics capabilities to better understand and serve our customers
  • Shape our internal AI capability — from staff skills to leadership development
What we offer
What we offer
  • Generous Annual Leave: 21 days, increasing with length of service, plus a holiday purchase scheme
  • Holiday Benefits: 3 Knovia Days for our operational December closure and 8 Public Bank Holidays
  • Extra Day Off: Enjoy an additional day off to celebrate your birthday
  • Paid Volunteering Leave: Up to 3 days of paid leave for volunteering opportunities and corporate conscience initiatives
  • Perkbox: Access to a wide range of lifestyle benefits and wellness tools
  • Recognition and Long Service Awards: Celebrating the milestones and contributions of our colleagues
  • Fulltime
Read More
Arrow Right

MS Fabric Data Architect

The Microsoft Fabric Solution Architect will be responsible for designing and im...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 7–10 years in data architecture, engineering, or consulting roles
  • 1-2 years in Microsoft Fabric solution architecture
  • University degree required
  • BSc/MSc in Computer Science, Data Engineering, or related field preferred
  • Microsoft Fabric certification(s) strongly desirable
  • Proven expertise in designing and implementing Microsoft Fabric architectures for enterprise clients
  • Strong consulting values with the ability to interface effectively with senior stakeholders
  • Hands-on experience with SQL, ETL/ELT pipelines, and BI/analytics tools
  • Deep understanding of the data lifecycle (ingestion, storage, processing, governance, consumption)
  • Ability to link business objectives with technology strategy and execution
Job Responsibility
Job Responsibility
  • Client Engagement & Delivery
  • Solution Design & Implementation
  • Migration & Modernisation
  • Thought Leadership & Knowledge Sharing
  • Collaboration & Leadership
  • Guiding enterprise clients through their data platform modernisation journey
  • Leveraging Microsoft Fabric’s capabilities to enable data democratisation, advanced analytics, and AI-driven insights
  • Collaborating with client partners, business stakeholders, and technology teams to define Microsoft Fabric strategies, design robust architectures, and implement best practices
What we offer
What we offer
  • Range of tailored benefits that support physical, emotional, and financial wellbeing
  • Continuous growth and development opportunities
  • Flexible work options
Read More
Arrow Right

Data Engineering Lead

We’re looking for a Lead Data Engineer who is ready to both lead and build. This...
Location
Location
Salary
Salary:
Not provided
5CA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Located in Albania, Bosnia and Herzegovina, Bulgaria, Greece, Hungary, Kosovo, North Macedonia, Poland, Romania, Serbia, South Africa, or Tunisia
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field
  • Minimum 7 years’ experience in data engineering
  • At least 2 years of leadership experience managing or mentoring engineers
  • Proven experience in designing, building, and maintaining scalable ETL/ELT pipelines and data platforms
  • Proficiency in SQL, Python, and data modelling techniques
  • Experience with cloud-based data architectures (e.g., Azure, AWS, or GCP)
  • Strong understanding of data governance, quality frameworks, and security best practices
  • Experience integrating data from multiple sources, including APIs and third-party systems
  • Familiarity with modern data tools and technologies (e.g., Databricks, Snowflake, Apache Spark, Airflow, Microsoft Fabric)
Job Responsibility
Job Responsibility
  • Lead the strategy, design, and implementation of scalable, high-performance data pipelines, platforms, and architectures
  • Manage the Data Engineering team’s backlog, priorities, and capacity to ensure timely delivery of high-quality outputs
  • Collaborate with data scientists, analysts, and product teams to understand data requirements and deliver fit-for-purpose solutions
  • Oversee and contribute to the development of ETL/ELT processes, data models, and data integration frameworks
  • Ensure data quality, consistency, and governance across all systems and datasets
  • Monitor and optimise data infrastructure for performance, scalability, and cost efficiency
  • Define and enforce engineering best practices, coding standards, and security guidelines
  • Evaluate and adopt new technologies, tools, and frameworks that can improve data engineering capabilities
  • Provide technical leadership and mentorship to data engineers, fostering a high-performance culture
  • Troubleshoot complex data issues and drive root cause analysis and remediation
What we offer
What we offer
  • A position at a fast-paced international company with ambitious gaming, e-commerce, and tech clients
  • A diverse and inclusive culture with people from 80+ countries, speaking 25+ languages - where we celebrate everyone's uniqueness
  • Innovative digital tools, and continuous opportunities for learning and development
  • Various learning and career development initiatives throughout the year
  • Fun employee engagement activities and participation in 5CA employee-led communities such as 5CA Connect, Pride, 5CA Gamers, Women of 5CA, to name a few
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly skilled Data Architect to join our Enterprise Transforma...
Location
Location
Bulgaria , Sofia
Salary
Salary:
Not provided
hotschedules.com Logo
HotSchedules Corporate
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
  • 5+ years in data architecture, database administration, or data engineering, with a focus on enterprise and business systems
  • Strong knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra)
  • Experience with cloud platforms (e.g., AWS, Azure, GCP – BigQuery, Redshift, Snowflake)
  • Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, dbt, Informatica, Talend, Azure Data Factory, Synapse)
  • Strong knowledge of API’s and Automation Tools (e.g. Make, Zapier, MS Power Automate)
  • Familiarity with ERP, CRM, and HRIS integrations
  • Programming skills in Python, Java, or Scala
  • Deep understanding of data governance, master data management, and security/compliance (especially GDPR)
  • Excellent analytical, problem-solving, and communication skills
Job Responsibility
Job Responsibility
  • Design, develop, and maintain the organization’s overall data architecture to support enterprise‑wide business applications, internal reporting, and analytics
  • Create and manage conceptual, logical, and physical data models for organizational data domains (HR, Finance, Sales, Operations, etc.)
  • Define and implement data governance policies, standards, and best practices across the enterprise
  • Oversee ETL/ELT processes and pipelines for integrating data from diverse business systems (ERP, CRM, HRIS, etc.)
  • Collaborate with internal stakeholders (business teams, IT, data engineers) to align data initiatives with organizational objectives
  • Optimize performance, cost, and scalability of data warehouses and internal reporting systems
  • Evaluate and recommend tools and platforms to enhance internal data and business application efficiency
  • Ensure compliance with GDPR and other relevant data security/privacy regulations
  • Responsible for the successful design and execution of Data related programs and projects
What we offer
What we offer
  • 25+ days off, as well as birthday day off and 4 charity days off per year
  • Flexible start and end of the working day and hybrid working mode, including a combination remote and in the office
  • Team-centric atmosphere
  • Encouraging healthy lifestyle and work-life balance including supplemental health insurance
  • New parents bonus scheme
  • Fulltime
Read More
Arrow Right