CrawlJobs Logo

Azure Data & Microsoft Fabric Engineer

https://www.wellsfargo.com/ Logo

Wells Fargo

Location Icon

Location:
United States , Charlotte

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

119000.00 - 224000.00 USD / Year

Job Description:

Wells Fargo is seeking Azure Data & Microsoft Fabric Engineer within Commercial and Corporate & Investment Banking Technology (CCIBT) to build, optimize, and support modern data platforms across our Azure and Microsoft Fabric. This role will focus on end‑to‑end data engineering - from ingestion and transformation to analytics, governance, and deployment - to enable scalable, secure, and high‑performance data solutions.

Job Responsibility:

  • Lead complex technology initiatives including those that are companywide with broad impact
  • Act as a key participant in developing standards and companywide best practices for engineering complex and large-scale technology solutions for technology engineering disciplines
  • Design, code, test, debug, and document for projects and programs
  • Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors
  • Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives
  • Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals
  • Lead projects, teams, or serve as a peer mentor

Requirements:

  • 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • 4+ years of experience with Azure Data Factory, Azure Data Lake Storage (ADLS), Notebooks, SQL Analytics end points, and Synapse
  • 4+ years of experience with Microsoft Fabric (Lakehouse, Warehouse, Pipelines, Dataflows)
  • 4+ years of experience with Power BI semantic modeling and report performance optimization
  • 4+ years of experience with SQL, Python, and PySpark skills
  • 4+ years of experience with Delta Lake, medallion architecture, and data modeling
  • 4+ years of experience with Git and CI/CD workflows

Nice to have:

  • Build ETL/ELT pipelines using Azure Data Factory, Synapse, or Fabric Pipelines
  • Develop scalable data models using ADLS, PySpark / Spark SQL, Fabric Lakehouse & Warehouse
  • Implement secure data architectures using Roles-based Access Controls (RBAC), Key Vault, Private Endpoints
  • Build and maintain Fabric workloads: Lakehouse, Warehouse, Dataflows, Notebooks, Semantic Models
  • Design and integrate semantic models with Power BI for enterprise reporting
  • Optimize performance using DirectLake, partitioning strategies, and query tuning where appropriate
  • Implement CI/CD for data assets using Azure DevOps/GitHub
  • Optimize cost and performance across Azure and Fabric environments
What we offer:
  • Health benefits
  • 401(k) Plan
  • Paid time off
  • Disability benefits
  • Life insurance, critical illness insurance, and accident insurance
  • Parental leave
  • Critical caregiving leave
  • Discounts and savings
  • Commuter benefits
  • Tuition reimbursement
  • Scholarships for dependent children
  • Adoption reimbursement

Additional Information:

Job Posted:
May 04, 2026

Expiration:
May 26, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Azure Data & Microsoft Fabric Engineer

Azure Data Factory Engineer

Join our vibrant client company team as a Data Engineer! Harness your passion fo...
Location
Location
United States , Des Moines
Salary
Salary:
120000.00 USD / Year
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's in IT or related field
  • Adept at Python, SQL, and Microsoft Fabric tools
  • Experience with Microsoft SQL trinity - Server, Azure SQL, and SSIS
Job Responsibility
Job Responsibility
  • Dive into creative collaborations with data scientists, analysts, and stakeholders
  • Bring to life scalable data pipelines using cutting-edge tools: Azure Data Factory, PySpark, and Spark SQL
  • Be the data whisperer integrating diverse data streams into a powerful, unified platform
  • Showcase your governance skills in managing cutting-edge data storage solutions
  • Keep pulse on the heartbeat of industry trends, injecting fresh tech ideas
What we offer
What we offer
  • Competitive salary
  • Exceptional benefits
  • Free online training
  • Medical, vision, dental, and life and disability insurance
  • 401(k) plan
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are currently looking for a Data Engineer to join our client’s forward-thinki...
Location
Location
United Kingdom , London
Salary
Salary:
70000.00 - 75000.00 GBP / Year
dataidols.com Logo
Data Idols
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience leading data engineering initiatives in a cloud environment
  • Expertise in Azure Synapse Analytics (SQL pools, Spark, pipelines, serverless SQL)
  • Hands-on experience with Microsoft Fabric (OneLake, Lakehouse, Data Engineering/Data Science workloads)
  • Strong SQL skills (T-SQL, Synapse SQL, Lakehouse SQL)
  • Experience with Python and/or Spark for big data processing
  • Knowledge of cloud data architecture (ADLS Gen2, Delta Lake, Parquet)
  • Understanding of data governance, security, and compliance standards
Job Responsibility
Job Responsibility
  • Lead and mentor internal and external data engineering teams to deliver best-in-class cloud solutions
  • Design, build, and optimise enterprise data platforms using Azure Synapse, Microsoft Fabric, OneLake, and lakehouse architectures
  • Modernise traditional ETL/ELT processes and transition workloads to cloud-native tools
  • Develop scalable data pipelines, Spark-based processing, and dataflows for unified analytics
  • Implement governance, security, and compliance standards across all cloud data environments
  • Collaborate with stakeholders to support high-value analytics and Power BI initiatives
  • Produce documentation, technical designs, and support materials to enable continuous improvement
What we offer
What we offer
  • Competitive salary
  • Strong benefits package
  • Hybrid working model
  • Opportunity to own and shape a modern cloud data ecosystem
  • Continuous training and professional development in Microsoft Fabric, Azure, and modern data engineering
  • Clear path for career progression within a growing data organisation
  • Supportive, innovative, and people-focused working environment
  • Fulltime
Read More
Arrow Right

Azure Data Engineer

At LeverX, we have had the privilege of delivering over 1,500 projects for vario...
Location
Location
Uzbekistan, Georgia
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with strong expertise in Azure services (e.g., Azure Data Factory, Azure SQL Database, Azure Synapse, Microsoft Fabric, and Azure Cosmos DB
  • Advanced SQL skills, including complex query development, optimization, and troubleshooting
  • Strong knowledge of indexing, partitioning, and query execution plans to ensure scalability and performance
  • Proven expertise in database modeling, schema design, and normalization/denormalization strategies
  • Ability to design and optimize data architectures to support both transactional and analytical workloads
  • Proficiency in at least one programming language such as Python, C#, or Scala
  • Strong background in cloud-based data storage and processing (e.g., Azure Data Lake, Databricks, or equivalent) and data warehouse platforms (e.g., Snowflake)
  • English B2+
Job Responsibility
Job Responsibility
  • Design, develop, and maintain efficient and scalable data architectures and workflows
  • Build and optimize SQL-based solutions for data transformation, extraction, and loading (ETL) processes
  • Collaborate closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver effective solutions
  • Manage and optimize data storage platforms, including databases, data lakes, and data warehouses
  • Troubleshoot and resolve data-related issues, ensuring accuracy, integrity, and performance across all systems
What we offer
What we offer
  • Projects in different domains: healthcare, manufacturing, e-commerce, fintech, etc
  • Projects for every taste: Startup products, enterprise solutions, research & development initiatives, and projects at the crossroads of SAP and the latest web technologies
  • Global clients based in Europe and the US, including Fortune 500 companies
  • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay with the company for 4+ years
  • Market-based compensation and regular performance reviews
  • Internal expert communities and courses
  • Perks to support your growth and well-being
Read More
Arrow Right

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Data Engineer

At ANS, the Data Engineer plays a vital role in enabling data-driven decision-ma...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Knowledge of python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Base knowledge understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of Lakehouse architecture, data warehousing principles, and data modelling
Job Responsibility
Job Responsibility
  • Deliver high-quality data solutions by building and optimising data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud-based data sources
  • Work closely with Data Architects and Senior Data Engineers to implement technical designs and contribute to solution development
  • Collaborate with customer-side data engineers to ensure smooth integration and alignment with business requirements
  • Focus on task execution and delivery, ensuring timelines and quality standards are met
  • Follow engineering best practices including CI/CD via Azure DevOps, secure data handling using Key Vault and private endpoints, and maintain code quality
  • Participate in Agile ceremonies such as standups, sprint planning, and user story refinement
  • Document solutions clearly for internal use and knowledge sharing
  • Troubleshoot and resolve technical issues across environments and subscriptions
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI-102, etc.) and development days
  • Contribute to the Data Engineer Guild by sharing knowledge, participating in discussions, and helping shape engineering standards and practices
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • Extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Technical Leader - Data Engineering

The position involves leading a team of Data Engineers and Developers, managing ...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Knowledge of Microsoft Azure and/or Google Cloud environments, particularly in the field of AI & Data solutions
  • Experience in building ETL/ELT processes
  • Knowledge of cloud services in the field of Data Engineering (Microsoft Fabric / Databrics / Snowflake)
  • Programming skills and knowledge of SQL / Python
  • Presales skills, i.e. preparing preliminary PoCs and presenting them to customers
  • Knowledge of English at B2+ level
Job Responsibility
Job Responsibility
  • Lead end-to-end delivery for multi-team engagements, converting strategy into executable roadmaps with clear milestones, risk controls, and timely decisions
  • Build high-performing teams through coaching and firm standards
  • resolve conflicts early and develop successors to mitigate key-person risk
  • Serve as a trusted client counterpart- facilitate outcome-oriented workshops, align stakeholders, negotiate scope/change, and maintain a value-prioritized backlog
  • Drive commercial outcomes by shaping proposals/SOWs, producing defensible estimates, tracking margin/utilization/forecast accuracy and identifying compliant, value-accretive expansions
  • Provide architectural leadership and governance, documenting trade-offs and aligning with security, compliance, cost, and operability
  • Lead team of Data Engineers and Developers
  • Technical verification of Candidates
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model, allowing employees to divide their time between home and modern offices in key Polish cities
  • A cafeteria system that allows employees to personalize benefits by choosing from a variety of options
  • Generous referral bonuses, offering up to PLN6,000 for referring specialists
  • Dedicated team-building budget for online and on-site team events
  • Opportunities to participate in charitable initiatives and local sports programs
  • A supportive and inclusive work culture with an emphasis on diversity and mutual respect
Read More
Arrow Right

Data Engineering & Analytics Lead

Premium Health is seeking a highly skilled, hands-on Data Engineering & Analytic...
Location
Location
United States , Brooklyn
Salary
Salary:
Not provided
premiumhealth.org Logo
Premium Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred
  • Proven track record and progressively responsible experience in data engineering, data architecture, or related technical roles
  • healthcare experience preferred
  • Strong knowledge of data engineering principles, data integration, ETL processes, and semantic mapping techniques and best practices
  • Experience implementing data quality management processes, data governance frameworks, cataloging, and master data management concepts
  • Familiarity with healthcare data standards (e.g., HL7, FHIR, etc), health information management principles, and regulatory requirements (e.g., HIPAA)
  • Understanding of healthcare data, including clinical, operational, and financial data models, preferred
  • Advanced proficiency in SQL, data modeling, database design, optimization, and performance tuning
  • Experience designing and integrating data from disparate systems into harmonized data models or semantic layers
  • Hands-on experience with modern cloud-based data platforms (e.g Azure, AWS, GCP)
Job Responsibility
Job Responsibility
  • Collaborate with the CDIO and Director of Technology to define a clear data vision aligned with the organization's goals and execute the enterprise data roadmap
  • Serve as a thought leader for data engineering and analytics, guiding the evolution of our data ecosystem and championing data-driven decision-making across the organization
  • Build and mentor a small data team, providing technical direction and performance feedback, fostering best practices and continuous learning, while remaining a hands-on implementor
  • Define and implement best practices, standards, and processes for data engineering, analytics, and data management across the organization
  • Design, implement, and maintain a scalable, reliable, and high-performing modern data infrastructure, aligned with the organizational needs and industry best practices
  • Architect and maintain data lake/lakehouse, warehouse, and related platform components to support analytics, reporting, and operational use cases
  • Establish and enforce data architecture standards, governance models, naming conventions ,and documentation
  • Develop, optimize, and maintain scalable ETL/ELT pipelines and data workflows to collect, transform, normalize, and integrate data from diverse systems
  • Implement robust data quality processes, validation, monitoring, and error-handling frameworks
  • Ensure data is accurate, timely, secure, and ready for self-service analytics and downstream applications
What we offer
What we offer
  • Paid Time Off, Medical, Dental and Vision plans, Retirement plans
  • Public Service Loan Forgiveness (PSLF)
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Blue Margin, we are on a mission to build the go-to data platform for PE-back...
Location
Location
United States , Fort Collins
Salary
Salary:
110000.00 - 140000.00 USD / Year
bluemargin.com Logo
Blue Margin
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark
  • Proven ability to manage large datasets and optimize for speed, scalability, and reliability
  • Strong SQL skills and understanding of relational and distributed data systems
  • Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake
  • Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices
  • Familiarity with CI/CD, version control, and DevOps practices for data pipelines
  • Experience leveraging AI-assisted tools to accelerate engineering workflows
  • Strong communication skills
  • ability to convey complex technical details to both engineers and business stakeholders
Job Responsibility
Job Responsibility
  • Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools
  • Drive efficiency in incremental/delta data loading, partitioning, and performance tuning
  • Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments
  • Collaborate with stakeholders and analysts to translate business needs into scalable data solutions
  • Evaluate and incorporate AI/automation to improve development speed, testing, and data quality
  • Oversee and mentor junior data engineers, establishing coding standards and best practices
  • Ensure high standards for data quality, security, and governance
  • Participate in solution design for client engagements, balancing technical depth with practical outcomes
What we offer
What we offer
  • Competitive pay
  • strong benefits
  • flexible hybrid work setup
  • Fulltime
Read More
Arrow Right