CrawlJobs Logo

Databricks Engineer- Azure Fabric

votredircom.fr Logo

Wissen

Location Icon

Location:
India , Bangalore South

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a highly skilled Databricks Engineer – Azure Fabric with 6–8 years of experience in data engineering to design, build, and maintain scalable data platforms on Microsoft Fabric and Azure. The ideal candidate will have strong hands‑on experience with Python, SQL, Microsoft Fabric (OneLake, Lakehouse, Data Factory), and Delta Lake, along with the ownership mindset to deliver regulatory‑grade, enterprise data solutions. This role involves close collaboration with global engineering, data, compliance, and business teams and supports advanced analytics and AI‑enabled data products.

Job Responsibility:

  • Design, build, and maintain scalable, distributed, and fault‑tolerant data pipelines on Microsoft Fabric
  • Develop lakehouse architectures using OneLake and Delta Lake, including incremental merge workflows and Change Data Feed
  • Build pipelines to ingest, normalize, transform, and publish large volumes of financial market data
  • Design and implement bitemporal data models (valid‑time and system‑time) for regulatory‑grade time‑series datasets
  • Participate in cross‑functional discussions with engineering, compliance, research, and business stakeholders globally
  • Build and maintain testing frameworks (unit, regression, UAT) for data pipelines and transformations
  • Own end‑to‑end delivery of solutions, including ingestion pipelines, QA processes, correction handling, and audit trails
  • Collaborate on shared platform services and reusable components instead of siloed implementations
  • Apply business understanding of financial reference data (equities and other asset classes)
  • Support AI enablement use cases such as AI‑assisted ingestion, anomaly detection, and semantic search over lakehouse data

Requirements:

  • 6–8 years of experience in data engineering
  • Strong proficiency in Python for data pipelines, transformations, and automation
  • Advanced SQL skills including window functions, partitioning, and time‑series query patterns
  • Hands‑on experience with Microsoft Fabric: OneLake, Fabric Data Factory pipelines, Fabric Lakehouse, Fabric Warehouse (SQL endpoint)
  • Strong working knowledge of Delta Lake: Table creation and management, Incremental merges, Z‑ordering, Change Data Feed (CDF)
  • Experience using AI‑assisted development tools (e.g., GitHub Copilot, Cursor)
  • Proficient with Git for code versioning, branching strategies, and pull‑request workflows
  • Experience working with REST APIs for data ingestion and system integration
  • Familiarity with Azure services such as Azure Data Factory, Azure SQL, Azure Key Vault, and RBAC
  • Strong ownership, problem‑solving, and collaboration skills

Nice to have:

  • Experience with pandas, PySpark, or similar data processing libraries
  • Knowledge of columnar storage and time‑series analytics (e.g., ClickHouse or equivalent)
  • Familiarity with Microsoft Purview for data lineage, cataloging, and data classification
  • Understanding of bitemporal modeling for financial and regulatory datasets
  • Knowledge of financial reference data: equities, identifiers, corporate actions, index data
  • Exposure to CI/CD pipelines and automated data platform deployments

Additional Information:

Job Posted:
April 20, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Databricks Engineer- Azure Fabric

Senior Azure Data Engineer

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/...
Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
Job Responsibility
Job Responsibility
  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
Read More
Arrow Right

Azure Data Engineer

At LeverX, we have had the privilege of delivering over 1,500 projects for vario...
Location
Location
Uzbekistan, Georgia
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with strong expertise in Azure services (e.g., Azure Data Factory, Azure SQL Database, Azure Synapse, Microsoft Fabric, and Azure Cosmos DB
  • Advanced SQL skills, including complex query development, optimization, and troubleshooting
  • Strong knowledge of indexing, partitioning, and query execution plans to ensure scalability and performance
  • Proven expertise in database modeling, schema design, and normalization/denormalization strategies
  • Ability to design and optimize data architectures to support both transactional and analytical workloads
  • Proficiency in at least one programming language such as Python, C#, or Scala
  • Strong background in cloud-based data storage and processing (e.g., Azure Data Lake, Databricks, or equivalent) and data warehouse platforms (e.g., Snowflake)
  • English B2+
Job Responsibility
Job Responsibility
  • Design, develop, and maintain efficient and scalable data architectures and workflows
  • Build and optimize SQL-based solutions for data transformation, extraction, and loading (ETL) processes
  • Collaborate closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver effective solutions
  • Manage and optimize data storage platforms, including databases, data lakes, and data warehouses
  • Troubleshoot and resolve data-related issues, ensuring accuracy, integrity, and performance across all systems
What we offer
What we offer
  • Projects in different domains: healthcare, manufacturing, e-commerce, fintech, etc
  • Projects for every taste: Startup products, enterprise solutions, research & development initiatives, and projects at the crossroads of SAP and the latest web technologies
  • Global clients based in Europe and the US, including Fortune 500 companies
  • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay with the company for 4+ years
  • Market-based compensation and regular performance reviews
  • Internal expert communities and courses
  • Perks to support your growth and well-being
Read More
Arrow Right

Data & AI Impact Consultant Engineer

Data Consultant role in Data & AI Business Unit, designing and building modern d...
Location
Location
Belgium
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering or analytics projects
  • Good knowledge of Dutch and English (French is an asset)
  • Familiarity with Azure stack (Data Factory, Synapse, Storage, Purview, Functions) and/or Databricks
  • Structural and flexible thinking
  • Interest in AI and its role in modern data products (prompt engineering, GenAI, monitoring, automation)
  • Ability to create business impact and understand outcomes
  • Team-oriented mindset
Job Responsibility
Job Responsibility
  • Design and build modern data platforms (Azure, Databricks, Data Fabric) with focus on reusability and AI-readiness
  • Deliver value today while preparing for tomorrow
  • Help colleagues grow through coaching, feedback, or knowledge sharing
  • Provide consultancy with solution-oriented approach
  • Take initiative beyond projects to help build Inetum
What we offer
What we offer
  • Company car
  • Fuel/charging card
  • Group insurance
  • Hospitalization coverage
  • 32 days of annual leave
  • Hybrid working options
  • Satellite offices
  • Training and certification programs
  • Fulltime
Read More
Arrow Right

Data & AI Impact Consultant Engineer

As a Data Consultant, you are a cornerstone of our Data & AI Business Unit – tec...
Location
Location
Belgium
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering or analytics projects
  • Good knowledge of Dutch and English (French is an asset)
  • Familiar with Azure stack (Data Factory, Synapse, Storage, Purview, Functions) and/or Databricks
  • Structural and flexible thinking
  • Interest in AI and its role in modern data products (prompt engineering, GenAI, monitoring, automation)
  • Ability to create business impact and understand outcomes
  • Team-oriented mindset
Job Responsibility
Job Responsibility
  • Design and build modern data platforms (Azure, Databricks, Data Fabric) with focus on reusability and AI-readiness
  • Deliver customer- and future-oriented value
  • Help colleagues grow through coaching, feedback, or knowledge sharing
  • Provide consultancy with solution-oriented approach
  • Take initiative in client development, talent growth, or community engagement
What we offer
What we offer
  • Company car
  • Fuel/charging card
  • Group insurance
  • Hospitalization coverage
  • 32 days of annual leave
  • Hybrid working options
  • Satellite offices
  • Continuous learning & development
  • Training and certification programs
  • Fulltime
Read More
Arrow Right

Data Engineer

Factors Group is a leading wellness company committed to leveraging data-driven ...
Location
Location
Canada , Coquitlam
Salary
Salary:
100000.00 - 120000.00 CAD / Year
ca.naturalfactors.com Logo
Natural Factors
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, or a related field
  • 5+ years of experience in data engineering or a similar role, preferably in a fast-paced, data-intensive environment
  • Proficiency in programming languages such as Python, with experience in building data pipelines using frameworks like Apache Spark
  • Strong SQL skills
  • Hands-on experience with cloud platforms such as Azure or Fabric, and related services like Databricks, Snowflake, Azure Data Factory, Fabric Data Factory
  • Familiarity with containerization and orchestration tools like Docker and Kubernetes
  • Excellent problem-solving skills and the ability to work independently and collaboratively in a dynamic team environment
  • Strong communication and interpersonal skills, with the ability to effectively convey technical concepts to non-technical stakeholders
Job Responsibility
Job Responsibility
  • Design, build, and maintain robust, scalable data pipelines and ETL processes to ingest, transform, and store structured and unstructured data from various sources
  • Optimize data workflows for performance, reliability, and cost-effectiveness, utilizing best practices in data engineering and cloud computing
  • Collaborate with Data Analysts to understand data requirements and implement solutions to enable advanced analytics and machine learning
  • Develop and maintain data models, schemas, and metadata to ensure data integrity, consistency, and quality
  • Implement monitoring, logging, and alerting systems to proactively identify and address data pipeline issues and performance bottlenecks
  • Stay current with emerging technologies and trends in data engineering, recommending and implementing improvements to enhance our data infrastructure and processes
  • Provide technical guidance and mentorship to junior team members, fostering a culture of collaboration, learning, and innovation
What we offer
What we offer
  • Great healthcare benefits (including health and personal spending accounts)
  • Company provided RRSP
  • Vacation and wellness days - 5 weeks in total
  • Employee appreciation events and wellness sessions
  • $300 per year to spend on company products plus discounts on our products
  • Free onsite parking and bus pass reimbursement
  • Discounts on local and sustainable companies
  • Tuition reimbursement
  • Fulltime
Read More
Arrow Right
New

Data Engineer with Azure Fabric

The Data Engineer will focus on developing scalable, high-performance data proce...
Location
Location
Romania , Cluj
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 3–8 years in data engineering, data warehousing, or data architecture roles
  • At least 3+ years working with Fabric
  • BSc/MSc in Computer Science, Data Engineering, or related field
  • Proven experience in data engineering and pipeline development on Fabric, Azure and cloud-native platforms
  • Familiarity with Fabric Workflows and other orchestration tools
  • Proficiency in ETL/ELT tools such as DBT, Matillion, Talend, or equivalent
  • Strong SQL and Python (or equivalent language) skills for data manipulation and automation
  • Exposure to AI/ML workloads desirable
  • Proficiency in cloud ecosystems (Specifically Azure, optionally AWS and GCP are an advantage) and infrastructure-as-code (e.g., Terraform)
  • Knowledge of data modelling methodologies (star schemas, Data Vault, Kimball, Inmon)
Job Responsibility
Job Responsibility
  • Client Engagement & Delivery
  • Data Pipeline Development (Batch and Streaming)
  • Fabric and Azure Architectures
  • Data Modelling & Optimisation
  • Collaboration & Best Practices
  • Quality, Governance & Security
  • Client stakeholders up to Head of Data Engineering, Chief Data Architect, and Analytics leadership
  • Delivery of high-performing, scalable, and secure data pipelines aligned to client requirements
  • High client satisfaction and successful adoption of Fabric and Azure based solutions
  • Improve data engineering practices
What we offer
What we offer
  • Smooth integration and a supportive mentor
  • Pick your working style: choose from Remote, Hybrid or Office work opportunities
  • Projects have different working hours to suit your needs
  • Sponsored certifications, trainings and top e-learning platforms
  • Private Health Insurance
  • Individual coaching sessions or joining our accredited Coaching School
  • Epic parties or themed events
Read More
Arrow Right

Cloud Data Platform Architect

Circle K is transforming our Data Engineering and BI platform to match our busin...
Location
Location
United States of America , Charlotte
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of professional experience in designing & architecting Data & Analytics solutions, with a focus on Azure-based platforms and enterprise-scale systems
  • Working experience in designing and architecting solutions that leverage Databricks and Snowflake
  • Hands-on experience with Azure Cloud services (Azure Synapse/SQL Server, ADF or close equivalents)
  • Working experience with Microsoft Power BI (Power BI Platform, AAS/Tabular) integration with Azure-based Platforms
  • Expert understanding of relational databases, Data Warehouse & Data Lake modeling techniques & concepts, ETL/ELT processing patterns, and Big Data technologies
  • Practical experience in designing systems to handle large data volumes
  • Practical experience in designing systems for large-scale data processing with a focus on Azure performance optimization and cost management
  • Working knowledge of Python, PySpark, SQL & T-SQL
  • Working experience in designing and architecting solutions that comply with data security industry standards and regulations, including RBAC, data encryption (GDPR, PCI, etc.), and monitoring
  • Microsoft Azure Certification required
Job Responsibility
Job Responsibility
  • Designing, building, and maintaining robust data platforms and solutions on Azure
  • Optimizing data delivery and ensuring the architecture aligns with business objectives
  • Leading architectural decisions and establishing governance standards
  • Collaborating across teams to ensure seamless data flows and scalable solutions
  • Driving the adoption and usage of Azure Databricks, Snowflake, Microsoft Fabric, and Power BI in the data platform
  • Performing architectural assessments and defining solutions to produce detailed design documents
  • Providing technical direction on Azure platform services
  • Mentoring Data Engineering and Data Science teams
  • Providing technical support for platform performance tuning and optimization activities
  • Participating in the creation and maintenance of technical roadmaps
  • Fulltime
Read More
Arrow Right

Data Engineer

Looking for stable projects with modern tech and room to grow? We are Clouds on ...
Location
Location
Salary
Salary:
Not provided
cloudsonmars.com Logo
Clouds on Mars Sp. z o. o.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering area in a consulting model
  • Experience with Azure ADF, SQL Database, Microsoft Fabric, Databricks, Synapse technologies
  • Strong understanding of data modeling techniques and data warehouse architecture
  • Ability to create, evaluate and optimize existing data pipelines
  • Strong communication and collaboration skills to work with both technical and non-technical stakeholders
  • Ownership and proactive, client-oriented approach
  • Fluency in English and Polish
  • Certificates (at least one of the below): Microsoft Certified: Azure Data Engineer Associate, Implementing a Data Warehouse, Analyzing and Visualizing Data with Power BI, Snowflake: SnowPro Core Certification, Microsoft Certified: Fabric Analytics Engineer Associate, Databricks: Data Engineer Associate/Professional
Job Responsibility
Job Responsibility
  • Design and develop ETL pipelines for complex data transformations
  • Build and optimize data models (normalized and multidimensional) to support analytics needs
  • Develop Azure-based solutions, including tools like Databricks, Synapse Analytics, and Microsoft Fabric
  • Collaborate with architects and stakeholders to design data architectures aligned with IT strategies
  • Provide support to analysts in creating and refining data models for reporting
  • Ensure data quality and optimize the performance of data platforms
What we offer
What we offer
  • Flexible working hours
  • Private health insurance (UNUM up to PLN 350,000, advanced up to PLN 500,000)
  • Co-funding for PZU, Medicover or LuxMed medical packages
  • Multisport card
  • Access to training platforms (MS certificates, SQL BI, internal Delivery Excellence Workshops)
  • Breakfasts, fruit, snacks, and always-chilled drinks in the office
  • Anniversary gifts and quarterly integration budgets
  • Paid referral program (up to PLN 8,000)
  • „It’s good to stay with us” – PLN 2,500 voucher after 3 years with the company
  • Fulltime
Read More
Arrow Right