CrawlJobs Logo

Senior Azure Data Engineer

darumatic.com Logo

Darumatic

Location Icon

Location:
Australia , Sydney

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Department of Customer Service is seeking a highly motivated Senior Data Engineer to join their team on a contract basis. This role is pivotal in driving the end-to-end data engineering strategy, utilising the latest Microsoft technologies to transform raw data into actionable insights. You will be joining a flexible and supportive environment, working to build and maintain robust data platforms. As a creative problem solver, you will bridge the gap between complex data systems and business needs, ensuring high-quality datasets support reporting, analytics, and machine learning initiatives.

Job Responsibility:

  • Drive the end-to-end data engineering strategy using Microsoft technologies
  • Build, maintain, and scale data pipelines and platforms to ensure efficient data flow
  • Design and execute ELT/ETL workflows to transform raw data from disparate systems into structured, high-quality datasets optimised for PowerBI, reporting, and machine learning
  • Work cross-functionally with Data Analysts and Business Analysts to understand database requirements, partitioning strategies, and business logic
  • Implement strong data governance and compliance measures
  • Stay abreast of industry trends and best practices to drive continuous improvement initiatives within the data engineering function
  • Utilise Excel, Power Query, and VBA to handle automation tasks and connectors where required

Requirements:

  • Minimum Bachelor’s degree in Computer Science, Software Engineering, or Information Systems
  • 5+ years of professional experience as a Data Engineer
  • Hands-on experience building scalable data pipelines and ELT/ETL workflows specifically using Microsoft Fabric to optimise datasets for PowerBI
  • Strong proficiency in SQL and at least one other programming language commonly used for data engineering (e.g., Python, Java, Scala)
  • Proven background in documentation, security, and compliance, specifically with experience in Microsoft Purview and Azure Key Vault
  • Advanced skills in Excel, including Power Query, VBA automation, and connectors
  • A creative problem solver who can think critically to resolve complex data challenges
  • Ability to stay updated with industry trends and integrate new best practices
  • Strong ability to collaborate with non-technical stakeholders and technical teams alike

Nice to have:

  • Microsoft Certified: Data Engineering on Microsoft Azure (DP-700 or equivalent)
  • Microsoft Certified: Administering Relational Databases on Microsoft Azure (DP-300)

Additional Information:

Job Posted:
January 05, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Azure Data Engineer

Senior Data Engineer

The mission of the business intelligence team is to create a data-driven culture...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
February 28, 2026
Flip Icon
Requirements
Requirements
  • Master’s degree in Computer Science / Information Technology or related field, highly preferred
  • Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions
  • Extensive Experience with data lakes, ETL and data warehouses
  • Advanced experience of building data pipelines
  • Passion for building quality BI software
  • Project Management and/or process improvement experience highly preferred
  • Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau
  • Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered
  • Any exposure to Kafka, Spark, and Scala will be an added advantage
  • Should demonstrate a strong understanding of OOPS concepts and methodologies
Job Responsibility
Job Responsibility
  • Architect and build complex data pipelines using advanced cloud data technologies
  • Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency
  • Define industry best practices for building data pipelines
  • Ensure data security, compliance, and governance standards are met
  • Partner with leadership team to define and implement agile and DevOps methodologies
  • Serve as subject matter expert and define data architecture and infrastructure requirements
  • Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times
  • Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations
  • Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes
  • Design and implement a robust data observability process
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer on our Core Engineering Data Team, you will design and...
Location
Location
United States , Boston
Salary
Salary:
111800.00 - 164000.00 USD / Year
simplisafe.com Logo
SimpliSafe
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience
  • 4+ years of experience in software engineering, data engineering, or a related field, with at least 2 years focused on data operations or data infrastructure
  • Strong knowledge of AWS or other public cloud platforms (e.g., Azure, GCP)
  • Strong SQL knowledge and experience optimizing for data warehousing technologies like AWS Athena
  • Strong knowledge of Python for use in data transformation
  • Hands-on experience with ETL/ELT, schema design, and datalake technologies
  • Hands-on experience with data orchestration tools like Dagster, Airflow, or Prefect
  • Experience with CI/CD pipelines, Docker, Kubernetes, and infrastructure-as-code tools (e.g., Terraform, CloudFormation)
  • Familiarity with various data and table formats (JSON, Avro, Parquet, Iceberg)
  • Love of data and a passion for building reliable data products
Job Responsibility
Job Responsibility
  • Collaborate with analysts, engineers, product managers, and stakeholders to design and implement solutions for Product and Engineering data workflows
  • Identify areas for improvement and contribute to centralized data platform
  • Manage data pipeline, orchestration, storage, and analytics infrastructure for Product and Engineering
  • Monitor performance and reliability of data pipelines, implementing solutions for scalability and efficiency
  • Optimize table structures to support query and usage patterns
  • Partner with producers of data across SimpliSafe to develop an understanding of data creation and meaning
  • Support data discovery, catalog, and analytics tooling
  • Implement and maintain data security measures and ensure compliance with data governance policies
  • Design and implement testing strategies and data quality validation to ensure the accuracy, reliability, and integrity of data pipelines
  • Contribute to the formation of our new team, assisting with the development of team norms, practices, and charter
What we offer
What we offer
  • A mission- and values-driven culture and a safe, inclusive environment where you can build, grow and thrive
  • A comprehensive total rewards package that supports your wellness and provides security for SimpliSafers and their families
  • Free SimpliSafe system and professional monitoring for your home
  • Employee Resource Groups (ERGs) that bring people together, give opportunities to network, mentor and develop, and advocate for change
  • Participation in our annual bonus program, equity, and other forms of compensation, in addition to a full range of medical, retirement, and lifestyle benefits
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Data Engineer to join our team and support with designing, ...
Location
Location
Salary
Salary:
Not provided
foundever.com Logo
Foundever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 7 years plus experience in data engineering
  • Track record of deploying and maintaining complex data systems at an enterprise level within regulated environments
  • Expertise in implementing robust data security measures, access controls, and monitoring systems
  • Proficiency in data modeling and database management
  • Strong programming skills in Python and SQL
  • Knowledge of big data technologies like Hadoop, Spark, and NoSQL databases
  • Deep experience with ETL processes and data pipeline development
  • Strong understanding of data warehousing concepts and best practices
  • Experience with cloud platforms such as AWS and Azure
  • Excellent problem-solving skills and attention to detail
Job Responsibility
Job Responsibility
  • Design and optimize complex data storage solutions, including data warehouses and data lakes
  • Develop, automate, and maintain data pipelines for efficient and scalable ETL processes
  • Ensure data quality and integrity through data validation, cleansing, and error handling
  • Collaborate with data analysts, machine learning engineers, and software engineers to deliver relevant datasets or data APIs for downstream applications
  • Implement data security measures and access controls to protect sensitive information
  • Monitor data infrastructure for performance and reliability, addressing issues promptly
  • Stay abreast of industry trends and emerging technologies in data engineering
  • Document data pipelines, processes, and best practices for knowledge sharing
  • Lead data governance and compliance efforts to meet regulatory requirements
  • Collaborate with cross-functional teams to drive data-driven decision-making within the organization
What we offer
What we offer
  • Impactful work
  • Professional growth
  • Competitive compensation
  • Collaborative environment
  • Attractive salary and benefits package
  • Continuous learning and development opportunities
  • A supportive team culture with opportunities for occasional travel for training and industry events
Read More
Arrow Right

Senior Data Engineer

The Data Engineer will build scalable pipelines and data models, implement ETL w...
Location
Location
United States , Fort Bragg
Salary
Salary:
Not provided
barbaricum.com Logo
Barbaricum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Active DoD TS/SCI clearance (required or pending verification)
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or related field (or equivalent experience) OR CSSLP / CISSP-ISSAP
  • Strong programming skills in Python, Java, or Scala
  • Strong SQL skills
  • familiarity with analytics languages/tools such as R
  • Experience with data processing frameworks (e.g., Apache Spark, Hadoop) and orchestration tools (e.g., Airflow)
  • Familiarity with cloud-based data services (e.g., AWS Redshift, Google BigQuery, Azure Data Factory)
  • Experience with data modeling, database design, and data architecture concepts
  • Strong analytical and problem-solving skills with attention to detail
  • Strong written and verbal communication skills
Job Responsibility
Job Responsibility
  • Build and maintain scalable, reliable data pipelines to collect, process, and store data from multiple sources
  • Design and implement ETL processes to support analytics, reporting, and operational needs
  • Develop and maintain data models, schemas, and standards to support enterprise data usage
  • Collaborate with data scientists, analysts, and stakeholders to understand requirements and deliver solutions
  • Analyze large datasets to identify trends, patterns, and actionable insights
  • Present findings and recommendations through dashboards, reports, and visualizations
  • Optimize database and pipeline performance for scalability and reliability across large datasets
  • Monitor and troubleshoot pipeline issues to minimize downtime and improve system resilience
  • Implement data quality checks, validation routines, and integrity controls
  • Implement security measures to protect data and systems from unauthorized access
Read More
Arrow Right

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
  • Strong Analytical and Problem-Solving Skills with attention to data quality and accuracy
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right