CrawlJobs Logo

Data Engineer (Modern Data Stack / DataOps)

ddroidd.com Logo

ddroidd

Location Icon

Location:
Romania

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for a Senior Data Engineer with experience in modern cloud data platforms. The role focuses on building, orchestrating, and maintaining reliable data pipelines and transformation workflows using a modern data stack centered around Snowflake, dbt, and Apache Airflow. This role combines data engineering expertise with DataOps practices, ensuring scalable, well-governed, and automated data workflows.

Job Responsibility:

  • Design and maintain data pipelines and ELT workflows within a modern cloud data platform
  • Build and maintain data transformation models using dbt, including testing, documentation, and modular data modeling
  • Orchestrate and monitor workflows using Apache Airflow
  • Manage and optimize Snowflake data warehouse environments, including performance and cost efficiency
  • Implement DataOps practices such as CI/CD, automated testing, and deployment for data pipelines
  • Ensure data quality, reliability, and observability across the data platform
  • Collaborate with analytics, product, and engineering teams to deliver reliable datasets and data products
  • Improve monitoring, automation, and operational processes for the data platform

Requirements:

  • 7+ years of experience in Data Engineering or similar roles
  • Strong hands-on experience with Snowflake
  • Strong hands-on experience with Apache Airflow
  • Strong hands-on experience with dbt (data build tool)
  • Strong SQL expertise
  • Experience building and managing ELT pipelines
  • Experience with Git and collaborative development workflows
  • Familiarity with DataOps / CI/CD practices for data pipelines
  • Solid understanding of data modeling and data warehouse architecture

Nice to have:

  • Experience with Databricks platform and its ecosystem
  • Knowledge of Scala for Spark development
  • Knowledge of AWS data ecosystem
  • Familiarity with linked data concepts and JSON-LD format
What we offer:
  • Private medical insurance
  • National holidays off, even when falling on weekends
  • Loyalty leave: +1 day/year
  • Continuous professional development opportunities
  • Sports subscription programs
  • Referral bonuses for bringing in new talent
  • Meal tickets
  • Bookster subscription for reading & learning
  • Community and team-building events
  • Flexible and unlimited remote work policy

Additional Information:

Job Posted:
March 21, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer (Modern Data Stack / DataOps)

Data Analytics Engineer

SDG Group is expanding its global Data & Analytics practice and is seeking a mot...
Location
Location
Egypt , Cairo
Salary
Salary:
Not provided
sdggroup.com Logo
SDG
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Information Systems, or related field
  • Hands-on experience in DataOps / Data Engineering
  • Strong knowledge in Databricks OR Snowflake (one of them is mandatory)
  • Proficiency in Python and SQL
  • Experience with Azure data ecosystem (ADF, ADLS, Synapse, etc.)
  • Understanding of CI/CD practices and DevOps for data.
  • Knowledge of data modeling, orchestration frameworks, and monitoring tools
  • Strong analytical and troubleshooting skills
  • Eagerness to learn and grow in a global consulting environment
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable and reliable data pipelines following DataOps best practices
  • Work with modern cloud data stacks using Databricks (Spark, Delta Lake) or Snowflake (Snow pipe, tasks, streams)
  • Develop and optimize ETL/ELT workflows using Python, SQL, and orchestration tools
  • Work with Azure data services (ADF, ADLS, Azure SQL, Azure Functions)
  • Implement CI/CD practices using Azure DevOps or Git-based workflows
  • Ensure data quality, consistency, and governance across all delivered data solutions
  • Monitor and troubleshoot pipelines for performance and operational excellence
  • Collaborate with international teams, architects, and analytics consultants
  • Contribute to technical documentation and solution design assets
What we offer
What we offer
  • Remote working model aligned with international project needs
  • Opportunity to work on European and global engagements
  • Mentorship and growth paths within SDG Group
  • A dynamic, innovative, and collaborative environment
  • Access to world-class training and learning platforms
  • Fulltime
Read More
Arrow Right

Consultant - Data & AI

Microsoft Industry Solutions - Global Center for Innovation and Delivery Center ...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
https://www.microsoft.com/ Logo
Microsoft Corporation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4 - 10 years of experience
  • Bachelor's degree in computer science engineering or equivalent work experience
  • Higher relevant education is preferred
  • Knowledge of solution design, planning, development and deployment of complex solutions
  • One or more of the following certifications, or an equivalent industry certification is a plus: Microsoft Certified: Azure Data Engineer Associate (DP-600) / Microsoft Certified: Azure AI Engineer Associate (AI102) / Microsoft Certified: Azure Solution Architect Expert (AZ-305)
  • Core Data Engineering & Platform Skills
  • Hands-on experience in Data Engineering across cloud, on-prem, and hybrid environments
  • Strong foundational experience with Azure Data Services and data platform modernization initiatives
  • Handson exposure to Data Warehouse and analytics using platforms like Microsoft Fabric, and Azure Synapse Analytics
  • Azure Databricks is a plus
Job Responsibility
Job Responsibility
  • Works as an Individual contributor and key member of the Data and AI team and helps in timely execution of assigned deliverables with accurate estimates, work priorities, and accommodates project changes and trade-offs necessary for a successful release
  • Applies technical experience and industry-specific knowledge to develop solutions, based on an analysis of how the proposed approach affects the business objectives of customers and partners
  • Works to accelerate the value proposition of customer/partner engagements by helping to design, develop, and deploy solutions on Microsoft technologies and methodologies
  • Contributes to the overall efficacy and quality of a project team’s technical delivery within assigned engagements
  • Defines dependencies and risks that go beyond the immediate scope and timeframe for a complex project
  • Develops contingency plans, risk-mitigation implementation criteria, and alternative strategies to manage short- and long-term risks and manages technical escalations
  • Drives opportunities to expand or accelerate the adoption and consumption of cloud and Microsoft technologies
  • Collaborates, as appropriate, with peers and other teams (e.g., Sales, account-aligned team) to scale the business with existing high-stake or strategic customers, by articulating/developing value propositions of strategic Microsoft products and services
  • Align with innovation and digital transformation initiatives
  • Ensures the use of existing intellectual property (IP) and delivers value to customers
  • Fulltime
Read More
Arrow Right

Staff DataOps Engineer

We are looking for a Staff DataOps / Platform Engineer to join the Data and ML p...
Location
Location
France , Paris
Salary
Salary:
Not provided
doctolib.fr Logo
Doctolib
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience after graduation as a Senior Data Platform, Senior Data Engineer or in a similar role, with a history of architecting and scaling robust data platforms
  • Extensive experience with Google Cloud Platform and a command of Kubernetes & Terraform for automated deployments
  • Authority on implementing network and IAM security best practices
  • Deep technical proficiency in orchestrating data pipelines using Airflow or Dagster, deploying applications to the cloud, and leveraging modern data warehouses such as BigQuery
  • Highly skilled in programming with Python, and have a solid understanding of software development principles
  • Excellent troubleshooter who excels at diagnosing and fixing data infrastructure and identifying performance bottlenecks
  • Strong communicator who can articulate complex technical concepts to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Design and implement enterprise-scale data infrastructure strategies, conducting thorough impact and cost analysis for major technical decisions, and establishing architectural standards across the organization
  • Build and optimize complex, multi-region data pipelines handling petabyte-scale datasets, ensuring 99.9% reliability and implementing advanced monitoring and alerting systems
  • Lead cost analysis initiatives, identify optimization opportunities across our data stack, and implement solutions that reduce infrastructure spend while improving performance and reliability
  • Provide technical guidance to data engineers and cross-functional teams, conduct architecture reviews, and drive adoption of best practices in DataOps, security, and governance
  • Evaluate emerging technologies, conduct proof-of-concepts for new data tools and platforms, and lead the technical roadmap for data infrastructure modernization
What we offer
What we offer
  • Free comprehensive health insurance for you and your children
  • Parent Care Program: receive one additional month of leave on top of the legal parental leave
  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of sport club membership or creative class
  • Up to 14 days of RTT
  • A subsidy from the work council to refund part of the membership to a sport club or a creative class
  • Lunch voucher with Swile card
  • Fulltime
Read More
Arrow Right

Staff DevOps - Data Platform

We are looking for a Staff DevOps - Data Platform to join the Data and ML Platfo...
Location
Location
France , Paris
Salary
Salary:
Not provided
doctolib.fr Logo
Doctolib
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience after graduation as a Staff Data Platform Engineer, Staff Data Ops, Staff Site Reliability Engineer, or in a similar role, with a history of architecting and scaling robust data platforms
  • Extensive experience with Google Cloud Platform and a command of Kubernetes & Terraform for automated deployments
  • Authority on implementing network and IAM security best practices
  • Deep technical proficiency in orchestrating data pipelines using Airflow or Dagster, deploying applications to the cloud, and leveraging modern data warehouses such as BigQuery
  • Highly skilled in programming with Python, and have a solid understanding of software development principles
  • Excellent troubleshooter who excels at diagnosing and fixing data infrastructure and identifying performance bottlenecks
  • Strong communicator who can articulate complex technical concepts to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Design and implement enterprise-scale data infrastructure strategies, conducting thorough impact and cost analysis for major technical decisions, and establishing architectural standards across the organization
  • Build and optimize complex, multi-region data pipelines handling petabyte-scale datasets, ensuring 99.9% reliability and implementing advanced monitoring and alerting systems
  • Lead cost analysis initiatives, identify optimization opportunities across our data stack, and implement solutions that reduce infrastructure spend while improving performance and reliability
  • Provide technical guidance to data engineers and cross-functional teams, conduct architecture reviews, and drive adoption of best practices in DataOps, security, and governance
  • Evaluate emerging technologies, conduct proof-of-concepts for new data tools and platforms, and lead the technical roadmap for data infrastructure modernization
What we offer
What we offer
  • Free comprehensive health insurance for you and your children
  • Parent Care Program: receive one additional month of leave on top of the legal parental leave
  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of sport club membership or creative class
  • Up to 14 days of RTT
  • A subsidy from the work council to refund part of the membership to a sport club or a creative class
  • Lunch voucher with Swile card
  • Fulltime
Read More
Arrow Right

Staff Data Ops Engineer - Platform

We are looking for a Staff Data Ops Engineer - Platform to join the Data & AI Pl...
Location
Location
France , Paris
Salary
Salary:
Not provided
doctolib.fr Logo
Doctolib
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience after graduation as a Staff Data Platform Engineer, or Staff Data Ops or Staff Site Reliability Engineer or in a similar role, with a history of architecting and scaling robust data platforms
  • Extensive experience with Google Cloud Platform and a command of Kubernetes & Terraform for automated deployments
  • Authority on implementing network and IAM security best practices
  • Deep technical proficiency in orchestrating data pipelines using Airflow or Dagster, deploying applications to the cloud, and leveraging modern data warehouses such as BigQuery
  • Highly skilled in programming with Python, and have a solid understanding of software development principles
  • Excellent troubleshooter who excels at diagnosing and fixing data infrastructure and identifying performance bottlenecks
  • Strong communicator who can articulate complex technical concepts to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Design and implement enterprise-scale data infrastructure strategies, conducting thorough impact and cost analysis for major technical decisions, and establishing architectural standards across the organization
  • Build and optimize complex, multi-region data pipelines handling petabyte-scale datasets, ensuring 99.9% reliability and implementing advanced monitoring and alerting systems
  • Lead cost analysis initiatives, identify optimization opportunities across our data stack, and implement solutions that reduce infrastructure spend while improving performance and reliability
  • Provide technical guidance to data engineers and cross-functional teams, conduct architecture reviews, and drive adoption of best practices in DataOps, security, and governance
  • Evaluate emerging technologies, conduct proof-of-concepts for new data tools and platforms, and lead the technical roadmap for data infrastructure modernization
What we offer
What we offer
  • Free comprehensive health insurance for you and your children
  • Parent Care Program: receive one additional month of leave on top of the legal parental leave
  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of sport club membership or creative class
  • Up to 14 days of RTT
  • A subsidy from the work council to refund part of the membership to a sport club or a creative class
  • Lunch voucher with Swile card
  • Fulltime
Read More
Arrow Right

Software Engineer - Data Engineering

At Catawiki, data sits at the core of our decision-making, powering everything f...
Location
Location
Netherlands , Amsterdam
Salary
Salary:
Not provided
catawiki.com Logo
Catawiki
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of hands-on experience building and operating data systems in production
  • Fluent in Python and SQL
  • Experience with data integration tools such as Fivetran and/or Airbyte
  • Experience with CI/CD, Infrastructure as Code (e.g. Terraform), and modern DataOps practices
  • Experience with cloud platforms (GCP is a plus)
  • Familiar with parts of the data stack such as BigQuery, PubSub, DataFlow, GKE, Airflow, Airbyte, FastAPI and Prometheus
  • Experience with streaming pipelines using technologies like Kafka, Pub/Sub, Dataflow, or Apache Beam
  • Keen to learn new tools, support data platform and machine learning engineering initiatives
  • Understand the importance of data privacy and GDPR
Job Responsibility
Job Responsibility
  • Build and Scale Data Pipelines: Maintain and develop reliable batch and streaming pipelines that ingest data from internal systems and third-party sources into Catawiki’s data warehouse
  • Empower Data Science and AI: Maintain and enhance the tools and platforms used by Data Scientists for analysis, experimentation, model training, and model deployment
  • Protect Data and Privacy: Ensure data is stored securely and that governance, access control, and privacy standards are consistently applied across the data platform
  • Run and Evolve the Data Platform: Maintain the infrastructure that hosts our data tools and applications, keeping it scalable, stable, and cost-effective
  • Own Core Data Tooling: Self-host and operate key data engineering tools such as Airflow and Airbyte on Kubernetes
  • Keep the Lights On: Provide operational support to ensure pipelines, platforms, and tools run smoothly and reliably for teams across the business
What we offer
What we offer
  • €100 Catavoucher when you join
  • €50 Catavoucher on your birthday
  • An extra day off each year to “Pursue Your Passion”
  • Additional leave for key work anniversaries and important life events
  • Fulltime
Read More
Arrow Right

Consultant A2 - Data & AI

Microsoft Industry Solutions - Global Center for Innovation and Delivery Center ...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
https://www.microsoft.com/ Logo
Microsoft Corporation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4 - 10 years of experience
  • Bachelor's degree in computer science engineering or equivalent work experience
  • Knowledge of solution design, planning, development and deployment of complex solutions
  • Core Data Engineering & Platform Skills
  • Hands-on experience in Data Engineering across cloud, on-prem, and hybrid environments
  • Strong foundational experience with Azure Data Services and data platform modernization initiatives
  • Handson exposure to Data Warehouse and analytics using platforms like Microsoft Fabric, and Azure Synapse Analytics
  • Experience/knowledge of one or more SQL and NoSQL database systems
  • Hands-on experience building AI-powered data pipelines using ETL/ELT tools like: Azure Data Factory (ADF), SSIS, Talend, Informatica, Airflow
  • Exposure to data migrations, platform upgrades, and modernization efforts
Job Responsibility
Job Responsibility
  • Works as an Individual contributor and key member of the Data and AI team and helps in timely execution of assigned deliverables with accurate estimates, work priorities, and accommodates project changes and trade-offs necessary for a successful release
  • Applies technical experience and industry-specific knowledge to develop solutions, based on an analysis of how the proposed approach affects the business objectives of customers and partners
  • Works to accelerate the value proposition of customer/partner engagements by helping to design, develop, and deploy solutions on Microsoft technologies and methodologies
  • Contributes to the overall efficacy and quality of a project team’s technical delivery within assigned engagements
  • Defines dependencies and risks that go beyond the immediate scope and timeframe for a complex project
  • Develops contingency plans, risk-mitigation implementation criteria, and alternative strategies to manage short- and long-term risks and manages technical escalations
  • Drives opportunities to expand or accelerate the adoption and consumption of cloud and Microsoft technologies
  • Collaborates, as appropriate, with peers and other teams (e.g., Sales, account-aligned team) to scale the business with existing high-stake or strategic customers, by articulating/developing value propositions of strategic Microsoft products and services
  • Align with innovation and digital transformation initiatives
  • Ensures the use of existing intellectual property (IP) and delivers value to customers
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Catawiki, data sits at the core of our decision-making, powering everything f...
Location
Location
Netherlands , Amsterdam
Salary
Salary:
Not provided
catawiki.com Logo
Catawiki
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of hands-on experience building and operating data systems in production
  • Fluent in Python and SQL
  • Experience with data integration tools such as Fivetran and/or Airbyte
  • Experience with CI/CD, Infrastructure as Code (e.g. Terraform), and modern DataOps practices
  • Experience with cloud platforms (GCP is a plus)
  • Familiar with parts of our data stack such as BigQuery, PubSub, DataFlow, GKE, Airflow, Airbyte, FastAPI and Prometheus
  • Experience with streaming pipelines using technologies like Kafka, Pub/Sub, Dataflow, or Apache Beam
  • Keen to learn new tools, support data platform and machine learning engineering initiatives
  • Understand the importance of data privacy and GDPR
Job Responsibility
Job Responsibility
  • Build and Scale Data Pipelines: Maintain and develop reliable batch and streaming pipelines that ingest data from internal systems and third-party sources into Catawiki’s data warehouse
  • Empower Data Science and AI: Maintain and enhance the tools and platforms used by Data Scientists for analysis, experimentation, model training, and model deployment
  • Protect Data and Privacy: Ensure data is stored securely and that governance, access control, and privacy standards are consistently applied across the data platform
  • Run and Evolve the Data Platform: Maintain the infrastructure that hosts our data tools and applications, keeping it scalable, stable, and cost-effective
  • Own Core Data Tooling: Self-host and operate key data engineering tools such as Airflow and Airbyte on Kubernetes
  • Keep the Lights On: Provide operational support to ensure pipelines, platforms, and tools run smoothly and reliably for teams across the business
What we offer
What we offer
  • €100 Catavoucher upon joining
  • €50 Catavoucher on each birthday
  • An extra day off each year to 'Pursue Your Passion'
  • Additional time off for significant work anniversaries (3, 5, 8, 10 years)
  • Extra leave for life’s big moments like marriage, engagements, or moving house
Read More
Arrow Right