CrawlJobs Logo

Data Operations Engineer

vattenfall.com Logo

Vattenfall

Location Icon

Location:
Poland, Katowice

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

BA Markets wants to professionalise and streamline its activities with regards to data streaming. Therefore, a new agile team has been created to build up a technical platform for BA Markets users to get a hub for all required streaming services. The team will develop the service suite largely internally but will also rely on managed vendor services, such as Databricks. The streaming technology used is Kafka. Many BA Markets users are technically quite advanced so the central data streaming team must constantly keep up to date to deliver state of the art services.So, there will be constant development opportunities for the candidate, and the candidate must be comfortable to work in a fast-changing, international environment. The team consists of four members in Hamburg, one member in Stockholm and should also have three colleagues in Poland. This is a hybrid role to support the Data Engineers and Software developers with automatization and simplification for the users. Vision for this role should be to “Simplify the user experience”.

Job Responsibility:

  • Stream deployment and stream architecture, developments and deployments
  • Automate workflows and orchestrate data pipelines
  • Implement CI/CD routines
  • Implement and monitor “system health” with observability tools and data quality checks
  • Support the development of Client Libraries so other applications can integrate streams in own application and services
  • Perform Python development
  • Perform “glue code” development that 95% of use cases can apply

Requirements:

  • Interest in understanding what the user needs with several years of hands-on experience as software developer with an interest in the responsibilities of a data engineer or vice versa
  • A proactive, communicative team player
  • Fluent in English
  • Deep understanding of Kafka architecture (brokers, topics, partitions, replication), and experience with Kafka Streams, Kafka Connect, and schema registry (e.g., Confluent)
  • Proficiency in designing and managing Kafka clusters (including monitoring and scaling)
  • Hands-on Experience building and maintaining real-time ETL pipelines
  • Familiarity with stream processing frameworks like: Apache Flink or Apache Spark Streaming
  • Strong skills in: Python and Java and at least basic Scala and at least solid SQL experience
  • Has build several CI/CD pipelines (e.g., Jenkins, GitLab CI, GitHub Actions)
  • And used infrastructure as Code (IaC) tools like Terraform or Ansible
  • Hands-on performed containerization (Docker) and orchestration (Kubernetes) tasks
  • Knows monitoring tools like Grafana, ELK stack, Datadog
  • And ideally also Kafka-specific monitoring tools such as (e.g., Burrow, Confluent Control Center)

Nice to have:

  • Experience with Databricks components is a merit
  • Experience with schema management best practices
What we offer:
  • Good remuneration
  • Challenging and international work environment
  • Possibility to work with some of the best in the field
  • Working in interdisciplinary teams
  • Support from committed colleagues
  • Attractive employment conditions
  • Opportunities for personal and professional development

Additional Information:

Job Posted:
December 13, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Operations Engineer

Software Engineer - Data Engineering

Akuna Capital is a leading proprietary trading firm specializing in options mark...
Location
Location
United States , Chicago
Salary
Salary:
130000.00 USD / Year
akunacapital.com Logo
AKUNA CAPITAL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 5+ years of professional experience developing software applications
  • Java/Scala experience required
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
  • Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
  • Must possess excellent communication, analytical, and problem-solving skills
  • Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
  • Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
  • The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Job Responsibility
Job Responsibility
  • Work within a growing Data Engineering division supporting the strategic role of data at Akuna
  • Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
  • Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
  • Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
  • Mentor junior engineers in software and data engineering best practices
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
What we offer
What we offer
  • Discretionary performance bonus
  • Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
  • Fulltime
Read More
Arrow Right
New

Data Engineering Support Engineer / Manager

Wissen Technology is hiring a seasoned Data Engineering Support Engineer / Manag...
Location
Location
India , Mumbai; Pune
Salary
Salary:
Not provided
votredircom.fr Logo
Wissen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor of Technology or master's degree in computer science, Engineering, or related field
  • 8-12 years of work experience
  • Python, SQL
  • Familiarity with data engineering
  • Experience with AWS data and analytics services or similar cloud vendor services
  • Strong problem solving and communication skills
  • Ability to organise and prioritise work effectively
Job Responsibility
Job Responsibility
  • Incident and user management for data and analytics platform
  • Development and maintenance of Data Quality framework (including anomaly detection)
  • Implementation of Python & SQL hotfixes and working with data engineers on more complex issues
  • Diagnostic tools implementation and automation of operational processes
  • Work closely with data scientists, data engineers, and platform engineers in a highly commercial environment
  • Support research analysts and traders with issue resolution
  • Fulltime
Read More
Arrow Right
New

Data Engineering & Analytics Lead

Premium Health is seeking a highly skilled, hands-on Data Engineering & Analytic...
Location
Location
United States , Brooklyn
Salary
Salary:
Not provided
premiumhealth.org Logo
Premium Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred
  • Proven track record and progressively responsible experience in data engineering, data architecture, or related technical roles
  • healthcare experience preferred
  • Strong knowledge of data engineering principles, data integration, ETL processes, and semantic mapping techniques and best practices
  • Experience implementing data quality management processes, data governance frameworks, cataloging, and master data management concepts
  • Familiarity with healthcare data standards (e.g., HL7, FHIR, etc), health information management principles, and regulatory requirements (e.g., HIPAA)
  • Understanding of healthcare data, including clinical, operational, and financial data models, preferred
  • Advanced proficiency in SQL, data modeling, database design, optimization, and performance tuning
  • Experience designing and integrating data from disparate systems into harmonized data models or semantic layers
  • Hands-on experience with modern cloud-based data platforms (e.g Azure, AWS, GCP)
Job Responsibility
Job Responsibility
  • Collaborate with the CDIO and Director of Technology to define a clear data vision aligned with the organization's goals and execute the enterprise data roadmap
  • Serve as a thought leader for data engineering and analytics, guiding the evolution of our data ecosystem and championing data-driven decision-making across the organization
  • Build and mentor a small data team, providing technical direction and performance feedback, fostering best practices and continuous learning, while remaining a hands-on implementor
  • Define and implement best practices, standards, and processes for data engineering, analytics, and data management across the organization
  • Design, implement, and maintain a scalable, reliable, and high-performing modern data infrastructure, aligned with the organizational needs and industry best practices
  • Architect and maintain data lake/lakehouse, warehouse, and related platform components to support analytics, reporting, and operational use cases
  • Establish and enforce data architecture standards, governance models, naming conventions ,and documentation
  • Develop, optimize, and maintain scalable ETL/ELT pipelines and data workflows to collect, transform, normalize, and integrate data from diverse systems
  • Implement robust data quality processes, validation, monitoring, and error-handling frameworks
  • Ensure data is accurate, timely, secure, and ready for self-service analytics and downstream applications
What we offer
What we offer
  • Paid Time Off, Medical, Dental and Vision plans, Retirement plans
  • Public Service Loan Forgiveness (PSLF)
  • Fulltime
Read More
Arrow Right
New

Principal Data Engineer

PointClickCare is searching for a Principal Data Engineer who will contribute to...
Location
Location
United States
Salary
Salary:
183200.00 - 203500.00 USD / Year
pointclickcare.com Logo
PointClickCare
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Principal Data Engineer with at least 10 years of professional experience in software or data engineering, including a minimum of 4 years focused on streaming and real-time data systems
  • Proven experience driving technical direction and mentoring engineers while delivering complex, high-scale solutions as a hands-on contributor
  • Deep expertise in streaming and real-time data technologies, including frameworks such as Apache Kafka, Flink, and Spark Streaming
  • Strong understanding of event-driven architectures and distributed systems, with hands-on experience implementing resilient, low-latency pipelines
  • Practical experience with cloud platforms (AWS, Azure, or GCP) and containerized deployments for data workloads
  • Fluency in data quality practices and CI/CD integration, including schema management, automated testing, and validation frameworks (e.g., dbt, Great Expectations)
  • Operational excellence in observability, with experience implementing metrics, logging, tracing, and alerting for data pipelines using modern tools
  • Solid foundation in data governance and performance optimization, ensuring reliability and scalability across batch and streaming environments
  • Experience with Lakehouse architectures and related technologies, including Databricks, Azure ADLS Gen2, and Apache Hudi
  • Strong collaboration and communication skills, with the ability to influence stakeholders and evangelize modern data practices within your team and across the organization
Job Responsibility
Job Responsibility
  • Lead and guide the design and implementation of scalable streaming data pipelines
  • Engineer and optimize real-time data solutions using frameworks like Apache Kafka, Flink, Spark Streaming
  • Collaborate cross-functionally with product, analytics, and AI teams to ensure data is a strategic asset
  • Advance ongoing modernization efforts, deepening adoption of event-driven architectures and cloud-native technologies
  • Drive adoption of best practices in data governance, observability, and performance tuning for streaming workloads
  • Embed data quality in processing pipelines by defining schema contracts, implementing transformation tests and data assertions, enforcing backward-compatible schema evolution, and automating checks for freshness, completeness, and accuracy across batch and streaming paths before production deployment
  • Establish robust observability for data pipelines by implementing metrics, logging, and distributed tracing for streaming jobs, defining SLAs and SLOs for latency and throughput, and integrating alerting and dashboards to enable proactive monitoring and rapid incident response
  • Foster a culture of quality through peer reviews, providing constructive feedback and seeking input on your own work
What we offer
What we offer
  • Benefits starting from Day 1!
  • Retirement Plan Matching
  • Flexible Paid Time Off
  • Wellness Support Programs and Resources
  • Parental & Caregiver Leaves
  • Fertility & Adoption Support
  • Continuous Development Support Program
  • Employee Assistance Program
  • Allyship and Inclusion Communities
  • Employee Recognition … and more!
  • Fulltime
Read More
Arrow Right

Senior Data Engineering Manager

Lead and mentor data engineering team to scale data platform, establish best pra...
Location
Location
United States , Work at Home, Illinois
Salary
Salary:
130295.00 - 260590.00 USD / Year
https://www.cvshealth.com/ Logo
CVS Health
Expiration Date
January 19, 2026
Flip Icon
Requirements
Requirements
  • Bachelor's or master's degree in Computer Science, Data Science, or related field
  • 7+ years of experience in data engineering
  • 5+ years in technical leadership or management role
  • Experience building and leading high-performing data engineering teams
  • Deep expertise with cloud platforms (AWS, Azure, or GCP)
  • Experience with big data frameworks (Apache Spark, Hadoop)
  • Experience with data warehousing solutions (Snowflake, Redshift, BigQuery)
  • Experience with workflow orchestration tools (Airflow, Dagster)
  • Solid experience with SQL, Python, PySpark
  • Excellent communication, interpersonal, and leadership skills
Job Responsibility
Job Responsibility
  • Lead, mentor, and grow team of 6-8 data engineers
  • Manage hiring, training, and professional development
  • Conduct performance reviews and provide feedback
  • Own technical vision and roadmap for data platform
  • Lead design, development, and maintenance of data pipelines and data warehouses
  • Drive best practices in data modeling, ETL/ELT processes, and data governance
  • Oversee implementation of new data technologies and architectures
  • Partner with product managers, data scientists, analysts, and other engineering teams
  • Ensure data platform meets standards for performance, security, and data quality
  • Establish culture of operational excellence including monitoring and incident response
What we offer
What we offer
  • Affordable medical plan options
  • 401(k) plan with matching company contributions
  • Employee stock purchase plan
  • Wellness screenings
  • Tobacco cessation and weight management programs
  • Confidential counseling and financial coaching
  • Paid time off
  • Flexible work schedules
  • Family leave
  • Dependent care resources
  • Fulltime
Read More
Arrow Right

Staff Data Engineer

As a Staff Data Engineer, you will be leading the architecture, design and devel...
Location
Location
United States; Canada , Remote
Salary
Salary:
Not provided
https://www.1password.com Logo
1Password
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 8+ years of professional software engineering experience
  • Minimum of 7 years technical engineering experience building data processing applications (batch and streaming) with coding in languages
  • In-depth, hands-on experience on extensible data modeling, query optimizations and work in Java, Scala, Python, and related technologies
  • Experience in data modeling across external facing product insights and business processes, such as revenue/sales operations, finance, and marketing
  • Experience with Big Data query engines such as Hive, Presto, Trino, Spark
  • Experience with data stores such as Redshift, MySQL, Postgres, Snowflake, etc.
  • Experience using Realtime technologies like Apache Kafka, Kinesis, Flink, etc.
  • Experience building scalable services on top of public cloud infrastructure like Azure, AWS, or GCP with extensive use of datastores like RDBMS, key-value stores, etc.
  • Experience leveraging distributed systems at scale and systems knowledge on infrastructure hardware, resources bare-metal hosts to containers to networking.
Job Responsibility
Job Responsibility
  • Design, develop, and automate large-scale, high-performance batch and streaming data processing systems to drive business growth and enhance product experience
  • Build data engineering strategy that supports a rapidly growing tech company and aligns with the priorities across our product strategy and internal business organizations’ desire to leverage data for more competitive advantages
  • Build scalable data pipelines using best-in-class software engineering practices
  • Develop optimal data models for storage and retrieval, meeting critical product and business requirements
  • Establish and execute short and long-term architectural roadmaps in collaboration with Analytics, Data Platform, Business Systems, Engineering, Privacy and Security
  • Lead efforts on continuous improvement to the efficiency and flexibility of the data, platform, and services
  • Mentor Analytics & Data Engineers on best practices, standards and forward-looking approaches on building robust, extensible and reusable data solutions
  • Influence and evangelize high standard of code quality, system reliability, and performance.
What we offer
What we offer
  • Maternity and parental leave top-up programs
  • Generous PTO policy
  • Four company-wide wellness days
  • Company equity for all full-time employees
  • Retirement matching program
  • Free 1Password account
  • Paid volunteer days
  • Employee-led inclusion and belonging programs and ERGs
  • Peer-to-peer recognition through Bonusly
  • Fulltime
Read More
Arrow Right
New

Staff Data Engineer

We’re looking for a Staff Data Engineer to own the design, scalability, and reli...
Location
Location
United States , San Jose
Salary
Salary:
150000.00 - 250000.00 USD / Year
figure.ai Logo
Figure
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience owning or architecting large-scale data platforms — ideally in EV, autonomous driving, or robotics fleet environments, where telemetry, sensor data, and system metrics are core to product decisions
  • Deep expertise in data engineering and architecture (data modeling, ETL orchestration, schema design, transformation frameworks)
  • Strong foundation in Python, SQL, and modern data stacks (dbt, Airflow, Kafka, Spark, BigQuery, ClickHouse, or Snowflake)
  • Experience building data quality, validation, and observability systems to detect regressions, schema drift, and missing data
  • Excellent communication skills — able to understand technical needs from domain experts (controls, perception, operations) and translate complex data patterns into clear, actionable insights for engineers and leadership
  • First-principles understanding of electrical and mechanical systems, including motors, actuators, encoders, and control loops
Job Responsibility
Job Responsibility
  • Architect and evolve Figure’s end-to-end platform data pipeline — from robot telemetry ingestion to warehouse transformation and visualization
  • Improve and maintain existing ETL/ELT pipelines for scalability, reliability, and observability
  • Detect and mitigate data regressions, schema drift, and missing data via validation and anomaly-detection frameworks
  • Identify and close gaps in data coverage, ensuring high-fidelity metrics coverage across releases and subsystems
  • Define the tech stack and architecture for the next generation of our data warehouse, transformation framework, and monitoring layer
  • Collaborate with robotics domain experts (controls, perception, Guardian, fall-prevention) to turn raw telemetry into structured metrics that drive engineering/business decisions
  • Partner with fleet management, operators, and leadership to design and communicate fleet-level KPIs, trends, and regressions in clear, actionable ways
  • Enable self-service access to clean, documented datasets for engineers
  • Develop tools and interfaces that make fleet data accessible and explorable for engineers without deep data backgrounds
  • Fulltime
Read More
Arrow Right
New

Senior Manager - Engineering Operations

At Mercury, we build products that help startups and growing companies manage th...
Location
Location
United States; Canada , San Francisco; New York; Portland
Salary
Salary:
239000.00 - 298900.00 USD / Year
mercury.com Logo
Mercury
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of experience in Engineering Operations, Technical Program Management, Engineering Chief of Staff, or adjacent leadership positions in a fast-paced tech company
  • A background in software engineering, with demonstrated success in shaping Engineering-wide or company-wide strategy through operational leadership
  • Experience managing complex engineering programs and initiatives at scale, and measuring impact through data
  • Experience bringing structure to ambiguity, managing through influence, and enabling senior ICs and managers to do their best work
  • Thoughtful systems thinking—you’re able to take a holistic view of the situation or engage with the details, depending on the need of the hour
  • Communicate clearly and with care, whether facilitating a leadership offsite, drafting an all-hands update, or aligning multiple teams around shared goals
Job Responsibility
Job Responsibility
  • Drive the planning and execution of strategic initiatives across Engineering—which may cover hiring, onboarding, metrics, budgeting, and tooling—working closely with leadership to turn ideas into scalable systems
  • Design and execute programs to enable and expand AI adoption for coding assistance and in-product features
  • Partner with our Strategic Operations team to improve core processes like planning cycles and incident management to help Engineering execute effectively and adapt with scale
  • Act as orchestrator and glue for department-wide initiatives such as tech debt paydown projects, architectural investments, or improvements to the hiring process
  • Overall: build an environment of operational excellence—using data, context, and collaboration to raise the bar for how we work
What we offer
What we offer
  • base salary
  • equity (stock options)
  • benefits
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.