CrawlJobs Logo

Software Engineer - Data Ingestion

coralogix.com Logo

Coralogix

Location Icon

Location:
Israel , Ramat Gan

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Coralogix is a modern, full-stack observability platform transforming how businesses process and understand their data. Our unique architecture powers in-stream analytics without reliance on expensive indexing or hot storage. We specialize in comprehensive monitoring of logs, metrics, trace and security events with features such as APM, RUM, SIEM, Kubernetes monitoring and more, all enhancing operational efficiency and reducing observability spend by up to 70%. We are seeking engineers with hands-on experience in developing and managing distributed systems and microservices in a production environment. The ideal candidate should have a solid background in infrastructure and operations. The team builds high-throughput ingestion and processing pipelines, efficient storage on object stores, and systems for data governance, usage reporting, and routing data to multiple physical locations. You will work on cloud-native, production systems written in Rust and Scala, operating at scale with Kafka, Postgres, Redis, and object storage, run on Kubernetes and operate in a multi-cloud environment.

Job Responsibility:

  • Develop and operate distributed systems in production
  • Build Kafka-based ingestion and processing pipelines
  • Design systems for data governance, retention, deletion, and usage reporting
  • Work with Postgres and Redis, requiring solid database design and operational knowledge
  • Implement efficient persistence using column-oriented data formats and object storage

Requirements:

  • Located in Israel
  • 5+ years of software development experience
  • Production experience with large-scale Apache Kafka or comparable distributed data streaming platforms
  • Strong understanding of distributed systems, databases, and production operations
  • Experience with Scala or Rust
  • B.Sc. in Computer Science or an equivalent field

Nice to have:

  • Experience with Kafka Streams or similar frameworks
  • Experience with column-oriented data formats and large-scale analytical storage systems
  • Experience with Apache Arrow and DataFusion

Additional Information:

Job Posted:
March 04, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Software Engineer - Data Ingestion

Senior Backend Engineer - Data Ingestion

The ClickPipes Platform plays a critical role in driving the growth of our compa...
Location
Location
Canada
Salary
Salary:
Not provided
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building data-intensive software solutions
  • Strong knowledge of Golang and experience with its ecosystem
  • Experience with distributed systems and microservices architecture
  • The ability to design and build robust ETL data pipelines that can handle large volumes of data reliably and efficiently
  • Understanding data replication methodologies like CDC
  • Good knowledge of cloud-native architecture and practical experience with at least one major CSP
  • You have excellent communication skills and the ability to work well within a team and across engineering teams
  • You are a strong problem solver and have solid production debugging skills
Job Responsibility
Job Responsibility
  • Develop and enhance integrations with various data sources including streaming platforms, databases, data lakes, and object stores
  • Continuously improve our systems based on operational metrics, customer feedback, and evolving business requirements
  • Drive technical discussions and contribute to architectural decisions that impact our platform's scalability and resilience
  • Participate in on-call rotations to ensure system reliability and respond to production incidents
What we offer
What we offer
  • Flexible work environment
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
Read More
Arrow Right

Senior Backend Engineer - Data Ingestion

The ClickPipes Platform plays a critical role in driving the growth of our compa...
Location
Location
Spain
Salary
Salary:
Not provided
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building data-intensive software solutions
  • Strong knowledge of Golang and experience with its ecosystem
  • Experience with distributed systems and microservices architecture
  • The ability to design and build robust ETL data pipelines that can handle large volumes of data reliably and efficiently
  • Understanding data replication methodologies like CDC
  • Good knowledge of cloud-native architecture and practical experience with at least one major CSP
  • Excellent communication skills and the ability to work well within a team and across engineering teams
  • Strong problem solver and have solid production debugging skills
Job Responsibility
Job Responsibility
  • Develop and enhance integrations with various data sources including streaming platforms, databases, data lakes, and object stores
  • Continuously improve our systems based on operational metrics, customer feedback, and evolving business requirements
  • Drive technical discussions and contribute to architectural decisions that impact our platform's scalability and resilience
  • Participate in on-call rotations to ensure system reliability and respond to production incidents
What we offer
What we offer
  • Flexible work environment
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – opportunities to engage with colleagues at company-wide offsites
Read More
Arrow Right

Software Engineer, Data Infrastructure

The Data Infrastructure team at Figma builds and operates the foundational platf...
Location
Location
United States , San Francisco; New York
Salary
Salary:
149000.00 - 350000.00 USD / Year
figma.com Logo
Figma
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of Software Engineering experience, specifically in backend or infrastructure engineering
  • Experience designing and building distributed data infrastructure at scale
  • Strong expertise in batch and streaming data processing technologies such as Spark, Flink, Kafka, or Airflow/Dagster
  • A proven track record of impact-driven problem-solving in a fast-paced environment
  • A strong sense of engineering excellence, with a focus on high-quality, reliable, and performant systems
  • Excellent technical communication skills, with experience working across both technical and non-technical counterparts
  • Experience mentoring and supporting engineers, fostering a culture of learning and technical excellence
Job Responsibility
Job Responsibility
  • Design and build large-scale distributed data systems that power analytics, AI/ML, and business intelligence
  • Develop batch and streaming solutions to ensure data is reliable, efficient, and scalable across the company
  • Manage data ingestion, movement, and processing through core platforms like Snowflake, our ML Datalake, and real-time streaming systems
  • Improve data reliability, consistency, and performance, ensuring high-quality data for engineering, research, and business stakeholders
  • Collaborate with AI researchers, data scientists, product engineers, and business teams to understand data needs and build scalable solutions
  • Drive technical decisions and best practices for data ingestion, orchestration, processing, and storage
What we offer
What we offer
  • equity
  • health, dental & vision
  • retirement with company contribution
  • parental leave & reproductive or family planning support
  • mental health & wellness benefits
  • generous PTO
  • company recharge days
  • a learning & development stipend
  • a work from home stipend
  • cell phone reimbursement
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Core Data

As a Senior Software Engineer on our Core Data team, you will take a leading rol...
Location
Location
United States
Salary
Salary:
190000.00 - 220000.00 USD / Year
pomelocare.com Logo
Pomelo Care
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience building high-quality, scalable data systems and pipelines
  • Expert-level proficiency in SQL and Python, with a deep understanding of data modeling and transformation best practices
  • Hands-on experience with dbt for data transformation and Dagster, Beam, Dataflow or similar tools for pipeline orchestration
  • Experience with modern data stack tools and cloud platforms, with a strong understanding of data warehouse design principles
  • A track record of delivering elegant and maintainable solutions to complex data problems that drive real business impact
Job Responsibility
Job Responsibility
  • Build and maintain elegant data pipelines that orchestrate ingestion from diverse sources and normalize data for company-wide consumption
  • Lead the design and development of robust, scalable data infrastructure that enables our clinical and product teams to make data-driven decisions, using dbt, Dagster, Beam and Dataflow
  • Write clean, performant SQL and Python to transform raw data into actionable insights that power our platform
  • Architect data models and transformations that support both operational analytics and new data-driven product features
  • Mentor other engineers, providing technical guidance on data engineering best practices and thoughtful code reviews, fostering a culture of data excellence
  • Collaborate with product, clinical and analytics teams to understand data needs and ensure we are building infrastructure that unlocks the most impactful insights
  • Optimize data processing workflows for performance, reliability and cost-effectiveness
What we offer
What we offer
  • Competitive healthcare benefits
  • Generous equity compensation
  • Unlimited vacation
  • Membership in the First Round Network (a curated and confidential community with events, guides, thousands of Q&A questions, and opportunities for 1-1 mentorship)
  • Fulltime
Read More
Arrow Right

Senior Backend Engineer- Data Ingestion - (ClickPipes Platform)

The ClickPipes Platform plays a critical role in driving the growth of our compa...
Location
Location
United States
Salary
Salary:
133450.00 - 197200.00 USD / Year
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building data-intensive software solutions
  • Strong knowledge of Golang and experience with its ecosystem
  • Experience with distributed systems and microservices architecture
  • The ability to design and build robust ETL data pipelines that can handle large volumes of data reliably and efficiently
  • Understanding data replication methodologies like CDC
  • Good knowledge of cloud-native architecture and practical experience with at least one major CSP
  • You have excellent communication skills and the ability to work well within a team and across engineering teams
  • You are a strong problem solver and have solid production debugging skills
Job Responsibility
Job Responsibility
  • Develop and enhance integrations with various data sources including streaming platforms, databases, data lakes, and object stores
  • Continuously improve our systems based on operational metrics, customer feedback, and evolving business requirements
  • Drive technical discussions and contribute to architectural decisions that impact our platform's scalability and resilience
  • Participate in on-call rotations to ensure system reliability and respond to production incidents
What we offer
What we offer
  • Flexible work environment - ClickHouse is a globally distributed company and remote-friendly
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
  • Fulltime
Read More
Arrow Right

Sr. Software Engineer - Data Assets

6sense is on a mission to revolutionize how B2B organizations create revenue by ...
Location
Location
United States , San Francisco
Salary
Salary:
Not provided
https://6sense.com Logo
6sense
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS degree in Computer Science or related technical field or equivalent practical experience
  • Experience in the Big Data technologies (Hadoop, M/R, Hive, Spark, Flink, Metastore, Presto, Kafka etc.)
  • Experience with performing data analysis, data ingestion and data integration
  • Experience with ETL (Extraction, Transformation & Loading) and architecting data systems
  • Experience with schema design, data modeling and SQL queries
  • Passionate and self-motivated about technologies in the Big Data area
Job Responsibility
Job Responsibility
  • Design and build data transformations efficiently and reliably for different purposes (e.g. reporting, growth analysis, multi-dimensional analysis)
  • Design and implement reliable, scalable, robust and extensible big data systems that support core products and business
  • Establish solid design and best engineering practice for engineers as well as non-technical people
What we offer
What we offer
  • generous health insurance coverage
  • life, and disability insurance
  • a 401K employer matching program
  • paid holidays
  • self-care days
  • paid time off (PTO)
  • stock options
  • paid parental leave
  • generous paid time-off
  • quarterly self-care days off
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Data Engineer III

As a Data Engineer, you will play a key role in designing, developing, and maint...
Location
Location
India , Chennai
Salary
Salary:
Not provided
arcadia.com Logo
Arcadia
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines
  • Strong Python skills, especially in the context of data orchestration
  • Strong understanding of database management and design, including experience with Snowflake or an equivalent platform
  • Proficiency in SQL
  • Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts
  • Experience with Argo, Prefect, Airflow, or similar data orchestration tools
  • Excellent problem-solving and analytical skills with a strong attention to detail
  • Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business
  • Strong communication skills
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable and efficient data pipelines in an AWS environment, centered on our Snowflake instance and using Fivetran, Prefect, Argo, and dbt
  • Collaborate with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions
  • Design, build and maintain tooling that enables users and services to interact with our data platform, including CI/CD pipelines for our data lakehouse, unit/integration/validation testing frameworks for our data pipelines, and command-line tools for ad-hoc data evaluation
  • Identify and implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy
  • Optimize and tune data pipelines for improved performance, scalability, and reliability
  • Monitor data pipelines and proactively address any issues or bottlenecks to ensure uninterrupted data flow
  • Develop and maintain documentation for data pipelines, ensuring knowledge sharing and smooth onboarding of new team members
  • Implement data governance and security measures to ensure compliance with industry standards and regulations
  • Keep up to date with emerging technologies and trends in data engineering and recommend their adoption as appropriate
What we offer
What we offer
  • Competitive compensation based on market standards
  • Flexible Leave Policy
  • Office is in the heart of the city in case you need to step in for any purpose
  • Medical Insurance (1+5 Family Members)
  • We provide comprehensive coverage including accident policy and life Insurance
  • Annual performance cycle
  • Quarterly team engagement activities and rewards & recognitions
  • L&D programs to foster professional growth
  • A supportive engineering culture that values diversity, empathy, teamwork, trust, and efficiency
  • Fulltime
Read More
Arrow Right