CrawlJobs Logo

Software Engineer - Data Ingestion

coralogix.com Logo

Coralogix

Location Icon

Location:
Israel , Ramat Gan

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Coralogix is a modern, full-stack observability platform transforming how businesses process and understand their data. Our unique architecture powers in-stream analytics without reliance on expensive indexing or hot storage. We specialize in comprehensive monitoring of logs, metrics, trace and security events with features such as APM, RUM, SIEM, Kubernetes monitoring and more, all enhancing operational efficiency and reducing observability spend by up to 70%. We are seeking engineers with hands-on experience in developing and managing distributed systems and microservices in a production environment. The ideal candidate should have a solid background in infrastructure and operations. The team builds high-throughput ingestion and processing pipelines, efficient storage on object stores, and systems for data governance, usage reporting, and routing data to multiple physical locations. You will work on cloud-native, production systems written in Rust and Scala, operating at scale with Kafka, Postgres, Redis, and object storage, run on Kubernetes and operate in a multi-cloud environment.

Job Responsibility:

  • Develop and operate distributed systems in production
  • Build Kafka-based ingestion and processing pipelines
  • Design systems for data governance, retention, deletion, and usage reporting
  • Work with Postgres and Redis, requiring solid database design and operational knowledge
  • Implement efficient persistence using column-oriented data formats and object storage

Requirements:

  • Located in Israel
  • 5+ years of software development experience
  • Production experience with large-scale Apache Kafka or comparable distributed data streaming platforms
  • Strong understanding of distributed systems, databases, and production operations
  • Experience with Scala or Rust
  • B.Sc. in Computer Science or an equivalent field

Nice to have:

  • Experience with Kafka Streams or similar frameworks
  • Experience with column-oriented data formats and large-scale analytical storage systems
  • Experience with Apache Arrow and DataFusion

Additional Information:

Job Posted:
March 04, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Software Engineer - Data Ingestion

Senior Backend Engineer - Data Ingestion

The ClickPipes Platform plays a critical role in driving the growth of our compa...
Location
Location
Canada
Salary
Salary:
Not provided
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building data-intensive software solutions
  • Strong knowledge of Golang and experience with its ecosystem
  • Experience with distributed systems and microservices architecture
  • The ability to design and build robust ETL data pipelines that can handle large volumes of data reliably and efficiently
  • Understanding data replication methodologies like CDC
  • Good knowledge of cloud-native architecture and practical experience with at least one major CSP
  • You have excellent communication skills and the ability to work well within a team and across engineering teams
  • You are a strong problem solver and have solid production debugging skills
Job Responsibility
Job Responsibility
  • Develop and enhance integrations with various data sources including streaming platforms, databases, data lakes, and object stores
  • Continuously improve our systems based on operational metrics, customer feedback, and evolving business requirements
  • Drive technical discussions and contribute to architectural decisions that impact our platform's scalability and resilience
  • Participate in on-call rotations to ensure system reliability and respond to production incidents
What we offer
What we offer
  • Flexible work environment
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
Read More
Arrow Right

Senior Backend Engineer - Data Ingestion

The ClickPipes Platform plays a critical role in driving the growth of our compa...
Location
Location
Spain
Salary
Salary:
Not provided
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building data-intensive software solutions
  • Strong knowledge of Golang and experience with its ecosystem
  • Experience with distributed systems and microservices architecture
  • The ability to design and build robust ETL data pipelines that can handle large volumes of data reliably and efficiently
  • Understanding data replication methodologies like CDC
  • Good knowledge of cloud-native architecture and practical experience with at least one major CSP
  • Excellent communication skills and the ability to work well within a team and across engineering teams
  • Strong problem solver and have solid production debugging skills
Job Responsibility
Job Responsibility
  • Develop and enhance integrations with various data sources including streaming platforms, databases, data lakes, and object stores
  • Continuously improve our systems based on operational metrics, customer feedback, and evolving business requirements
  • Drive technical discussions and contribute to architectural decisions that impact our platform's scalability and resilience
  • Participate in on-call rotations to ensure system reliability and respond to production incidents
What we offer
What we offer
  • Flexible work environment
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – opportunities to engage with colleagues at company-wide offsites
Read More
Arrow Right

Software Engineer, Data Infrastructure

The Data Infrastructure team at Figma builds and operates the foundational platf...
Location
Location
United States , San Francisco; New York
Salary
Salary:
149000.00 - 350000.00 USD / Year
figma.com Logo
Figma
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of Software Engineering experience, specifically in backend or infrastructure engineering
  • Experience designing and building distributed data infrastructure at scale
  • Strong expertise in batch and streaming data processing technologies such as Spark, Flink, Kafka, or Airflow/Dagster
  • A proven track record of impact-driven problem-solving in a fast-paced environment
  • A strong sense of engineering excellence, with a focus on high-quality, reliable, and performant systems
  • Excellent technical communication skills, with experience working across both technical and non-technical counterparts
  • Experience mentoring and supporting engineers, fostering a culture of learning and technical excellence
Job Responsibility
Job Responsibility
  • Design and build large-scale distributed data systems that power analytics, AI/ML, and business intelligence
  • Develop batch and streaming solutions to ensure data is reliable, efficient, and scalable across the company
  • Manage data ingestion, movement, and processing through core platforms like Snowflake, our ML Datalake, and real-time streaming systems
  • Improve data reliability, consistency, and performance, ensuring high-quality data for engineering, research, and business stakeholders
  • Collaborate with AI researchers, data scientists, product engineers, and business teams to understand data needs and build scalable solutions
  • Drive technical decisions and best practices for data ingestion, orchestration, processing, and storage
What we offer
What we offer
  • equity
  • health, dental & vision
  • retirement with company contribution
  • parental leave & reproductive or family planning support
  • mental health & wellness benefits
  • generous PTO
  • company recharge days
  • a learning & development stipend
  • a work from home stipend
  • cell phone reimbursement
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Core Data

As a Senior Software Engineer on our Core Data team, you will take a leading rol...
Location
Location
United States
Salary
Salary:
190000.00 - 220000.00 USD / Year
pomelocare.com Logo
Pomelo Care
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience building high-quality, scalable data systems and pipelines
  • Expert-level proficiency in SQL and Python, with a deep understanding of data modeling and transformation best practices
  • Hands-on experience with dbt for data transformation and Dagster, Beam, Dataflow or similar tools for pipeline orchestration
  • Experience with modern data stack tools and cloud platforms, with a strong understanding of data warehouse design principles
  • A track record of delivering elegant and maintainable solutions to complex data problems that drive real business impact
Job Responsibility
Job Responsibility
  • Build and maintain elegant data pipelines that orchestrate ingestion from diverse sources and normalize data for company-wide consumption
  • Lead the design and development of robust, scalable data infrastructure that enables our clinical and product teams to make data-driven decisions, using dbt, Dagster, Beam and Dataflow
  • Write clean, performant SQL and Python to transform raw data into actionable insights that power our platform
  • Architect data models and transformations that support both operational analytics and new data-driven product features
  • Mentor other engineers, providing technical guidance on data engineering best practices and thoughtful code reviews, fostering a culture of data excellence
  • Collaborate with product, clinical and analytics teams to understand data needs and ensure we are building infrastructure that unlocks the most impactful insights
  • Optimize data processing workflows for performance, reliability and cost-effectiveness
What we offer
What we offer
  • Competitive healthcare benefits
  • Generous equity compensation
  • Unlimited vacation
  • Membership in the First Round Network (a curated and confidential community with events, guides, thousands of Q&A questions, and opportunities for 1-1 mentorship)
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Data Platform

We are looking for a foundational member of the Data Team to enable Skydio to ma...
Location
Location
United States , San Mateo
Salary
Salary:
180000.00 - 240000.00 USD / Year
skydio.com Logo
Skydio
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience
  • 2+ years in software engineering
  • 2+ years in data engineering with a bias towards getting your hands dirty
  • Deep experience with Databricks building pipelines, managing datasets, and developing dashboards or analytical applications
  • Proven track record of operating scalable data platforms, defining company-wide patterns that ensure reliability, performance, and cost effectiveness
  • Proficiency in SQL and at least one modern programming language (we use Python)
  • Comfort working across the full data stack — from ingestion and transformation to orchestration and visualization
  • Strong communication skills, with the ability to collaborate effectively across all levels and functions
  • Demonstrated ability to lead technical direction, mentor teammates, and promote engineering excellence and best practices across the organization
  • Familiarity with AI-assisted data workflows, including tools that accelerate data transformations or enable natural-language interfaces for analytics
Job Responsibility
Job Responsibility
  • Design and scale the data infrastructure that ingests live telemetry from tens of thousands of autonomous drones
  • Build and evolve our Databricks and Palantir Foundry environments to empower every Skydian to query data, define jobs, and build dashboards
  • Develop data systems that make our products truly data-driven — from predictive analytics that anticipate hardware failures, to 3D connectivity mapping, to in-depth flight telemetry analysis
  • Create and integrate AI-powered tools for data analysis, transformation, and pipeline generation
  • Champion a data-driven culture by defining and enforcing best practices for data quality, lineage, and governance
  • Collaborate with autonomy, manufacturing, and operations teams to unify how data flows across the company
  • Lead and mentor data engineers, analysts, and stakeholders across Skydio
  • Ensure platform reliability by implementing robust monitoring, observability, and contributing to the on-call rotation for critical data systems
What we offer
What we offer
  • Equity in the form of stock options
  • Comprehensive benefits packages
  • Relocation assistance may also be provided for eligible roles
  • Paid vacation time
  • Sick leave
  • Holiday pay
  • 401K savings plan
  • Fulltime
Read More
Arrow Right

Senior Backend Engineer- Data Ingestion - (ClickPipes Platform)

The ClickPipes Platform plays a critical role in driving the growth of our compa...
Location
Location
United States
Salary
Salary:
133450.00 - 197200.00 USD / Year
clickhouse.com Logo
ClickHouse
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant software development industry experience building data-intensive software solutions
  • Strong knowledge of Golang and experience with its ecosystem
  • Experience with distributed systems and microservices architecture
  • The ability to design and build robust ETL data pipelines that can handle large volumes of data reliably and efficiently
  • Understanding data replication methodologies like CDC
  • Good knowledge of cloud-native architecture and practical experience with at least one major CSP
  • You have excellent communication skills and the ability to work well within a team and across engineering teams
  • You are a strong problem solver and have solid production debugging skills
Job Responsibility
Job Responsibility
  • Develop and enhance integrations with various data sources including streaming platforms, databases, data lakes, and object stores
  • Continuously improve our systems based on operational metrics, customer feedback, and evolving business requirements
  • Drive technical discussions and contribute to architectural decisions that impact our platform's scalability and resilience
  • Participate in on-call rotations to ensure system reliability and respond to production incidents
What we offer
What we offer
  • Flexible work environment - ClickHouse is a globally distributed company and remote-friendly
  • Healthcare - Employer contributions towards your healthcare
  • Equity in the company - Every new team member who joins our company receives stock options
  • Time off - Flexible time off in the US, generous entitlement in other countries
  • A $500 Home office setup if you’re a remote employee
  • Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
  • Fulltime
Read More
Arrow Right

Senior Software Engineer - Data Team

We’re looking for Software Engineers to join our Data Department, developers wit...
Location
Location
Spain , Barcelona; Madrid
Salary
Salary:
50000.00 - 70000.00 EUR / Year
https://feverup.com/fe Logo
Fever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Python Proficiency: confident working deeply in Python, understand topics like the GIL, concurrency (asyncio), generators, and decorators, care about maintainable typing and thoughtful performance optimization
  • Architecture Patterns: comfortable applying Hexagonal Architecture to keep domain logic clean and decoupled, can leverage patterns like CQRS and the Transactional Outbox to support consistency and reliability in an event-driven environment
  • Database Polyglot: strong SQL fundamentals, know how to design for performance (PostgreSQL internals, indexing strategies), understand when tools like Redis (caching) or Elasticsearch (search/aggregations) are the right fit
  • Communication: communicate clearly in English across audiences
  • Pragmatic mindset: balance quality with impact, able to make thoughtful trade-offs, deliver iteratively, and keep an eye on long-term sustainability while moving at a good pace
Job Responsibility
Job Responsibility
  • Architect and Build: Design, implement, and maintain scalable microservices using Python (FastAPI/Django), take ownership of breaking down complex monoliths or building new services from the ground up, applying DDD principles
  • Master the Event Stream: Build robust, event-driven flows with Kafka, ensure that our events are durable, ordered, and processed idempotently, managing eventual consistency with care
  • Integrate at Scale: Design fault-tolerant integrations with third-party ecosystems (Meta Ads, Google Marketing Platform, Salesforce), navigate rate limits, retries, and circuit breakers to maintain platform stability
  • Bridge OLTP and OLAP: Work at the intersection of transactional applications and analytical data, optimize PostgreSQL for operational efficiency while designing ingestion pipelines for Snowflake and Elasticsearch, using Airflow and dbt
  • Productionize Data Capabilities: Partner closely with Data Science, Machine Learning, and Data Engineering teams to ensure seamless integration of data sources and model infrastructure
  • Elevate the Bar: Lead thorough code reviews, write RFCs for key technical decisions, and mentor mid-level engineers, champion testing strategies (unit, integration, contract testing) and advocate for clean, sustainable code architecture
What we offer
What we offer
  • Responsibility from day one and professional and personal growth
  • Opportunity to have a real impact in a high-growth global category leader
  • A compensation package consisting of base salary and the potential to earn a significant bonus for top performance
  • Stock options plan
  • 40% discount on all Fever events and experiences
  • Home office friendly
  • Health insurance and other benefits such as Flexible remuneration with a 100% tax exemption through Cobee
  • English / Spanish Lessons
  • Wellhub Membership
  • Possibility to receive in advance part of your salary by Payflow
  • Fulltime
Read More
Arrow Right

Sr. Software Engineer - Data Assets

6sense is on a mission to revolutionize how B2B organizations create revenue by ...
Location
Location
United States , San Francisco
Salary
Salary:
Not provided
https://6sense.com Logo
6sense
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS degree in Computer Science or related technical field or equivalent practical experience
  • Experience in the Big Data technologies (Hadoop, M/R, Hive, Spark, Flink, Metastore, Presto, Kafka etc.)
  • Experience with performing data analysis, data ingestion and data integration
  • Experience with ETL (Extraction, Transformation & Loading) and architecting data systems
  • Experience with schema design, data modeling and SQL queries
  • Passionate and self-motivated about technologies in the Big Data area
Job Responsibility
Job Responsibility
  • Design and build data transformations efficiently and reliably for different purposes (e.g. reporting, growth analysis, multi-dimensional analysis)
  • Design and implement reliable, scalable, robust and extensible big data systems that support core products and business
  • Establish solid design and best engineering practice for engineers as well as non-technical people
What we offer
What we offer
  • generous health insurance coverage
  • life, and disability insurance
  • a 401K employer matching program
  • paid holidays
  • self-care days
  • paid time off (PTO)
  • stock options
  • paid parental leave
  • generous paid time-off
  • quarterly self-care days off
  • Fulltime
Read More
Arrow Right