This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We're looking for a Data Engineer to join our Data Platform team and build the infrastructure that powers Socket's data ecosystem. You'll design and maintain systems that handle billions of records, enable real-time analytics, and power the insights our customers rely on to secure their software supply chain. This is a high-impact role where you'll work across the stack- from ingestion pipelines to analytics APIs - ensuring data flows reliably and is accessible when teams need it.
Job Responsibility:
Design and build scalable data pipelines that ingest, process, and transform high-volume event streams and historical data
Develop and maintain APIs that deliver analytics, trend reports, and drill-down capabilities to internal teams and external customers
Build robust infrastructure for data quality monitoring, ensuring accuracy and completeness across customer and artifact datasets
Optimize data storage and query performance using systems like ClickHouse, Kafka, NATS, and PostgreSQL to support real-time and batch use cases
Implement usage tracking, auditing, and event processing systems that provide visibility into platform behavior
Create reliable data ingestion systems for security scan results, SBOM data, and artifact metadata
Build infrastructure for outbound integrations that deliver Socket data to customer systems
Collaborate with product, security research, and engineering teams to understand data needs and deliver solutions that scale
Requirements:
5+ years of professional software engineering experience
3+ years of experience building data pipelines and infrastructure in production environments
Strong proficiency in Node.js and TypeScript for backend development
Experience with Kafka or other streaming platforms like NATS, RabbitMQ, or Kinesis in event-driven architectures
Hands-on experience with ClickHouse or other columnar/OLAP databases like BigQuery, Snowflake, DuckDB, or similar
Solid understanding of data modeling, schema design, and query optimization
Familiarity with Parquet or other cloud data lake technologies like Delta Lake or Iceberg
Experience building REST APIs and data access layers for analytics use cases
Comfort working with large-scale distributed systems and debugging performance bottlenecks
Strong ownership mindset - you take responsibility for the systems you build and ensure they're reliable
Clear communication skills
you can explain technical trade-offs to both engineers and non-technical stakeholders
Nice to have:
Experience with time-series data and real-time analytics
Familiarity with security or DevOps tooling ecosystems
Background working with SBOM formats or supply chain security concepts
Experience with data quality frameworks and observability tools
Understanding of multi-tenant architectures and data isolation patterns
What we offer:
Market competitive salary bands
Meaningful equity program
Comprehensive health benefits for you and your family
Flexible time-off, holidays, and winter shutdown to rest & recharge