This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Altamira is seeking a Data Engineer to design, build, and operate high-performance data pipelines and event-driven systems supporting mission-critical platforms. This role focuses on implementing and managing Apache Kafka–based messaging architectures and integrating real-time data streams with cloud-native applications and analytics platforms. The ideal candidate brings strong experience in distributed systems, data streaming technologies, and cloud environments, and is comfortable working in secure, high-reliability environments.
Job Responsibility:
Design, deploy, and operate Apache Kafka clusters in classified and hybrid environments
Build and maintain reliable, scalable, and secure data streaming pipelines
Develop and optimize producers, consumers, and stream processing applications
Configure and manage topics, partitions, replication, and retention policies
Monitor, tune, and troubleshoot Kafka performance, availability, and latency
Integrate streaming platforms with databases, storage systems, and analytics tools
Implement data governance, retention, and access control policies
Automate deployment and management of streaming infrastructure
Collaborate with platform, infrastructure, and application teams to support data requirements
Support system accreditation, compliance, and security requirements
Participate in architecture design and technical planning activities
Requirements:
Active TS/SCI clearance
Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience)
Experience in data engineering, distributed systems, or backend engineering roles
Hands-on experience with Apache Kafka in production environments
Experience building and supporting real-time data pipelines
Strong proficiency in Java, Python, Scala, or similar programming languages
Experience working in AWS or hybrid cloud environments
Strong Linux systems administration and troubleshooting skills
Ability to work effectively in secure, mission-focused environments
Nice to have:
Experience with Kafka Connect, Kafka Streams, or similar frameworks
Experience with stream processing platforms (Flink, Spark Streaming, etc.)
Experience with PostgreSQL, Redis, ArangoDB, or other data platforms
Experience with object storage systems such as MinIO or S3
Familiarity with Kubernetes-based deployments
Experience implementing data security and compliance controls
Prior experience supporting DoD or Intelligence Community programs