This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Welcome to an exciting opportunity at Ericsson, where you'll step into the role of a Data Engineer working on high-volume, low-latency data platforms powering next-gen GenAI systems, AI agents, and enterprise copilots for Telecom OSS/BSS. We build production-grade systems that drive automation, intelligence, and decisioning at scale. Real systems. Real impact.
Job Responsibility:
Build and operate large-scale, high-throughput data systems handling massive datasets
Develop complex, scalable solutions using Python and Java
Design and optimize distributed data pipelines using Apache Spark
Engineer low-latency, high-performance data processing systems (batch + streaming)
Work with Cassandra and OpenSearch/Elasticsearch for high availability and scale
Develop scalable backend services and REST APIs (Spring Boot-based microservices)
Experience to work with AWS (Kiro) / Microsoft Copilot stack
Develop MCP-based applications and integrations with enterprise systems
Build RAG pipelines across network, service, customer, and operational data
Engineer data pipelines for embeddings, vector stores, and retrieval systems
Implement end-to-end Data/MLOps pipelines using Docker, Kubernetes, Kubeflow, and CI/CD
Ensure system performance, scalability, observability, and reliability
Manage and mitigate FOSS (Free & Open Source Software) vulnerabilities using security scanning and patching practices
Requirements:
Strong Python and Java expertise with experience building production-grade systems
Hands-on experience with Apache Spark (PySpark/Scala/Java) and distributed processing
Proven experience in high-volume, low-latency system design and optimization
Strong knowledge of Cassandra, OpenSearch/Elasticsearch, and NoSQL data modeling
Experience building scalable APIs and microservices (Spring Boot)
Hands-on experience with cloud platforms (AWS or GCP)
Strong working knowledge of Docker and Kubernetes
Experience with vulnerability management and non functional features ( alarm , logging , fault management etc.)
Good understanding of LLMs, embeddings, RAG, and GenAI data pipelines
Exposure to developer copilots / AI-assisted coding tools