CrawlJobs Logo

Confluent Kafka Developer

https://www.soprasteria.com Logo

Sopra Steria

Location Icon

Location:
India , Bengaluru

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

We are looking for a Confluent Kafka Developer to manage and support our Kafka ecosystem. This role involves supporting L2 activities, ensuring reliability and performance through monitoring, and collaborating with development teams to implement low latency data streaming use cases. Requires experience with Confluent Kafka administration and backend development skills in Java, .NET or Node.js.

Job Responsibility:

  • Managing and supporting our Kafka ecosystem
  • Supporting L2 activities
  • Ensuring reliability and performance via monitoring
  • Implementing data streaming use-cases with development teams
  • Adhering to internal compliance policies and procedures
  • Acting with integrity and ensuring sustainable growth

Requirements:

  • Confluent
  • Kafka - Developer (70% in development) and 30% in admin is a plus
  • Node/Java/.NET - >=3 Years experience
  • 4-6 Years experience with BE/B.Tech

Nice to have:

  • Experience with Java, .NET or Node.js
  • Strong collaboration with the Kafka team in ASA and suppliers
What we offer:
  • Inclusive and respectful work environment
  • Open to people with disabilities

Additional Information:

Job Posted:
April 26, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Confluent Kafka Developer

Kafka Technical Lead

We are looking for an experienced Confluent Kafka Administrator/Developer to ass...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Confluent
  • Kafka
  • Node/Java/.NET - >=3 Years experience
  • strong experience with Confluent Kafka administration
  • strong experience with developing low latency data streaming solutions
  • experience as a backend developer with Java, .NET or Node.js
Job Responsibility
Job Responsibility
  • Managing and supporting our Kafka ecosystem
  • Supporting L2 activities and ensuring reliability and performance via monitoring
  • Working with development teams to implement data streaming use-cases
  • Awareness of compliance risks
  • Commitment to act with integrity and adherence to policies governing business activities
  • Compliance to protect Airbus reputation and brand
Read More
Arrow Right

Kafka Engineer

This role is integral within our client's team, where you will be tasked with ad...
Location
Location
United States , Santa Clara
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 10 years of detail-oriented experience as a Software Engineer
  • Strong problem-solving abilities and analytical skills
  • Solid understanding of software engineering principles and methodologies
  • Exceptional communication skills, both written and verbal
  • Strong ability to learn new technologies quickly and apply them in problem-solving
  • Bachelor's degree in Computer Science, Information Technology, or a related field
  • Prior experience in managing a team or leading a project will be considered an advantage
  • Proactive approach, with the ability to handle multiple projects simultaneously and meet deadlines
  • Familiarity with other programming languages or technologies is a plus
Job Responsibility
Job Responsibility
  • Administering and maintaining various aspects of Confluent Kafka clusters including multi DC brokers, connectors, C3, KSQL DB, Rest Proxy, and Schema registry
  • Configuring and managing Kafka topics, RBAC, connectors, KSQL, and schema registry while adhering to security, availability, scalability, and DR standards
  • Supporting Java, Node.js, and Python based Kafka clients and microservices
  • Performing basic administration tasks of Apache Nifi (OSS)
  • Understanding user data flow requirements and designing and developing Kafka based solutions using Confluent Kafka, Connectors KSQL, and Nifi
  • Providing low-code data flow alternatives
  • Utilizing experience with data, cloud (AWS), and Queues Connectors in both design and configuration tasks
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • Eligibility to enroll in company 401(k) plan
  • Fulltime
Read More
Arrow Right

Backend Software Developer

We are seeking a highly skilled developer with backend expertise, particularly i...
Location
Location
United States , Grand Rapids
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5 to 10 years of relevant experience in backend development, preferably using .NET and C#
  • hands-on experience with Microsoft Azure (Function Apps, Durable Functions, Logic Apps, Service Bus, Cosmos DB, Storage Accounts)
  • hands-on experience with Confluent Kafka
  • hands-on experience with CI/CD best practices
  • hands-on experience with unit test case development
Job Responsibility
Job Responsibility
  • Follow the company’s software development lifecycle to design, code, configure, test, debug, and document systems and application programs
  • assist in preparing technical design specifications based on functional requirements and analysis documents
  • review functional requirements, analysis, and design documents, providing constructive feedback
  • collaborate with other developers to ensure quality, consistency, and maintainability of code
  • participate in architecture design discussions and code reviews
  • develop and maintain system and operational-level documentation
  • work within the SAFe Agile framework
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • eligibility to enroll in the company 401(k) plan
  • access to top jobs
  • competitive compensation
  • free online training
Read More
Arrow Right
New

Kafka Administration

Location
Location
United States
Salary
Salary:
Not provided
thirdeyedata.ai Logo
Thirdeye Data
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3-5 years of hands-on experience managing Confluent Kafka clusters in production environments, preferably on-premises
  • Solid understanding of distributed systems, high availability, and failover mechanisms
  • Experience with Kafka Cluster Linking and cross-cluster replication
Job Responsibility
Job Responsibility
  • Install, configure, and maintain Confluent Kafka clusters in an on-premises environment
  • Manage Kafka brokers, Zookeeper, Connect, Kafka Streams, and related services
  • Monitor and tune the performance of Kafka clusters to meet operational SLAs
  • Implement and manage Kafka security including SSL/TLS, ACLs, role-based access control (RBAC)
  • Ensure that Kafka infrastructure complies with the organization’s security policies and best practices
  • Set up monitoring and alerting for Kafka clusters using tools like Prometheus, Grafana, or Confluent Control Center
  • Analyze logs and metrics to proactively detect and resolve issues
  • Regularly perform maintenance and upgrades for Kafka components (brokers, Zookeeper, etc.)
  • Handle Kafka backup and disaster recovery procedures
  • Provide operational support and incident management for Kafka-related issues
  • Fulltime
Read More
Arrow Right

Applications Development Sr Programmer Analyst

Integration Services within Common Platform Engineering is responsible for devel...
Location
Location
Canada , Mississauga
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience working in Financial Services or a large complex and/or global environment
  • Experience of the following technologies: Kafka Ecosystem (Confluent distribution preferred)
  • Kubernetes and Openshift
  • Java
  • React
  • Familiarity with SRE practices
  • Consistently demonstrates clear and concise written and verbal communication
Job Responsibility
Job Responsibility
  • Designing and developing workflow solutions to integrate Kafka with our data governance and control platforms
  • Understanding the existing onboarding flow and working to streamline and simplify the process
  • Design and develop developer facing tooling to manage topics and connectors
  • Help to deliver the SRE requirements for this stack
  • Fulltime
Read More
Arrow Right

Data Operations Engineer

BA Markets wants to professionalise and streamline its activities with regards t...
Location
Location
Poland , Katowice
Salary
Salary:
Not provided
vattenfall.com Logo
Vattenfall
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Interest in understanding what the user needs with several years of hands-on experience as software developer with an interest in the responsibilities of a data engineer or vice versa
  • A proactive, communicative team player
  • Fluent in English
  • Deep understanding of Kafka architecture (brokers, topics, partitions, replication), and experience with Kafka Streams, Kafka Connect, and schema registry (e.g., Confluent)
  • Proficiency in designing and managing Kafka clusters (including monitoring and scaling)
  • Hands-on Experience building and maintaining real-time ETL pipelines
  • Familiarity with stream processing frameworks like: Apache Flink or Apache Spark Streaming
  • Strong skills in: Python and Java and at least basic Scala and at least solid SQL experience
  • Has build several CI/CD pipelines (e.g., Jenkins, GitLab CI, GitHub Actions)
  • And used infrastructure as Code (IaC) tools like Terraform or Ansible
Job Responsibility
Job Responsibility
  • Stream deployment and stream architecture, developments and deployments
  • Automate workflows and orchestrate data pipelines
  • Implement CI/CD routines
  • Implement and monitor “system health” with observability tools and data quality checks
  • Support the development of Client Libraries so other applications can integrate streams in own application and services
  • Perform Python development
  • Perform “glue code” development that 95% of use cases can apply
What we offer
What we offer
  • Good remuneration
  • Challenging and international work environment
  • Possibility to work with some of the best in the field
  • Working in interdisciplinary teams
  • Support from committed colleagues
  • Attractive employment conditions
  • Opportunities for personal and professional development
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

The Engineering Lead Analyst is a senior level position responsible for leading ...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness
  • Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions
  • Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations
What we offer
What we offer
  • Equal opportunity employer commitment
  • Accessibility and accommodation support
  • Global workforce benefits
  • Fulltime
Read More
Arrow Right

Data Engineer

This is a data engineer position - a programmer responsible for the design, deve...
Location
Location
India , Chennai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5-8 years of experience in working in data eco systems
  • 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing 'big data' data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
Job Responsibility
Job Responsibility
  • Ensuring high quality software development, with complete documentation and traceability
  • Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data
  • Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance
  • Ensure efficient data storage and retrieval using Big Data
  • Implement best practices for spark performance tuning including partition, caching and memory management
  • Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins)
  • Work on batch processing frameworks for Market risk analytics
  • Promoting unit/functional testing and code inspection processes
  • Work with business stakeholders and Business Analysts to understand the requirements
  • Work with other data scientists to understand and interpret complex datasets
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.