This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Cyber Defense within CVS Health is seeking a well experienced Staff Data Engineer to join and technically lead our growing team. Staff Data Engineer will be responsible for developing and deploying security specific data engineering solutions following CVS Health specific business and technology requirements. Additionally, this Staff Data Engineer will help to define, drive and deliver all aspects of the data engineering product development lifecycle, from solution architecting, programming, testing, implementation, delivery of products.
Job Responsibility:
Developing and deploying security specific data engineering solutions following CVS Health specific business and technology requirements
Defining, driving and delivering all aspects of the data engineering product development lifecycle
Driving the development and implementation of advanced data engineering pipelines and algorithms to solve complex data problems
Collaborating with multiple departments to understand business requirements, define data engineering projects, and prioritize initiatives
Defining performance metrics and evaluation methodologies for data engineering products
Contributing to rigorous testing, validation, and performance monitoring of products to ensure accuracy and reliability
Advising on the optimization and improvement of data pipelines and infrastructure
Presenting technical findings, insights, and recommendations to both technical and non-technical stakeholders
Managing team performance through regular, timely feedback
Staying up-to-date with the latest advancements in data engineering and related technologies
Requirements:
7+ years of programming experience in Python and experience with libraries such as PySpark
5+ years' experience with deploying complex streaming pipelines
5+ year(s) of soliciting complex requirements and managing relationships with key stakeholders
5+ year(s) of experience independently managing deliverables
3+ years of customer interfacing experience (internal or external), demonstrating excellent ability to communicate technical ideas and results to non-technical audiences
Bachelor degree from accredited university or equivalent work experience (HS diploma + 4 years relevant experience)
Nice to have:
Proficient in Databricks Delta Streaming for real-time data processing and analytics
Experience with Apache Kafka for building real-time data pipelines and streaming applications
Knowledge of Apache Flink for stream processing and complex event processing
Familiarity with Microsoft Azure services, including Azure Data Lake, Azure Databricks, and Azure Stream Analytics
Experience with Google Cloud Platform (GCP) services, such as Google BigQuery, Google Cloud Dataflow, and Google Pub/Sub
Skills in designing and implementing ETL processes using Databricks and other tools
Ability to integrate data from various sources using Kafka and Flink
Proficient in programming languages such as Python, Scala, or Java for data processing and application development
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.