CrawlJobs Logo

Senior Data Engineer

https://www.cvshealth.com/ Logo

CVS Health

Location Icon

Location:
United States, Hartford

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

101970.00 - 203940.00 USD / Year

Job Description:

At CVS Health, we're building a world of health around every consumer and surrounding ourselves with dedicated colleagues who are passionate about transforming health care. As the nation's leading health solutions company, we reach millions of Americans through our local presence, digital channels and more than 300,000 purpose-driven colleagues – caring for people where, when and how they choose in a way that is uniquely more connected, more convenient and more compassionate.

Job Responsibility:

  • Design, build, and maintain scalable data pipelines using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery
  • Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts
  • Optimize BigQuery performance through partitioning, clustering, and query tuning
  • Implement data governance, security, and compliance best practices within GCP
  • Work with Cloud Storage, Pub/Sub, Ni-Fi, Cloud SQL and Bigtable for real-time and batch data processing
  • Monitor and troubleshoot data pipeline performance, failures, and cost efficiency
  • Collaborate with data scientists, analysts, and software engineers to support business requirements
  • Ensure data quality, validation, and integrity using appropriate testing frameworks

Requirements:

  • Strong expertise in GCP services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.)
  • Proficiency in SQL, Python, and Java for data processing and automation
  • Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform
  • Strong understanding of data modeling, warehousing, and distributed computing
  • Experience with real-time and batch processing architectures
  • Knowledge of CI/CD pipelines, Git, and DevOps best practices
  • Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.)
  • 5+ years of relevant work experience
  • Bachelor's degree or equivalent experience (HS diploma + 4 years relevant experience)

Nice to have:

  • Experience with machine learning pipelines on GCP (Vertex AI, AI Platform, etc.)
  • Exposure to Kafka, Ni-Fi, or other streaming technologies
  • Experience with containerization and orchestration (Docker, Kubernetes, GKE)
  • GCP certifications (e.g., Professional Data Engineer, Associate Cloud Engineer)
What we offer:
  • Affordable medical plan options
  • 401(k) plan with matching company contributions
  • Employee stock purchase plan
  • No-cost wellness screenings
  • Tobacco cessation and weight management programs
  • Confidential counseling and financial coaching
  • Paid time off
  • Flexible work schedules
  • Family leave
  • Dependent care resources
  • Colleague assistance programs
  • Tuition assistance
  • Retiree medical access

Additional Information:

Job Posted:
October 09, 2025

Expiration:
October 27, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.