CrawlJobs Logo
Briefcase Icon
Category Icon

Gcp Data Engineer Jobs (Hybrid work)

2 Job Offers

Filters
Data Engineer with GCP
Save Icon
Join WE ARE META as a Data Engineer with GCP expertise. This hybrid role in Porto requires 3+ years in data engineering, ETL, SQL, and hands-on GCP experience. Fluency in English and French is mandatory. Enjoy a welcome kit, health insurance, career growth, and a Coverflex meal card.
Location Icon
Location
Portugal , Porto
Salary Icon
Salary
Not provided
wearemeta.io Logo
We Are Meta
Expiration Date
Until further notice
Gcp Data Engineer
Save Icon
Seeking a certified Google Cloud Data Engineer in Pune. Design and manage Big Data pipelines using GCP tools like BigQuery, Dataflow, and Dataproc. Build real-time systems with Kafka and support MLOps. Requires 3-5 years' experience with DevOps, containers, and telecom knowledge.
Location Icon
Location
India , Pune
Salary Icon
Salary
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Explore the dynamic and in-demand field of GCP Data Engineer jobs, where professionals architect the foundational data systems that power modern analytics and artificial intelligence. A GCP Data Engineer specializes in designing, building, and maintaining robust data infrastructure on the Google Cloud Platform. Their core mission is to transform raw, often chaotic data into reliable, accessible, and secure information assets that drive business intelligence, machine learning models, and data-driven decision-making across an organization. Professionals in this role are responsible for the end-to-end data lifecycle. They typically design and implement scalable data pipelines, which involves ingesting data from various sources (batch and real-time), processing it, and storing it in optimized formats. A significant part of their day-to-day work involves leveraging core GCP services like BigQuery for data warehousing, Dataflow or Dataproc for data processing, Pub/Sub for event streaming, and Cloud Storage for data lakes. They ensure these pipelines are efficient, cost-effective, and performant through continuous monitoring and optimization. Common responsibilities extend beyond pipeline construction. GCP Data Engineers enforce data governance, security protocols, and compliance standards. They implement critical data quality checks and validation rules to ensure the integrity of the information. Collaboration is key, as they work closely with data scientists to operationalize machine learning models, with data analysts to understand business requirements, and with other engineers to integrate data systems. Creating and maintaining clear documentation for data processes and architectures is also a standard duty. The typical skill set for these jobs is a blend of cloud expertise, programming proficiency, and architectural thinking. Strong programming skills in Python and SQL are fundamental requirements. Deep, hands-on experience with GCP data services is the defining technical competency. Knowledge of data modeling, ETL/ELT design patterns, and data orchestration tools like Cloud Composer is expected. As roles advance, familiarity with Infrastructure-as-Code (e.g., Terraform), MLOps practices for model deployment, and real-time streaming architectures becomes highly valuable. Soft skills such as problem-solving, attention to detail, and the ability to communicate complex technical concepts to non-technical stakeholders are equally important for success. For those with a passion for cloud technology and data architecture, GCP Data Engineer jobs offer a challenging and rewarding career path at the intersection of engineering and analytics, providing the critical infrastructure that turns data into a strategic asset.

Filters

×
Countries
Category
Location
Work Mode
Salary