CrawlJobs Logo

Gcp Engineer

https://www.roberthalf.com Logo

Robert Half

Location Icon

Location:
United States , Washington D.c.

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a mid-level Google Cloud Platform (GCP) Engineer with strong, hands-on experience across cloud administration, automation, system integration, and application support. This role supports production GCP environments and participates in cloud migration initiatives, including on-premises to cloud and cloud-to-cloud migrations.

Job Responsibility:

  • Administer and support GCP organizations, folders, projects, and billing
  • Manage IAM roles, service accounts, and access controls using least-privilege principles
  • Configure and maintain VPC networks, firewall rules, VPNs, and hybrid connectivity
  • Monitor platform health using Cloud Monitoring, Logging, and Alerting
  • Troubleshoot production issues and perform root-cause analysis
  • Support environments across development, test, staging, and production
  • Support cloud migration initiatives, including: On-premises to Google Cloud migrations, Cloud-to-cloud migrations (e.g., AWS or Azure to GCP)
  • Assist with: Migration planning and execution, Workload and dependency analysis, Data, application, and infrastructure migrations
  • Support cutovers, post-migration stabilization, and optimization
  • Help modernize legacy workloads into cloud-native or hybrid architectures
  • Integrate GCP services with enterprise systems such as: Identity platforms (Google Workspace, Active Directory, SSO), CI/CD pipelines and automation tooling, SaaS and internal applications
  • Support API-based and event-driven integrations using REST and Pub/Sub
  • Collaborate with security, networking, and application teams
  • Assist with hybrid and multi-cloud integration patterns
  • Assist development teams with: Environment provisioning, Deployment pipelines, Performance and reliability tuning
  • Review cloud architectures for scalability, resilience, security, and cost
  • Support data platforms such as: BigQuery, Cloud Storage, Pub/Sub, Dataflow
  • Assist with data ingestion pipelines and analytics workloads
  • Understand basic data governance, access controls, and performance tuning

Requirements:

  • 5 years of experience working with Google Cloud Platform to include BI tools available in Google Workspace
  • Active public trust and/or DOT onboarded is a strong plus
  • 3–6 years of hands-on experience working with Google Cloud Platform in production environments
  • Proven experience administering and supporting GCP services, including: IAM, VPC, Compute Engine, Cloud Storage, Monitoring, logging, and alerting
  • Hands-on experience with Infrastructure as Code and automation
  • Experience supporting migration initiatives (on-prem → cloud and/or cloud → cloud)
  • Proficiency in Python and/or shell scripting
  • Strong troubleshooting, problem-solving, and documentation skills
  • Ability to work independently while collaborating across teams
  • Bachelor / Masters in Computer Science, Information Systems or related discipline

Nice to have:

  • Experience with GKE (Kubernetes) or containerized workloads
  • Experience with CI/CD pipelines (Cloud Build, GitHub Actions, Jenkins, etc.)
  • Experience with data engineering concepts on any of the db platforms
  • Experience supporting enterprise identity integrations
  • Familiarity with cloud security best practices
  • Google Cloud certifications
  • Experience with data engineering and analytics platforms
What we offer:
  • medical
  • vision
  • dental
  • life and disability insurance
  • company 401(k) plan
  • free online training

Additional Information:

Job Posted:
January 26, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Gcp Engineer

Senior Data & AI/ML Engineer - GCP Specialization Lead

We are on a bold mission to create the best software services offering in the wo...
Location
Location
United States , Menlo Park
Salary
Salary:
Not provided
techjays.com Logo
techjays
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • GCP Services: BigQuery, Dataflow, Pub/Sub, Vertex AI
  • ML Engineering: End-to-end ML pipelines using Vertex AI / Kubeflow
  • Programming: Python & SQL
  • MLOps: CI/CD for ML, Model deployment & monitoring
  • Infrastructure-as-Code: Terraform
  • Data Engineering: ETL/ELT, real-time & batch pipelines
  • AI/ML Tools: TensorFlow, scikit-learn, XGBoost
  • Min Experience: 10+ Years
Job Responsibility
Job Responsibility
  • Design and implement data architectures for real-time and batch pipelines, leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI, and Cloud Storage
  • Lead the development of ML pipelines, from feature engineering to model training and deployment using Vertex AI, AI Platform, and Kubeflow Pipelines
  • Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions, CI/CD, and Model Registry
  • Define and implement data governance, lineage, monitoring, and quality frameworks
  • Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions
  • Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP
  • Contribute to building repeatable solution accelerators in Data & AI/ML
  • Work with the leadership team to align with Google Cloud Partner Program metrics
  • Mentor engineers and data scientists toward achieving GCP certifications, especially in Data Engineering and Machine Learning
  • Organize and lead internal GCP AI/ML enablement sessions
What we offer
What we offer
  • Best in class packages
  • Paid holidays and flexible paid time away
  • Casual dress code & flexible working environment
  • Medical Insurance covering self & family up to 4 lakhs per person
Read More
Arrow Right

Gcp Data Engineer

We at AlgebraIT are looking for a GCP Data Engineer with 3+ years of experience ...
Location
Location
United States , Austin
Salary
Salary:
Not provided
algebrait.com Logo
AlgebraIT
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering with GCP
  • Proficiency in Python, SQL, and GCP services
  • Experience with data pipeline orchestration tools
  • Strong problem-solving abilities and attention to detail
  • Bachelor’s degree in Computer Science or related field
Job Responsibility
Job Responsibility
  • Build and maintain scalable data pipelines using GCP tools
  • Ensure data security and governance
  • Monitor, troubleshoot, and optimize data workflows
  • Collaborate with stakeholders to gather requirements and deliver data solutions
  • Implement data quality checks and best practices
  • Develop and maintain ETL processes
  • Create detailed documentation of data processes
  • Work closely with data analysts and business teams for data alignment
  • Ensure high availability and reliability of data services
  • Stay current with GCP data technology advancements
  • Fulltime
Read More
Arrow Right

Senior DevOps Engineer (GCP)

Our client is a global UK-based financial services and investment banking organi...
Location
Location
Salary
Salary:
Not provided
n-ix.com Logo
N-iX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in DevOps, Cloud Engineering, or SRE roles
  • Strong hands-on experience with Google Cloud Platform, including: GKE / Kubernetes, Cloud Run, Cloud Functions, Pub/Sub, Cloud Storage, VPC, IAM, networking, security
  • Expertise in Terraform, Helm, or other IaC tools
  • Experience building CI/CD pipelines (GitHub Actions, GitLab CI, CircleCI, Jenkins, etc.)
  • Strong understanding of containerization and orchestration: Docker, Kubernetes
  • Solid experience with monitoring, observability, and logging stacks
  • Familiarity with networking, load balancing, security hardening, and zero-trust principles
  • Experience supporting production systems in high-availability, distributed environments
  • Strong scripting skills (Python, Bash, or similar)
  • Experience working with agile engineering teams
Job Responsibility
Job Responsibility
  • Design, implement, and maintain cloud infrastructure on Google Cloud (GKE, Cloud Run, Cloud Functions, Pub/Sub, Cloud Storage)
  • Build and optimize CI/CD pipelines (GitHub Actions, GitLab CI, Jenkins, or similar)
  • Develop infrastructure-as-code using Terraform or similar tools
  • Set up and maintain container orchestration (Kubernetes, GKE) and automated deployment workflows
  • Implement monitoring, alerting, and observability using tools such as Prometheus, Grafana, ELK/Elastic, Stackdriver, or OpenTelemetry
  • Ensure compliance with security and governance standards across all environments
  • Collaborate closely with engineering teams to ensure scalable, high-performance deployment architectures
  • Support AI/ML and GenAI workloads (Vertex AI pipelines, model hosting, GPU workloads, inference optimization)
  • Manage environment strategies, release pipelines, configuration management, and secrets management
  • Optimize cloud costs and recommend improvements for performance and reliability
What we offer
What we offer
  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits
Read More
Arrow Right

GCP Cloud Engineer

Wissen Technology is hiring an experienced GCP Cloud Engineer to design, impleme...
Location
Location
India , Mumbai | Pune
Salary
Salary:
Not provided
votredircom.fr Logo
Wissen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong knowledge of GCP services (GKE, Compute Engine, IAM, VPC, Cloud Storage, Cloud SQL, Cloud Functions)
  • Experience with GKE, Docker, GKE Networking, Helm
  • Hands-on experience with Azure DevOps for CI/CD pipeline automation
  • Expertise in Terraform for provisioning cloud resources
  • Proficiency in Python, Bash, or PowerShell for automation
  • Knowledge of cloud security principles, IAM, and compliance standards
  • Work Experience Min 8
  • Work Experience Max 15
Job Responsibility
Job Responsibility
  • Architect, deploy, and maintain GCP cloud resources via Terraform or other automation tools
  • Implement Google Cloud Storage, Cloud SQL, and Filestore for data storage and processing needs
  • Manage and configure Cloud Load Balancers (HTTP(S), TCP/UDP, and SSL Proxy) for high availability and scalability
  • Optimize resource allocation, monitoring, and cost efficiency across GCP environments
  • Deploy, manage, and optimize workloads on Google Kubernetes Engine (GKE)
  • Work with Helm charts for microservices deployments
  • Automate scaling, rolling updates, and zero-downtime deployments
  • Deploy and manage applications on Cloud Run and Cloud Functions for scalable, serverless workloads
  • Optimize containerized applications running on Cloud Run for cost efficiency and performance
  • Design, implement, and manage CI/CD pipelines using Azure DevOps
  • Fulltime
Read More
Arrow Right

Senior DevOps Engineer - AWS & GCP

We are seeking a passionate and experienced Senior DevOps Engineer to join our g...
Location
Location
India , Ahmedabad; Pune
Salary
Salary:
Not provided
techholding.co Logo
Tech Holding
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in DevOps
  • Strong hands-on experience with Terraform, Kubernetes, and GCP
  • Solid programming experience in Python or Go
  • Proficiency with CI/CD tools (e.g., Jenkins, GitLab CI, GitHub Actions, etc.)
  • Hands-on experience with monitoring and alerting tools such as Prometheus, Grafana, ELK, or similar
  • Deep understanding of networking, security, and system administration
  • A passion for developer productivity and building internal tools that empower engineering teams
  • Excellent communication, collaboration, and problem-solving skills
  • A positive and proactive approach to work and team dynamics
  • Bachelor’s degree in Computer Science, Engineering, or a related technical field
Job Responsibility
Job Responsibility
  • Architect, build, and manage scalable, secure, and resilient infrastructure on AWS and Google Cloud Platform (GCP)
  • Automate infrastructure provisioning using Terraform
  • Manage containerized workloads with Kubernetes
  • Develop and enhance CI/CD pipelines to support fast and reliable software delivery
  • Implement monitoring and alerting solutions to ensure system performance and reliability
  • Write scripts and tools in Python or Go to streamline infrastructure operations and support developer workflows
  • Collaborate with engineering teams to improve developer experience and productivity through tooling and automation
  • Participate in troubleshooting, root cause analysis, and performance tuning
  • Ensure adherence to security, compliance, and operational best practices across environments
What we offer
What we offer
  • A culture that values flexibility, work-life balance, and employee well-being - including Work From Home Fridays
  • Competitive compensation packages and comprehensive health benefits
  • Work with a collaborative, global team of engineers who thrive on solving complex challenges
  • Exposure to multi-cloud environments (AWS, GCP, Azure) and modern DevOps tooling at scale
  • Professional growth through continuous learning, mentorship, and access to new technologies
  • Leadership that recognizes contributions and supports career advancement
  • The chance to shape DevOps best practices and directly influence company-wide engineering culture
  • A people-first environment where your ideas matter and innovation is encouraged
  • Fulltime
Read More
Arrow Right

Lead Software Engineer Scientific Engine

Lead Software Engineer to manage a team of 4. As team lead, you will oversee: Th...
Location
Location
France , Paris
Salary
Salary:
Not provided
descartesunderwriting.com Logo
Descartes Underwriting
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 1 year or more of technical management experience
  • Handling human interactions between tech and business
  • Experience mentoring a team of software engineers by unblocking complex situations and sharing best practices (code reviews, pair programming..)
  • Scoping and defining tech priorities according to roadmap and maintenance
  • Excellent communication skills, in both formal and informal settings, and in English and French
  • 3 years of experiences as a software engineer or data scientist
  • Solid knowledge in Python
  • Solid engineering background: master in computer science, mathematics, physics or earth science
  • Experience optimizing and profiling python code
  • Experience in a microservices architecture
Job Responsibility
Job Responsibility
  • Contribute directly on the code base either individually, in pairs or more
  • Organize REX sessions to share the knowledge received with the rest of the team
  • Ensure compliance to internal standards and practices
  • Present the progress and goals
  • Contribute to the technical roadmap through architecture meetings, design documents
  • Lead & coach your engineer team to consistently deliver according to their roadmap
  • Provide expertise to help your team: Develop, optimize and update software for: Calculation of risk models
  • Data collection, preparation and visualization
  • Export of outputs adapted to users
  • Testing and validation of existing solutions
What we offer
What we offer
  • Opportunity to work and learn with teams from the most prestigious schools and research labs in the world
  • Commitment from Descartes to its staff of continued learning and development (think annual seminars, training etc.)
  • Work in a collaborative & professional environment
  • Be part of an international team, passionate about diversity
  • Join a company with a true purpose – help us help our clients be more resilient towards climate risks
  • A competitive salary, bonus and benefits
  • You can benefit from a punctual home office days
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer to design, develop, and optimize data platforms, pipelines,...
Location
Location
United States , Chicago
Salary
Salary:
160555.00 - 176610.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
  • 3 years of experience in data engineering
  • Experience building data platforms and pipelines
  • Experience with AWS, GCP or Azure
  • Experience with SQL and Python for data manipulation, transformation, and automation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with data governance, data quality, data lineage and metadata management
  • Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
  • Experience with CI/CD pipelines for continuous deployment and delivery of data products
  • Experience maintaining technical records and system designs
Job Responsibility
Job Responsibility
  • Design, develop, and optimize data platforms, pipelines, and governance frameworks
  • Enhance business intelligence, analytics, and AI capabilities
  • Ensure accurate data flows and push data-driven decision-making across teams
  • Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
  • Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
  • Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
  • Develop and manage CI/CD pipelines to support continuous deployment of data products
  • Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
  • Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
  • Develop and expose REST APIs for sharing data across teams
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Annual incentive program
  • Fulltime
Read More
Arrow Right

Intermediate / Senior Software Engineer Scientific Engine (Python)

Due to our consistent growth, we are seeking to expand our Data, Software and De...
Location
Location
France , Paris
Salary
Salary:
Not provided
descartesunderwriting.com Logo
Descartes Underwriting
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Coaching or mentoring experience
  • Scoping and identifying solutions with business team
  • Handling human interactions between tech and business
  • Excellent communication skills, in both formal and informal settings, and in English and French
  • 3 years or more of experiences as a software engineer or data scientist
  • Solid knowledge in Python
  • Solid engineering background: master in computer science, mathematics, physics or earth science
  • Experience optimizing and profiling python code
  • Experience in a microservices architecture
  • Good knowledge with Docker
Job Responsibility
Job Responsibility
  • Contribute directly on the code base either individually, in pairs or more
  • Organize REX sessions to share the knowledge received with the rest of the team
  • Ensure compliance to internal standards and practices
  • Present the progress and goals
  • Contribute to the technical roadmap through architecture meetings, design documents
  • Coach your collaborators to consistently deliver according to their roadmap
  • Provide expertise to help your team: Develop, optimize and update software for: Calculation of risk models
  • Data collection, preparation and visualization
  • Export of outputs adapted to users
  • Testing and validation of existing solutions
What we offer
What we offer
  • Opportunity to work and learn with teams from the most prestigious schools and research labs in the world, allowing you to progress towards technical excellence
  • Commitment from Descartes to its staff of continued learning and development (think annual seminars, training etc.)
  • Work in a collaborative & professional environment
  • Be part of an international team, passionate about diversity
  • Join a company with a true purpose – help us help our clients be more resilient towards climate risks
  • A competitive salary, bonus and benefits
  • You can benefit from a punctual home office days
Read More
Arrow Right