CrawlJobs Logo

Gcp Data Engineer - Vois

vodafone.com Logo

Vodafone

Location Icon

Location:
India , Pune

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a cloud-focused data professional who will contribute to the design, development and support of scalable data solutions on Google Cloud Platform (GCP). This role is ideal for individuals who enjoy working with large datasets, cloud-native services and modern data engineering practices while collaborating closely with data engineers and platform teams to deliver reliable, secure and high-performing solutions.

Job Responsibility:

  • Write, optimise and debug complex SQL queries and procedural database logic within GCP BigQuery environments
  • Build, deploy and maintain GCP-based infrastructure using services such as Cloud Functions, App Engine and Cloud Run
  • Collaborate with data engineering teams to process and manage large datasets using BigQuery and Cloud Storage
  • Apply best practices for performance optimisation, security and scalability across cloud-based applications
  • Develop unit tests and maintain clear technical documentation to support long-term maintainability and code quality
  • Troubleshoot and resolve issues in cloud-based applications and services within production environments
  • Contribute to cloud-native development approaches and continuous improvement of data platforms

Requirements:

  • Experienced in working with GCP services, particularly BigQuery, Data Fusion and Cloud Composer
  • Proficient in SQL and experienced in database development and optimisation
  • Comfortable working with both SQL and NoSQL database systems
  • Familiar with data processing frameworks and cloud-native development principles
  • Able to work collaboratively across teams and communicate technical concepts clearly
  • Committed to writing clean, testable and well-documented code
What we offer:
  • Opportunity to work on enterprise-scale cloud and data platforms within a global organisation
  • Exposure to modern GCP services and data engineering tools in real-world production environments
  • A collaborative and inclusive culture that values learning, knowledge sharing and continuous improvement
  • The chance to contribute to solutions that support large-scale business transformation initiatives

Additional Information:

Job Posted:
May 15, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Gcp Data Engineer - Vois

Data Engineer - GCP, Python & BigQuery - VOIS

We are seeking a Data Engineer with strong expertise in GCP BigQuery, Data Fusio...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7–10 years of relevant experience in data engineering or data warehousing roles
  • Strong proficiency in SQL and PL/SQL, particularly within GCP BigQuery environments
  • Hands-on experience with GCP services such as BigQuery and Data Fusion, with working knowledge of other GCP components
  • Understanding of data warehousing concepts, database systems (SQL and NoSQL) and data processing frameworks
  • Exposure to Python for data processing or cloud-based development
  • Strong problem-solving, debugging and analytical skills
  • Clear communication and effective collaboration within cross-functional teams
  • Graduate or postgraduate degree in a technical discipline, preferably Engineering
Job Responsibility
Job Responsibility
  • Design, develop and maintain data solutions using GCP services such as BigQuery, Data Fusion, Cloud Storage, Cloud Functions, App Engine and Cloud Run
  • Write, debug and optimise complex SQL and PL/SQL queries to process and analyse large-scale datasets
  • Collaborate with data engineers and stakeholders to process and manage large volumes of structured data
  • Ensure best practices are followed for performance, security, scalability and reliability of cloud-based data applications
  • Develop and maintain unit tests and technical documentation to ensure code quality and long-term maintainability
  • Troubleshoot and resolve issues in cloud-based data applications and services in production environments
  • Apply cloud-native development principles and modern data engineering practices
  • Contribute to continuous improvement initiatives within data warehousing and analytics platforms
What we offer
What we offer
  • Opportunity to work on enterprise-scale data platforms using modern GCP technologies
  • Exposure to global analytics programmes and shared services delivery models
  • A collaborative environment that values process improvement, quality and innovation
  • The chance to influence data reliability, performance and analytics outcomes
Read More
Arrow Right

Data Engineer (GCP) - VOIS

We are seeking a Data Engineer with strong expertise in GCP BigQuery, Data Fusio...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hold a Bachelor’s or Master’s degree in Computer Science, Information Technology or a related field
  • Experienced with SQL and NoSQL databases
  • Knowledgeable in data warehousing concepts and industry best practices
  • Familiar with data integration tools and frameworks
  • Demonstrate strong problem-solving and analytical capabilities
  • Communicate clearly and collaborate effectively with diverse teams
  • Comfortable working in a dynamic, fast-paced environment
Job Responsibility
Job Responsibility
  • Design, develop and maintain data solutions using GCP services such as BigQuery, Data Fusion, Cloud Storage, Cloud Functions, App Engine and Cloud Run
  • Write, debug and optimise complex SQL and PL/SQL queries to process and analyse large-scale datasets
  • Collaborate with data engineers and stakeholders to process and manage large volumes of structured data
  • Ensure best practices are followed for performance, security, scalability and reliability of cloud-based data applications
  • Develop and maintain unit tests and technical documentation to ensure code quality and long-term maintainability
  • Troubleshoot and resolve issues in cloud-based data applications and services in production environments
  • Apply cloud-native development principles and modern data engineering practices
  • Contribute to continuous improvement initiatives within data warehousing and analytics platforms
What we offer
What we offer
  • Opportunity to work on modern, cloud-native data platforms using Google Cloud technologies
  • Exposure to enterprise-scale data engineering challenges across diverse business domains
  • A collaborative environment that values continuous improvement and shared learning
  • The ability to influence data quality, performance and analytics outcomes across teams
Read More
Arrow Right

Data Engineer (GCP) - VOIS

We are seeking a skilled Data Engineer to design, build and maintain scalable da...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hold a Bachelor’s or Master’s degree in Computer Science, Information Technology or a related field
  • Experienced with SQL and NoSQL databases
  • Knowledgeable in data warehousing concepts and industry best practices
  • Familiar with data integration tools and frameworks
  • Demonstrate strong problem-solving and analytical capabilities
  • Communicate clearly and collaborate effectively with diverse teams
  • Comfortable working in a dynamic, fast-paced environment
Job Responsibility
Job Responsibility
  • Design, develop and maintain scalable data pipelines and ETL processes using GCP services including BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer, Cloud Functions and Cloud Run
  • Collaborate with data scientists, analysts and business stakeholders to understand data requirements and deliver robust data solutions
  • Implement data integration solutions to ingest, process and store structured and unstructured data from multiple sources
  • Optimise and tune data pipelines for performance, reliability and cost efficiency
  • Ensure data quality through validation, cleansing and transformation processes
  • Develop and maintain data models, schemas and metadata to support analytics and reporting needs
  • Monitor, troubleshoot and resolve data pipeline issues to minimise disruption
  • Stay current with GCP technologies and best practices, recommending improvements where appropriate
  • Mentor junior data engineers and promote a collaborative, knowledge-sharing culture
What we offer
What we offer
  • Opportunity to work on modern, cloud-native data platforms using Google Cloud technologies
  • Exposure to enterprise-scale data engineering challenges across diverse business domains
  • A collaborative environment that values continuous improvement and shared learning
  • The ability to influence data quality, performance and analytics outcomes across teams
  • Fulltime
Read More
Arrow Right

Data Engineer (GCP) - VOIS

We are seeking a skilled Data Engineer to design, build and maintain scalable da...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hold a Bachelor’s or Master’s degree in Computer Science, Information Technology or a related field
  • Experienced with SQL and NoSQL databases
  • Knowledgeable in data warehousing concepts and industry best practices
  • Familiar with data integration tools and frameworks
  • Demonstrate strong problem-solving and analytical capabilities
  • Communicate clearly and collaborate effectively with diverse teams
  • Comfortable working in a dynamic, fast-paced environment
Job Responsibility
Job Responsibility
  • Design, develop and maintain scalable data pipelines and ETL processes using GCP services including BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer, Cloud Functions and Cloud Run
  • Collaborate with data scientists, analysts and business stakeholders to understand data requirements and deliver robust data solutions
  • Implement data integration solutions to ingest, process and store structured and unstructured data from multiple sources
  • Optimise and tune data pipelines for performance, reliability and cost efficiency
  • Ensure data quality through validation, cleansing and transformation processes
  • Develop and maintain data models, schemas and metadata to support analytics and reporting needs
  • Monitor, troubleshoot and resolve data pipeline issues to minimise disruption
  • Stay current with GCP technologies and best practices, recommending improvements where appropriate
  • Mentor junior data engineers and promote a collaborative, knowledge-sharing culture
What we offer
What we offer
  • Opportunity to work on modern, cloud-native data platforms using Google Cloud technologies
  • Exposure to enterprise-scale data engineering challenges across diverse business domains
  • A collaborative environment that values continuous improvement and shared learning
  • The ability to influence data quality, performance and analytics outcomes across teams
  • Fulltime
Read More
Arrow Right

Data Engineer - VOIS

We are seeking a Data Engineer to design, build, and maintain scalable and relia...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4 to 6yrs experienced data engineering professional with strong hands‑on exposure to big data processing
  • Skilled in distributed data processing frameworks with a solid understanding of performance optimisation
  • Experienced in designing robust, scalable, and maintainable data architectures
  • Proficient in Scala programming and advanced SQL for complex data transformations and analytical queries
  • Hands‑on with Apache Spark and big data ecosystems, including Cassandra and Redis
  • Experienced with GCP services such as BigQuery, Cloud Composer, and Cloud Storage
  • Comfortable working in multi‑cloud environments, with exposure to OCI considered an advantage
  • A collaborative communicator with strong problem‑solving skills and a focus on data quality and reliability
  • German language proficiency is desirable
Job Responsibility
Job Responsibility
  • Build, maintain, and optimise scalable and reliable data pipelines for analytics and downstream data consumption
  • Design and develop ETL/ELT processes using Scala, SQL, and Apache Spark for large‑scale data processing
  • Work with distributed systems and big data technologies to process high‑volume datasets efficiently
  • Leverage cloud‑native data services on GCP and support workloads on OCI where required
  • Ensure data quality, performance, availability, and reliability across all data pipelines
  • Collaborate with analytics, data science, and product teams to deliver clean, well‑structured, and consumable datasets
What we offer
What we offer
  • Opportunities to work on large‑scale, cloud‑based data platforms with global impact
  • Exposure to complex data engineering challenges across multi‑cloud environments
  • Collaboration with diverse, cross‑functional teams across analytics, data science, and product domains
  • A supportive environment that values learning, inclusion, and continuous improvement
  • Fulltime
Read More
Arrow Right

Data Engineer - VOIS

We are seeking an experienced Data Engineering Specialist with strong hands-on e...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in Databricks on GCP including Delta Lake, notebooks/jobs, Unity Catalog, and cluster policies
  • Experienced in Cloud Data Fusion design, including pipeline management, error handling, and orchestration
  • Skilled in Dataproc Spark with experience building PySpark jobs, configuring ephemeral clusters, and handling initialisation actions
  • Proficient in Python for data engineering including packaging, unit testing, type hints, and linting
  • Strong SQL skills, specifically with BigQuery including performance tuning, partitioning, and clustering
  • Familiar with GCP services such as Cloud Storage, Pub/Sub, and Cloud Composer/Airflow
  • Holds a qualification such as B.E., B.Tech, BCA, MCA, BSc, or MSc in Computer Science or a related field.
Job Responsibility
Job Responsibility
  • Design and build data pipelines on GCP using Databricks (Delta Lake and Unity Catalog) for orchestration, Dataproc for Spark execution, supporting both ETL/ELT and feature engineering workloads
  • Engineer declarative, modular, and reusable pipelines in Python, following configuration-as-code principles and CI/CD practices including Git-based promotion, testing, and deployment
  • Implement and maintain data quality and observability practices using validation frameworks, logging, metrics, and alerts
  • Optimise pipeline performance, reliability, and cost through techniques such as cluster sizing, auto-termination, Z-ordering, caching, and partitioning strategies
  • Apply robust error handling, parameterisation, and triggers within Cloud Data Fusion pipelines
  • Ensure operational excellence by maintaining monitoring, performance tuning, and continuous improvements across data products and workloads.
What we offer
What we offer
  • The opportunity to build and scale data solutions using leading GCP and Databricks technologies
  • Exposure to enterprise-level CI/CD, observability, and configuration-as-code practices
  • A collaborative environment where innovation, continuous learning, and technical excellence are encouraged
  • The chance to contribute to high-impact global data platforms.
  • Fulltime
Read More
Arrow Right

Test Engineer - GCP, BigQuery & SQL - VOIS

We are seeking an experienced QA and Data Testing professional to support the qu...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experienced QA or Data Testing professional with hands-on expertise in Google Cloud Platform
  • Proficient in BigQuery and SQL, with a solid understanding of data warehouse concepts and testing methodologies
  • Experienced in validating complex data pipelines and large datasets in cloud environments
  • Analytical, detail-oriented, and comfortable working with structured and semi-structured data
  • Able to apply independent judgement while working effectively within collaborative delivery teams
Job Responsibility
Job Responsibility
  • Perform end-to-end testing of data pipelines built on Google Cloud Platform
  • Validate data ingestion, transformation, and loading into BigQuery tables
  • Conduct comprehensive data quality checks, including schema validation, duplicate detection, null checks, reconciliation, and transformation validation
  • Validate ETL and ELT processes across staging, curated, and analytics layers
  • Apply analytical thinking and structured problem-solving to identify, debug, and resolve data-related issues
  • Collaborate with cross-functional technology and data teams to support continuous improvement in data quality and testing practices
What we offer
What we offer
  • Opportunity to work on enterprise-scale cloud data platforms within a global organisation
  • Exposure to advanced data engineering and analytics environments on Google Cloud Platform
  • A collaborative and inclusive workplace that supports professional growth and continuous learning
  • The chance to contribute to data-driven decision-making that impacts Vodafone markets globally
  • Fulltime
Read More
Arrow Right
New

Secure By Design - VOIS

We are seeking an experienced Information Security professional to support secur...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8 years of IT experience
  • strong exposure to perimeter security, network engineering, and security management
  • experienced in information security risk assessment across cloud platforms (AWS, Azure, GCP, Oracle), data centres, and applications
  • knowledgeable in security principles, protocols, and technologies such as PKI, SSL, IKEv1 & v2, sandboxing, and cloud security controls
  • familiar with global security frameworks and assurance models
  • aware of data privacy and information protection requirements, including GDPR
  • comfortable working in complex, cross-functional environments
  • telco industry experience is advantageous
  • professional certifications such as CCNA, CISM, CISSP, ISO 27001, or ISO 31001 are desirable
Job Responsibility
Job Responsibility
  • Conduct information security risk assessments across cloud, data centre, and application environments
  • provide security architecture guidance and technical design recommendations to internal teams during the design and build phases
  • evaluate business requirements and proposed technical designs to identify risks, define secure alternatives, and recommend optimal security solutions
  • apply recognised security frameworks and standards such as ISO 27001, ISO 31001, NIST, CIS, SANS, and NIST SP 800-53
  • support secure development practices aligned with OWASP 'Security by Design' principles
  • assess and advise on perimeter security controls, including firewalls, VPNs, proxies, and network security solutions
  • monitor and interpret the global threat landscape, including advanced persistent threats, to inform risk-based decisions
  • create clear reports, dashboards, and presentations to communicate security posture, trends, and performance to stakeholders
  • collaborate across teams, influencing outcomes through strong interpersonal and negotiation skills
What we offer
What we offer
  • Opportunities to work on large-scale, global security initiatives within a leading telecoms environment
  • exposure to diverse technologies, cloud platforms, and international stakeholders
  • a collaborative and inclusive workplace that values learning, innovation, and professional growth
  • the chance to influence security strategy and design decisions at an early stage
  • Fulltime
Read More
Arrow Right