CrawlJobs Logo

Data Engineer (GCP) - VOIS

vodafone.com Logo

Vodafone

Location Icon

Location:
India , Pune

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a Data Engineer with strong expertise in GCP BigQuery, Data Fusion and SQL to design, build and support critical data platforms within our Data & Analytics Shared Services function. This role blends deep technical capability with collaborative delivery, focusing on developing scalable, secure and high-performing cloud-based data solutions that support enterprise analytics use cases.

Job Responsibility:

  • Design, develop and maintain data solutions using GCP services such as BigQuery, Data Fusion, Cloud Storage, Cloud Functions, App Engine and Cloud Run
  • Write, debug and optimise complex SQL and PL/SQL queries to process and analyse large-scale datasets
  • Collaborate with data engineers and stakeholders to process and manage large volumes of structured data
  • Ensure best practices are followed for performance, security, scalability and reliability of cloud-based data applications
  • Develop and maintain unit tests and technical documentation to ensure code quality and long-term maintainability
  • Troubleshoot and resolve issues in cloud-based data applications and services in production environments
  • Apply cloud-native development principles and modern data engineering practices
  • Contribute to continuous improvement initiatives within data warehousing and analytics platforms

Requirements:

  • Hold a Bachelor’s or Master’s degree in Computer Science, Information Technology or a related field
  • Experienced with SQL and NoSQL databases
  • Knowledgeable in data warehousing concepts and industry best practices
  • Familiar with data integration tools and frameworks
  • Demonstrate strong problem-solving and analytical capabilities
  • Communicate clearly and collaborate effectively with diverse teams
  • Comfortable working in a dynamic, fast-paced environment
What we offer:
  • Opportunity to work on modern, cloud-native data platforms using Google Cloud technologies
  • Exposure to enterprise-scale data engineering challenges across diverse business domains
  • A collaborative environment that values continuous improvement and shared learning
  • The ability to influence data quality, performance and analytics outcomes across teams

Additional Information:

Job Posted:
May 04, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer (GCP) - VOIS

New

Data Engineer - GCP, Python & BigQuery - VOIS

We are seeking a Data Engineer with strong expertise in GCP BigQuery, Data Fusio...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7–10 years of relevant experience in data engineering or data warehousing roles
  • Strong proficiency in SQL and PL/SQL, particularly within GCP BigQuery environments
  • Hands-on experience with GCP services such as BigQuery and Data Fusion, with working knowledge of other GCP components
  • Understanding of data warehousing concepts, database systems (SQL and NoSQL) and data processing frameworks
  • Exposure to Python for data processing or cloud-based development
  • Strong problem-solving, debugging and analytical skills
  • Clear communication and effective collaboration within cross-functional teams
  • Graduate or postgraduate degree in a technical discipline, preferably Engineering
Job Responsibility
Job Responsibility
  • Design, develop and maintain data solutions using GCP services such as BigQuery, Data Fusion, Cloud Storage, Cloud Functions, App Engine and Cloud Run
  • Write, debug and optimise complex SQL and PL/SQL queries to process and analyse large-scale datasets
  • Collaborate with data engineers and stakeholders to process and manage large volumes of structured data
  • Ensure best practices are followed for performance, security, scalability and reliability of cloud-based data applications
  • Develop and maintain unit tests and technical documentation to ensure code quality and long-term maintainability
  • Troubleshoot and resolve issues in cloud-based data applications and services in production environments
  • Apply cloud-native development principles and modern data engineering practices
  • Contribute to continuous improvement initiatives within data warehousing and analytics platforms
What we offer
What we offer
  • Opportunity to work on enterprise-scale data platforms using modern GCP technologies
  • Exposure to global analytics programmes and shared services delivery models
  • A collaborative environment that values process improvement, quality and innovation
  • The chance to influence data reliability, performance and analytics outcomes
Read More
Arrow Right
New

Data Engineer (GCP) - VOIS

We are seeking a skilled Data Engineer to design, build and maintain scalable da...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hold a Bachelor’s or Master’s degree in Computer Science, Information Technology or a related field
  • Experienced with SQL and NoSQL databases
  • Knowledgeable in data warehousing concepts and industry best practices
  • Familiar with data integration tools and frameworks
  • Demonstrate strong problem-solving and analytical capabilities
  • Communicate clearly and collaborate effectively with diverse teams
  • Comfortable working in a dynamic, fast-paced environment
Job Responsibility
Job Responsibility
  • Design, develop and maintain scalable data pipelines and ETL processes using GCP services including BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer, Cloud Functions and Cloud Run
  • Collaborate with data scientists, analysts and business stakeholders to understand data requirements and deliver robust data solutions
  • Implement data integration solutions to ingest, process and store structured and unstructured data from multiple sources
  • Optimise and tune data pipelines for performance, reliability and cost efficiency
  • Ensure data quality through validation, cleansing and transformation processes
  • Develop and maintain data models, schemas and metadata to support analytics and reporting needs
  • Monitor, troubleshoot and resolve data pipeline issues to minimise disruption
  • Stay current with GCP technologies and best practices, recommending improvements where appropriate
  • Mentor junior data engineers and promote a collaborative, knowledge-sharing culture
What we offer
What we offer
  • Opportunity to work on modern, cloud-native data platforms using Google Cloud technologies
  • Exposure to enterprise-scale data engineering challenges across diverse business domains
  • A collaborative environment that values continuous improvement and shared learning
  • The ability to influence data quality, performance and analytics outcomes across teams
  • Fulltime
Read More
Arrow Right
New

Data Engineer (GCP) - VOIS

We are seeking a skilled Data Engineer to design, build and maintain scalable da...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hold a Bachelor’s or Master’s degree in Computer Science, Information Technology or a related field
  • Experienced with SQL and NoSQL databases
  • Knowledgeable in data warehousing concepts and industry best practices
  • Familiar with data integration tools and frameworks
  • Demonstrate strong problem-solving and analytical capabilities
  • Communicate clearly and collaborate effectively with diverse teams
  • Comfortable working in a dynamic, fast-paced environment
Job Responsibility
Job Responsibility
  • Design, develop and maintain scalable data pipelines and ETL processes using GCP services including BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer, Cloud Functions and Cloud Run
  • Collaborate with data scientists, analysts and business stakeholders to understand data requirements and deliver robust data solutions
  • Implement data integration solutions to ingest, process and store structured and unstructured data from multiple sources
  • Optimise and tune data pipelines for performance, reliability and cost efficiency
  • Ensure data quality through validation, cleansing and transformation processes
  • Develop and maintain data models, schemas and metadata to support analytics and reporting needs
  • Monitor, troubleshoot and resolve data pipeline issues to minimise disruption
  • Stay current with GCP technologies and best practices, recommending improvements where appropriate
  • Mentor junior data engineers and promote a collaborative, knowledge-sharing culture
What we offer
What we offer
  • Opportunity to work on modern, cloud-native data platforms using Google Cloud technologies
  • Exposure to enterprise-scale data engineering challenges across diverse business domains
  • A collaborative environment that values continuous improvement and shared learning
  • The ability to influence data quality, performance and analytics outcomes across teams
  • Fulltime
Read More
Arrow Right
New

Data Engineer - VOIS

We are seeking a Data Engineer to design, build, and maintain scalable and relia...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4 to 6yrs experienced data engineering professional with strong hands‑on exposure to big data processing
  • Skilled in distributed data processing frameworks with a solid understanding of performance optimisation
  • Experienced in designing robust, scalable, and maintainable data architectures
  • Proficient in Scala programming and advanced SQL for complex data transformations and analytical queries
  • Hands‑on with Apache Spark and big data ecosystems, including Cassandra and Redis
  • Experienced with GCP services such as BigQuery, Cloud Composer, and Cloud Storage
  • Comfortable working in multi‑cloud environments, with exposure to OCI considered an advantage
  • A collaborative communicator with strong problem‑solving skills and a focus on data quality and reliability
  • German language proficiency is desirable
Job Responsibility
Job Responsibility
  • Build, maintain, and optimise scalable and reliable data pipelines for analytics and downstream data consumption
  • Design and develop ETL/ELT processes using Scala, SQL, and Apache Spark for large‑scale data processing
  • Work with distributed systems and big data technologies to process high‑volume datasets efficiently
  • Leverage cloud‑native data services on GCP and support workloads on OCI where required
  • Ensure data quality, performance, availability, and reliability across all data pipelines
  • Collaborate with analytics, data science, and product teams to deliver clean, well‑structured, and consumable datasets
What we offer
What we offer
  • Opportunities to work on large‑scale, cloud‑based data platforms with global impact
  • Exposure to complex data engineering challenges across multi‑cloud environments
  • Collaboration with diverse, cross‑functional teams across analytics, data science, and product domains
  • A supportive environment that values learning, inclusion, and continuous improvement
  • Fulltime
Read More
Arrow Right

Senior Data Platform Engineer

As a Senior Data Platform Engineer in Voi’s Platform team, you will help shape t...
Location
Location
Sweden , Stockholm
Salary
Salary:
Not provided
voi.com Logo
Voi Technology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience with cloud-based data platforms (Snowflake, BigQuery, Redshift, Databricks or similar) and an ability to understand and work with modern data architectures
  • Strong Python and SQL skills, and experience working closely with analytics, data science or data engineering teams
  • Experience with cloud‑native infrastructure (GCP or equivalent) and IaC tools (we use Terraform) is a strong plus
  • Experience with CI/CD for multiple teams and services, and with logging, monitoring and alerting (we use OpenTelemetry, Grafana, Prometheus, Mimir and Google Cloud Monitoring), with interest in observability for data pipelines and platforms
Job Responsibility
Job Responsibility
  • Improve our data platform in areas such as reliability, data security and access control, cost optimisation, observability and future architecture
  • Take shared ownership of the core data platforms used across Voi - keeping them secure, performant and easy to use for product, analytics and backend teams
  • Drive our journey towards self‑serve data by enabling teams to build and run their own data pipelines and models on top of the platform
  • Collaborate with engineers across backend, data, machine learning, web, mobile and IoT to solve cross‑domain problems
  • Automate data infrastructure operations, security configuration, monitoring, and cloud identity and access management – strengthening observability for data pipelines and platforms
What we offer
What we offer
  • Get “skin in the game” through our employee options programme
  • Enjoy unlimited free Voi rides and a dog‑friendly office
  • Work together with inspiring, motivated and supportive colleagues towards a common goal
  • Help create sustainable cities made for living, free from noise and pollution
Read More
Arrow Right

Test Engineer - GCP, BigQuery & SQL - VOIS

We are seeking an experienced QA and Data Testing professional to support the qu...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experienced QA or Data Testing professional with hands-on expertise in Google Cloud Platform
  • Proficient in BigQuery and SQL, with a solid understanding of data warehouse concepts and testing methodologies
  • Experienced in validating complex data pipelines and large datasets in cloud environments
  • Analytical, detail-oriented, and comfortable working with structured and semi-structured data
  • Able to apply independent judgement while working effectively within collaborative delivery teams
Job Responsibility
Job Responsibility
  • Perform end-to-end testing of data pipelines built on Google Cloud Platform
  • Validate data ingestion, transformation, and loading into BigQuery tables
  • Conduct comprehensive data quality checks, including schema validation, duplicate detection, null checks, reconciliation, and transformation validation
  • Validate ETL and ELT processes across staging, curated, and analytics layers
  • Apply analytical thinking and structured problem-solving to identify, debug, and resolve data-related issues
  • Collaborate with cross-functional technology and data teams to support continuous improvement in data quality and testing practices
What we offer
What we offer
  • Opportunity to work on enterprise-scale cloud data platforms within a global organisation
  • Exposure to advanced data engineering and analytics environments on Google Cloud Platform
  • A collaborative and inclusive workplace that supports professional growth and continuous learning
  • The chance to contribute to data-driven decision-making that impacts Vodafone markets globally
  • Fulltime
Read More
Arrow Right

General Manager-VOIS Architect

We are seeking an IT Architect to join the VOIS Internal IT function in India, f...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A relevant technical degree (e.g., Computer Science, Engineering, Information Technology or related field)
  • Recognised Technology Architect certification (e.g., enterprise or solution architecture certification)
  • Cloud solution architect certification in one or more major cloud platforms (Azure, AWS, GCP)
  • A minimum of 5 years’ experience in system and solution architecture design
  • Exposure to machine learning or workflow-based projects is a strong advantage
  • Programming or framework experience such as Python or TensorFlow is a plus
  • Familiarity with TOGAF or similar architecture frameworks will be considered an advantage
Job Responsibility
Job Responsibility
  • Analyse business and operational requirements and translate them into robust, scalable technology architectures and solution designs
  • Research industry trends, vendor offerings and emerging technologies (including data analytics, AI, robotics and cloud services) and assess their applicability and value for VOIS
  • Design and develop new prototypes, proof of concepts (PoCs), pilots and frameworks that demonstrate the potential of innovative technologies for shared services and telecom-centric use cases
  • Own and govern the end-to-end solution architecture, including business flows, data flows, integration patterns, security controls and GDPR-compliant designs
  • Define and maintain architectural guidelines, standards and best practices for new technologies within VOIS, ensuring consistency and reusability across initiatives
  • Lead and run RFI/RFP processes for identified technologies in close partnership with supply chain / procurement teams, and support vendor onboarding from a technical and architectural standpoint
  • Evaluate solution complexity and estimate the effort required to integrate new capabilities with existing platforms and systems
  • Review solution designs, code deliveries and deployment approaches, supporting teams in troubleshooting architectural and integration issues
  • Contribute to and help complete Integration and Operational Design (IOD) artefacts, ensuring that new capabilities are effectively embedded into the operational landscape
  • Define, oversee and audit user access management approaches and controls within the end-to-end solution, in line with security and compliance standards
Read More
Arrow Right
New

Night Support Worker

Make a difference every day and give back to your community! Prime Life are on t...
Location
Location
United Kingdom , Scunthorpe
Salary
Salary:
12.96 - 13.32 GBP / Hour
jobs.360resourcing.co.uk Logo
360 Resourcing Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Prior experience as a Support worker or Care Assistant is desirable, but this is not essential as full training is provided
  • Candidates must be patient, understanding and respectful of our resident's individual abilities
  • Genuine caring nature and a desire to make a real difference
Job Responsibility
Job Responsibility
  • Encouraging residents to achieve the highest possible quality of life that is right for them
  • Providing companionship whilst giving emotional and practical support
  • Assisting residents to participate in any decisions relating to their daily living arrangements
  • Assisting residents with their personal care and in undertaking recreational activities
  • Assisting residents to retire for the evening and enjoy a comfortable night's sleep
  • Promoting resident independence in the preparation of food and/or at mealtimes
  • Working with other professionals to provide individualised care and development plans
What we offer
What we offer
  • Hourly rates of pay ranging from £12.96 to £13.32 per hour dependent upon qualification level
  • Opportunities to learn and progress with the support of our dedicated Quality Matters team
  • Fully funded DBS
  • Comprehensive Holiday Pay scheme that rewards you for your commitment to care
  • Fantastic Refer a Friend scheme, offering up to £250 per successful candidate
Read More
Arrow Right