CrawlJobs Logo

Manager Data Engineer (GCP + Python)

vodafone.com Logo

Vodafone

Location Icon

Location:
India , Pune

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Data Engineer will design, develop, and maintain data solutions leveraging Google Cloud Platform (GCP) and Python. This role is critical for building scalable data pipelines, optimising performance, and ensuring data integrity across multiple systems. The individual will collaborate with cross-functional teams to deliver high-quality data products that support business decisions.

Job Responsibility:

  • Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load large datasets into GCP-based platforms (e.g., BigQuery, Cloud Storage)
  • Optimise data pipelines for performance, reliability, and scalability
  • Develop and manage data models, schemas, and storage solutions aligned with best practices
  • Leverage GCP services such as Cloud Composer, Dataflow, Pub/Sub, and Cloud Functions to build automated workflows
  • Implement data validation, cleansing, and quality checks to maintain accuracy and integrity
  • Collaborate with data scientists, analysts, and business stakeholders to define and execute data requirements
  • Set up monitoring systems to track pipeline performance and ensure timely delivery

Requirements:

  • Proficiency in Python for building and deploying data processing scripts
  • Strong expertise in GCP services, especially BigQuery, Cloud Storage, Cloud Functions, Cloud Composer, and Pub/Sub
  • Experience with SQL for querying and processing data
  • Familiarity with workflow orchestration tools like Apache Airflow
  • Knowledge of version control systems (e.g., Git)
  • Excellent communication and stakeholder management skills

Nice to have:

  • Advanced GCP features and emerging cloud technologies
  • Best practices for cost optimisation and performance tuning in cloud environments
  • Integration of streaming solutions like Kafka with Python SDK
  • Collaborative approaches to data-driven decision-making

Additional Information:

Job Posted:
March 05, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Manager Data Engineer (GCP + Python)

Senior Data Engineer

Senior Data Engineer to design, develop, and optimize data platforms, pipelines,...
Location
Location
United States , Chicago
Salary
Salary:
160555.00 - 176610.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
  • 3 years of experience in data engineering
  • Experience building data platforms and pipelines
  • Experience with AWS, GCP or Azure
  • Experience with SQL and Python for data manipulation, transformation, and automation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with data governance, data quality, data lineage and metadata management
  • Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
  • Experience with CI/CD pipelines for continuous deployment and delivery of data products
  • Experience maintaining technical records and system designs
Job Responsibility
Job Responsibility
  • Design, develop, and optimize data platforms, pipelines, and governance frameworks
  • Enhance business intelligence, analytics, and AI capabilities
  • Ensure accurate data flows and push data-driven decision-making across teams
  • Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
  • Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
  • Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
  • Develop and manage CI/CD pipelines to support continuous deployment of data products
  • Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
  • Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
  • Develop and expose REST APIs for sharing data across teams
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right

Data Engineer

We are looking for a seasoned Data Engineer to join our MarTech and Data Strateg...
Location
Location
United States
Salary
Salary:
Not provided
zionandzion.com Logo
Zion & Zion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-6 years experience in a data engineering role
  • Significant experience and (preferably) certified in public cloud products: GCP Cloud Architect / Data Engineer or equivalent on AWS
  • Experience with tools like dbt
  • Familiar with ETL/ELT and (nice to have) reverse ETL platforms
  • Experience on a cloud-based or Business Intelligence project (as a technical project manager, developer or architect)
  • DevOps capabilities (CI/CD, Infrastructure as code, Docker, etc.)
  • Experience leveraging digital data in the cloud for marketing activations
  • Excellent verbal and written communication skills and comfortable working with both marketing and technical teams
  • Client-facing experience for detailed technical specifications discussions
  • Fluent in SQL, Python or R
Job Responsibility
Job Responsibility
  • Work with internal and external teams to design and implement technical architecture in order to facilitate the advanced activation of data
  • Interacting with 3rd party MarTech solutions (Google Analytics, Google Marketing Platform, Data Management Platforms, Tag Management Solutions, Adobe Analytics, Clouds, Customer Data Platforms, etc.)
  • Come up with creative solutions to integrate data from a variety of sources into platforms and data warehouses
  • Work with internal teams to specify data processing pipelines (database schemas, integrity constraints, delivery throughput) for use case activations and implement it in the cloud
  • Work with internal data science team to scope and check typical machine learning and AI project requirements
  • Work with data visualization teams to design and implement tables to help power complex dashboards
Read More
Arrow Right

Engineering Manager (Python + Machine Learning)

We are seeking a hands-on Machine Learning Engineering Manager to lead cross-fun...
Location
Location
India , Noida
Salary
Salary:
Not provided
aqusag.com Logo
AquSag Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 9+ yrs of strong background in Machine Learning, NLP, and modern deep learning architectures (Transformers, LLMs)
  • Hands-on experience with frameworks such as PyTorch, TensorFlow, Hugging Face, or DeepSpeed
  • 2+ yrs of proven experience managing teams delivering ML/LLM models in production environments
  • Knowledge of distributed training, GPU/TPU optimization, and cloud platforms (AWS, GCP, Azure)
  • Familiarity with MLOps tools like MLflow, Kubeflow, or Vertex AI for scalable ML pipelines
  • Excellent leadership, communication, and cross-functional collaboration skills
  • Bachelor’s or Master’s in Computer Science, Engineering, or related field
Job Responsibility
Job Responsibility
  • Lead and mentor a cross-functional team of ML engineers, data scientists, and MLOps professionals
  • Oversee the full lifecycle of LLM and ML projects — from data collection to training, evaluation, and deployment
  • Collaborate with Research, Product, and Infrastructure teams to define goals, milestones, and success metrics
  • Provide technical direction on large-scale model training, fine-tuning, and distributed systems design
  • Implement best practices in MLOps, model governance, experiment tracking, and CI/CD for ML
  • Manage compute resources, budgets, and ensure compliance with data security and responsible AI standards
  • Communicate progress, risks, and results to stakeholders and executives effectively
Read More
Arrow Right

Gcp Data Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of experience as Data Architect, including min. 4 years of experience working with GCP cloud-based infrastructure & systems
  • Deep knowledge of Google Cloud Platform and cloud computing services
  • Strong experience in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as columnar databases, relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks
  • Knowledge of modern data transformation tools (such as DBT, Dataform)
  • Knowledge of at least one orchestration and scheduling tool
  • Programming skills (SQL, Python, other scripting)
  • Tools knowledge: Git, Jira, Confluence, etc.
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices, and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
  • Stay updated with emerging trends and technologies in data engineering, recommending, and implementing innovative solutions as appropriate
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • “Office as an option” model. You can choose to work remotely or in the office
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
Read More
Arrow Right

Intermediate / Senior Software Engineer Scientific Engine (Python)

Due to our consistent growth, we are seeking to expand our Data, Software and De...
Location
Location
France , Paris
Salary
Salary:
Not provided
descartesunderwriting.com Logo
Descartes Underwriting
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Coaching or mentoring experience
  • Scoping and identifying solutions with business team
  • Handling human interactions between tech and business
  • Excellent communication skills, in both formal and informal settings, and in English and French
  • 3 years or more of experiences as a software engineer or data scientist
  • Solid knowledge in Python
  • Solid engineering background: master in computer science, mathematics, physics or earth science
  • Experience optimizing and profiling python code
  • Experience in a microservices architecture
  • Good knowledge with Docker
Job Responsibility
Job Responsibility
  • Contribute directly on the code base either individually, in pairs or more
  • Organize REX sessions to share the knowledge received with the rest of the team
  • Ensure compliance to internal standards and practices
  • Present the progress and goals
  • Contribute to the technical roadmap through architecture meetings, design documents
  • Coach your collaborators to consistently deliver according to their roadmap
  • Provide expertise to help your team: Develop, optimize and update software for: Calculation of risk models
  • Data collection, preparation and visualization
  • Export of outputs adapted to users
  • Testing and validation of existing solutions
What we offer
What we offer
  • Opportunity to work and learn with teams from the most prestigious schools and research labs in the world, allowing you to progress towards technical excellence
  • Commitment from Descartes to its staff of continued learning and development (think annual seminars, training etc.)
  • Work in a collaborative & professional environment
  • Be part of an international team, passionate about diversity
  • Join a company with a true purpose – help us help our clients be more resilient towards climate risks
  • A competitive salary, bonus and benefits
  • You can benefit from a punctual home office days
Read More
Arrow Right

Senior Backend Engineer / Tech Lead (Data Management)

As a Senior Backend Software Engineer at Aignostics, you work hand in hand with ...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
aignostics.com Logo
Aignostics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor and/or Master in a relevant field or extensive work experience
  • 6+ years of software development experience in a data intensive environment
  • Experience leading a technical initiative ideally with cross-team impact
  • Strong background in software development ideally with Python
  • Experience with cloud providers (GCP, AWS) and their services
  • Experience with container orchestration (preferable Kubernetes)
  • Experience with database systems
  • Familiar with CI/CD pipelines, code reviews and other standards to keep up code quality
  • Driven self-starter, well-organized, excellent communication skills and a strong team player
Job Responsibility
Job Responsibility
  • Design and develop services and core libraries that enable our SaaS platform
  • Ensure reliable, high throughput access to our data for machine learning
  • Maintain and expand our data management infrastructure
  • Lead initiatives, evaluate new technologies and their integration into our current codebase
  • Eagerness to take ownership - from inception to completion - without losing focus on the business context
  • Communicate closely with our frontend and machine learning teams
  • Perform code reviews, considering readability, design and performance
What we offer
What we offer
  • Learning & Development yearly budget of 1,000€ (plus 2 L&D days)
  • Language classes and internal development programs
  • Mentoring program
  • Flexible working hours and teleworking policy
  • 30 paid vacations days per year
  • Family & pet friendly and support flexible parental leave options
  • Subsidized membership of your choice among public transport, sports and well-being
  • Social gatherings, lunches, and off-site events
  • Optional company pension scheme
Read More
Arrow Right