CrawlJobs Logo
Briefcase Icon
Category Icon

Data Engineer - Pyspark India Jobs

260 Job Offers

Filters
New
Data engineer & mlops lead
Save Icon
Location Icon
Location
India , Bangalore
Salary Icon
Salary
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
Until further notice
New
Senior Data Engineer - Operation
Save Icon
Location Icon
Location
India , Gurugram
Salary Icon
Salary
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
New
Fid federal risk applications data engineer
Save Icon
Location Icon
Location
India , Bangalore
Salary Icon
Salary
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
New
Senior Software Engineer - Data Engineering/Generative AI
Save Icon
Join Wells Fargo as a Senior Software Engineer specializing in Data Engineering and Generative AI in Hyderabad. You will lead complex data initiatives, designing and optimizing large-scale data pipelines using Spark, Java, and GCP. This role requires strong expertise in DWH, big data, Python, SQL...
Location Icon
Location
India , Hyderabad
Salary Icon
Salary
Not provided
https://www.wellsfargo.com/ Logo
Wells Fargo
Expiration Date
Until further notice
New
Lead Software Engineer - Data Engineering/Generative AI
Save Icon
Location Icon
Location
India , Hyderabad
Salary Icon
Salary
Not provided
https://www.wellsfargo.com/ Logo
Wells Fargo
Expiration Date
Until further notice
New
Senior DevOps Lead Data Engineering
Save Icon
Lead our data platform's DevOps strategy in Pune. This senior role requires 7-10 years of experience in infrastructure automation, CI/CD, and managing scalable systems on AWS/ECS. Expertise in Elastic Stack, Python/Shell scripting, and Java for tooling is essential. Drive operational excellence f...
Location Icon
Location
India , Pune
Salary Icon
Salary
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
New
GCP Data Engineer
Save Icon
Join our team in Pune as a GCP Data Engineer. Design and build robust data pipelines using GCP tools like BigQuery, Data Fusion, and Dataproc. Apply your expertise in Python, Spark, and cloud-native development. This role requires 2-4 years' experience, including hands-on GCP work, and offers a c...
Location Icon
Location
India , Pune
Salary Icon
Salary
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
New
GCP Data Engineer
Save Icon
Join our team in Pune as a GCP Data Engineer. Design and build robust data pipelines using BigQuery, Data Fusion, Dataproc, and Cloud Composer. Leverage your expertise in Python, Spark, and cloud platforms to optimize data processing systems. This role requires 2-4 years of experience, including ...
Location Icon
Location
India , Pune
Salary Icon
Salary
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
New
Senior Data Engineer
Save Icon
Location Icon
Location
India , Chennai; Pune
Salary Icon
Salary
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
New
Lead Data Engineer
Save Icon
Lead Data Engineer role for a seasoned professional with 10+ years' experience. Expertise in Azure Data Lake and Microsoft Fabric is essential. This position is based in India, offering a key leadership opportunity in advanced data architecture.
Location Icon
Location
India
Salary Icon
Salary
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
New
Data Engineering Lead
Save Icon
Location Icon
Location
India
Salary Icon
Salary
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
New
Software Engineer - Data Platform
Save Icon
Join Addepar's Pune team as a Software Engineer for the Data Platform. You will design and build core infrastructure and tooling for a modern financial Data Lakehouse. This role requires strong data engineering skills, cloud infrastructure expertise, and experience with agile practices. Help esta...
Location Icon
Location
India , Pune
Salary Icon
Salary
Not provided
addepar.com Logo
Addepar
Expiration Date
Until further notice
New
Snowflake Data Engineer
Save Icon
Seeking an experienced Snowflake Data Engineer in Bangalore for a hybrid role. Leverage 6-9 years of expertise in Snowflake, Python, and AWS to develop robust data solutions and implement CI/CD processes. Proficiency with GitHub and strong communication are key. Certifications in Snowflake or AWS...
Location Icon
Location
India , Bangalore
Salary Icon
Salary
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
New
Data Engineer - ETL/Python
Save Icon
Location Icon
Location
India , Pune
Salary Icon
Salary
Not provided
barclays.co.uk Logo
Barclays
Expiration Date
Until further notice
New
Data Engineer
Save Icon
Location Icon
Location
India , Hyderabad
Salary Icon
Salary
Not provided
seismic.com Logo
Seismic
Expiration Date
Until further notice
New
Data Platform Engineer
Save Icon
Seeking a Data Platform Engineer in Bengaluru to build and operationalize cloud-based data platforms. You will develop complex ETL pipelines using Python, GCP, BigQuery, and Airflow/Astronomer. This role requires strong big data expertise, agile methodology experience, and a drive for technical i...
Location Icon
Location
India , Bengaluru
Salary Icon
Salary
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
Until further notice
New
Data Ops Engineer
Save Icon
Seeking a skilled DataOps Engineer in India to operationalize healthcare data pipelines. You will design, automate, and maintain reliable ETL/ELT flows from medical devices, ensuring data quality and HIPAA compliance. Required: 8+ years' experience with Python, SQL, cloud platforms, dbt, Airflow,...
Location Icon
Location
India
Salary Icon
Salary
Not provided
augustahitech.com Logo
Augusta Hitech Soft Solutions
Expiration Date
Until further notice
New
Data Engineer
Save Icon
Seeking a skilled Data Engineer in Bengaluru with 2+ years of experience. You will design and build robust batch/real-time data pipelines on AWS, automating ETL/ELT workflows. Expertise in data ingestion, infrastructure optimization, and translating business requirements is key. Insurance domain ...
Location Icon
Location
India , Bengaluru
Salary Icon
Salary
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
Until further notice
New
Engineering Manager II, Data & ML Systems
Save Icon
Location Icon
Location
India , Hyderabad
Salary Icon
Salary
Not provided
uber.com Logo
Uber
Expiration Date
Until further notice
New
Senior Data Engineer
Save Icon
Location Icon
Location
India , Hyderabad
Salary Icon
Salary
Not provided
uber.com Logo
Uber
Expiration Date
Until further notice
Previous 1 2 3 4 5 6 ... 13 Next
Are you a data architect with a passion for building robust, scalable systems? Your search for Data Engineer - PySpark jobs ends here. A Data Engineer specializing in PySpark is a pivotal role in the modern data ecosystem, responsible for constructing the foundational data infrastructure that powers analytics, machine learning, and business intelligence. These professionals are the master builders of the data world, transforming raw, unstructured data into clean, reliable, and accessible information for data scientists, analysts, and business stakeholders. If you are seeking jobs where you can work with cutting-edge big data technologies to solve complex data challenges at scale, this is your domain. In this profession, typical responsibilities revolve around the entire data pipeline lifecycle. Data Engineers design, develop, test, and maintain large-scale data processing systems. A core part of their daily work involves writing efficient, scalable code using PySpark, the Python library for Apache Spark, to perform complex ETL (Extract, Transform, Load) or ELT processes. They build and orchestrate data pipelines that ingest data from diverse sources—such as databases, APIs, and log files—into data warehouses like Snowflake or data lakes on cloud platforms like AWS, Azure, and GCP. Ensuring data quality and reliability is paramount; they implement robust data validation, monitoring, and observability frameworks to guarantee that data is accurate, timely, and trusted. Furthermore, they are tasked with optimizing the performance and cost of these data systems, fine-tuning Spark jobs for maximum efficiency, and automating deployment processes through CI/CD and Infrastructure as Code (IaC) practices. To excel in Data Engineer - PySpark jobs, a specific and powerful skill set is required. Mastery of Python and PySpark is non-negotiable, as it is the primary tool for distributed data processing. Profound knowledge of SQL is essential for data manipulation and querying. Experience with workflow orchestration tools like Apache Airflow is a common requirement to manage complex pipeline dependencies. A deep understanding of cloud data solutions (AWS, GCP, Azure) and platforms like Databricks is highly valued. Beyond technical prowess, successful candidates possess strong problem-solving abilities to debug and optimize data flows, a keen eye for system design and architecture, and excellent collaboration skills to work with cross-functional teams, including data scientists and business analysts. They are often expected to mentor junior engineers and contribute to establishing data engineering best practices and standards across an organization. If you are ready to build the future of data, explore the vast array of Data Engineer - PySpark jobs available and take the next step in your impactful career.

Filters

×
Countries
Category
Location
Work Mode
Salary