CrawlJobs Logo
Briefcase Icon
Category Icon

Filters

×
Countries
Cities

Bigdata Java Spark And Pyspark Developer India Jobs

3 Job Offers

Filters
Big Data Engineering Lead
Save Icon
Location Icon
Location
India , Chennai
Salary Icon
Salary
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Read More
Arrow Right
Big Data Engineer
Save Icon
Location Icon
Location
India , Chennai
Salary Icon
Salary
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Read More
Arrow Right
Bigdata Java Spark And Pyspark Developer
Save Icon
Location Icon
Location
India , Pune
Salary Icon
Salary
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Read More
Arrow Right
Explore the dynamic world of Big Data Java Spark and PySpark Developer jobs, a pivotal role at the intersection of large-scale data processing and sophisticated software engineering. Professionals in this high-demand field are responsible for designing, building, and optimizing data-intensive applications that empower organizations to extract actionable insights from massive, complex datasets. They are the architects of robust data pipelines and analytical engines that drive business intelligence, machine learning, and real-time analytics. A Big Data Java Spark and PySpark Developer typically engages in the end-to-end lifecycle of data solutions. This begins with analyzing business requirements and translating them into technical specifications for scalable and efficient data processing systems. A core responsibility involves application design and development, where they write, test, and maintain high-quality code using Java, Scala (for Spark), and Python (for PySpark) to manipulate and analyze petabytes of data. They leverage the Apache Spark framework to perform distributed data transformations, run complex algorithms, and build both batch and real-time streaming data pipelines. Troubleshooting is a significant part of the role, requiring developers to analyze and resolve performance bottlenecks, data quality issues, and system failures to ensure optimal application performance and reliability. Common responsibilities for individuals in these jobs extend beyond pure coding. They often serve as technical leads on medium to large-scale projects, guiding design decisions and mentoring junior team members. This includes participating in and leading code reviews, ensuring adherence to best practices and architectural standards. They are also tasked with optimizing data workflows and Spark jobs for maximum efficiency and cost-effectiveness, which involves fine-tuning configurations, partitioning strategies, and memory management. Furthermore, these developers collaborate closely with data scientists, business analysts, and other engineering teams to understand operational needs and ensure the system's stability, scalability, and maintainability throughout its entire lifecycle. They play a key role in process improvement, streamlining development and deployment pipelines, often within an Agile methodology framework. The typical skill set for these jobs is a powerful blend of deep technical expertise and strong analytical abilities. Proficiency in core Java is fundamental, alongside advanced skills in Apache Spark, including its Core, SQL, Streaming, and MLlib libraries. Strong Python programming skills are equally critical for PySpark development. A solid understanding of distributed systems principles, data structures, and algorithms is non-negotiable. Experience with the broader Hadoop ecosystem, SQL and NoSQL databases, and cloud platforms like AWS, Azure, or GCP is highly valued. From a professional standpoint, excellent problem-solving capabilities, the ability to drive mid-size projects to completion, and effective written and oral communication skills are essential for collaborating with cross-functional teams and affecting positive technical and cultural change. A bachelor's degree in computer science or a related field, or equivalent practical experience, is a standard requirement for these transformative roles.

Filters

×
Countries
Category
Location
Work Mode
Salary