CrawlJobs Logo

Lead Java Developer - Data Engineering

schwab.com Logo

Charles Schwab

Location Icon

Location:
United States , Austin, TX

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

152000.00 - 168000.00 USD / Year

Job Description:

This newly created role will support the future growth of the WAS business. As a seasoned, hands-on Lead Developer of Software Development Engineering, you will enable WME to achieve consistent, predictable, high-quality delivery by implementing best practices, tools, metrics, automation frameworks, and provide oversight in the areas of software development and testing. You’ll serve as a strong and versatile hands-on technical leader. You will deliver high-quality solutions that meet business objectives in a flexible, collaborative, and rapidly changing environment. You will play a critical role in supporting key stakeholders across the WAS organization which is focused on high-net worth and ultra-high net worth retail investors. Supported business functions span Schwab’s Wealth and Investment Solutions including Schwab Wealth Advisory. You will be working with a team of talented and highly motivated technologists that strive to make technology a strategic differentiator for Schwab’s WAS business and their clients.

Job Responsibility:

  • Lead team members and work with partners to deliver Integration Fabric platform across the full software development lifecycle - designing, developing, testing, and support
  • Design, implement, influence, and refine architecture designs and technologies to develop various components of an Integration Fabric
  • Implement and coach on software development best practices as the go-to development partner for the scrum team members
  • Coach team members on software development best practices and data integration patterns
  • Assist leaders with talent acquisition by identifying resource needs, interviewing candidates, and onboarding new engineers and analysts
  • Define and maintain conceptual, logical, and physical database models by working with business stakeholders and identifying optimal database specifications
  • Ensures that database design and system capabilities meet validated user requirements
  • Work with application development teams designing, developing, and enhancing database schemas, creating database views, helping with query optimization, and figuring out ways to increase application performance
  • Enhance and recommend solutions / processes for downstream impact analysis associated with data and/or model changes
  • Work with various IT departments within Schwab to perform preventive and corrective maintenance and incident management measures
  • Ensure database integrity, availability, and restorability by working with enterprise operational excellence teams
  • Evaluates the integration of new tools or technologies based on the capabilities of the existing infrastructure
  • Recommend solutions and help with functional and performance testing needs (mock data, test environments, data refresh processes)
  • Responsible for coordination of planning the installation of operating systems, databases, applications, development upgrades, and new releases
  • Follow and develop database standards, guidelines, and best practices
  • Establishes standards, controls, and procedures to ensure data integrity and security
  • Perform capacity plan

Requirements:

  • Bachelor of Science in Computer Science or a related field
  • 7+ years of experience in software development roles with focus on software development and data products – includes 4+ years of with engineering lead experience
  • 3 years or more experience with Postgres database management. The candidate must additionally have experience with Oracle, SQL Server, MySQL, and/or Mongo DB
  • Must have worked as alongside a business vertical to establish nomenclature, data catalogs, provenance, standards, semantics and usage of data
  • Deep knowledge and experience with query development and optimization (SQL, PL/SQL, and No SQL)
  • Experience in performance-tuning database applications is a MUST
  • Strong knowledge and experience with data integration design patterns
  • Strong experience working on medium or large-scale data integration projects with OLAP/OLTP systems
  • Experience in Domain Driven Design, Microservices and Database design
  • Experience in implementing software development engineering best practices as well as DevOps, CI/CD, Scrum, and Kanban
  • Hands-on experience coding in Java, building REST based APIs, using messaging and streaming technologies, and development tools such as Bamboo, Bitbucket, and Jira
  • Hands-on experience utilizing Spring Boot, Spring Batch, Spring Cloud Data Flow and other technologies to build data integration pipelines
  • Experience writing automated unit, integration, and acceptance tests for data interfaces & data pipelines is a must

Nice to have:

  • Exposure to various flavors of databases Postgress, Oracle, Mongo, SQL Server, MySQL
  • Exposure to data visualizations tools such as Tableau
  • Exposure to ETL tools such as Informatica Intelligent Cloud Solutions
  • Proactively detect, troubleshoot, and remediate issues affecting production applications
  • Provide clear, concise, and timely communication to affected parties during the investigation and resolution of any individual or system-wide outage
  • Responsible for ensuring the Change Implementation Management policies are adhered to for all changes deployed to Production
  • Work with development teams at the appropriate stages to ensure the support strategy guidelines are followed and new systems or projects meet the Production standards
  • Constantly update knowledge repository, ensure information regarding any support related activities or issues are available and easily accessible
  • Responsible for servicing all requests for data or other activities that require access to Production systems
  • Improve self-reliance and reduce dependency on the availability of development or external team resources for the initial troubleshooting and resolution of problems
  • Experience with scheduling tools like Control-M is preferred
  • Experience with other developments tools, such as GitHub, Harness, Launch Darkly, and mabl is preferred
  • Experience with Data Catalog tools like Collibra, Informatica Enterprise Data Catalog and Informatica Data Quality is a plus
  • Experience with AWS, GCP, or other Cloud technologies is a plus
  • Wealth Management and/or Financial Services industry experience is a plus
  • Ability to quickly learn & become proficient with new technologies
  • Exceptional interpersonal skills, including teamwork, communication, and negotiation
What we offer:
  • 401(k) with company match and Employee stock purchase plan
  • Paid time for vacation, volunteering, and 28-day sabbatical after every 5 years of service for eligible positions
  • Paid parental leave and family building benefits
  • Tuition reimbursement
  • Health, dental, and vision insurance
  • Medical, dental and vision benefits
  • 401(k) and employee stock purchase plans
  • Tuition reimbursement to keep developing your career
  • Paid parental leave and adoption/family building benefits
  • Sabbatical leave available after five years of employment

Additional Information:

Job Posted:
January 12, 2026

Expiration:
January 19, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Lead Java Developer - Data Engineering

Lead Data Engineer

Sparteo is an independent suite of AI-powered advertising technologies built on ...
Location
Location
Salary
Salary:
Not provided
corporate.sparteo.com Logo
Sparteo
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficiency in distributed data systems
  • Proficient in clustering, various table types, and data types
  • Strong understanding of materialized views concepts
  • Skilled in designing table sorting keys
  • Solid programming skills in Python, Java, or Scala
  • Expertise in database technologies (SQL, NoSQL)
  • You are comfortable using AI-assisted development tools (e.g., GitHub Copilot, Tabnine)
  • Proven experience leading data teams in fast-paced environments
  • Ability to mentor junior engineers and foster a culture of growth and collaboration
  • Data-driven decision-making abilities aligned with Sparteo's focus on results and improvement
Job Responsibility
Job Responsibility
  • Data Infrastructure Design and Optimization
  • Lead the design, implementation, and optimization of data architectures to support massive data pipelines
  • Ensure the scalability, security, and performance of the data infrastructure
  • Collaborate with software and data scientists to integrate AI-driven models into data workflows
  • Leadership and Team Management
  • Manage and mentor a team of 2 data engineers, fostering a culture of continuous improvement
  • Oversee project execution and delegate responsibilities within the team
  • Guide technical decisions and promote best practices in data engineering
  • Collaboration and Cross-Functional Engagement
  • Work closely with product managers, developers, and analytics teams to define data needs and ensure alignment with business objectives
What we offer
What we offer
  • A convivial and flexible working environment, with our telecommuting culture integrated into the company's organization
  • A friendly and small-sized team that you can find in our offices near Lille or in Paris
  • Social gatherings and company events organized throughout the year
  • Sparteo is experiencing significant growth both in terms of business and workforce, especially internationally
  • Additional benefits include an advantageous compensation system with non-taxable and non-mandatory overtime hours, as well as a Swile restaurant ticket card
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

As a Lead Data Engineer at Rearc, you'll play a pivotal role in establishing and...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering, data architecture, or related fields
  • Extensive experience in writing and testing Java and/or Python
  • Proven experience with data pipeline orchestration using platforms such as Airflow, Databricks, DBT or AWS Glue
  • Hands-on experience with data analysis tools and libraries like Pyspark, NumPy, Pandas, or Dask
  • Proficiency with Spark and Databricks is highly desirable
  • Proven track record of leading complex data engineering projects, including designing and implementing scalable data solutions
  • Hands-on experience with ETL processes, data warehousing, and data modeling tools
  • In-depth knowledge of data integration tools and best practices
  • Strong understanding of cloud-based data services and technologies (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery)
  • Strong strategic and analytical skills
Job Responsibility
Job Responsibility
  • Understand Requirements and Challenges: Collaborate with stakeholders to deeply understand their data requirements and challenges
  • Implement with a DataOps Mindset: Embrace a DataOps mindset and utilize modern data engineering tools and frameworks, such as Apache Airflow, Apache Spark, or similar, to build scalable and efficient data pipelines and architectures
  • Lead Data Engineering Projects: Take the lead in managing and executing data engineering projects, providing technical guidance and oversight to ensure successful project delivery
  • Mentor Data Engineers: Share your extensive knowledge and experience in data engineering with junior team members, guiding and mentoring them to foster their growth and development in the field
  • Promote Knowledge Sharing: Contribute to our knowledge base by writing technical blogs and articles, promoting best practices in data engineering, and contributing to a culture of continuous learning and innovation
Read More
Arrow Right

Team Lead Data Engineer

Data Management Platform is the core system that receives, processes and provide...
Location
Location
Salary
Salary:
Not provided
coherentsolutions.com Logo
Coherent Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Desire and readiness to perform a team lead role and a tech lead role
  • 5+ years of experience in Java
  • Strong knowledge of algorithms and data structures
  • Readiness to deep dive into legacy codebase
  • Experience with SQL DBs
  • Solid experience with Kafka, streaming systems, microservices
  • Experience in dealing with performance and high scale systems
  • Understanding of Hadoop/Spark/big data tools
  • Analytical thinking, ability to deeply investigate tasks and understand how system components works from business side
  • Reliability, confidence and readiness to deal with production issues
Job Responsibility
Job Responsibility
  • Perform the team lead / people management role for 3 our engineers: 1:1s, ensuring high motivation and retention, working with feedback, mentoring and tech support
  • Perform the tech lead role for a mixed team of +-5 customer and Coherent engineers: coordination, task distribution, technical assistance
  • End-to-end development and ownership, from design to production
  • Implement high scale Big-Data solutions and contribute to our platform infrastructure and architecture
  • Research core technologies and integrations with external APIs and services
  • Work with various stakeholders: Product, Engineering, Data providers, and etc.
  • Participate in off-hours Pager Duty
What we offer
What we offer
  • Technical and non-technical training for professional and personal growth
  • Internal conferences and meetups to learn from industry experts
  • Support and mentorship from an experienced employee to help you professional grow and development
  • Internal startup incubator
  • Health insurance
  • English courses
  • Sports activities to promote a healthy lifestyle
  • Flexible work options, including remote and hybrid opportunities
  • Referral program for bringing in new talent
  • Work anniversary program and additional vacation days
Read More
Arrow Right

Lead Java Bigdata Developer

Senior level position responsible for establishing and implementing new or revis...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Engineering, or related field
  • Minimum 12 years experience in full stack development with focus on Java
  • Extensive experience in big data technologies such as Hadoop, Spark, Kafka
  • Proven leadership experience in managing large-scale data projects
  • Strong understanding of data governance principles and practices
  • Excellent problem-solving skills and ability to innovate solutions
  • Strong communication and interpersonal skills with ability to work collaboratively
  • Ability to prioritize and manage multiple tasks effectively
Job Responsibility
Job Responsibility
  • Partner with multiple management teams to ensure appropriate integration of functions
  • Resolve variety of high impact problems/projects through evaluation of complex business processes
  • Provide expertise in area and advanced knowledge of applications programming
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging
  • Develop comprehensive knowledge of how areas of business integrate to accomplish business goals
  • Provide in-depth analysis with interpretive thinking to define issues
  • Serve as advisor or coach to mid-level developers and analysts
  • Assess risk when business decisions are made
  • Design, develop, and maintain scalable architecture using Java and full stack technologies
  • Manage big data technologies for data integration, storage, and analysis
  • Fulltime
Read More
Arrow Right

Big Data Lead Developer

We are seeking a highly skilled and experienced Big Data Lead Developer to estab...
Location
Location
Canada , Mississauga
Salary
Salary:
170.00 USD / Year
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of relevant experience in Big Data application development or systems analysis role
  • Experience in leading and mentoring big data engineering teams
  • Strong understanding of big data concepts, architectures, and technologies (e.g., Hadoop, PySpark, Hive, Kafka, NoSQL databases)
  • Proficiency in programming languages such as Java, Scala, or Python
  • Excellent problem-solving and analytical skills
  • Strong presentation, communication and interpersonal skills
  • Experience with data warehousing and business intelligence tools
  • Experience with data visualization and reporting
  • Knowledge of cloud-based big data platforms (e.g., AWS EMR, Azure HDInsight, Google Cloud Dataproc)
  • Proficiency in Unix/Linux environments
Job Responsibility
Job Responsibility
  • Lead and mentor a team of big data engineers, fostering a collaborative and high-performing environment
  • Provide technical guidance, code reviews, and support for professional development
  • Design and implement scalable and robust big data architectures and pipelines to handle large volumes of data from various sources
  • Evaluate and select appropriate big data technologies and tools based on project requirements and industry best practices
  • Implement and integrate these technologies into our existing infrastructure
  • Develop and optimize data processing and analysis workflows using technologies such as Spark, Hadoop, Hive, and other relevant tools
  • Implement data quality checks and ensure adherence to data governance policies and procedures
  • Continuously monitor and optimize the performance of big data systems and pipelines to ensure efficient data processing and retrieval
  • Collaborate effectively with cross-functional teams, including data scientists, business analysts, and product managers, to understand their data needs and deliver impactful solutions
  • Stay up to date with the latest advancements in big data technologies and explore new tools and techniques to improve our data infrastructure
What we offer
What we offer
  • Global benefits designed to support your well-being, growth, and work-life balance
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

The Engineering Lead Analyst is a senior level position responsible for leading ...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness
  • Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions
  • Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations
What we offer
What we offer
  • Equal opportunity employer commitment
  • Accessibility and accommodation support
  • Global workforce benefits
  • Fulltime
Read More
Arrow Right

Big Data Engineering Lead

The Senior Big Data engineering lead will play a pivotal role in designing, impl...
Location
Location
India , Chennai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master’s degree in Computer Science, Information Technology, or related field
  • Atleast 10 -12 years overall software development experience on majorly working with handling application with large scale data volumes from ingestion, persistence and retrieval
  • Deep understanding of big data technologies, including Hadoop, Spark, Kafka, Flink, NoSQL databases, etc.
  • Experience with Bigdata technologies Developer Hadoop, Apache Spark, Python, PySpark
  • Strong programming skills in languages such as Java, Scala, or Python
  • Excellent problem-solving skills with a knack for innovative solutions
  • Strong communication and leadership abilities
  • Proven ability to manage multiple projects simultaneously and deliver results
Job Responsibility
Job Responsibility
  • Lead the design and development of a robust and scalable big data architecture handling exponential data growth while maintaining high availability and resilience
  • Design complex data transformation processes using Spark and other big data technologies using Java, Pyspark or Scala
  • Design and implement data pipelines that ensure data quality, integrity, and availability
  • Collaborate with cross-functional teams to understand business needs and translate them into technical requirements
  • Evaluate and select technologies that improve data efficiency, scalability, and performance
  • Oversee the deployment and management of big data tools and frameworks such as Hadoop, Spark, Kafka, and others
  • Provide technical guidance and mentorship to the development team and junior architects
  • Continuously assess and integrate emerging technologies and methodologies to enhance data processing capabilities
  • Optimize big data frameworks, such as Hadoop, Spark, for performance improvements and reduced processing time across distributed systems
  • Implement data governance frameworks to ensure data accuracy, consistency, and privacy across the organization, leveraging metadata management and data lineage tracking
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

Data Engineering Lead a strategic professional who stays abreast of developments...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Strategic Leadership: Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Team Management: Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Architecture and Design: Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Technology Selection and Implementation: Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Performance Optimization: Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness, ensuring optimal access to global wealth data
  • Collaboration: Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions that support investment strategies and client reporting
  • Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations, particularly around sensitive financial data
  • Fulltime
Read More
Arrow Right