This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Do you want to be part of an enterprise data solutions team managing over 4 petabytes of data and building the next-generation analytics platform for a leading financial firm with over $10 trillion in assets under management? At Schwab, the Schwab Data Operational data Exchange (ODX) organization owns the strategy, implementation, delivery, and support of the enterprise data warehouse and emerging data platforms. We are looking for someone who has a passion for data and comes with software engineering specializing in data. Someone who has experience designing and developing Rest based APIs and Microservices along with some Batch processing frameworks. Who wants to be part of the Data Exchange team that is actively designing and implementing the Enterprise Data solutions. Someone who wants to be challenged every day and has a passion for keeping up to date on new technologies.
Job Responsibility:
Design, develop, and maintain scalable data streaming pipelines using Java, Spring, and AWS & GCP native compute services (Cloud Functions, Cloud Run and GKE) and GCP Storage services (Google Storage, Cloud SQL and Pubsub)
Develop and unit test high-quality, maintainable code
partner with QA to ensure comprehensive test coverage and zero-defect production releases
Develop and modify front-end UI components using React
Build reliable batch ingestion jobs to integrate Contact Center data from multiple upstream sources into the Operational Data Exchange (ODX) database
Streamline, simplify, and performance-tune batch and streaming data loads to improve throughput and minimize latency
Collaborate closely with business stakeholders and upstream application teams to understand requirements, align on data contracts, and build trusted relationships
Work with Production Support and Platform Engineering teams to triage and resolve production issues promptly, while ensuring data security and platform reliability
Follow agile and release management best practices to ensure smooth deployments and prevent production install failures
Stay current with evolving technologies and trends
continuously learn and apply modern patterns for data engineering and streaming
Communicate effectively across technical and non-technical audiences
demonstrate ownership, adaptability, and a collaborative mindset
Requirements:
Minimum 7 years of hands-on development experience using Java, Spring and related technologies for Spring Batch and API and Microservice applications
Must have 3+ years’ experience of developing for and deploying to Public Cloud platforms, preferably GCP
3+ years’ recent experience in developing front-end applications using Angular/React
Must have development experience using data streaming technologies like Kafka, Kinesis, RabbitMQ
Experience setting best practices for building and designing code and strong Java & SQL experience to develop, tune, and debug complex applications
Hands-on experience with Linux and shell scripting
Hands-on experience with CI/CD tools like Bamboo, Jenkins, Bitbucket, GitHub, etc.
Nice to have:
Experience working with data systems, including database schema design, retrieval and maintenance
DW knowledge and experience, IDMC / Informatica ETL experience is beneficial to have
Experience as a lead to mentor junior resources onsite and offshore
Experience in managing Operational Data Stores / Exchanges
What we offer:
401(k) with company match and Employee stock purchase plan
Paid time for vacation, volunteering, and 28-day sabbatical after every 5 years of service for eligible positions