This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Wells Fargo is seeking a seasoned Senior Software Engineer to design, develop, and deliver complex solutions within our big data, cloud, and distributed systems ecosystem. This role is critical in supporting large‑scale data platforms and cloud‑ready applications that power key banking experiences across the enterprise. The ideal candidate will work closely with cross‑functional Agile teams to build secure, scalable, and high‑performance solutions. You will support both ongoing platform enhancements and new technology initiatives while ensuring all engineering, security, and compliance standards are met.
Job Responsibility:
Design, develop, test, debug, and document large‑scale application components and platform features
Lead moderately complex engineering initiatives across big data, cloud, and distributed system environments
Analyze technical challenges and propose solutions grounded in strong engineering principles
Partner with engineering leads and product teams to shape technical strategies and long‑term roadmaps
Serve as an escalation point for complex design, coding, and testing activities, including performance tuning
Develop models, simulations, and analytics to support platform enhancements
Create ad‑hoc data queries to validate data quality and identify defects during testing
Collaborate closely with developers, testers, architects, and end‑users to drive successful implementation
Requirements:
4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
3+ years of experience with Spark, Hadoop, and Big Data technologies
3+ years with Spark SQL, Streaming, and DataFrame/Dataset APIs
3+ years working with relational databases such as SQL Server, Oracle, or MySQL
2+ years building cloud‑ready solutions on AWS, GCP, or PCF
2+ years using CI/CD tools such as Jenkins, UrbanCode Deploy, or Harness
2+ years with performance and monitoring tools such as JMeter, Blazemeter, Splunk, or AppDynamics
2+ years integrating SQL & NoSQL data sources with Spark (e.g., MS SQL Server, MongoDB)
2+ years of experience with Apache Kafka or Confluent Enterprise
Strong understanding of the Hadoop ecosystem, cloud platforms, HDFS, ETL/ELT workflows, and Unix shell scripting
Nice to have:
2+ years working with automation and testing frameworks (e.g., Selenium, JUnit, Rest Assured, Jasmine, Karma)
2+ years of microservices development experience, including REST/SOAP service design and API architecture
Experience with PCF, OpenShift, or similar cloud‑native platforms
Experience with front‑end or interface technologies such as JavaScript, HTML, or XML