This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are expanding the team of engineers that owns and operates our extensive data platform underpinning the SMA business. As a member of this team, you will build, enhance, and support a wide variety of data-centric applications and tools to generate or consume data feeds from diverse systems; generate reporting artifacts at scale; and expand the data estate in support of our users. You will work with your team and your stakeholders to ensure data arrives in a complete and timely manner, with validated quality.
Job Responsibility:
Build, enhance, and support a wide variety of data-centric applications and tools to generate or consume data feeds from diverse systems
Generate reporting artifacts at scale
Expand the data estate in support of our users
Work with your team and your stakeholders to ensure data arrives in a complete and timely manner, with validated quality
Requirements:
BA/BS in Computer Science or equivalent practical experience
At least 3+ years of post-university experience as a full-stack engineer
Solid knowledge of programming fundamentals—algorithms, data structures, design patterns, and paradigms
Strong knowledge of Python
Solid knowledge SQL and relational databases
Ability to troubleshoot problems in a live environment and provide real time help with critical issues
Write code that is easily understood and maintainable by other team members
Keep up to date with developments in technologies we are using
Ability to communicate and work effectively, supporting various business departments
Ability to work in fast-paced interdisciplinary environment
Nice to have:
Knowledge of Extract-Transform-Load (ETL) and big data analytics tools
Experience working in a cloud environment (AWS, Azure, GCP)
Familiarity with popular Python data analytics frameworks
Familiarity with Snowflake or other large analytics engines
Experience with data warehousing and working in environments with large scale, complex analytics requirements
Familiarity with DAG-based job scheduling tools
Cloud-native container orchestration platforms and tools
Experience working in the Financial Services industry