This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Perform development work and technical support related to our data transformation and ETL jobs in support of a global data warehouse. Can communicate results with internal customers. Requires the ability to work independently, as well as in cooperation with a variety of customers and other technical professionals.
Job Responsibility:
Perform development work and technical support related to our data transformation and ETL jobs in support of a global data warehouse
Development of new ETL/data transformation jobs, using PySpark and IBM DataStage in AWS
Enhancement and support on existing ETL/data transformation jobs
Can explain technical solutions and resolutions with internal customers and communicate feedback to the ETL team
Perform technical code reviews for peers moving code into production
Perform and review integration testing before production migrations
Provide high level of technical support, and perform root cause analysis for problems experienced within area of functional responsibility
Can document technical specs from business communications
Requirements:
5+ years of ETL experience
Experience with core Python programming for data transformation
Intermediate-level PySpark skills
Strong knowledge of SQL fundamentals
IBM DataStage experience preferred
Able to write SQL code sufficient for most business requirements
Proven track record in troubleshooting ETL jobs
Proficient in developing optimization strategies for ETL processes
Basic AWS technical support skills
Will run and monitor jobs running via Control-M
Can create clear and concise documentation and communications
Ability to coordinate and aggressively follow up on incidents and problems
Ability to prioritize and work on multiple tasks simultaneously
Effective in cross-functional and global environments
A self-starter who can work well independently and on team projects
Experienced in analyzing business requirements
Understands data dependencies and how to schedule jobs in Control-M
Experienced working at the command line in various flavors of UNIX
Bachelors of Science in computer science or equivalent
5+ years of ETL and SQL experience
3+ years of python and PySpark experience
3+ years of AWS and UNIX experience
2+ years of IBM DataStage experience
1+ years of SnowFlake experience
Nice to have:
Candidates with SnowFlake background or certification will be Preferred