This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
While in this position, you will work independently while engaging in the design, model and implementation of enterprise data warehousing activities in support of projects and defect remediation process. You will also, participate in and lead segments in the programming and configuration of warehouses on database, and provide independent support to warehouse users. You will also assist with designing, implementing and supporting solutions that meet business requirements related to data warehouses, data marts or operational data stores primarily using ETL. Lastly, you will work with application architects to implement enhancements to systems and remediate existing code.
Job Responsibility:
Be a strong contributor in implementing data integration architecture and solutions.
Conduct source system/data analysis and data profiling.
Build robust data pipelines to collect, process and compute different metrics from various financial sources, adhering to quality and development standards.
Design application architecture / technical design and articulate the data pipeline solutions to team members.
Data mapping from source to target and define transformation rules that meet the models created by data Architects.
Collaborate with cross functional team members as necessary and come up with optimal solutions that meet data demands.
Execute unit test of data populated in target data container, validate expected result and ensure quality & accuracy.
Coordinate with business users for user acceptance test and with operations team for code deployment to upper environments.
Follow change management team stipulations on path to production requirements and strictly adhere to the compliance and regulatory needs.
Requirements:
Bachelor's Degree in Analytics, Mathematics, Statistics or Computer Science
Hands on experience in building data solutions in big data / cloud platforms.
Experience with Hadoop ecosystem (HDFS, Hive, PIG, Hbase, Hiveql etc.)
Experienced in ETL/ELT tools such as Talend, Datastage, Informatica, and Sqoop as well as building/enhancing ETL framework.
RDBMS experience in either of Teradata, Oracle, MS-SQL, DB2, RedShift, Snowflakes, etc.