Job Description:
Expertise in Data Architecture, Data Strategy and Roadmap for large and complex organization and systems and implemented large scale end-to-end Data Management & Analytics solutions. Experience in transforming traditional Data Warehousing approaches to Big Data based approaches and proven track record of managing risks and data security. Expertise with Dimensional modeling techniques, Star & Snowflake schemas, modeling slowly changing dimensions and role playing dimensions, dimensional hierarchies, and data classification. Experiences in cloud native principals, designs and deployments. Extensive experience working with and enhancing Continuous Integration (CI) and Continuous Development (CD) environments. Expertise in Data Quality, Data Profiling, Data Governance, Data Security, Metadata Management, and Data Archival. Data Migration strategies using appropriate tools. Drive delivery in a matrixed environment working with various internal IT partners. Demonstrated ability to work in a fast paced and changing environment with short deadlines, interruptions, and multiple tasks/projects occurring simultaneously. Must be able to work independently and have skills in planning, strategy, estimation, scheduling. Strong problem solving, influencing, communication, and presentation skills, self-starter. Strong Hands-on programming skills in PySpark. Experience with data processing frameworks and platforms (Kafka, Beam, Flink, SAP HANA, Hadoop, Presto, Tez, Hive, Spark etc.). Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Python, GIT, Jenkins, MLOps). Hands-on experience with BI tools and reporting software (e.g. MS PowerBI and Cognos Reporting).