This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Join us as a Data Engineer responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. Purpose of the role: To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Job Responsibility:
Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures
Development of processing and analysis algorithms fit for the intended data complexity and volumes
Collaboration with data scientist to build and deploy machine learning models
To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement
Lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources
Partner with other functions and business areas
Take responsibility for embedding new policies/ procedures adopted due to risk mitigation
Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to
Requirements:
Hands on experience in pyspark and strong knowledge on Dataframes, RDD and SparkSQL
Hands on Experience in developing, testing and maintaining applications on AWS Cloud
Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena)
Design and implement scalable and efficient data transformation/storage solutions using Snowflake
Experience in Data ingestion to Snowflake for different storage format such Parquet, Iceberg, JSON, CSV etc
Experience in using DBT (Data Build Tool) with snowflake for ELT pipeline development
Experience in Writing advanced SQL and PL SQL programs
Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology
Should have worked at least on two major project implementations
Nice to have:
Exposure to data governance or lineage tools such as Immuta and Alation is added advantage
Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage
Knowledge on Abinitio ETL tool is a plus
Ability to engage with Stakeholders, elicit requirements/ user stories and translate requirements into ETL components
Ability to understand the infrastructure setup and be able to provide solutions either individually or working with teams
Good knowledge of Data Marts and Data Warehousing concepts
Resource should possess good analytical and Interpersonal skills
Implement Cloud based Enterprise data warehouse with multiple data platform along with Snowflake and NoSQL environment to build data movement strategy
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.