This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Because of their immense growth in their Data Engineering and cloud portfolio, an exciting opportunity has come up in one of their offices in Sydney. You will be joining their elite team and playing a key role in developing, building and enhancing enterprise scale Data Engineering solutions in the Google Cloud Platform (GCP) space.
Job Responsibility:
Developing, building and enhancing enterprise scale Data Engineering solutions in the Google Cloud Platform (GCP) space
Requirements:
Strong previous experience in Data Engineering and Data Warehousing
Experience in building and deploying solutions in data warehouses using GCP services including BigQuery, Dataflow, Pub/Sub
Experience in Big Data and Hadoop experience with main focus on Spark, Hive, Presto (or other query engines), big data storage formats (such as Parquet, ORC, Avro), Bamboo/Ansible (or other CICD tools)
Experience working in a cloud based environment (ideally AWS or GCP)
Strong experience in building ETL pipelines and frameworks using SQL and Python
Exposure to data modeling (Kimball, data vault or similar)with event driven processing and big data processing using GCP Pub/Sub
Strong collaboration and communication skills
Only Australian Permanent Residents and Citizens required for this role