This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Data Engineer, you will design, develop and implement a cost-effective, scalable, reusable and secured Ingestion framework. You will take advantage of the opportunity to work with business leaders, various stakeholders, and source system SME’s to understand and define the business needs, translate to technical specifications, and ingest data into Google cloud platform, BigQuery. You will design and implement processes for data ingestion, transformation, storage, analysis, modelling, reporting, monitoring, availability, governance and security of high volumes of structured and unstructured data
Job Responsibility:
Pipeline Design & Implementation: Develop and deploy high-throughput data pipelines using the latest GCP technologies
Subject Matter Expertise: Serve as a specialist in data engineering and Google Cloud Platform (GCP) data technologies
Client Communication: Leverage your GCP data engineering experience to engage with clients, understand their requirements, and translate these into technical data solutions
Technical Translation: Analyze business requirements and convert them into technical specifications. Create source-to-target mappings, enhance ingestion frameworks to incorporate internal and external data sources, and transform data according to business rules
Data Cataloging: Develop capabilities to support enterprise-wide data cataloging
Security & Privacy: Design data solutions with a focus on security and privacy
Agile & DataOps: Utilize Agile and DataOps methodologies and implementation strategies in project delivery
Requirements:
Bachelor’s or Master’s degree in any one of the disciplines: Computer Science, Data & Analytics or similar relevant subjects
4+ yrs years of hands-on IT experience in a similar role
Proven expertise in SQL – subqueries, aggregations, functions, triggers, Indexes, DB optimization, creating/understanding relational data-based models
Deep experience working with Google Data Products (e.g. BigQuery, Dataproc, Dataplex, Looker, Cloud data fusion, Data Catalog, Dataflow, Cloud composer, Analytics Hub, Pub/Sub, Dataprep, Cloud Bigtable, Cloud SQL, Cloud IAM, Google Kubernetes engine, AutoML)
Experience in Qlik replicate , Spark (Scala/Python/Java) and Kafka
Excellent written and verbal skills to communicate technical solutions to business teams
Understanding trends, new concepts, industry standards and new technologies in Data and Analytics space
Ability to work with globally distributed teams
Knowledge of Statistical methods and data modelling knowledge
Working knowledge in designing and creating Tableau/Qlik/Power BI dashboards, Alteryx and Informatica Data Quality
What we offer:
Personal holidays
Healthcare
Pension
Tax saver scheme
Free Onsite Breakfast
Discounted Corporate Gym Membership
Multicultural environment
Learning, professional growth and development in a world-recognized international environment
Access to internal & external training, coaching & certifications
Recognition for innovation and excellence
Access to transportation: Grand Canal Dock is well-connected to public transportation, including DART trains, buses, and bike-sharing services, making it easy to get to and from the area