This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
You will be a part of the team accountable for design, model and development of whole GCP data ecosystem for one of our Client’s (Cloud Storage, Cloud Functions, BigQuery)
Involvement throughout the whole process starting with the gathering, analyzing, modelling, and documenting business/technical requirements will be needed
The role will include direct contact with clients
Modelling the data from various sources and technologies
Troubleshooting and supporting the most complex and high impact problems, to deliver new features and functionalities
Designing and optimizing data storage architectures, including data lakes, data warehouses, or distributed file systems
Implementing techniques like partitioning, compression, or indexing to optimize data storage and retrieval
Identifying and resolving bottlenecks, tuning queries, and implementing caching strategies to enhance data retrieval speed and overall system efficiency
Identifying and resolving issues related to data processing, storage, or infrastructure
Monitoring system performance, identifying anomalies, and conducting root cause analysis to ensure smooth and uninterrupted data operations
Train and mentor junior data engineers, providing guidance and knowledge transfer
Requirements:
At least 8+ years’ experience as DE - Python, GCP, BigQuery
Object oriented programming, Python and SQL
Strong knowledge in cloud computing platforms - Google Cloud - Candidate should be able to design, build, and deploy data pipelines in the cloud, to ingest data from various sources like databases, APIs or streaming platforms
Experience in composer or Apache Airflow and knowledge of Dataplex is a plus
Experience in dq checks implementation, any frameworks like Clouddq, Pydeequ
Good Knowledge of Dq dimensions
Experience working with GCP cloud-based infrastructure & systems
Programming skills (SQL, Python, other scripting)
Proficient in data modelling techniques and database optimization
Knowledge of query optimization, indexing, and performance tuning is necessary for efficient data retrieval and processing
Proficient in database management systems such as SQL (Big Query is a must), NoSQL
Candidate should be able to design, configure, and manage databases to ensure optimal performance and reliability
Experience with data integration tools and techniques, such as ETL and ELT Candidate should be able to integrate data from multiple sources and transform it into a format that is suitable for analysis
Nice to have:
Experience in composer or Apache Airflow and knowledge of Dataplex is a plus
What we offer:
Stable employment
“Office as an option” model
Flexibility regarding working hours and your preferred form of contract
Comprehensive online onboarding program with a “Buddy” from day 1
Cooperation with top-tier engineers and experts
Unlimited access to the Udemy learning platform from day 1
Certificate training programs
Upskilling support
Internal Gallup Certified Strengths Coach to support your growth