This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Part of the team accountable for design, model and development of whole GCP data ecosystem for one of our Client’s (Cloud Storage, Cloud Functions, BigQuery)
Involvement throughout the whole process starting with the gathering, analyzing, modelling, and documenting business/technical requirements
Direct contact with clients
Modelling the data from various sources and technologies
Troubleshooting and supporting the most complex and high impact problems, to deliver new features and functionalities
Designing and optimizing data storage architectures, including data lakes, data warehouses, or distributed file systems
Implementing techniques like partitioning, compression, or indexing to optimize data storage and retrieval
Identifying and resolving bottlenecks, tuning queries, and implementing caching strategies to enhance data retrieval speed and overall system efficiency
Identifying and resolving issues related to data processing, storage, or infrastructure
Monitoring system performance, identifying anomalies, and conducting root cause analysis to ensure smooth and uninterrupted data operations
Train and mentor less experienced data engineers, providing guidance and knowledge transfer
Requirements:
At least 4 years of experience as a Data Engineer working with GCP cloud-based infrastructure & systems
Deep knowledge of Google Cloud Platform and cloud computing services
Extensive experience in design, build, and deploy data pipelines in the cloud, to ingest data from various sources like databases, APIs or streaming platforms
Proficient in database management systems such as SQL (Big Query is a must), NoSQL
Programming skills (SQL, Python, other scripting)
Proficient in data modeling techniques and database optimization
Knowledge of at least one orchestration and scheduling tool (Airflow is a must)
Experience with data integration tools and techniques, such as ETL and ELT
Knowledge of modern data transformation tools (such as DBT, Dataform)
Excellent communication skills
Advanced English level
Ability to actively participate/lead discussions with clients
Tools knowledge: Git, Jira, Confluence, etc.
Open to learn new technologies and solutions
Experience in multinational environment and distributed teams
Nice to have:
Certifications in big data technologies or/and cloud platforms
Experience with BI solutions (e.g. Looker, Power BI, Tableau)
Experience with ETL tools: e.g. Talend, Alteryx
Experience with Apache Spark, especially in GCP environment
Experience with Databricks
Experience with Azure cloud-based infrastructure & systems
What we offer:
Stable employment
Full-time position with work contract
Medical Insurance
Grocery Coupons
Saving fund
30 days of Christmas bonus
Remote work bonus
Profit sharing
50% vacation premium
Flexibility regarding working hours
Comprehensive online onboarding program with a “Buddy” from day 1
Cooperation with top-tier engineers and experts
Unlimited access to the Udemy learning platform from day 1
Certificate training programs
Upskilling support
Grow as we grow as a company
A diverse, inclusive, and values-driven community
Autonomy to choose the way you work
Create our community together
Activities to support your well-being and health
Plenty of opportunities to donate to charities and support the environment