This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Platform team at Deel is dedicated to building a secure, high-performance infrastructure that empowers our 100+ data professionals. In 2026, our core mission has expanded: we aren't just moving data; we are ensuring its integrity and security. As we scale, your primary focus will be architecting and implementing a robust, unified permission system across our modern data stack—ensuring the right people have the right access at the right time, while keeping Deel compliant with global standards.
Job Responsibility:
Design, build, and maintain efficient data pipelines (ETL processes) to integrate data from various source systems into the data warehouse
Architect Permissions-as-Code: Design and implement scalable Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) models within Snowflake and Looker
Unify the Stack: Create a seamless bridge between data warehouse roles and BI tool permissions, ensuring that security policies in Snowflake (like Row-Level Security) are accurately reflected in Looker
Audit & Comply: Support internal and external audits by building automated reporting on data access, usage, and compliance status (GDPR, SOC2, etc.)
Develop and optimize data warehouse schemas and tables to support analytics and reporting needs
Write and refine complex SQL queries and use scripting (e.g., Python) to transform and aggregate large datasets
Implement data quality measures (such as validation checks and cleansing routines) to ensure data integrity and reliability
Collaborate with data analysts, data scientists, and other engineers to understand data requirements and deliver appropriate solutions
Document pipeline designs, data flows, and data definitions for transparency and future reference, adhering to team standards
Handle multiple tasks or projects simultaneously, prioritizing work and communicating progress to stakeholders to meet deadlines
Requirements:
Bachelor’s or Master’s degree in a relevant field (e.g., Computer Science, Mathematics, Physics)
At least 3 years of experience in a data engineering or similar backend data development role
Strong SQL skills and experience with data modeling and building data warehouse solutions
Proficiency in at least one programming language (e.g., Python) for data processing and pipeline automation
Familiarity with ETL tools and workflow orchestration frameworks (e.g., Apache Airflow or similar)
Experience implementing data quality checks and working with large-scale datasets
Good problem-solving abilities, plus strong communication and teamwork skills to work with cross-functional stakeholders
Nice to have:
Advanced LookML: Experience building complex Looker models that integrate with Snowflake’s security layers
Security Tools: Experience with data discovery or cataloging tools (e.g., Select.dev)
dbt Governance: Experience using dbt "tests" and "docs" to enforce data quality and metadata standards
What we offer:
Stock grant opportunities dependent on your role, employment status and location
Additional perks and benefits based on your employment status and country
The flexibility of remote work, including optional WeWork access