This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This long-term contract position offers the opportunity to work on cutting-edge data engineering projects while collaborating with multidisciplinary teams to deliver high-quality solutions. The ideal candidate will have a strong background in Databricks and big data technologies, along with a passion for optimizing data processes and systems.
Job Responsibility:
Design, build, and enhance data pipelines using Databricks Runtime, Delta Lake, Autoloader, and Structured Streaming
Implement secure and governed data access protocols utilizing Unity Catalog, workspace controls, and audit configurations
Manage and integrate structured and unstructured data from diverse sources, including APIs and cloud storage
Develop and maintain notebook-based workflows and manage jobs using Databricks Workflows and Jobs
Apply best practices for performance tuning, scalability, and cost optimization in Databricks environments
Collaborate with data scientists, analysts, and business stakeholders to deliver clean and reliable datasets
Support continuous integration and deployment processes for Databricks jobs and system configurations
Ensure high standards of data quality and security across all engineering tasks
Troubleshoot and resolve issues to maintain operational efficiency in data pipelines
Requirements:
5–7 years of experience in data engineering or software development roles
At least 3 years of hands-on experience with Databricks, Delta Lake, and Spark (PySpark, PySQL, Python)
Proven expertise in implementing and managing Unity Catalog in multi-tenant environments
Strong knowledge of big data performance optimization techniques and cloud-based systems
Familiarity with GitHub, CI/CD pipelines, and automated deployment for Databricks assets
Experience with cloud storage services such as Azure or AWS, including networking configurations
Excellent problem-solving abilities and a demonstrated capacity to thrive in Agile team environments
Nice to have:
Databricks Certified Data Engineer certification is highly preferred
What we offer:
medical, vision, dental, and life and disability insurance