This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Specialized IT Consultant to serve as a technical advisor and subject matter expert in data architecture, warehousing, and modern data lake implementation. This role is focused on designing, maintaining, and optimizing sophisticated ETL (Extract, Transform, Load) processes within Azure Databricks. You will be responsible for the end-to-end movement of data—from real-time ingestion via Oracle GoldenGate to curated data layers using Medallion Architecture, ensuring high-performance analytics and FHIR standard compliance.
Job Responsibility:
Data Pipeline Engineering: Design and implement robust ETL/ELT pipelines using Azure Databricks, Spark, and Delta Lake
Medallion Architecture Implementation: Structure data into Bronze (raw), Silver (cleansed), and Gold (business-ready) layers to ensure scalability and data quality
Real-Time Ingestion: Manage streaming data and Change Data Capture (CDC) using Oracle GoldenGate and Databricks Autoloader
Orchestration & Workflow: Schedule and manage complex job dependencies using Databricks Workflows and Azure Data Factory
Data Modeling: Define conceptual, logical, and physical models, including mapping from transactional sources to curated data marts
Governance & Security: Implement data governance, auditing, and access control policies using Unity Catalog
Optimization & Troubleshooting: Perform data profiling, performance tuning of Spark jobs, and troubleshoot complex data consistency issues
Knowledge Transfer: Develop comprehensive technical documentation and lead KT sessions for internal teams to ensure long-term system maintainability
Requirements:
Databricks & Delta Lake Mastery: Extensive experience in Delta Lake for ACID transactions, schema evolution, and data versioning
Programming Proficiency: Strong expertise in Python, PySpark, and Advanced SQL for data manipulation
Legacy & Modern Integration: Hands-on experience with Microsoft SSIS, T-SQL, and Oracle PL/SQL alongside modern cloud tools
Streaming Architecture: Expert knowledge of Structured Streaming, readStream, and real-time data handling
ETL Methodology: Deep understanding of incremental vs. full load strategies and Delta Live Tables (DLT)
Standards Awareness: Familiarity with FHIR (Fast Healthcare Interoperability Resources) standards within data ecosystems
Analytical Skills: Proven ability to conduct Fit-Gap analysis, system use case reviews, and complex data lineage documentation
What we offer:
Technical Authority: Serve as a high-level advisor on major business process enablement through IT enhancements
Modern Stack Exposure: Work at the forefront of data engineering with Unity Catalog, DLT, and real-time streaming technologies
Complex Data Landscapes: Solve challenging architectural puzzles involving the migration of legacy Oracle data into modern Delta Lakes
Strategic Impact: Directly influence the efficiency and structure of organizational I&IT systems through expert technical assistance