This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a highly skilled Senior Data Engineer with strong expertise in data integration, ETL/ELT, and cloud data platforms. The ideal candidate will design and build scalable data pipelines, develop integration strategies, and ensure high-quality, reliable data movement across enterprise systems.
Job Responsibility:
Define and implement data integration strategy, architecture, and roadmap (batch vs real-time, API vs ETL)
Design, develop, and maintain scalable data pipelines for internal and external data sources
Build and manage ETL/ELT processes using tools such as SSIS and modern data platforms
Develop API-based integrations and data transformation workflows
Ensure data quality through validation, cleansing, and reconciliation processes
Optimize data workflows for performance, scalability, and reliability
Implement monitoring, logging, error handling, and alerting mechanisms for data pipelines
Maintain documentation for data flows, mappings, and transformation logic
Ensure compliance with data governance, security, and regulatory standards (PII, etc.)
Collaborate with business and technical stakeholders to understand data requirements and deliver solutions
Requirements:
7+ years of experience in Data Engineering and Data Integration
Strong expertise in Data Warehousing concepts and architecture
Hands-on experience with ETL/ELT tools (SSIS preferred)
Strong proficiency in SQL Server or other RDBMS
Experience building and maintaining data pipelines
Experience implementing data quality frameworks (QA/QC)
Hands-on experience with cloud data platforms (e.g., Snowflake)
Experience with BI tools (e.g., Power BI)
Experience with API integrations (e.g., MuleSoft)
Strong communication and stakeholder management skills
Nice to have:
Experience with real-time data pipelines (Kafka, Python)
Knowledge of Data Lakehouse architecture
Experience handling sensitive data (PII, HIPAA compliance)
Exposure to Salesforce integrations
Familiarity with AWS (Lambda), Kubernetes, or containerization tools