This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Senior data engineering role delivering large-scale data integration solutions across CRM ecosystems using ETL tooling, Python automation, and advanced SQL/NoSQL capabilities.
Job Responsibility:
Data integration delivery: Design, build, and operate robust batch and near-real-time integration pipelines for CRM data domains (e.g., customers, products, orders, invoices, service interactions)
ETL/ELT engineering: Develop and optimize workflows using enterprise ETL tools
establish reusable patterns for ingestion, transformation, and publish layers
Python engineering: Build Python-based utilities and frameworks for data loads, automation, validation, reconciliation, and operational tooling
SQL excellence: Write and tune complex SQL (query optimization, indexing strategy awareness, incremental loads, CDC patterns, and performance troubleshooting)
NoSQL competence: Apply non-relational data modeling and query patterns where appropriate (document/columnar/key-value/graph), including performance and consistency considerations
Data quality and controls: Implement validation rules, error handling, auditability, and reconciliation controls
create monitoring/alerting to meet SLAs/SLOs
Architecture and design: Contribute to system architecture and integration design decisions (data contracts, schemas, idempotency, versioning, resiliency)
Security and compliance: Ensure data pipelines follow security best practices (encryption, access control, secrets management) and align with retention and privacy requirements
Collaboration: Work closely with CRM application teams, enterprise architects, and analytics consumers to align on data definitions and delivery priorities
Operational ownership: Participate in release planning, production support, incident triage, and continuous improvement to enhance reliability and reduce run-time cost
Requirements:
6-10 years of experience in data engineering, data integration, or platform engineering with a focus on large enterprise programs
Proven track record delivering data integration for ERP and/or CRM platforms in complex environments
Strong hands-on experience with one or more enterprise ETL/ELT tools (e.g., Informatica, DataStage, SSIS, Talend, Azure Data Factory, Matillion, dbt or similar)
Advanced Python skills for data processing, automation, and scripting (packaging, logging, error handling, and testing discipline)
Expert-level SQL skills, including performance tuning and data warehousing concepts (dimensional modeling awareness a plus)
Working knowledge of NoSQL concepts and at least one NoSQL technology (implementation experience preferred)
Strong understanding of data pipeline fundamentals: incremental loading, late-arriving data, schema evolution, and end-to-end observability
Ability to communicate clearly with both technical and non-technical stakeholders
strong documentation and design skills
Nice to have:
Experience with streaming/event-driven data integration (e.g., Kafka or similar) and CDC-based ingestion patterns
Experience with cloud data platforms and modern lakehouse/warehouse ecosystems
Knowledge of data governance practices (metadata, lineage, data quality frameworks)
Experience implementing DevOps practices for data engineering (CI/CD for pipelines, infrastructure-as-code, automated testing)
What we offer:
medical
dental & vision coverage
401(k)
life, accident, and disability insurance
wellness programs
paid time off packages, including planned time off (vacation), unplanned time off (sick leave), and paid holidays