This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This position is NOT eligible for visa sponsorship. This role will specialize in building comprehensive data pipeline development and management, enabling our current Business Intelligence team to focus on analytics and business value while ensuring robust, scalable data integration solutions.
Job Responsibility:
Develop and maintain ETL/ELT data pipelines leveraging Qlik Data Integration for data warehouse generation in bronze, silver, gold layers
Build consumer facing datamarts, views, and push-down calculations to enable improved analytics by BI team and Citizen Developers
Implement enterprise data integration patterns supporting batch, real-time, and hybrid processing requirements
Coordinate execution of and monitor pipelines to ensure timely reload of EDW
Configure and manage Qlik Data Integration components including pipeline projects, lineage, data catalog, data quality, and data marketplace
Implement data quality rules and monitoring using Qlik and Talend tools
Manage Qlik Tenant, security, access and manage Data Movement Gate way
Monitor and optimize data replication performance, latency, and throughput across all integration points
Implement comprehensive logging, alerting, and performance monitoring
Conduct regular performance audits and capacity planning for integration infrastructure
Establish SLA monitoring and automated recovery procedures for critical data flows
Provide technical expertise on Qlik Data Integration best practices and enterprise patterns
Support database administrators and infrastructure teams with replication and integration architecture
Lead technical discussions on data strategy and platform roadmap decisions
Requirements:
Bachelor's degree in Computer Science, Information Systems, or related technical field
4+ years of experience in enterprise data integration with at least 2 years of hands-on Qlik or Talend experience
Strong understanding of change data capture (CDC) technologies and real-time data streaming concepts
Strong understanding of data lake and data warehouse strategies, and data modelling
Advanced SQL skills with expertise in database replication, synchronization, and performance tuning
Experience with enterprise ETL/ELT tools and data integration patterns
Proficiency in at least one programming language (Java, Python, or SQL scripting)
Nice to have:
Qlik Data Integration certification or Talend certification (Data Integration, Data Quality, or Big Data)
Experience with cloud platforms (AWS or Azure) and hybrid integration scenarios
Experience with Snowflake preferred
Understanding of data governance frameworks and regulatory compliance requirements
Experience with API management and microservices architecture
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.