CrawlJobs Logo

Talend ETL Data Engineer

brosterbuchanan.com Logo

Broster Buchanan

Location Icon

Location:
United Kingdom , Greater Manchester

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

A major client of ours is looking to build a team of strong Talend ETL Data Engineers for a long-term project migrating legacy systems and Data into a Talend ecosystem, and is looking for experienced Talend Data Engineers.

Job Responsibility:

Migrating legacy systems and Data into a Talend ecosystem

Requirements:

  • ETL/ELT pipelines, transformations, and data quality
  • Strong SQL and performance optimisation
  • Experience with data integration patterns (batch / real-time, CDC, APIs)
  • Solid problem-solving and debugging skills in the context of data pipelines
  • Version control and an understanding of CI/CD and deployment of data jobs
  • Must have been a UK resident for a minimum of 5 years to be able to gain BPSS clearance

Nice to have:

Data modelling experience

Additional Information:

Job Posted:
December 28, 2025

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Talend ETL Data Engineer

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right
New

Qlik Data Engineer

This position is NOT eligible for visa sponsorship. This role will specialize in...
Location
Location
United States , Easton
Salary
Salary:
Not provided
victaulic.com Logo
Victaulic
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, or related technical field
  • 4+ years of experience in enterprise data integration with at least 2 years of hands-on Qlik or Talend experience
  • Strong understanding of change data capture (CDC) technologies and real-time data streaming concepts
  • Strong understanding of data lake and data warehouse strategies, and data modelling
  • Advanced SQL skills with expertise in database replication, synchronization, and performance tuning
  • Experience with enterprise ETL/ELT tools and data integration patterns
  • Proficiency in at least one programming language (Java, Python, or SQL scripting)
Job Responsibility
Job Responsibility
  • Develop and maintain ETL/ELT data pipelines leveraging Qlik Data Integration for data warehouse generation in bronze, silver, gold layers
  • Build consumer facing datamarts, views, and push-down calculations to enable improved analytics by BI team and Citizen Developers
  • Implement enterprise data integration patterns supporting batch, real-time, and hybrid processing requirements
  • Coordinate execution of and monitor pipelines to ensure timely reload of EDW
  • Configure and manage Qlik Data Integration components including pipeline projects, lineage, data catalog, data quality, and data marketplace
  • Implement data quality rules and monitoring using Qlik and Talend tools
  • Manage Qlik Tenant, security, access and manage Data Movement Gate way
  • Monitor and optimize data replication performance, latency, and throughput across all integration points
  • Implement comprehensive logging, alerting, and performance monitoring
  • Conduct regular performance audits and capacity planning for integration infrastructure
Read More
Arrow Right

Data Test Engineer

We are looking for a skilled Data Test Engineer who can design, build, and valid...
Location
Location
India , Chennai
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of experience in Data Engineering and Data/ETL Testing
  • Strong expertise in writing and optimizing SQL queries (joins, subqueries, window functions, performance tuning)
  • Proficiency in Python or PySpark for data transformation and automation
  • Hands-on experience with ETL tools such as Azure Data Factory, Talend, SSIS, or Informatica
  • Familiarity with cloud platforms, preferably Azure
  • AWS or GCP is a plus
  • Experience working with data lakes, data warehouses (Snowflake, BigQuery, Redshift), and modern data platforms
  • Knowledge of version control systems (Git), issue tracking tools (JIRA), and Agile methodologies
  • Exposure to data testing frameworks like Great Expectations, DBT tests, or custom validation tools
  • Experience integrating data testing into CI/CD pipelines
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust ETL/ELT pipelines to process large volumes of structured and unstructured data using Azure Data Factory, PySpark, and SQL-based tools
  • Collaborate with data architects and analysts to understand transformation requirements and implement business rules correctly
  • Develop and execute complex SQL queries to validate, transform, and performance-tune data workflows
  • Perform rigorous data validation including source-to-target mapping (S2T), data profiling, reconciliation, and transformation rule testing
  • Conduct unit, integration, regression, and performance testing for data pipelines and storage layers
  • Automate data quality checks using Python and frameworks like Great Expectations, DBT, or custom-built tools
  • Monitor data pipeline health and implement observability through logging, alerting, and dashboards
  • Integrate testing into CI/CD workflows using tools like Azure DevOps, Jenkins, or GitHub Actions
  • Troubleshoot and resolve data quality issues, schema changes, and pipeline failures
  • Ensure compliance with data privacy, security, and governance policies
What we offer
What we offer
  • Competitive salary aligned with industry standards
  • Hands-on experience with enterprise-scale data platforms and cloud-native tools
  • Opportunities to work on data-centric initiatives across AI, analytics, and enterprise transformation
  • Access to internal learning accelerators, mentorship, and career growth programs
  • Flexible work culture, wellness initiatives, and comprehensive health benefits
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology (4-year graduate course)
  • 4 to 8 years' experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of 'big data' platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
What we offer
What we offer
  • Programs and services for physical and mental well-being including access to telehealth options, health advocates, confidential counseling
  • Empowerment to manage financial well-being and help plan for the future
  • Fulltime
Read More
Arrow Right

Data Engineer

The IT company Andersen invites a Data Engineer to join its team for working wit...
Location
Location
United Arab Emirates , Abu Dhabi
Salary
Salary:
Not provided
andersenlab.com Logo
Andersen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience as a Data Engineer for 4+ years
  • Strong experience with SQL and NoSQL databases (PostgreSQL, MySQL, MongoDB, Cassandra, Redis)
  • Expertise in big data technologies (Hadoop, Spark, Kafka, Flink)
  • Hands-on experience with ETL/ELT tools (Apache Airflow, dbt, Talend, Informatica)
  • Proficiency in Python for data processing
  • Experience with cloud data platforms (Azure Synapse)
  • Understanding of data modeling, indexing, and partitioning for performance optimization
  • Knowledge of data lake architectures (Delta Lake, Iceberg, Hudi)
  • Familiarity with data security best practices (RBAC, encryption, GDPR, HIPAA compliance)
  • Experience with data lineage, metadata management, and observability tools
Job Responsibility
Job Responsibility
  • Designing and managing data pipelines, ETL processes, and data storage solutions
  • Working with SQL, NoSQL, and big data technologies
  • Ensuring data quality, governance, and compliance with security standards
  • Collaborating with AI Engineers to prepare and process datasets for machine learning
  • Developing and optimizing scalable data architectures
  • Automating ETL/ELT workflows
  • Implementing data streaming solutions
  • Ensuring data security and compliance by implementing access controls, encryption, and monitoring
  • Monitoring and maintaining data pipeline performance, troubleshooting issues and optimizing workflows
  • Building and maintaining data warehouses and data lakes
What we offer
What we offer
  • Experience in teamwork with leaders in FinTech, Healthcare, Retail, Telecom, and others
  • The opportunity to change the project and/or develop expertise in an interesting business domain
  • Job conditions – you can work both fully remotely and from the office or can choose a hybrid variant
  • Guarantee of professional, financial, and career growth
  • The opportunity to earn up to an additional 1,000 USD per month, depending on the level of expertise, which will be included in the annual bonus, by participating in the company's activities
  • Access to the corporate training portal
  • Bright corporate life (parties / pizza days / PlayStation / fruits / coffee / snacks / movies)
  • Certification compensation (AWS, PMP, etc)
  • Referral program
  • English courses
Read More
Arrow Right

Data Engineer

The IT company Andersen invites a Data Engineer to join its team for working wit...
Location
Location
United Arab Emirates , Abu Dhabi
Salary
Salary:
Not provided
andersenlab.com Logo
Andersen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience as a Data Engineer for 3+ years
  • Strong experience with SQL and NoSQL databases (PostgreSQL, MySQL, MongoDB, Cassandra, Redis)
  • Expertise in big data technologies (Hadoop, Spark, Kafka, Flink)
  • Hands-on experience with ETL/ELT tools (Apache Airflow, dbt, Talend, Informatica)
  • Proficiency in Python for data processing
  • Experience with cloud data platforms (Azure Synapse)
  • Understanding of data modeling, indexing, and partitioning for performance optimization
  • Knowledge of data lake architectures (Delta Lake, Iceberg, Hudi)
  • Familiarity with data security best practices (RBAC, encryption, GDPR, HIPAA compliance)
  • Experience with data lineage, metadata management, and observability tools
Job Responsibility
Job Responsibility
  • Designing and managing data pipelines, ETL processes, and data storage solutions
  • Working with SQL, NoSQL, and big data technologies
  • Ensuring data quality, governance, and compliance with security standards
  • Collaborating with AI Engineers to prepare and process datasets for machine learning
  • Developing and optimizing scalable data architectures
  • Automating ETL/ELT workflows
  • Implementing data streaming solutions
  • Ensuring data security and compliance by implementing access controls, encryption, and monitoring
  • Monitoring and maintaining data pipeline performance, troubleshooting issues and optimizing workflows
  • Building and maintaining data warehouses and data lakes
What we offer
What we offer
  • Experience in teamwork with leaders in FinTech, Healthcare, Retail, Telecom, and others
  • The opportunity to change the project and/or develop expertise in an interesting business domain
  • Job conditions – you can work both fully remotely and from the office or can choose a hybrid variant
  • Guarantee of professional, financial, and career growth
  • The opportunity to earn up to an additional 1,000 USD per month, depending on the level of expertise, which will be included in the annual bonus, by participating in the company's activities
  • Access to the corporate training portal
  • Bright corporate life (parties / pizza days / PlayStation / fruits / coffee / snacks / movies)
  • Certification compensation (AWS, PMP, etc)
  • Referral program
  • English courses
Read More
Arrow Right

Data Architect

We are seeking an experienced Data Architect with deep technical expertise and a...
Location
Location
United States
Salary
Salary:
Not provided
indatalabs.com Logo
InData Labs
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data architecture, data engineering, or database design
  • Proven experience designing large-scale data systems in cloud environments (AWS, Azure, or GCP)
  • Strong expertise in relational and non-relational databases (e.g., PostgreSQL, SQL Server, MongoDB, Snowflake, Redshift, BigQuery)
  • Proficiency in data modeling tools (e.g., ER/Studio, ERwin, dbt, Lucidchart)
  • Hands-on experience with ETL frameworks, data pipelines, and orchestration tools (e.g., Apache Airflow, Fivetran, Talend)
  • Solid understanding of data governance, metadata management, and data lineage tools
  • Experience working with modern data stack technologies (e.g., Databricks, Kafka, Spark, dbt)
  • Strong SQL and at least one programming language (Python, Scala, or Java)
  • Excellent communication and leadership skills
  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field
Job Responsibility
Job Responsibility
  • Design and implement enterprise-grade data architectures to support analytics, reporting, and operational needs
  • Define data standards, data flows, and governance frameworks across systems and departments
  • Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical data solutions
  • Develop and maintain logical and physical data models using modern modeling tools
  • Oversee data integration strategies including ETL/ELT pipelines, APIs, and real-time data ingestion
  • Evaluate, recommend, and implement new data technologies and tools aligned with industry best practices
  • Ensure data quality, security, and compliance across all platforms
  • Act as a technical mentor to engineering and analytics teams, promoting architectural consistency and knowledge sharing
  • Partner with DevOps and infrastructure teams to ensure optimal deployment, scalability, and performance of data systems
  • Lead initiatives in data warehousing, master data management, and data lakes (on-premise and cloud)
What we offer
What we offer
  • 100% remote with flexible hours
  • Work from anywhere in the world
  • Be part of a senior, talented, and supportive team
  • Flat structure – your input is always welcome
  • Clients in the US and Europe, projects with real impact
  • Room to grow and experiment with cutting-edge AI solutions
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.