This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Senior Data Engineer – Dublin. Hybrid – 3 days in the office. We are seeking a Senior Data Engineer to join our growing team as part of an exciting Security Innovation program. This is an opportunity to design and deliver scalable, high-performance data solutions that leverage AI and Machine Learning to combat financial fraud. You’ll work with cutting-edge technologies across big data, cloud, and modern data platforms to build pipelines, optimize workflows, and support analytics at scale.
Job Responsibility:
Build, optimize, and maintain ETL pipelines using Hadoop ecosystem tools (HDFS, Hive, Spark)
Assemble, process, and manage large, complex datasets to support analytics, BI, and AI-driven applications
Collaborate with Software Engineers, Data Scientists, and Architects to deliver efficient, reliable, and scalable data processing solutions
Perform data modelling, quality checks, and system performance tuning to ensure accuracy and efficiency
Design and implement process improvements for automation, workflow orchestration, and data scalability
Support modernization efforts, including cloud adoption and Databricks integration
Take ownership of clarifying requirements and proposing scalable, secure solutions before implementation
Requirements:
Strong experience with SQL and relational/NoSQL databases (Postgres, Oracle, CosmosDB)
Hands-on expertise in big data frameworks such as Spark, Hadoop, Hive, Impala, Oozie, Airflow, and HDFS
Proficiency in Java, Scala, or Python for data engineering and automation
Solid understanding of distributed data processing, stream processing, and message queuing (Kafka, Spark Streaming, Storm)
Cloud experience (preferably AWS) with services such as S3, Athena, EMR, Redshift, Glue, Lambda
Experience with Snowflake or similar data warehousing solutions
Familiarity with Databricks and AI/ML integration for data platforms
CI/CD, containerization (Docker), and workflow management experience
Comfortable working in Agile environments (Scrum, SAFe) and collaborating across cross-functional teams
Strong problem-solving skills and ability to extract value from large, complex datasets