CrawlJobs Logo

Senior Data Integration Engineer

levy-professionals.com Logo

Levy Professionals

Location Icon

Location:
Netherlands , Amsterdam

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

As a Senior Data Integration Engineer, you will be part of a multidisciplinary data team working on a modern Data Lake environment. You will play a key role in designing, developing, and maintaining the data integration layer, ensuring reliable, scalable, and high-quality data delivery for downstream reporting and analytics.

Job Responsibility:

  • Design, build, and maintain a data integration layer within a large-scale Data Lake environment
  • Develop and manage complex ETL flows for batch and near real-time data ingestion
  • Create data mapping specifications from multiple source systems
  • Work extensively with enterprise data storage platforms, including Oracle-based solutions
  • Address data modelling and integration challenges across the data lifecycle
  • Coach and mentor junior engineers from a technical and architectural perspective
  • Contribute to automation, standardisation, and continuous improvement of data delivery
  • Collaborate closely with product teams in an agile, self-steering setup

Requirements:

  • Minimum 10 years’ experience as a Data Engineer / Data Integration Engineer
  • Strong experience designing and building integration layers
  • Proven experience with enterprise ETL tooling (e.g. IBM DataStage or similar)
  • Solid knowledge of data modelling (Data Vault, dimensional modelling, or similar)
  • Strong Oracle database knowledge (Oracle Exadata or comparable platforms)
  • Deep understanding of data storage tooling and large-scale data platforms
  • Experience coaching and guiding junior engineers
  • Strong SQL skills
  • Fluent in English

Nice to have:

  • Experience with scheduling tools
  • Version control (Git or similar)
  • CI/CD pipelines (e.g. Azure DevOps pipelines)
  • Reporting or analytics tooling (e.g. Cognos or similar)
  • Agile / DevOps ways of working
  • Exposure to cloud-based data platforms or cloud migration initiatives

Additional Information:

Job Posted:
January 14, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Integration Engineer

Senior Data Engineer

Binariks is looking for a highly motivated and skilled Senior Data Engineer to j...
Location
Location
Salary
Salary:
Not provided
binariks.com Logo
Binariks
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years as a Data Engineer with experience in working with AWS
  • Experience with Terraform
  • Experience with Datalakes, Medallion Architecture
  • Experience with SQL
  • Professional experience with Python
  • Experience with Athena, Glue, Glue Streaming, Kinesis, dbt
  • Able to work as a team to deliver against requirements in a collaborative and agile way
  • Excellent communication skills in English
Job Responsibility
Job Responsibility
  • Managing and improving our cloud data lakehouse for analytic and aggregate data
  • Building data pipelines
  • Developing foundations for handling larger volumes of streaming data
  • Manage our cloud data infrastructure using IaC tooling
  • Assist teams on integrating with the data lake and consuming from it
  • Developing customer-facing data products, such as APIs and data timelines
  • Help develop our machine learning data capabilities
What we offer
What we offer
  • 18 days of paid annual leave
  • 10 sick leaves
  • Additional days off for special occasions
  • Medical Care
  • Health check-up
  • English Class
  • Play Room
  • IT Cluster membership
  • Business Trip
  • Tech Talks
Read More
Arrow Right

Senior Data Engineer

Beacon Technologies is seeking a Senior Data Engineer for our client partner. Th...
Location
Location
Salary
Salary:
Not provided
beacontechinc.com Logo
Beacon Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data engineering (SQL)
  • Scripting (Python, PowerShell, Bash)
  • Custom development (API integration, file-based integration)
  • Strategic thinker
  • Mentor juniors
  • Governance aware
Job Responsibility
Job Responsibility
  • Design integrations
  • Optimize rule structures
  • Lead analytics initiatives
  • Manage workforce identity (system access) for all corporate and field employees
What we offer
What we offer
  • Career advancement opportunities
  • Extensive training
  • Excellent benefits including paying for health and dental premiums for salaried employees
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer on our Core Engineering Data Team, you will design and...
Location
Location
United States , Boston
Salary
Salary:
111800.00 - 164000.00 USD / Year
simplisafe.com Logo
SimpliSafe
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience
  • 4+ years of experience in software engineering, data engineering, or a related field, with at least 2 years focused on data operations or data infrastructure
  • Strong knowledge of AWS or other public cloud platforms (e.g., Azure, GCP)
  • Strong SQL knowledge and experience optimizing for data warehousing technologies like AWS Athena
  • Strong knowledge of Python for use in data transformation
  • Hands-on experience with ETL/ELT, schema design, and datalake technologies
  • Hands-on experience with data orchestration tools like Dagster, Airflow, or Prefect
  • Experience with CI/CD pipelines, Docker, Kubernetes, and infrastructure-as-code tools (e.g., Terraform, CloudFormation)
  • Familiarity with various data and table formats (JSON, Avro, Parquet, Iceberg)
  • Love of data and a passion for building reliable data products
Job Responsibility
Job Responsibility
  • Collaborate with analysts, engineers, product managers, and stakeholders to design and implement solutions for Product and Engineering data workflows
  • Identify areas for improvement and contribute to centralized data platform
  • Manage data pipeline, orchestration, storage, and analytics infrastructure for Product and Engineering
  • Monitor performance and reliability of data pipelines, implementing solutions for scalability and efficiency
  • Optimize table structures to support query and usage patterns
  • Partner with producers of data across SimpliSafe to develop an understanding of data creation and meaning
  • Support data discovery, catalog, and analytics tooling
  • Implement and maintain data security measures and ensure compliance with data governance policies
  • Design and implement testing strategies and data quality validation to ensure the accuracy, reliability, and integrity of data pipelines
  • Contribute to the formation of our new team, assisting with the development of team norms, practices, and charter
What we offer
What we offer
  • A mission- and values-driven culture and a safe, inclusive environment where you can build, grow and thrive
  • A comprehensive total rewards package that supports your wellness and provides security for SimpliSafers and their families
  • Free SimpliSafe system and professional monitoring for your home
  • Employee Resource Groups (ERGs) that bring people together, give opportunities to network, mentor and develop, and advocate for change
  • Participation in our annual bonus program, equity, and other forms of compensation, in addition to a full range of medical, retirement, and lifestyle benefits
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Data Engineer to join our team and support with designing, ...
Location
Location
Salary
Salary:
Not provided
foundever.com Logo
Foundever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 7 years plus experience in data engineering
  • Track record of deploying and maintaining complex data systems at an enterprise level within regulated environments
  • Expertise in implementing robust data security measures, access controls, and monitoring systems
  • Proficiency in data modeling and database management
  • Strong programming skills in Python and SQL
  • Knowledge of big data technologies like Hadoop, Spark, and NoSQL databases
  • Deep experience with ETL processes and data pipeline development
  • Strong understanding of data warehousing concepts and best practices
  • Experience with cloud platforms such as AWS and Azure
  • Excellent problem-solving skills and attention to detail
Job Responsibility
Job Responsibility
  • Design and optimize complex data storage solutions, including data warehouses and data lakes
  • Develop, automate, and maintain data pipelines for efficient and scalable ETL processes
  • Ensure data quality and integrity through data validation, cleansing, and error handling
  • Collaborate with data analysts, machine learning engineers, and software engineers to deliver relevant datasets or data APIs for downstream applications
  • Implement data security measures and access controls to protect sensitive information
  • Monitor data infrastructure for performance and reliability, addressing issues promptly
  • Stay abreast of industry trends and emerging technologies in data engineering
  • Document data pipelines, processes, and best practices for knowledge sharing
  • Lead data governance and compliance efforts to meet regulatory requirements
  • Collaborate with cross-functional teams to drive data-driven decision-making within the organization
What we offer
What we offer
  • Impactful work
  • Professional growth
  • Competitive compensation
  • Collaborative environment
  • Attractive salary and benefits package
  • Continuous learning and development opportunities
  • A supportive team culture with opportunities for occasional travel for training and industry events
Read More
Arrow Right

Senior Data Engineer

The Data Engineer will build scalable pipelines and data models, implement ETL w...
Location
Location
United States , Fort Bragg
Salary
Salary:
Not provided
barbaricum.com Logo
Barbaricum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Active DoD TS/SCI clearance (required or pending verification)
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or related field (or equivalent experience) OR CSSLP / CISSP-ISSAP
  • Strong programming skills in Python, Java, or Scala
  • Strong SQL skills
  • familiarity with analytics languages/tools such as R
  • Experience with data processing frameworks (e.g., Apache Spark, Hadoop) and orchestration tools (e.g., Airflow)
  • Familiarity with cloud-based data services (e.g., AWS Redshift, Google BigQuery, Azure Data Factory)
  • Experience with data modeling, database design, and data architecture concepts
  • Strong analytical and problem-solving skills with attention to detail
  • Strong written and verbal communication skills
Job Responsibility
Job Responsibility
  • Build and maintain scalable, reliable data pipelines to collect, process, and store data from multiple sources
  • Design and implement ETL processes to support analytics, reporting, and operational needs
  • Develop and maintain data models, schemas, and standards to support enterprise data usage
  • Collaborate with data scientists, analysts, and stakeholders to understand requirements and deliver solutions
  • Analyze large datasets to identify trends, patterns, and actionable insights
  • Present findings and recommendations through dashboards, reports, and visualizations
  • Optimize database and pipeline performance for scalability and reliability across large datasets
  • Monitor and troubleshoot pipeline issues to minimize downtime and improve system resilience
  • Implement data quality checks, validation routines, and integrity controls
  • Implement security measures to protect data and systems from unauthorized access
Read More
Arrow Right

Senior Data Engineer II

We are looking for a skilled Data Engineer to join our growing team. You will pl...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
alterdomus.com Logo
Alter Domus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • 3+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms
  • Technical Skills: Cloud & Orchestration: Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling
  • Familiarity with best practices for Airflow DAG structure, dependency management, and error handling
  • AWS Expertise: Hands-on experience with AWS Lake Formation, S3, Athena, and related services (e.g., Lambda, Glue, IAM)
  • Snowflake: Proficient in setting up data warehouses, configuring security, and optimizing queries on Snowflake
  • Data Ingestion & Transformation: Experience with Airbyte or similar tools for data ingestion
  • dbt or other SQL-based transformation frameworks for modular data processing
  • Programming: Proficiency in Python and/or Java/Scala for building data pipelines and custom integrations
  • Query Languages: Advanced knowledge of SQL for data manipulation and analysis
Job Responsibility
Job Responsibility
  • Data Pipeline Orchestration: Design, build, and maintain end-to-end data pipelines using Airflow (including managed services like Amazon MWAA) to orchestrate, schedule, and monitor batch/streaming workflows
  • Implement DAGs (Directed Acyclic Graphs) with retry logic, error handling, and alerting to ensure data quality and pipeline reliability
  • Data Ingestion & Transformation: Integrate data from various sources using Airbyte for ingestion and dbt for transformations in a scalable and modular fashion
  • Collaborate with Data Analysts and Data Scientists to implement transformations and business logic, ensuring data is analytics-ready
  • Data Modeling & Warehousing: Design and implement efficient data models for both structured and semi-structured data in AWS S3 (data lake) and Snowflake (data warehouse)
  • Ensure data schemas and transformations support advanced analytics, BI reporting, and machine learning use cases
  • Data Governance & Security: Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance
  • Work closely with IT security to establish robust encryption standards, audit trails, and identity/role-based access
  • Performance Optimization: Optimize AWS Athena queries and configurations (e.g., data partitioning) for performance and cost efficiency
  • Monitor and tune Airflow DAGs, Snowflake queries, and data transformations to improve throughput and reliability
What we offer
What we offer
  • Support for professional accreditations
  • Flexible arrangements, generous holidays, plus an additional day off for your birthday
  • Continuous mentoring along your career progression
  • Active sports, events and social committees across our offices
  • 24/7 support available from our Employee Assistance Program
  • The opportunity to invest in our growth and success through our Employee Share Plan
  • Plus additional local benefits depending on your location
Read More
Arrow Right

Senior Data Engineer

For this role, we are seeking a Senior Data Engineer for our Client's ETL Suppor...
Location
Location
India
Salary
Salary:
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • In-depth knowledge of AWS Glue, AWS Lambda, and AWS Step Functions
  • A deep understanding of ETL processes and data warehouse design
  • Proven ability to troubleshoot data pipelines and perform root cause analysis (RCA)
  • 3-5 years of relevant experience
  • Hands-on experience with Glue, Lambda, and Step Function development
  • Must be able to work a day shift that includes coverage for weekends and holidays on a rotational basis
Job Responsibility
Job Responsibility
  • Monitor approximately 2,300 scheduled jobs (daily, weekly, monthly) to ensure timely and successful execution
  • Execute on-demand jobs as required by the business
  • Troubleshoot job failures, perform detailed root cause analysis (RCA), and provide clear documentation for all findings
  • Address and resolve bugs and data-related issues reported by the business team
  • Verify source file placement in designated directories to maintain data integrity
  • Reload Change Data Capture (CDC) tables when structural changes occur in source systems
  • Help manage synchronization between external databases (including Teradata write-backs) and AWS Glue tables
  • Assist in developing new solutions, enhancements, and bug fixes using AWS Glue, Lambda, and Step Functions
  • Answer questions from the business and support User Acceptance Testing (UAT) inquiries
  • Make timely decisions to resolve issues, execute tasks efficiently, and escalate complex problems to senior or lead engineers as needed, all while maintaining agreed-upon SLAs
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Data Software Engineer, you will leverage your skills in data science, mach...
Location
Location
United States , Arlington; Woburn
Salary
Salary:
134000.00 - 184000.00 USD / Year
str.us Logo
STR
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Ability to obtain a Top Secret (TS) security clearance, for which U.S citizenship is needed by the U.S government
  • 5+ years of experience in one or more high level programming languages, like Python
  • Experience working with large datasets for machine learning applications
  • Experience in navigating and contributing to complex, large code bases
  • Experience with containerization practices and CI/CD workflows
  • BS, MS, or PhD in a related field or equivalent experience
Job Responsibility
Job Responsibility
  • Explore a variety of data sources to build and integrate machine learning models into software
  • Develop machine learning and deep learning algorithms for large-scale multi-modal problems
  • Work with large data sets and develop software solutions for scalable analysis
  • Collaborate to create and maintain software for data pipelines, algorithms, storage, and access
  • Monitor software deployments, create logging frameworks, and design APIs
  • Build analytic tools that utilize data pipelines to provide actionable insights into customer requests
  • Develop and execute plans to improve software robustness and ensure system performance
  • Fulltime
Read More
Arrow Right