CrawlJobs Logo

Data Warehouse AWS Solution Architect / Data Engineer

myn.co.uk Logo

Myn

Location Icon

Location:
Belgium , Bornem

Category Icon

Job Type Icon

Contract Type:
Contract work

Salary Icon

Salary:

Not provided

Job Description:

Our important client based is currently looking for a Data Warehouse AWS Solution Architect / Data Engineer. This is a 12 month contract based in Bornem. Day rate is negotiable. We are looking for an experienced full-time Data Warehouse AWS Solution Architect / Data Engineer to join our team and lead the migration and setup of AWS cloud-based data warehouse solutions.

Job Responsibility:

  • Drive the migration and implementation of a cloud-based data warehouse platform leveraging AWS services, Snowflake, DBT, and AWS Glue, transitioning from on-prem Oracle and Informatica ETL tools
  • Lead the integration and optimization of cloud-native data pipelines, primarily using AWS Glue and related AWS services
  • Support migration of existing BI data feeds, including Qlik (as a plus), to AWS Cloud, ensuring seamless integration with AWS Quicksight and other analytics tools
  • Collaborate with cross-functional teams and external partners to gather business requirements and coordinate project activities
  • Design, develop, and optimize scalable ETL/ELT pipelines and data models tailored for cloud environments
  • Monitor and enhance the performance, scalability, and data quality of AWS cloud data warehouse solutions
  • Document technical processes and provide user guidance on cloud data warehousing best practices
  • Manage project timelines effectively, particularly in partner collaboration and cloud migration initiatives
  • Stay up-to-date with the latest AWS cloud data warehousing technologies, tools, and best practices

Requirements:

  • Bachelor’s or advanced degree in Computer Science, Information Systems, or a related field
  • Minimum of 8 years’ experience in data warehousing, ETL development, and BI, with strong hands-on expertise in AWS cloud platforms
  • Proven track record of migrating data warehouse environments from on-premises to AWS and Snowflake cloud platforms
  • Deep knowledge of AWS services such as AWS Glue, AWS Quicksight, AWS S3, AWS Lambda, and other relevant cloud technologies
  • Experience with modern data transformation tools like DBT in cloud contexts
  • Strong proficiency in SQL
  • scripting skills in Python or similar languages are advantageous
  • Solid understanding of data modeling concepts, including star schema, snowflake schema, and dimensional modeling
  • Excellent project management skills with experience coordinating cross-functional and external partner collaborations
  • Familiarity with Agile methodologies and software development lifecycle
  • Excellent communication skills in English
  • additional languages are a plus
  • Self-motivated, detail-oriented, and able to work independently as well as collaboratively
  • At least 2 days on-site in Bornem or Aartselaar required

Nice to have:

  • Experience with Qlik data feed migration or similar BI tool migrations is a plus, but not mandatory
  • Experience in financial or automotive industries is beneficial

Additional Information:

Job Posted:
February 19, 2026

Expiration:
March 14, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Warehouse AWS Solution Architect / Data Engineer

Data Engineer (AWS)

Fyld is a Portuguese consulting company specializing in IT services. We bring hi...
Location
Location
Portugal , Lisboa
Salary
Salary:
Not provided
https://www.fyld.pt Logo
Fyld
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Software Engineering, Data Engineering, or related
  • Relevant certifications in AWS, such as AWS Certified Solutions Architect, AWS Certified Developer, or AWS Certified Data Analytics
  • Hands-on experience with AWS services, especially those related to Big Data and data analytics, such as Amazon Redshift, Amazon EMR, Amazon Athena, Amazon Kinesis, Amazon Glue, among others
  • Familiarity with data storage and processing services on AWS, including Amazon S3, Amazon RDS, Amazon DynamoDB, and AWS Lambda
  • Proficiency in programming languages such as Python, Scala, or Java for developing data pipelines and automation scripts
  • Knowledge of distributed data processing frameworks, such as Apache Spark or Apache Flink
  • Experience in data modeling, cleansing, transformation, and preparation for analysis
  • Ability to work with different types of data, including structured, unstructured, and semi-structured data
  • Familiarity with data architecture concepts such as data lakes, data warehouses, and data pipelines (not mandatory)
  • Knowledge of security and compliance practices on AWS, including access control, data encryption, and regulatory compliance
  • Fulltime
Read More
Arrow Right

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right

Data Architect - Enterprise Data & AI Solutions

We are looking for a visionary Data Architect who can translate enterprise data ...
Location
Location
India , Chennai; Madurai; Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong background in RDBMS design, data modeling, and schema optimization
  • Advanced SQL skills, including performance tuning and analytics functions
  • Proven expertise in data warehouses, data lakes, and lakehouse architectures
  • Proficiency in ETL/ELT tools (Informatica, Talend, dbt, Glue)
  • Hands-on with cloud platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake)
  • Familiarity with GenAI frameworks (OpenAI, Vertex AI, Bedrock, Azure OpenAI)
  • Experience with real-time streaming (Kafka, Kinesis, Flink) and big data ecosystems (Hadoop, Spark)
  • Strong communication skills with the ability to present data insights to executives
  • 8+ years in data architecture, enterprise data strategy, or modernization programs
  • Hands-on with AI-driven analytics and GenAI adoption
Job Responsibility
Job Responsibility
  • Design scalable data models, warehouses, lakes, and lakehouse solutions
  • Build data pipelines to support advanced analytics, reporting, and predictive insights
  • Integrate GenAI frameworks to enhance data generation, automation, and summarization
  • Define and enforce enterprise-wide data governance, standards, and security practices
  • Drive data modernization initiatives, including cloud migrations
  • Collaborate with stakeholders, engineers, and AI/ML teams to align solutions with business goals
  • Enable real-time and batch insights through dashboards, AI-driven recommendations, and predictive reporting
  • Mentor teams on best practices in data and AI adoption
What we offer
What we offer
  • Opportunity to design next-generation enterprise data & AI architectures
  • Exposure to cutting-edge GenAI platforms to accelerate innovation
  • Collaborate with experts across cloud, data engineering, and AI practices
  • Access to learning, certifications, and leadership mentoring
  • Competitive pay with opportunities for career growth and leadership visibility
  • Fulltime
Read More
Arrow Right

Senior AWS Data Engineer / Data Platform Engineer

We are seeking a highly experienced Senior AWS Data Engineer to design, build, a...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering and data platform development
  • Strong hands-on experience with: AWS Glue
  • Amazon EMR (Spark)
  • AWS Lambda
  • Apache Airflow (MWAA)
  • Amazon EC2
  • Amazon CloudWatch
  • Amazon Redshift
  • Amazon DynamoDB
  • AWS DataZone
Job Responsibility
Job Responsibility
  • Design, develop, and optimize scalable data pipelines using AWS native services
  • Lead the implementation of batch and near-real-time data processing solutions
  • Architect and manage data ingestion, transformation, and storage layers
  • Build and maintain ETL/ELT workflows using AWS Glue and Apache Spark on EMR
  • Orchestrate complex data workflows using Apache Airflow (MWAA)
  • Develop and manage serverless data processing using AWS Lambda
  • Design and optimize data warehouses using Amazon Redshift
  • Implement and manage NoSQL data models using Amazon DynamoDB
  • Utilize AWS DataZone for data governance, cataloging, and access management
  • Monitor, log, and troubleshoot data pipelines using Amazon CloudWatch
  • Fulltime
Read More
Arrow Right

Senior Data Architect

We are seeking a highly experienced Senior Data Architect with 12+ years of expe...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years of experience in Data Engineering and Data Architecture
  • Proven experience working as a Data Architect on large-scale AWS platforms
  • Strong experience designing enterprise data lakes and data warehouses
  • Hands-on experience with batch data processing and orchestration frameworks
  • Excellent communication and stakeholder management skills
  • Ability to work onsite in Dubai, UAE
  • AWS Glue (ETL, Data Catalog)
  • Amazon EMR (Batch Processing)
  • AWS Lambda (Serverless Data Processing)
  • Amazon MWAA (Apache Airflow)
Job Responsibility
Job Responsibility
  • Design and own end-to-end AWS data architecture for enterprise platforms
  • Define data architecture standards, best practices, and reference models
  • Architect batch and event-driven data pipelines using AWS native services
  • Lead data ingestion, transformation, and orchestration workflows
  • Design and implement solutions using AWS Glue, EMR, Lambda, and MWAA (Airflow)
  • Architect data lakes and data warehouses using Amazon S3 and Amazon Redshift
  • Design NoSQL data solutions using Amazon DynamoDB
  • Implement data governance, metadata management, and access control using AWS DataZone
  • Ensure monitoring, logging, and observability using Amazon CloudWatch
  • Partner with engineering, analytics, and business teams to translate requirements into scalable data solutions
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly experienced Data Architect with 12+ years of experience ...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years of experience in Data Engineering and Data Architecture
  • Proven experience working as a Data Architect on large-scale AWS platforms
  • Strong experience designing enterprise data lakes and data warehouses
  • Hands-on experience with batch data processing and orchestration frameworks
  • Excellent communication and stakeholder management skills
  • Ability to work onsite in Dubai, UAE
  • AWS Glue (ETL, Data Catalog)
  • Amazon EMR (Batch Processing)
  • AWS Lambda (Serverless Data Processing)
  • Amazon MWAA (Apache Airflow)
Job Responsibility
Job Responsibility
  • Design and own end-to-end AWS data architecture for enterprise platforms
  • Define data architecture standards, best practices, and reference models
  • Architect batch and event-driven data pipelines using AWS native services
  • Lead data ingestion, transformation, and orchestration workflows
  • Design and implement solutions using AWS Glue, EMR, Lambda, and MWAA (Airflow)
  • Architect data lakes and data warehouses using Amazon S3 and Amazon Redshift
  • Design NoSQL data solutions using Amazon DynamoDB
  • Implement data governance, metadata management, and access control using AWS DataZone
  • Ensure monitoring, logging, and observability using Amazon CloudWatch
  • Partner with engineering, analytics, and business teams to translate requirements into scalable data solutions
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Rearc, we're committed to empowering engineers to build awesome products and ...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering, showcasing expertise in diverse architectures, technology stacks, and use cases
  • Strong expertise in designing and implementing data warehouse and data lake architectures, particularly in AWS environments
  • Extensive experience with Python for data engineering tasks, including familiarity with libraries and frameworks commonly used in Python-based data engineering workflows
  • Proven experience with data pipeline orchestration using platforms such as Airflow, Databricks, DBT or AWS Glue
  • Hands-on experience with data analysis tools and libraries like Pyspark, NumPy, Pandas, or Dask
  • Proficiency with Spark and Databricks is highly desirable
  • Experience with SQL and NoSQL databases, including PostgreSQL, Amazon Redshift, Delta Lake, Iceberg and DynamoDB
  • In-depth knowledge of data architecture principles and best practices, especially in cloud environments
  • Proven experience with AWS services, including expertise in using AWS CLI, SDK, and Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or AWS CDK
  • Exceptional communication skills, capable of clearly articulating complex technical concepts to both technical and non-technical stakeholders
Job Responsibility
Job Responsibility
  • Strategic Data Engineering Leadership: Provide strategic vision and technical leadership in data engineering, guiding the development and execution of advanced data strategies that align with business objectives
  • Architect Data Solutions: Design and architect complex data pipelines and scalable architectures, leveraging advanced tools and frameworks (e.g., Apache Kafka, Kubernetes) to ensure optimal performance and reliability
  • Drive Innovation: Lead the exploration and adoption of new technologies and methodologies in data engineering, driving innovation and continuous improvement across data processes
  • Technical Expertise: Apply deep expertise in ETL processes, data modelling, and data warehousing to optimize data workflows and ensure data integrity and quality
  • Collaboration and Mentorship: Collaborate closely with cross-functional teams to understand requirements and deliver impactful data solutions—mentor and coach junior team members, fostering their growth and development in data engineering practices
  • Thought Leadership: Contribute to thought leadership in the data engineering domain through technical articles, conference presentations, and participation in industry forums
Read More
Arrow Right

Senior Data Engineer

At Relatient, we’re on a mission to simplify access to care – intelligently. As ...
Location
Location
India , Pune
Salary
Salary:
Not provided
relatient.com Logo
Relatient
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree, B.E./ B. Tech, computer engineering, or equivalent work experience in lieu of a degree is required, Master’s degree preferred
  • 7+ years of experience in database engineering, data warehousing, or data architecture
  • Proven expertise with at least one major data warehouse platform (e.g. Postgres, Snowflake, Redshift, BigQuery)
  • Strong SQL and ETL/ELT development skills
  • Deep understanding of data modeling
  • Experience with cloud data ecosystems (AWS)
  • Hands-on experience with orchestration tools and version control (Git)
  • Experience in data governance, security, and compliance best practices
  • Experience building/generating analytical reports using Power BI
Job Responsibility
Job Responsibility
  • Architect, design, and implement robust end-to-end data warehouse (DW) solutions using modern technologies (e.g. Postgres or on-prem solutions)
  • Define data modeling standards (dimensional and normalized) and build ETL/ELT pipelines for efficient data flow and transformation
  • Integrate data from multiple sources (ERP, CRM. APIs, flat files, real-time streams)
  • Develop and maintain scalable and reliable data ingestion, transformation, and storage pipelines
  • Ensure data quality, consistency, and lineage across all data systems
  • Analyst and tune SQL queries, schemas, indexes, and ETL process to maximize database and warehouse performance
  • Monitor data systems and optimize storage costs and query response times
  • Implement high availability, backup, disaster recovery, and data security strategies
  • Collaborate with DevOps and Infrastructure teams to ensure optimal deployment, scaling, and performance of DW environments
  • Work closely with Data Scientists, Analysts, and Business Teams to translate business needs into technical data solutions
What we offer
What we offer
  • INR 5,00,000/- of life insurance coverage for all full-time employees and their immediate family
  • INR 15,00,000/- of group accident insurance
  • Education reimbursement
  • 10 national and state holidays, plus 1 floating holiday
  • Flexible working hours and a hybrid policy
  • Fulltime
Read More
Arrow Right