CrawlJobs Logo

Senior Data Engineering Architect

lingarogroup.com Logo

Lingaro

Location Icon

Location:
Poland

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Responsibility:

  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
  • Stay updated with emerging trends and technologies in data engineering, recommending and implementing innovative solutions as appropriate
  • Conduct performance analysis and optimization of data engineering systems, identifying and resolving bottlenecks and inefficiencies
  • Ensure data quality and integrity throughout the data engineering processes, implementing appropriate validation and monitoring mechanisms
  • Collaborate with cross-functional teams to integrate data engineering solutions with other systems and applications
  • Participate in project planning and estimation, providing technical insights and recommendations
  • Document data architecture, infrastructure, and design decisions, ensuring clear and up-to-date documentation for implementation, reference and knowledge sharing

Requirements:

  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
  • Excellent communication and collaboration skills, with the ability to effectively communicate technical concepts to non-technical stakeholders
  • Experience in leading and mentoring data engineering teams, providing guidance and technical expertise
  • Familiarity with agile methodologies and experience in working in agile development environments
  • Continuous learning mindset, staying updated with the latest advancements and trends in data engineering and related technologies
  • Strong project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Strong understanding of distributed computing principles, including parallel processing, data partitioning, and fault-tolerance
  • Bachelor's degree in Computer Science, Information Technology, or a related field. A Master's degree may be preferred
What we offer:
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
  • A diverse, inclusive, and values-driven community
  • Autonomy to choose the way you work. We trust your ideas
  • Create our community together. Refer your friends to receive bonuses
  • Activities to support your well-being and health
  • Social fund benefits for everyone. All Lingarians can apply for social fund benefits, such as vacation co-financing
  • Plenty of opportunities to donate to charities and support the environment
  • Modern office equipment. Purchased for you or available to borrow, depending on your location

Additional Information:

Job Posted:
December 09, 2025

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineering Architect

Senior Data Architect

We are seeking a highly experienced Senior Data Architect with 12+ years of expe...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years of experience in Data Engineering and Data Architecture
  • Proven experience working as a Data Architect on large-scale AWS platforms
  • Strong experience designing enterprise data lakes and data warehouses
  • Hands-on experience with batch data processing and orchestration frameworks
  • Excellent communication and stakeholder management skills
  • Ability to work onsite in Dubai, UAE
  • AWS Glue (ETL, Data Catalog)
  • Amazon EMR (Batch Processing)
  • AWS Lambda (Serverless Data Processing)
  • Amazon MWAA (Apache Airflow)
Job Responsibility
Job Responsibility
  • Design and own end-to-end AWS data architecture for enterprise platforms
  • Define data architecture standards, best practices, and reference models
  • Architect batch and event-driven data pipelines using AWS native services
  • Lead data ingestion, transformation, and orchestration workflows
  • Design and implement solutions using AWS Glue, EMR, Lambda, and MWAA (Airflow)
  • Architect data lakes and data warehouses using Amazon S3 and Amazon Redshift
  • Design NoSQL data solutions using Amazon DynamoDB
  • Implement data governance, metadata management, and access control using AWS DataZone
  • Ensure monitoring, logging, and observability using Amazon CloudWatch
  • Partner with engineering, analytics, and business teams to translate requirements into scalable data solutions
  • Fulltime
Read More
Arrow Right

Senior AWS Data Engineer / Data Platform Engineer

We are seeking a highly experienced Senior AWS Data Engineer to design, build, a...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering and data platform development
  • Strong hands-on experience with: AWS Glue
  • Amazon EMR (Spark)
  • AWS Lambda
  • Apache Airflow (MWAA)
  • Amazon EC2
  • Amazon CloudWatch
  • Amazon Redshift
  • Amazon DynamoDB
  • AWS DataZone
Job Responsibility
Job Responsibility
  • Design, develop, and optimize scalable data pipelines using AWS native services
  • Lead the implementation of batch and near-real-time data processing solutions
  • Architect and manage data ingestion, transformation, and storage layers
  • Build and maintain ETL/ELT workflows using AWS Glue and Apache Spark on EMR
  • Orchestrate complex data workflows using Apache Airflow (MWAA)
  • Develop and manage serverless data processing using AWS Lambda
  • Design and optimize data warehouses using Amazon Redshift
  • Implement and manage NoSQL data models using Amazon DynamoDB
  • Utilize AWS DataZone for data governance, cataloging, and access management
  • Monitor, log, and troubleshoot data pipelines using Amazon CloudWatch
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

The mission of the business intelligence team is to create a data-driven culture...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
February 28, 2026
Flip Icon
Requirements
Requirements
  • Master’s degree in Computer Science / Information Technology or related field, highly preferred
  • Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions
  • Extensive Experience with data lakes, ETL and data warehouses
  • Advanced experience of building data pipelines
  • Passion for building quality BI software
  • Project Management and/or process improvement experience highly preferred
  • Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau
  • Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered
  • Any exposure to Kafka, Spark, and Scala will be an added advantage
  • Should demonstrate a strong understanding of OOPS concepts and methodologies
Job Responsibility
Job Responsibility
  • Architect and build complex data pipelines using advanced cloud data technologies
  • Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency
  • Define industry best practices for building data pipelines
  • Ensure data security, compliance, and governance standards are met
  • Partner with leadership team to define and implement agile and DevOps methodologies
  • Serve as subject matter expert and define data architecture and infrastructure requirements
  • Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times
  • Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations
  • Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes
  • Design and implement a robust data observability process
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Our client is a global jewelry manufacturer undergoing a major transformation, m...
Location
Location
Poland , Wroclaw
Salary
Salary:
Not provided
zoolatech.com Logo
Zoolatech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with proven expertise in Azure Synapse Analytics and SQL Server
  • Advanced proficiency in SQL, covering relational databases, data warehousing, dimensional modeling, and cubes
  • Practical experience with Azure Data Factory, Databricks, and PySpark
  • Track record of designing, building, and delivering production-ready data products at enterprise scale
  • Strong analytical skills and ability to translate business requirements into technical solutions
  • Excellent communication skills in English, with the ability to adapt technical details for different audiences
  • Experience working in Agile/Scrum teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable, efficient, and reusable data pipelines and products on the Azure PaaS data platform
  • Collaborate with product owners, architects, and business stakeholders to translate requirements into technical designs and data models
  • Enable advanced analytics, reporting, and other data-driven use cases that support commercial initiatives and operational efficiencies
  • Ingest, transform, and optimize large, complex data sets while ensuring data quality, reliability, and performance
  • Apply DevOps practices, CI/CD pipelines, and coding best practices to ensure robust, production-ready solutions
  • Monitor and own the stability of delivered data products, ensuring continuous improvements and measurable business benefits
  • Promote a “build-once, consume-many” approach to maximize reuse and value creation across business verticals
  • Contribute to a culture of innovation by following best practices while exploring new ways to push the boundaries of data engineering
What we offer
What we offer
  • Paid Vacation
  • Sick Days
  • Sport/Insurance Compensation
  • English Classes
  • Charity
  • Training Compensation
Read More
Arrow Right

Senior Data Engineer

Our client is a global jewelry manufacturer undergoing a major transformation, m...
Location
Location
Turkey , Istanbul
Salary
Salary:
Not provided
zoolatech.com Logo
Zoolatech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with proven expertise in Azure Synapse Analytics and SQL Server
  • Advanced proficiency in SQL, covering relational databases, data warehousing, dimensional modeling, and cubes
  • Practical experience with Azure Data Factory, Databricks, and PySpark
  • Track record of designing, building, and delivering production-ready data products at enterprise scale
  • Strong analytical skills and ability to translate business requirements into technical solutions
  • Excellent communication skills in English, with the ability to adapt technical details for different audiences
  • Experience working in Agile/Scrum teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable, efficient, and reusable data pipelines and products on the Azure PaaS data platform
  • Collaborate with product owners, architects, and business stakeholders to translate requirements into technical designs and data models
  • Enable advanced analytics, reporting, and other data-driven use cases that support commercial initiatives and operational efficiencies
  • Ingest, transform, and optimize large, complex data sets while ensuring data quality, reliability, and performance
  • Apply DevOps practices, CI/CD pipelines, and coding best practices to ensure robust, production-ready solutions
  • Monitor and own the stability of delivered data products, ensuring continuous improvements and measurable business benefits
  • Promote a “build-once, consume-many” approach to maximize reuse and value creation across business verticals
  • Contribute to a culture of innovation by following best practices while exploring new ways to push the boundaries of data engineering
What we offer
What we offer
  • Paid Vacation
  • Hybrid Work (home/office)
  • Sick Days
  • Sport/Insurance Compensation
  • Holidays Day Off
  • English Classes
  • Training Compensation
  • Transportation compensation
Read More
Arrow Right

Senior Data Engineer

At Relatient, we’re on a mission to simplify access to care – intelligently. As ...
Location
Location
India , Pune
Salary
Salary:
Not provided
relatient.com Logo
Relatient
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree, B.E./ B. Tech, computer engineering, or equivalent work experience in lieu of a degree is required, Master’s degree preferred
  • 7+ years of experience in database engineering, data warehousing, or data architecture
  • Proven expertise with at least one major data warehouse platform (e.g. Postgres, Snowflake, Redshift, BigQuery)
  • Strong SQL and ETL/ELT development skills
  • Deep understanding of data modeling
  • Experience with cloud data ecosystems (AWS)
  • Hands-on experience with orchestration tools and version control (Git)
  • Experience in data governance, security, and compliance best practices
  • Experience building/generating analytical reports using Power BI
Job Responsibility
Job Responsibility
  • Architect, design, and implement robust end-to-end data warehouse (DW) solutions using modern technologies (e.g. Postgres or on-prem solutions)
  • Define data modeling standards (dimensional and normalized) and build ETL/ELT pipelines for efficient data flow and transformation
  • Integrate data from multiple sources (ERP, CRM. APIs, flat files, real-time streams)
  • Develop and maintain scalable and reliable data ingestion, transformation, and storage pipelines
  • Ensure data quality, consistency, and lineage across all data systems
  • Analyst and tune SQL queries, schemas, indexes, and ETL process to maximize database and warehouse performance
  • Monitor data systems and optimize storage costs and query response times
  • Implement high availability, backup, disaster recovery, and data security strategies
  • Collaborate with DevOps and Infrastructure teams to ensure optimal deployment, scaling, and performance of DW environments
  • Work closely with Data Scientists, Analysts, and Business Teams to translate business needs into technical data solutions
What we offer
What we offer
  • INR 5,00,000/- of life insurance coverage for all full-time employees and their immediate family
  • INR 15,00,000/- of group accident insurance
  • Education reimbursement
  • 10 national and state holidays, plus 1 floating holiday
  • Flexible working hours and a hybrid policy
  • Fulltime
Read More
Arrow Right