CrawlJobs Logo

Senior Data Engineer (DWH)

scalefocus.com Logo

Scalefocus

Location Icon

Location:

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Are you a motivated Data Engineer (DWH), willing to grow as a professional and work on large-scale projects, using different working approaches that are both diverse and cutting-edge? Are you passionate about innovative and top-notch software solutions and want to be part of like-minded professionals? Do you enjoy working in a fast-paced, yet collaborative environment?

Requirements:

  • Experience in designing scalable data warehousing solutions
  • Experience in designing and implementing efficient ETL processes and data pipelines
  • Experience and knowledge in data migration techniques and processes
  • Proven ability to collaborate with technical and business teams and communicate effectively across channels
  • Excellent knowledge and experience in SQL - developing stored procedures with T-SQL, database performance optimisation and working with large datasets
  • Understanding of the database and analytics technologies in the industry, including Relational databases
  • Data integration from multiple sources - batch data processing, streaming, integrations of APIs
  • Deep understanding of data warehousing and data architecture concepts
  • Experience with version control systems (i.e. GitHub)

Nice to have:

  • Experience with SSIS - designing, configuring, and deploying packages
  • Experience with python scripting for data engineering
  • Experience with Agile software development and DevOps practices
  • Knowledge on both on-prem as well as Cloud big data solution
  • Experience with streaming technologies such as Kafka
What we offer:
  • Flexible benefits system
  • Generous referral bonuses and awards
  • Multitude of training, certification, and leadership programs
  • Mental health benefits and workshops
  • Frequent teambuilding activities, events and gatherings
  • Opportunity to grow in a multinational environment by working with international teams and clients

Additional Information:

Job Posted:
February 20, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer (DWH)

Senior Data Engineer

We are looking for a highly skilled Senior Data Engineer to lead the design and ...
Location
Location
United Kingdom
Salary
Salary:
45000.00 - 60000.00 GBP / Year
activate-group.com Logo
Activate Group Limited
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Senior Data Engineer, BI/Data Warehouse Engineer, or similar
  • Strong hands-on expertise with Microsoft Fabric and related services
  • End-to-end DWH development experience, from ingestion to modelling and consumption
  • Strong background in data modelling, including star schema, dimensional modelling and semantic modelling
  • Experience with orchestration, monitoring and optimisation of data pipelines
  • Proficiency in SQL and strong understanding of database principles
  • Ability to design scalable data architectures aligned to business needs
Job Responsibility
Job Responsibility
  • Lead the design, architecture and build of a new enterprise data warehouse on Microsoft Fabric
  • Develop robust data pipelines, orchestration processes and monitoring frameworks using Fabric components (Data Factory, Data Engineering, Lakehouse)
  • Create scalable and high-quality data models to support analytics, Power BI reporting and self-service data consumption
  • Establish and enforce data governance, documentation and best practices across the data ecosystem
  • Collaborate with cross-functional teams to understand data needs and translate them into technical solutions
  • Provide technical leadership, mentoring and guidance to junior team members where required
What we offer
What we offer
  • 33 days holiday (including bank holidays)
  • Personal health cash plan – claim back the cost of things like dentist and optical check ups
  • Enhanced maternity / paternity / adoption / shared parental pay
  • Life assurance: three times basic salary
  • Free breakfasts and fruit
  • Birthday surprise for everybody
  • Fulltime
Read More
Arrow Right

Senior Databricks Data Engineer

To develop, implement, and optimize complex Data Warehouse (DWH) and Data Lakeho...
Location
Location
Romania , Bucharest
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven, expert-level experience with the entire Databricks ecosystem (Workspace, Cluster Management, Notebooks, Databricks SQL)
  • In-depth knowledge of Spark architecture (RDD, DataFrames, Spark SQL) and advanced optimization techniques
  • Expertise in implementing and managing Delta Lake (ACID properties, Time Travel, Merge, Optimize, Vacuum)
  • Advanced/expert-level proficiency in Python (with PySpark) and/or Scala (with Spark)
  • Advanced/expert-level skills in SQL and Data Modeling (Dimensional, 3NF, Data Vault)
  • Solid experience with a major Cloud platform (AWS, Azure, or GCP), especially with storage services (S3, ADLS Gen2, GCS) and networking.
Job Responsibility
Job Responsibility
  • Design and implement robust, scalable, and high-performance ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform
  • Expertise in implementing and optimizing the Medallion architecture (Bronze, Silver, Gold) using Delta Lake to ensure data quality, consistency, and historical tracking
  • Efficient implementation of the Lakehouse architecture on Databricks, combining best practices from DWH and Data Lake
  • Optimize Databricks clusters, Spark operations, and Delta tables to reduce latency and computational costs
  • Design and implement real-time/near-real-time data processing solutions using Spark Structured Streaming and Delta Live Tables
  • Implement and manage Unity Catalog for centralized data governance, data security and data lineage
  • Define and implement data quality standards and rules to maintain data integrity
  • Develop and manage complex workflows using Databricks Workflows or external tools to automate pipelines
  • Integrate Databricks pipelines into CI/CD processes
  • Work closely with Data Scientists, Analysts, and Architects to understand business requirements and deliver optimal technical solutions
What we offer
What we offer
  • Full access to foreign language learning platform
  • Personalized access to tech learning platforms
  • Tailored workshops and trainings to sustain your growth
  • Medical insurance
  • Meal tickets
  • Monthly budget to allocate on flexible benefit platform
  • Access to 7 Card services
  • Wellbeing activities and gatherings.
  • Fulltime
Read More
Arrow Right

Senior Data / Data Warehouse (DWH) Engineer

We are looking for a Senior Data / Data Warehouse Engineer to design and deliver...
Location
Location
Czechia , Prague
Salary
Salary:
70000.00 - 80000.00 CZK / Year
algoteque.com Logo
Algoteque
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience in data engineering and data warehouse development
  • Hands-on experience with ETL/ELT pipeline design and implementation
  • Proficiency in SQL and experience with PL/pgSQL stored procedures
  • Experience with DBT for data modeling and transformation
  • Experience building APIs or data access layers using FastAPI
  • Strong understanding of data modeling concepts for OLTP and OLAP systems
  • Experience implementing data quality checks, validation, and reconciliation processes
  • Familiarity with Agile methodologies and tools such as Jira
  • Strong communication skills and ability to collaborate with cross-functional and distributed teams
Job Responsibility
Job Responsibility
  • Own end-to-end data engineering and DWH delivery, from requirements and design to development, validation, and deployment
  • Design and maintain data models, data marts, and table structures for OLTP and OLAP reporting
  • Develop and maintain ETL/ELT pipelines and DBT-based transformation layers
  • Build automated data pipelines and FastAPI-based data access layers for application integration
  • Implement data validation, reconciliation, and quality checks using SQL
  • Optimize and maintain PL/pgSQL stored procedures for data processing
  • Collaborate with cross-functional and distributed teams to ensure timely and high-quality delivery
  • Work in an Agile environment using Jira and support demos or user training when needed
  • Fulltime
Read More
Arrow Right

Azure Data Engineer- Senior Consultant

Location
Location
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A Bachelor's or master's degree in computer science, Information Systems, or a related field is typically required. Additional certifications in cloud are advantageous
  • Minimum of 9+ years of experience in data engineering or a related field
  • Strong technical skills in data engineering, including proficiency in programming languages such as Python, SQL, Pyspark
  • Familiarity with Azure cloud platform viz. Azure Databricks, Data Factory, Data Lake etc., and experience in implementing data solutions in a cloud environment
  • Expertise in working with various data tools and technologies, such as ETL frameworks, data pipelines, and data warehousing solutions
  • In-depth knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
  • Knowledge of data security and privacy regulations, and the ability to ensure compliance within data engineering projects
  • Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams, stakeholders, and senior management
  • Continuous learning mindset, staying updated with the latest advancements and trends in data engineering and related technologies
Job Responsibility
Job Responsibility
  • Provide technical expertise and direction in data engineering, guiding the team in selecting appropriate tools, technologies, and methodologies
  • Stay updated with the latest advancements in data engineering and ensure the team follows best practices and industry standards
  • Collaborate with stakeholders to understand project requirements, define scope, and create project plans
  • Support project managers to ensure that projects are executed effectively, meeting timelines, budgets, and quality standards
  • Monitor progress, identify risks, and implement mitigation strategies
  • Oversee the design and architecture of data solutions, collaborating with data architects and other stakeholders
  • Ensure data solutions are scalable, efficient, and aligned with business requirements
  • Provide guidance in areas such as data modeling, database design, and data integration
  • Align coding standards, conduct code reviews to ensure proper code quality level
  • Identify and introduce quality assurance processes for data pipelines and workflows
What we offer
What we offer
  • Stable employment
  • 100% remote
  • Flexibility regarding working hours
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
  • Grow as we grow as a company
  • A diverse, inclusive, and values-driven community
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer, you’ll design, build, and operate scalable, reliable ...
Location
Location
Bulgaria , Sofia; Varna
Salary
Salary:
Not provided
mypos.com Logo
myPOS
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, or a related technical field (or equivalent practical experience)
  • 6+ years of experience as a Data Engineer, building and maintaining production-grade pipelines and datasets
  • Strong Python and SQL skills with a solid understanding of data structures, performance, and optimization strategies
  • Hands-on experience with orchestration (like Airflow, Dagster, Databricks Workflows) and distributed processing in a cloud environment
  • Experience with analytical data modeling (star and snowflake schemas), DWH, ETL/ELT patterns, and dimensional concepts
  • Experience building reliable incremental data ingestion pipelines from DBs and APIs
  • Familiarity with at least one major cloud provider (GCP, AWS, Azure) and deploying data solutions in the cloud
  • Familiarity with CI/CD for data pipelines, IaC (Terraform), and/or DataOps practices
  • Strong troubleshooting mindset: ability to debug issues across data, infra, pipelines, and deployments
  • Collaborative mindset and clear communication across engineering, analytics, and business stakeholders
Job Responsibility
Job Responsibility
  • Build and maintain data pipelines for ingestion, transformation, and export across multiple sources and destinations
  • Develop and evolve scalable data architecture to meet business and performance requirements
  • Partner with analysts and data scientists to deliver curated, analysis-ready datasets and enable self-service analytics
  • Implement best practices for data quality, testing, monitoring, lineage, and reliability
  • Optimize workflows for performance, cost, and scalability (e.g., tuning Spark jobs, query optimization, partitioning strategies)
  • Ensure secure data handling and compliance with relevant data protection standards and internal policies
  • Contribute to documentation, standards, and continuous improvement of the data platform and engineering processes
  • Ensure secure, compliant handling of data and models, including access controls, auditability, and governance practices
  • Build and maintain MLOps automation: CI/CD for ML, environment management, artifact handling, versioning of data/models/code
What we offer
What we offer
  • Vibrant international team operating in hi-tech environment
  • Annual salary reviews, promotions and performance bonuses
  • myPOS Academy for upskilling and training
  • Unlimited access to courses on LinkedIn Learning
  • Annual individual training and development budget
  • Refer a friend bonus
  • Teambuilding, social activities and networks on a multi-national level
  • Excellent compensation package
  • 25 days annual paid leave (+1 day per year up to 30)
  • Full “Luxury” package health insurance including dental care and optical glasses
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for experienced engineers that are willing to solve complex techn...
Location
Location
Salary
Salary:
Not provided
itransition.com Logo
Itransition
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of Date Engineer experience
  • Understanding of the DWH and ETL development principles and methodologies
  • Experience with Cloud DWH solutions
  • Hands-on experience with Snowflake
  • Understanding of the analytical and transactional processing
  • Advanced experience with RDBMS (MSSQL preferred) or NoSQL databases
  • Advanced in SQL querying (T-SQL preferred)
  • Experience with at least one VCS (Git, SVN, etc.)
  • English skills sufficient for spoken communication (Intermediate level and above)
What we offer
What we offer
  • Projects for such clients as PayPal, Wargaming, Xerox, Philips, adidas and Toyota
  • Competitive compensation that depends on your qualification and skills
  • Career development system with clear skill qualifications
  • Flexible working hours aligned to your schedule
  • Options to work remotely
  • Corporate medical insurance covering services of private and public medical centers
  • English courses online
  • Corporate parties and events for employees and their children
  • Internal conferences, workshops and meetups for learning and experience sharing
  • Gym membership compensation, corporate sport competitions (cybersport included)
Read More
Arrow Right

Data Product Manager

Spendesk is seeking a Data Product Manager to join our data organization. You wi...
Location
Location
Spain , Barcelona
Salary
Salary:
Not provided
spendesk.com Logo
Spendesk
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4–7 years of experience in product management, data product ownership, analytics product management, ideally from a former full-stack data senior IC role
  • A track record of owning internal data/analytics products (BI tools, semantic layers, data platforms, internal tools) from discovery through adoption
  • Experience working closely with Data/Analytics Engineers and business stakeholders (Product, Finance, RevOps)
  • Experience rolling out or migrating a BI or analytics platform is a strong plus
  • Solid understanding of modern data platforms and BI: data warehouses (ideally Snowflake), dbt, and self‑service analytics tools
  • Strong SQL skills for exploration and validation
  • Good understanding of data modeling concepts (dimensional modeling, semantic layers, shared vs domain models)
  • Familiarity with data quality, lineage, documentation, and governance
  • Ability to discuss trade‑offs on performance, complexity, and maintainability with engineers
  • Strong discovery skills: talking to users, identifying real problems, and defining success metrics
Job Responsibility
Job Responsibility
  • Define and drive the strategy for our analytics platform (including the migration from Looker/Metabase to a new, AI-native BI tool) and our core semantic data layer
  • Work hand‑in‑hand with the Datapoints squad (Data Engineering, Analytics Engineering, Software Engineering, Team Lead/EM) to turn this strategy into well‑scoped initiatives and shipped outcomes
  • Act as the “coach” for data product practices across Spendesk, helping product squads and business units (FP&A, RevOps, etc.) become autonomous and data‑savvy
  • Ensure that our data products solve real problems, are discoverable and well‑documented, and are actually used in day‑to‑day decisions
  • Co‑lead (with the squad’s Team Lead & A Engs) the vision and roadmap for Spendesk’s internal data products: analytics platform, semantic layer, shared business models, and self‑service experience
  • Align stakeholders (Head of Data, Product, FP&A, RevOps, etc.) on a clear 6–12 month plan
  • Turn strategic objectives into concrete initiatives with clear problem statements, success metrics, and business impact
  • Co‑lead (with the EM & AE) the migration from Looker and Metabase to our new BI tool
  • Own company‑wide adoption of the new BI tool
  • Help recruit and onboard 2 Analytics Engineers
What we offer
What we offer
  • Flexible on-site and remote policy
  • Alan health insurance (fully covered by Spendesk)
  • Meal vouchers through Edenred (€6 per working day)
  • 100% reimbursement on public transportation subscription
  • Access to Moka.care for emotional and mental health wellbeing
  • 28 days of holidays
  • Latest Apple equipment
  • Great office snacks to fuel your day
Read More
Arrow Right

Senior Data Solution Developer

At Bombardier, we design, build and maintain the world’s peak-performing aircraf...
Location
Location
Canada , Dorval
Salary
Salary:
Not provided
bombardier.com Logo
Bombardier
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • You hold a bachelor’s degree in computer science, Statistics, Informatics, Information Systems or another quantitative field
  • You have 8+ years of experience in a Data Engineer / Data Solution Developer role
  • You have knowledge of Agile / SCRUM project delivery, DevOps and CI/CD practices related to data solutions
  • You have advanced knowledge of SQL, query authoring/optimization and relational databases
  • You have experience optimizing ‘big data’ pipelines (storage, file formats, partitioning, Spark, Python, streaming)
  • You are efficient at performing root cause analysis to address data pipeline issues and applying long-term fixes
  • You have experience designing and building data transformation, data structures, metadata frameworks, semantic layer and automated workload management
  • You have experience implementing data protection measures, understanding data privacy and collaborating with Cybersecurity teams
  • You have good knowledge of Azure data services (Azure Data Factory, Azure Data Lake Storage, Event Hub, Databricks, Lakehouse Medallion Architecture) and PowerBI
  • You have good knowledge of object-oriented and functional programming languages: Python, Java, C++, Scala, etc.
Job Responsibility
Job Responsibility
  • Administer the enterprise data platforms (DWH, Data Lake, BI)
  • Create and maintain performance- and cost-optimized data pipelines, with high reliability, to meet business needs
  • Define and operate the infrastructure required for optimal extraction, loading and transformation (ELT) of data from a wide variety of data sources using SQL, API and Spark technologies
  • Design and implement improvements to life-cycle management processes (DevOps) enabling continuous integration, testing and deployment (CI/CT/CD) of data systems
  • Integrate data from various sources (including external data sources and IoT) and manage the big data as a key enterprise asset
  • Create and maintain backend data solutions for data analysts and data scientists. Assist them in unlocking insights from enterprise data
  • Identify, design, and implement internal processes and frameworks to improve the data platform (e.g. eliminating manual processes, optimizing data delivery, evolving data infrastructure capabilities, etc.)
  • Work with stakeholders including product, data and architecture SMEs to assist with data-related technical issues and support their data infrastructure needs
  • Ensure compliance with data architecture, data governance principles and security requirements
  • Implement and maintain the data platform’s semantic layer
What we offer
What we offer
  • Insurance plans (Dental, medical, life insurance, disability, and more)
  • Competitive base salary
  • Retirement savings plan
  • Employee Assistance Program
  • Tele Health Program
  • Fulltime
Read More
Arrow Right