CrawlJobs Logo

Senior Data Migration Engineer

cshark.com Logo

Cshark

Location Icon

Location:
Poland , Wrocław

Category Icon

Job Type Icon

Contract Type:
B2B

Salary Icon

Salary:

135.00 - 170.00 PLN / Hour

Job Description:

As a Senior Data Migration Engineer, you will be a core execution member of the dedicated migration squad. You will work hands-on across the full migration lifecycle – from extraction and mapping through conversion, validation, and cutover – while also actively building the tooling and accelerators that make future migrations faster and safer (e.g.: artifact observability, version-controlled mapping pipelines, automated validation checks, and reusable migration templates). This is not a maintenance role: you will shape the process, own the tooling, and directly influence delivery velocity.

Job Responsibility:

  • Work hands-on across the full migration lifecycle – from extraction and mapping through conversion, validation, and cutover
  • Actively building the tooling and accelerators that make future migrations faster and safer (e.g.: artifact observability, version-controlled mapping pipelines, automated validation checks, and reusable migration templates)
  • Shape the process, own the tooling, and directly influence delivery velocity
  • Mentor and technically lead parallel migration teams

Requirements:

  • Proven experience with large-scale data migration projects: ETL or systems integration on enterprise-scale projects, transformation pipelines, cutover planning
  • Experience working with XML-based data transformation (mapping files, XSLT, or equivalent config-driven ETL)
  • Proven ability to diagnose complex migration errors under time pressure: root-cause analysis, staging data fixes, re-run coordination
  • Experience coordinating with business stakeholders during UAT and cutover
  • clear written and verbal communication in English
  • Experience with legacy database analysis: reverse-engineering un(der)documented schemas, understanding data semantics and relationships without complete documentation and dealing with corrupted or ambiguous data or metadata
  • Comfortable working with SQL at an advanced level: complex queries, schema analysis, data profiling, and diagnosing data quality issues
  • Strong proficiency in SQL across at least two of: PostgreSQL, Oracle, SQL Server, DB2
  • Solid Java skills (Java 21 ecosystem preferred)
  • comfortable building and extending internal tooling – data pipelines, automation scripts, validation frameworks – with clean, testable code
  • Familiarity with Git workflows, CI/CD pipelines, and infrastructure-as-code practices
  • Good exposure to cloud environments, ideally AWS infrastructure and services
  • Docker: confident setup, troubleshooting, and local environment management
  • Rapid domain understanding: ability to quickly absorb unfamiliar, regulated business domains
  • Governance mindset: documenting decisions, maintaining audit trails, getting formal client sign-off
  • Experience working with subject matter experts to define and validate data transformations and mappings
  • Ability to manage multiple concurrent workstreams
  • Clear communicator: able to explain architectural and domain modelling decisions to both technical and non-technical stakeholders
  • Resilient and persistent under pressure: comfortable working with tight deadlines and high client expectations
  • Actively uses AI-assisted tools (e.g. ChatGPT, Claude, GitHub Copilot or similar) as part of daily development and problem-solving workflow

Nice to have:

  • Exposure to Smalltalk or ObjectStudio
  • Familiarity with S3-compatible object storage, artifact lifecycle management, or observability tooling (dashboards, trend analysis)
  • Experience with cloud-native deployments on AWS
  • understanding of multi-tenant SaaS architectures
  • Debt management, financial services, or accounts receivable domain experience is a significant advantage
  • German language skills (B2+)
What we offer:
  • 100% remote work
  • Flexible hours
  • International projects
  • Business english lessons
  • Participation in charity actions
  • In-house technology workshops

Additional Information:

Job Posted:
March 01, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Migration Engineer

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer with a collaborative, “can-do” attitud...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
  • Strong analytical abilities and a strong intellectual curiosity
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals
  • Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery
  • Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders
  • Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability)
  • Stay current with and adopt new tools and applications to ensure high quality and efficient solutions
  • Build cross-platform data strategy to aggregate multiple sources and process development datasets
  • Fulltime
Read More
Arrow Right

Senior Data Architect

We are seeking a highly experienced Senior Data Architect with 12+ years of expe...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 12+ years of experience in Data Engineering and Data Architecture
  • Proven experience working as a Data Architect on large-scale AWS platforms
  • Strong experience designing enterprise data lakes and data warehouses
  • Hands-on experience with batch data processing and orchestration frameworks
  • Excellent communication and stakeholder management skills
  • Ability to work onsite in Dubai, UAE
  • AWS Glue (ETL, Data Catalog)
  • Amazon EMR (Batch Processing)
  • AWS Lambda (Serverless Data Processing)
  • Amazon MWAA (Apache Airflow)
Job Responsibility
Job Responsibility
  • Design and own end-to-end AWS data architecture for enterprise platforms
  • Define data architecture standards, best practices, and reference models
  • Architect batch and event-driven data pipelines using AWS native services
  • Lead data ingestion, transformation, and orchestration workflows
  • Design and implement solutions using AWS Glue, EMR, Lambda, and MWAA (Airflow)
  • Architect data lakes and data warehouses using Amazon S3 and Amazon Redshift
  • Design NoSQL data solutions using Amazon DynamoDB
  • Implement data governance, metadata management, and access control using AWS DataZone
  • Ensure monitoring, logging, and observability using Amazon CloudWatch
  • Partner with engineering, analytics, and business teams to translate requirements into scalable data solutions
  • Fulltime
Read More
Arrow Right

Data Engineer

Location: 100% remote; Years’ Experience: 10+ years professional experience; Edu...
Location
Location
United States
Salary
Salary:
Not provided
sparibis.com Logo
Sparibis
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of IT experience focusing on enterprise data architecture and management
  • Experience with Databricks, Structured Streaming, Delta Lake concepts, and Delta Live Tables required
  • Experience with ETL and ELT tools such as SSIS, Pentaho, and/or Data Migration Services
  • Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design, Postgres performance optimization)
  • Must be able to obtain a Public Trust security clearance
  • Bachelor degree required
  • Experience in Conceptual/Logical/Physical Data Modeling & expertise in Relational and Dimensional Data Modeling
  • Additional experience with Spark, Spark SQL, Spark DataFrames and DataSets, and PySpark
  • Data Lake concepts such as time travel and schema evolution and optimization
  • Experience leading and architecting enterprise-wide initiatives specifically system integration, data migration, transformation, data warehouse build, data mart build, and data lakes implementation / support
Job Responsibility
Job Responsibility
  • Plan, create, and maintain data architectures, ensuring alignment with business requirements
  • Obtain data, formulate dataset processes, and store optimized data
  • Identify problems and inefficiencies and apply solutions
  • Determine tasks where manual participation can be eliminated with automation
  • Identify and optimize data bottlenecks, leveraging automation where possible
  • Create and manage data lifecycle policies (retention, backups/restore, etc)
  • In-depth knowledge for creating, maintaining, and managing ETL/ELT pipelines
  • Create, maintain, and manage data transformations
  • Maintain/update documentation
  • Create, maintain, and manage data pipeline schedules
Read More
Arrow Right

Data (DevOps) Engineer

Ivy Partners is a Swiss consulting firm dedicated to helping businesses navigate...
Location
Location
Switzerland , Genève
Salary
Salary:
Not provided
ivy.partners Logo
IVY Partners
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Substantial experience with Apache Airflow in complex orchestration and production settings
  • Advanced skills in AWS, Databricks, and Python for data pipelines, MLOps tooling, and automation
  • Proven experience deploying volumetric and sensitive pipelines
  • Confirmed to senior level
  • Highly autonomous, capable of working in a critical and structuring environment
  • Not just a team player but someone who challenges the status quo, proposes solutions, and elevates the team
  • Communicate clearly and possess a strong sense of business urgency
Job Responsibility
Job Responsibility
  • Design and maintain high-performance data pipelines
  • Migrate large volumes of historical and operational data to AWS
  • Optimize data flows used by machine learning models for feature creation, time series, and trade signals
  • Ensure the quality, availability, and traceability of critical datasets
  • Collaborate directly with data scientists to integrate, monitor, and industrialize models: price prediction models, optimization algorithms, and automated trading systems
  • Support model execution and stability in production environments utilizing Airflow and Databricks
  • Build, optimize, and monitor Airflow DAGs
  • Automate Databricks jobs and integrate CI/CD pipelines (GitLab/Jenkins)
  • Monitor the performance of pipelines and models, and address incidents
  • Deploy robust, secure, and scalable AWS data architectures
What we offer
What we offer
  • Supportive environment where everyone is valued, with training and career advancement opportunities
  • Building a relationship based on transparency, professionalism, and commitment
  • Encouraging innovation
  • Taking responsibility
Read More
Arrow Right

Senior Engineering Manager

Senior Engineering Manager role at Atlassian requiring leadership, planning, and...
Location
Location
United States , San Francisco
Salary
Salary:
205700.00 - 274300.00 USD / Year
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Electronics Engineering, or a related field
  • 10 years of progressive software development experience building large-scale and modern web software at a SaaS company
  • 5 years of experience managing software engineers
  • experience driving operational and engineering excellence improvements across teams
  • setting and maintaining standards for engineering teams
  • must pass technical interview.
Job Responsibility
Job Responsibility
  • Plan, initiate, and manage information projects for the development of a new foundation for web properties to increase experiment speed, increase developer productivity, improve SEO, and more
  • ensure the foundation meets Atlassian’s long-term business needs and work with stakeholders to migrate over
  • lead and guide a team of software engineers for Atlassian’s collaboration software products through hiring, mentoring, and hands-on career development
  • work with product managers and principal engineers to guide the roadmap for scaling and evolving critical infrastructure
  • champion cultural and process improvements through engineering excellence, quality, and efficiency
  • drive revenue growth through experimentation and data-driven development
  • drive operational and engineering excellence improvements across teams, setting and maintaining standards for engineering teams.
What we offer
What we offer
  • Health and wellbeing resources
  • paid volunteer days
  • wide range of perks and benefits designed to support you and your family
  • engagement with local community.
  • Fulltime
Read More
Arrow Right

Senior Network Engineer

We are looking for a highly skilled HPE Aruba Senior Network Engineer – Premium ...
Location
Location
India
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of hands-on experience supporting and troubleshooting HPE Aruba WLAN solutions in large-scale enterprise environments
  • Expert knowledge of 802.11 standards, RF design, and enterprise Wi-Fi troubleshooting
  • Hands-on experience with HPE Aruba Mobility Controllers (AOS 8 & AOS 10), HPE Aruba APs, and HPE Aruba Central
  • Strong understanding of VLANs, QoS, AAA (RADIUS/TACACS), firewall rules, and network security best practices
  • Experience working in a high-pressure premium support environment with enterprise customers
  • Excellent problem-solving, analytical, and communication skills
Job Responsibility
Job Responsibility
  • Deliver premium support services to enterprise customers using HPE Aruba WLAN AOS8/AOS10 and HPE Aruba Central
  • Troubleshoot and resolve complex wireless performance, authentication, and roaming issues
  • Provide advanced technical guidance, best practices, and proactive recommendations for HPE Aruba WLAN environments
  • Work closely with customers to diagnose issues in HPE Aruba Mobility Controllers, APs, ClearPass (NAC), and cloud-managed networks in HPE Aruba Central
  • Perform Wi-Fi site analysis, RF tuning, and optimization for high-density environments
  • Analyze logs, packet captures, and HPE Aruba telemetry data to identify root causes of network issues
  • Collaborate with Engineering teams and product management team for escalations and fixes
  • Monitor and maintain customer networks using HPE Aruba Central, AirWave, and AI-driven analytics
  • Assist with firmware upgrades, security patching, and HPE ArubaOS version migrations
  • Document troubleshooting steps, resolutions, and best practices for internal teams and customers
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
  • Fulltime
Read More
Arrow Right