CrawlJobs Logo

Data Migration Junior Developer

testhr.pl Logo

Advisory Group TEST Human Resources

Location Icon

Location:
Poland , Kraków

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

As a Data Migration Junior Developer, you’ll play a critical role in ensuring the successful transition from legacy HCM system to SAP SuccessFactors. Your work will directly support our HR-IT transformation program by enabling clean, validated, and structured data migration — a foundation for any reliable system go-live. You’ll work closely with business analysts, data owners, and technical teams to analyze source data, identify and address data quality issues, implement transformation logic, and execute the migration process using ABAP and other tools. Your contribution will be essential not only for technical execution but also for building trust in data integrity among stakeholders.

Job Responsibility:

  • Analyzing legacy HCM data structures and other required data sources, and then identifying data elements in scope for migration
  • Developing transformation logic to map source data to SuccessFactors data model
  • Coding data extraction and transformation routines using ABAP
  • Preparing and maintaining migration templates and conversion scripts
  • Executing test migrations and supporting defect resolution during testing cycles
  • Supporting data cleansing analysis and enrichment in coordination with business stakeholders
  • Validating migrated data against source systems for completeness and accuracy
  • Collaborating with functional consultants and data owners to align on data requirements
  • Supporting cutover planning, final data loads, and go-live activities
  • Perform technical troubleshooting of data load issues, including field mapping errors, transformation logic mismatches, and system performance bottlenecks during migration cycles
  • Documenting all migration logic, field mappings, and technical procedures for audit and reuse

Requirements:

  • Master’s or Bachelor’s degree
  • Min. 3 years of experience in data analysis using Excel (Power Query, Pivot Tables)
  • Basic experience with Python or/and Java
  • Willingness to work with and learn ABAP
  • Knowledge of SAP HCM or/and SuccessFactors data structures and infotypes will be an asset
  • Understanding of data mapping, transformation, and validation principles
  • Strong analytical thinking and ability to identify inconsistencies in large data sets
  • Ability to analyze and document data requirements and field-level mappings
  • Very good English skills
  • Independent and systematic approach to work
  • Clear and effective communication with technical and non-technical stakeholders
  • Ability to detect data inconsistencies, errors, and mismatches across large and complex data sets
  • Team player
  • Flexibility and adaptability

Nice to have:

Knowledge of SAP HCM or/and SuccessFactors data structures and infotypes

What we offer:
  • Attractive development opportunities
  • Private medical care for employees and their families
  • Possibility to fulfil a dream about far journeys with attractive discounts for flight tickets
  • Possibility to adjust the place and hours of work to private needs
  • Hybrid working model (work in the office one day per week)
  • Opportunity to work from each part of Poland during remote work
  • Flexible working time
  • New branded and modern office: 200 adjustable desks, several types of meeting rooms, bike amenities, lounge rooms and chill out spaces, cozy kitchens and excellent public transport in a walking distance
  • Continuous improvement environment
  • Supportive & friendly working atmosphere - we work towards a joint success

Additional Information:

Job Posted:
January 20, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Migration Junior Developer

New

Data Migration Junior Developer

As a Data Migration Junior Developer, you’ll play a critical role in ensuring th...
Location
Location
Poland , Kraków
Salary
Salary:
Not provided
testhr.pl Logo
Advisory Group TEST Human Resources
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master’s or Bachelor’s degree
  • Min. 3 years of experience in data analysis using Excel (Power Query, Pivot Tables)
  • Basic experience with Python or/and Java
  • Willingness to work with and learn ABAP
  • Understanding of data mapping, transformation, and validation principles
  • Strong analytical thinking and ability to identify inconsistencies in large data sets
  • Ability to analyze and document data requirements and field-level mappings
  • Very good English skills
  • Independent and systematic approach to work
  • Clear and effective communication with technical and non-technical stakeholders
Job Responsibility
Job Responsibility
  • Analyzing legacy HCM data structures and other required data sources, and then identifying data elements in scope for migration
  • Developing transformation logic to map source data to SuccessFactors data model
  • Coding data extraction and transformation routines using ABAP
  • Preparing and maintaining migration templates and conversion scripts
  • Executing test migrations and supporting defect resolution during testing cycles
  • Supporting data cleansing analysis and enrichment in coordination with business stakeholders
  • Validating migrated data against source systems for completeness and accuracy
  • Collaborating with functional consultants and data owners to align on data requirements
  • Supporting cutover planning, final data loads, and go-live activities
  • Perform technical troubleshooting of data load issues, including field mapping errors, transformation logic mismatches, and system performance bottlenecks during migration cycles
What we offer
What we offer
  • Attractive development opportunities
  • Private medical care for employees and their families
  • Possibility to fulfil a dream about far journeys with attractive discounts for flight tickets
  • Possibility to adjust the place and hours of work to private needs
  • Hybrid working model (work in the office one day per week)
  • Opportunity to work from each part of Poland during remote work
  • Flexible working time
  • New branded and modern office: 200 adjustable desks, several types of meeting rooms, bike amenities, lounge rooms and chill out spaces, cozy kitchens and excellent public transport in a walking distance
  • Continuous improvement environment
  • Supportive & friendly working atmosphere - we work towards a joint success
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right

Data Engineer

We are looking for a Data Engineer with a collaborative, “can-do” attitude who i...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred
  • 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 3+ years of experience with setting up and operating data pipelines using Python or SQL
  • 3+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy/analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring
  • Strong analytical abilities and a strong intellectual curiosity
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals
  • Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development
  • Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance)
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver
  • Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders
  • Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability)
  • Stay current with and adopt new tools and applications to ensure high quality and efficient solutions
  • Build cross-platform data strategy to aggregate multiple sources and process development datasets
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer with a collaborative, “can-do” attitud...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
  • Strong analytical abilities and a strong intellectual curiosity
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals
  • Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery
  • Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders
  • Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability)
  • Stay current with and adopt new tools and applications to ensure high quality and efficient solutions
  • Build cross-platform data strategy to aggregate multiple sources and process development datasets
  • Fulltime
Read More
Arrow Right

Senior Power Platform Solution Architect

The Senior Power Platform Solution Architect / SharePoint Modernization Lead wil...
Location
Location
United States , Arlington
Salary
Salary:
140000.00 - 155000.00 USD / Year
trilogyfederal.com Logo
Trilogy Federal
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, Software Engineering, or a related field
  • Minimum 8 years of progressive experience in designing and developing enterprise SharePoint solutions (including SharePoint Online and Office 365)
  • Experience architecting and building custom business applications with Microsoft Power Apps and Power Automate
  • Experience in executing complex data migrations into Microsoft Dataverse, involving design, extraction, transformation, and load, as well as validation
  • Demonstrated mastery of: SharePoint Online, including document management, site design, list/library customization, permissions, and REST API integration, Power Apps (Canvas and Model-Driven), Power Automate, and Dataverse schema design, Power Fx, JSON, JavaScript, HTML5Custom connector development, use of Microsoft Graph API, and Azure integration (Azure Logic Apps, Azure Functions)
  • Exceptional written and verbal communication skills, with a proven ability to explain technical solutions to both highly technical and non-technical audiences
  • Ability to obtain a Public Trust Clearance
Job Responsibility
Job Responsibility
  • Architect, design, and deliver integrated solutions using SharePoint Online, Power Platform (Power Apps – Canvas & Model-Driven Apps, Power Automate, Power BI, Dataverse), and Azure Services (Logic Apps, Functions, Key Vault)
  • Lead modernization and automation strategies ensuring scalable, secure, and compliant solutions in alignment with VA and federal standards
  • Direct multidisciplinary teams and guide technical decision-making to bridge business objectives to successful delivery
  • Serve as the principal technical advisor for the team regarding Power Platform capabilities and limitations, and enterprise integration patterns
  • Establish technical best practices and mentor junior developers and power users
  • Facilitate discovery sessions and requirements workshops with business users, process owners, and VA leadership to elicit and document functional, non-functional, and security-related requirements
  • Translate stakeholder requirements into clear technical specifications, process models, and data schemas
  • Develop advanced, data-driven Power Apps and Power Automate flows to automate intake, triage, review, and approval workflows with complex logic and role-based routing
  • Build and maintain custom connectors, reusable controls, and components using Power Fx and available APIs
  • Design and implement integrations between Power Platform, SharePoint, Microsoft Teams, Outlook, and external systems via Dataverse, APIs, and connectors
What we offer
What we offer
  • Health, dental, and vision plans
  • Optional FSA
  • Paid parental leave
  • Safe Harbor 401(k) with employer contributions 100% vested from day 1
  • Paid time off and 11 paid holidays
  • No cost group term life/AD&D plan, and optional supplemental coverage
  • Pet insurance
  • Monthly phone and internet stipend
  • Tuition and training reimbursement
  • Fulltime
Read More
Arrow Right

Senior Software Engineer

The Senior Software Development Engineer specializing in ERP preferably in Workd...
Location
Location
United States , Englewood
Salary
Salary:
49.78 - 74.05 USD / Hour
americannursingcare.com Logo
American Nursing Care
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelors Of Science in Computer Science or equivalent knowledge and skills obtained through a combination of education, training, and experience in a senior-level Healthcare environment
  • 5+ years of experience working in IT as an ERP Software Engineer or equivalent software development role
  • 5+ years of Workday integration development (Studio, EIB, RaaS, APIs, Workday Extend)
  • 1-3 years - Strong understanding of HCM, Payroll, Finance, and Supply Chain modules and related data models
Job Responsibility
Job Responsibility
  • Business Process Analysis & Optimization: Analyze existing business processes and workflows to identify opportunities for improvement and automation. Develop detailed technical specifications and solution designs to implement these improvements
  • Agile Development & Collaboration: Lead and actively participate in agile ceremonies (sprint planning, daily stand-up, sprint review, retrospective). Collaborate effectively with business analysts, scrum masters, QA analysts, product owners, and other cross-functional teams to define and deliver impactful projects
  • Software Development & Deployment: Workday Solution Design & Development Lead development of ERP - Workday integrations using Workday Studio, EIB, Cloud Connectors (PECI, etc.,), RaaS, APIs including WQL, and Workday Extend. Build and maintain secure, scalable integrations with downstream and upstream systems (Payroll, HR, Finance, Supply Chain, Identity Management, Benefits vendors, etc.). Develop calculated fields, condition rules, business processes, and custom reports to meet complex business needs
  • Data Conversion & Migration: ERP Data Conversion & Migration Support Collaborate with the Data Conversion team leveraging GCP BigQuery for centralized transformation of HR, Payroll, Finance, and Supply Chain data. Build automated scripts and validation tools to ensure data accuracy during iterative conversion cycles. Partner with SMEs to reconcile data from multiple source ERPs and ensure high-quality migration into Workday
  • Technical Leadership & Governance: Document and demonstrate solutions through clear and concise documentation, flowcharts, layouts, diagrams, charts, code comments, and code. Communicate technical concepts effectively to both technical and non-technical stakeholders. Work closely with enterprise architects to align with integration standards, security models, and data governance policies. Support vendor engagement by providing technical guidance and reviewing third-party integrations to ensure compliance with our OneERP technical standards. Participate in technical design reviews, ensuring solutions are scalable and aligned with future-state architecture
  • Testing & Quality Assurance: Conduct thorough testing of solutions to ensure accuracy, reliability, and scalability. Debug and resolve issues that arise during testing or production. Partner with testing teams to design and execute Workday performance testing plans for high-volume payroll, recruiting, and finance transactions. Identify and remediate integration bottlenecks to ensure reliability and scalability across markets
  • Security & Compliance: Implement security measures to protect sensitive data and ensure all implementations comply with organizational policies, industry regulations, and security standards
  • Mentorship & Training: Provide guidance and mentorship to junior developers or team members. Conduct training sessions to share best practices and knowledge on used tools and techniques. Help build a Workday engineering playbook to standardize integration development, testing, and deployment
  • Production Support & On-Call: Support deployments, troubleshoot production issues, and participate in on-call rotations as needed
What we offer
What we offer
  • medical, prescription drug, dental, vision plans, life insurance, paid time off (full-time benefit eligible team members may receive a minimum of 14 paid time off days, including holidays annually), tuition reimbursement, retirement plan benefit(s) including, but not limited to, 401(k), 403(b), and other defined benefits offerings
  • Fulltime
Read More
Arrow Right

Salesforce Specialist

You are just a few clicks away from becoming a Salesforce Specialist at a dynami...
Location
Location
Luxembourg , Bertrange
Salary
Salary:
Not provided
cap4lab.com Logo
CAP4 LAB
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Excellent level of English and French (C2 level)
  • Bachelor's degree in computer science or information technology (Master's and/or PhD an advantage)
  • Relevant experience (+5 years) as a developer/architect on CRM products
  • Skills in object-oriented programming languages (Java, Apex)
  • Relevant experience with CRM specific data models (objects and relationships) as well as good knowledge of database design principles (such as normalization and indexing)
  • Knowledge of Salesforce APIs, such as REST, SOAP and Bulk API
  • Understanding of different integration models and middleware tools for creating robust and scalable integrations
  • Strong troubleshooting skills to resolve issues that arise during development and deployment
  • Relevant experience migrating data from other systems into CRM and designing interfaces for synchronization of data between IT systems
  • Strong communication and collaboration skills, with the ability to work effectively in a team environment
Job Responsibility
Job Responsibility
  • Develop customized solutions on the Salesforce platform to meet business and operational requirements
  • Collaborate with stakeholders to gather and document functional needs, translating them into clear technical specifications
  • Design and build custom applications, workflows, and process automations using Apex, Visualforce, and Lightning Components
  • Define and implement the technical architecture for complex Salesforce solutions, ensuring scalability, performance, and security
  • Design and maintain integrations between Salesforce and external systems (e.g., ERP, marketing platforms, support tools) using REST, SOAP, or middleware solutions
  • Ensure data integrity, security, and compliance within the Salesforce environment, including user access and sharing settings
  • Design and refine data models to support extended business needs while optimizing for performance and scalability
  • Identify opportunities for improvement and propose effective, sustainable solutions aligned with organizational goals
  • Perform unit testing, system testing, and bug resolution to ensure robust and reliable deliverables
  • Provide strategic vision and technical guidance across Salesforce projects, contributing to platform governance and best practices
What we offer
What we offer
  • Permanent contract with an attractive package based on your current and future potential, with flexibility in your work
  • Training for employees
  • Pleasant working environment
  • headquarters has been certified as a great place to work
  • Company with a strong CSR impact
  • supports several associations
  • specific commitments in the sports and charitable sectors
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Backend

As a Senior Software Engineer, Backend specializing in database architecture and...
Location
Location
United States , San Francisco
Salary
Salary:
150000.00 - 240000.00 USD / Year
chefrobotics.ai Logo
Chef Robotics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or equivalent practical experience
  • 7+ years of professional experience in backend development roles with demonstrated leadership experience
  • Expert knowledge of relational databases (MySQL, PostgreSQL) including schema design, optimization, and administration
  • Strong proficiency with Python and JavaScript/TypeScript with advanced software engineering skills
  • Extensive experience leading projects with at least two web frameworks: Flask, FastAPI, Django, Node.js, or Next.js
  • Proven experience designing and implementing RESTful and GraphQL APIs at scale
  • Advanced understanding of containerization (Docker) and orchestration (Kubernetes) technologies
  • Experience with cloud infrastructure and deployment (AWS, GCP, or Azure) in production environments
  • Proven experience leading complex backend projects and mentoring junior engineers
  • Understanding of data requirements for robotics or automation systems
Job Responsibility
Job Responsibility
  • Lead the design, implementation, and optimization of database schemas to support robot operations, telemetry, recipe management, and system analytics
  • Develop robust data migration strategies and version control for database schema evolution
  • Implement efficient query optimization and indexing strategies to support high-throughput robot operations
  • Establish data integrity protocols and backup systems to ensure operational continuity across customer deployments
  • Create scalable data access layers that balance security, performance, and maintainability
  • Mentor team members on database design patterns and optimization techniques
  • Lead the development and maintenance of scalable APIs to serve robot control systems, dashboards, and monitoring tools
  • Design and implement secure authentication and authorization mechanisms across backend services
  • Develop robust middleware for processing and validating data between robotics subsystems
  • Create service interfaces that enable efficient communication between robotics components and cloud services
What we offer
What we offer
  • medical, dental, and vision insurance
  • commuter benefits
  • flexible paid time off (PTO)
  • catered lunch
  • 401(k) matching
  • early-stage equity
  • Fulltime
Read More
Arrow Right