CrawlJobs Logo

Data Platform Engineer (DWH / Data Management)

https://www.randstad.com Logo

Randstad

Location Icon

Location:
Japan , 東京23区,神奈川

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

6000000.00 - 9000000.00 JPY / Year

Job Description:

A global manufacturing company is seeking a Data Platform Engineer to lead the development and management of an enterprise data platform. This role focuses on building a scalable data environment and establishing strong data management practices to support company-wide decision-making.

Job Responsibility:

  • Design, build, and maintain a data warehouse (DWH) integrating multiple internal data sources
  • Develop and implement data management frameworks (data quality, data catalog, governance)
  • Establish data validation, testing, and standardization processes
  • Manage platform operations including monitoring, backup, and recovery
  • Conduct technical research and evaluate new data technologies
  • Define and promote data management standards and best practices across the organization

Requirements:

  • Experience in data platform / DWH / BI infrastructure development or operations
  • Experience with data management (data quality, governance, or cataloging)
  • Understanding of ETL/ELT processes and data integration
  • Knowledge of data architecture and data lifecycle management
What we offer:
  • 健康保険
  • 厚生年金保険
  • 雇用保険
  • 土曜日
  • 日曜日
  • 祝日
  • 賞与あり

Additional Information:

Job Posted:
May 03, 2026

Expiration:
March 18, 2027

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Platform Engineer (DWH / Data Management)

Senior Databricks Data Engineer

To develop, implement, and optimize complex Data Warehouse (DWH) and Data Lakeho...
Location
Location
Romania , Bucharest
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven, expert-level experience with the entire Databricks ecosystem (Workspace, Cluster Management, Notebooks, Databricks SQL)
  • In-depth knowledge of Spark architecture (RDD, DataFrames, Spark SQL) and advanced optimization techniques
  • Expertise in implementing and managing Delta Lake (ACID properties, Time Travel, Merge, Optimize, Vacuum)
  • Advanced/expert-level proficiency in Python (with PySpark) and/or Scala (with Spark)
  • Advanced/expert-level skills in SQL and Data Modeling (Dimensional, 3NF, Data Vault)
  • Solid experience with a major Cloud platform (AWS, Azure, or GCP), especially with storage services (S3, ADLS Gen2, GCS) and networking.
Job Responsibility
Job Responsibility
  • Design and implement robust, scalable, and high-performance ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform
  • Expertise in implementing and optimizing the Medallion architecture (Bronze, Silver, Gold) using Delta Lake to ensure data quality, consistency, and historical tracking
  • Efficient implementation of the Lakehouse architecture on Databricks, combining best practices from DWH and Data Lake
  • Optimize Databricks clusters, Spark operations, and Delta tables to reduce latency and computational costs
  • Design and implement real-time/near-real-time data processing solutions using Spark Structured Streaming and Delta Live Tables
  • Implement and manage Unity Catalog for centralized data governance, data security and data lineage
  • Define and implement data quality standards and rules to maintain data integrity
  • Develop and manage complex workflows using Databricks Workflows or external tools to automate pipelines
  • Integrate Databricks pipelines into CI/CD processes
  • Work closely with Data Scientists, Analysts, and Architects to understand business requirements and deliver optimal technical solutions
What we offer
What we offer
  • Full access to foreign language learning platform
  • Personalized access to tech learning platforms
  • Tailored workshops and trainings to sustain your growth
  • Medical insurance
  • Meal tickets
  • Monthly budget to allocate on flexible benefit platform
  • Access to 7 Card services
  • Wellbeing activities and gatherings.
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Our client, a leading global consulting firm specializing in AI, advanced analyt...
Location
Location
Saudi Arabia , Riyadh
Salary
Salary:
Not provided
https://gitmax.com/ Logo
Gitmax
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Greenplum
  • Teradata
  • advanced SQL proficiency
  • Enterprise ETL tool experience
  • fluent in English (spoken and written)
  • 5+ years of experience in Data Warehousing (DWH) projects within fintech, IT consulting, or related industries
  • proven experience managing a team of 5+ engineers in a technical leadership capacity
  • strong understanding of financial and banking data management practices
Job Responsibility
Job Responsibility
  • Lead the transition of a major credit bureau’s data platform from multiple scattered data warehouses to a unified Greenplum platform
  • manage ETL process migration and implement CDC and streaming solutions
  • develop a corporate reporting platform for a major bank
  • guide and mentor engineering teams
  • ensure best practices in data architecture, ETL development, and code review processes
  • work closely with internal and external business stakeholders to define technical solutions
  • enhance SQL queries, ETL pipelines, and data models for high performance and scalability
  • troubleshoot data integration issues
  • apply Agile methodologies to ensure smooth project execution and timely delivery
  • Fulltime
Read More
Arrow Right

Data Engineer II

MU-Data Engineers II located in Costa Mesa, CA will provide direct data engineer...
Location
Location
United States , Costa Mesa
Salary
Salary:
128676.72 - 130000.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree (or foreign equivalent) in computer science, any engineering, data science or closely related field
  • 2 years of relevant experience
  • Advanced level in SQL and Python for data engineering and backend development
  • Understanding of popular code development approaches: Test-driven development & Continuous Integration/Continuous Deployment (CI/CD)
  • Experienced working with AWS services (Lambda, S3, Glue, Redshift etc.) and Cloud Data Warehouse
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Define digital data collection methods and implementations to DWH and customer management platforms
  • Provide guidance to website tagging, digital data integration and collections
  • Experienced with QA testing process to verify data is collected & rendered as expected
  • Prior working knowledge of Braze or other Customer Management Platforms
Job Responsibility
Job Responsibility
  • Work collaboratively with cross-functional teams to determine data transformation needs, establish data collection requirements, and deploy & maintain accurate data to ensure metrics & dimensions are addressed for analytics/reporting requirements
  • Participate in daily/weekly SCRUM ceremonies
  • Transform required data conditions into performance code logic
  • Develop high-performance code in SQL and Python
  • Develop, automate, and enhance ELT processes for continuous data flow
  • Onboard domain data as custom attributes and events to external systems such as Braze, Customer Journey
  • Maintain data daily updates and core logic integrity
  • Provide support to troubleshoot data anomalies and apply resolutions
  • Create, modify, and deploy code to production via AWS resources including Lambda, S3, Glue, Redshift, etc.
  • Partner with stakeholders to support the implementation and development of data requirements. Troubleshoot data anomalies and provide applicable solutions
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Paid time off
  • Up to 12 paid holidays
  • Paid parental and family leave
  • Fulltime
Read More
Arrow Right

Azure Data Engineer- Senior Consultant

Location
Location
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A Bachelor's or master's degree in computer science, Information Systems, or a related field is typically required. Additional certifications in cloud are advantageous
  • Minimum of 9+ years of experience in data engineering or a related field
  • Strong technical skills in data engineering, including proficiency in programming languages such as Python, SQL, Pyspark
  • Familiarity with Azure cloud platform viz. Azure Databricks, Data Factory, Data Lake etc., and experience in implementing data solutions in a cloud environment
  • Expertise in working with various data tools and technologies, such as ETL frameworks, data pipelines, and data warehousing solutions
  • In-depth knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
  • Knowledge of data security and privacy regulations, and the ability to ensure compliance within data engineering projects
  • Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams, stakeholders, and senior management
  • Continuous learning mindset, staying updated with the latest advancements and trends in data engineering and related technologies
Job Responsibility
Job Responsibility
  • Provide technical expertise and direction in data engineering, guiding the team in selecting appropriate tools, technologies, and methodologies
  • Stay updated with the latest advancements in data engineering and ensure the team follows best practices and industry standards
  • Collaborate with stakeholders to understand project requirements, define scope, and create project plans
  • Support project managers to ensure that projects are executed effectively, meeting timelines, budgets, and quality standards
  • Monitor progress, identify risks, and implement mitigation strategies
  • Oversee the design and architecture of data solutions, collaborating with data architects and other stakeholders
  • Ensure data solutions are scalable, efficient, and aligned with business requirements
  • Provide guidance in areas such as data modeling, database design, and data integration
  • Align coding standards, conduct code reviews to ensure proper code quality level
  • Identify and introduce quality assurance processes for data pipelines and workflows
What we offer
What we offer
  • Stable employment
  • 100% remote
  • Flexibility regarding working hours
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
  • Grow as we grow as a company
  • A diverse, inclusive, and values-driven community
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

The next step of your career starts here, where you can bring your own unique mi...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
metyis.com Logo
Metyis
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7-10 years of professional hands-on experience
  • A broad practice in multiple software engineering fields
  • Experience on managing & leading data engineering / warehousing projects
  • Must have experience on Python, SQL and Distributed programming languages, preferable Spark
  • Experience working on Cloud, Azure is a plus
  • Experience working setting up data lakes using entire Big Data and DWH ecosystem
  • Experience on data workflows and ETL, Apache Airflow is a plus
  • Comfortable with Unix OS type systems, Bash and Linux
  • Basic knowledge of webservices and APIs
  • Basic knowledge of Containers and Docker as platform
Job Responsibility
Job Responsibility
  • Independently lead & manage execution of data engineering projects
  • Engineer complete technical solutions to solve concrete business challenges in the areas of digital marketing, eCommerce, Business Intelligence and self-service analytics
  • Collect functional and non-functional requirements, consider technical environments, business constraints and enterprise organizations
  • Support our clients in executing their Big Data strategies by designing and building operational data platforms: ETL pipelines, data anonymization pipelines, data lakes, near real-time streaming data hubs, web services, training and scoring machine learning models
  • Troubleshoot and quality check work done by team members
  • Collaborate closely with partners, strategy consultants and data scientists in a flat and agile organization where personal initiative is highly valued
  • Share data engineering knowledge by giving technical trainings
  • Communicate and interact with clients at the executive level
  • Guide and mentor team members
What we offer
What we offer
  • Interact with C-level at our clients on regular basis to drive their business towards impactful change
  • Lead your team in creating new business solutions
  • Seize opportunities at the client and at Metyis in our entrepreneurial environment
  • Become part of a fast-growing international and diverse team
Read More
Arrow Right

Data Engineering Lead

The next step of your career starts here, where you can bring your own unique mi...
Location
Location
India , Gurgaon
Salary
Salary:
Not provided
metyis.com Logo
Metyis
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7-10 years of professional hands-on experience
  • A broad practice in multiple software engineering fields
  • Experience on managing & leading data engineering / warehousing projects
  • Must have experience on Python, SQL and Distributed programming languages, preferable Spark
  • Experience working on Cloud, Azure is a plus
  • Experience working setting up data lakes using entire Big Data and DWH ecosystem
  • Experience on data workflows and ETL, Apache Airflow is a plus
  • Comfortable with Unix OS type systems, Bash and Linux
  • Basic knowledge of webservices and APIs
  • Basic knowledge of Containers and Docker as platform
Job Responsibility
Job Responsibility
  • Independently lead & manage execution of data engineering projects
  • Engineer complete technical solutions to solve concrete business challenges in the areas of digital marketing, eCommerce, Business Intelligence and self-service analytics
  • Collect functional and non-functional requirements, consider technical environments, business constraints and enterprise organizations
  • Support our clients in executing their Big Data strategies by designing and building operational data platforms: ETL pipelines, data anonymization pipelines, data lakes, near real-time streaming data hubs, web services, training and scoring machine learning models
  • Troubleshoot and quality check work done by team members
  • Collaborate closely with partners, strategy consultants and data scientists in a flat and agile organization where personal initiative is highly valued
  • Share data engineering knowledge by giving technical trainings
  • Communicate and interact with clients at the executive level
  • Guide and mentor team members
What we offer
What we offer
  • Interact with C-level at our clients on regular basis to drive their business towards impactful change
  • Lead your team in creating new business solutions
  • Seize opportunities at the client and at Metyis in our entrepreneurial environment
  • Become part of a fast-growing international and diverse team
Read More
Arrow Right

Senior Data Solution Developer

At Bombardier, we design, build and maintain the world’s peak-performing aircraf...
Location
Location
Canada , Dorval
Salary
Salary:
Not provided
bombardier.com Logo
Bombardier
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • You hold a bachelor’s degree in computer science, Statistics, Informatics, Information Systems or another quantitative field
  • You have 8+ years of experience in a Data Engineer / Data Solution Developer role
  • You have knowledge of Agile / SCRUM project delivery, DevOps and CI/CD practices related to data solutions
  • You have advanced knowledge of SQL, query authoring/optimization and relational databases
  • You have experience optimizing ‘big data’ pipelines (storage, file formats, partitioning, Spark, Python, streaming)
  • You are efficient at performing root cause analysis to address data pipeline issues and applying long-term fixes
  • You have experience designing and building data transformation, data structures, metadata frameworks, semantic layer and automated workload management
  • You have experience implementing data protection measures, understanding data privacy and collaborating with Cybersecurity teams
  • You have good knowledge of Azure data services (Azure Data Factory, Azure Data Lake Storage, Event Hub, Databricks, Lakehouse Medallion Architecture) and PowerBI
  • You have good knowledge of object-oriented and functional programming languages: Python, Java, C++, Scala, etc.
Job Responsibility
Job Responsibility
  • Administer the enterprise data platforms (DWH, Data Lake, BI)
  • Create and maintain performance- and cost-optimized data pipelines, with high reliability, to meet business needs
  • Define and operate the infrastructure required for optimal extraction, loading and transformation (ELT) of data from a wide variety of data sources using SQL, API and Spark technologies
  • Design and implement improvements to life-cycle management processes (DevOps) enabling continuous integration, testing and deployment (CI/CT/CD) of data systems
  • Integrate data from various sources (including external data sources and IoT) and manage the big data as a key enterprise asset
  • Create and maintain backend data solutions for data analysts and data scientists. Assist them in unlocking insights from enterprise data
  • Identify, design, and implement internal processes and frameworks to improve the data platform (e.g. eliminating manual processes, optimizing data delivery, evolving data infrastructure capabilities, etc.)
  • Work with stakeholders including product, data and architecture SMEs to assist with data-related technical issues and support their data infrastructure needs
  • Ensure compliance with data architecture, data governance principles and security requirements
  • Implement and maintain the data platform’s semantic layer
What we offer
What we offer
  • Insurance plans (Dental, medical, life insurance, disability, and more)
  • Competitive base salary
  • Retirement savings plan
  • Employee Assistance Program
  • Tele Health Program
  • Fulltime
Read More
Arrow Right

Senior Data Solution Developer

Job Description
Location
Location
Canada , Administrative Centre
Salary
Salary:
Not provided
bombardier.com Logo
Bombardier
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • You hold a bachelor’s degree in computer science, Statistics, Informatics, Information Systems or another quantitative field
  • You have 8+ years of experience in a Data Engineer / Data Solution Developer role
  • You have knowledge of Agile / SCRUM project delivery, DevOps and CI/CD practices related to data solutions
  • You have advanced knowledge of SQL, query authoring/optimization and relational databases
  • You have experience optimizing ‘big data’ pipelines (storage, file formats, partitioning, Spark, Python, streaming)
  • You are efficient at performing root cause analysis to address data pipeline issues and applying long-term fixes
  • You have experience designing and building data transformation, data structures, metadata frameworks, semantic layer and automated workload management
  • You have experience implementing data protection measures, understanding data privacy and collaborating with Cybersecurity teams
  • You have good knowledge of Azure data services (Azure Data Factory, Azure Data Lake Storage, Event Hub, Databricks, Lakehouse Medallion Architecture) and PowerBI
  • You have good knowledge of object-oriented and functional programming languages: Python, Java, C++, Scala, etc
Job Responsibility
Job Responsibility
  • Administer the enterprise data platforms (DWH, Data Lake, BI)
  • Create and maintain performance- and cost-optimized data pipelines, with high reliability, to meet business needs
  • Define and operate the infrastructure required for optimal extraction, loading and transformation (ELT) of data from a wide variety of data sources using SQL, API and Spark technologies
  • Design and implement improvements to life-cycle management processes (DevOps) enabling continuous integration, testing and deployment (CI/CT/CD) of data systems
  • Integrate data from various sources (including external data sources and IoT) and manage the big data as a key enterprise asset
  • Create and maintain backend data solutions for data analysts and data scientists
  • Assist data analysts and data scientists in unlocking insights from enterprise data
  • Identify, design, and implement internal processes and frameworks to improve the data platform (e.g. eliminating manual processes, optimizing data delivery, evolving data infrastructure capabilities, etc.)
  • Work with stakeholders including product, data and architecture SMEs to assist with data-related technical issues and support their data infrastructure needs
  • Ensure compliance with data architecture, data governance principles and security requirements
What we offer
What we offer
  • Insurance plans (Dental, medical, life insurance, disability, and more)
  • Competitive base salary
  • Retirement savings plan
  • Employee Assistance Program
  • Tele Health Program
  • Fulltime
Read More
Arrow Right