This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Senior Data Engineer role within Data Platform & Strategy at AXA Life Insurance Japan. The role involves technical development (80%), research & innovation (10%), and technical leadership (10%).
Job Responsibility:
Design and implement scalable, production-grade data platform infrastructure components using AWS, Databricks, and modern data stack architecture patterns
Lead the development of critical data pipeline components and ensure they meet performance, security, reliability, and segregation of duty requirements
Architect and implement complex data transformation workflows that handle large-scale data processing efficiently
Implement advanced monitoring and observability solutions for the entire Data Platform
Develop and maintain infrastructure-as-Code to ensure reproducibility and standardization
Conduct research on emerging technologies and architectural patterns in the data engineering space that can serve in solving performance/maintainability/efficiency/cost related pain points
Evaluate and prototype new tools and frameworks that could enhance platform capabilities
Present technical proposals and PoCs to Data Platform Engineering leadership
Author technical documentation and design proposals for Data platform improvements and smooth onboarding of the team as a whole
Provide technical mentorship to team members and review critical code contributions
Collaborate with junior stakeholders to align technical solutions with delivery requirements
Lead technical design discussions and architecture review sessions
Share knowledge through documentation, workshops, and technical presentations
Requirements:
Master’s degree in computer science, Data Engineering, or related technical field
10+ years of experience in data engineering and software development
Deep expertise in Cloud Platforms (AWS preferred) and data processing frameworks (Spark preferred)
Expert-level knowledge of Python (preferred), Scala, or Java
Extensive experience with modern SQL Warehouse (Databricks preferred, Snowflake, Big Query), columnar format handling (Parquet / Delta format, ORC preferred), and modern data architecture (Lakehouse preferred)
Experience with DBT (Data Build Tool) and Data modeling methodologies
Expert-level experience with Infrastructure as Code (IaC) practices and tools (Terraform and AWS CDK preferred)
Strong background in distributed systems and high-performance computing
Proven track record of leading complex technical initiatives
Excellent communication skills with ability to explain complex technical concepts to various audiences
Strong problem-solving abilities and analytical thinking
Demonstrate leadership in technical decision-making
Self-motivated with ability to work autonomously
Proven ability to mentor and guide technical team members
Result-oriented with strong project management skills