This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Architect role at NTT DATA requires a hands-on professional with 8-12 years of IT experience, focusing on AWS data platforms and financial data modeling. The candidate will design scalable data architectures, manage data models, and collaborate with stakeholders. Strong programming skills in Python, SQL, and experience with AWS services are essential.
Job Responsibility:
Design, implement, and maintain end-to-end data architectures on AWS, supporting analytical, reporting, and modeling workloads
Define and manage data models, schemas, and transformations for structured and semi-structured financial data
Architect and optimize data ingestion, integration, and processing pipelines using AWS Glue, S3, Athena, and Snowflake
Enable analytics and modeling use cases (segmentation, classification, time-series analysis) through well-curated, high-quality datasets
Collaborate with data scientists, analysts, and business stakeholders to ensure data platforms align with mortgage and financial domain requirements
Implement data governance, version control, and CI/CD best practices, ensuring reliability and auditability in a corporate environment
Ensure data solutions are scalable, secure, and cost-efficient, following AWS best practices
Support data consumption via visualization and downstream applications, including integration with Amazon QuickSight and web-based interfaces where required
Requirements:
8–12 years of overall IT experience
5–8 years of relevant experience in Data Architecture or Senior Data Engineering roles within AWS environments
4+ years of hands-on experience with AWS data services
Proven experience designing and operating data architectures using Amazon S3, AWS Glue, Athena, Amazon SageMaker Studio, Snowflake
Strong programming skills in Python, PySpark, and SQL
Experience integrating and transforming data from multiple data lakes and data warehouses
Solid understanding of financial data modelling, preferably within mortgage or lending use cases
Familiarity with segmentation, classification, and time-series data structures
Proven experience with code management and version control in enterprise environments (GitLab)
Ability to translate complex financial data into actionable, well-structured datasets for analytics and reporting
Nice to have:
Experience with Amazon QuickSight for data visualization and reporting
Exposure to React and JavaScript for data-driven application integration
Experience working in regulated financial services environments