This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The role is responsible for developing and maintaining the data architecture across Data analytics platforms and applications, along with liaising with Architecture that includes the activities required for data flow design, data modelling, physical data design, query performance optimization. The Data Modeler architect position is responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Modeler architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data.
Job Responsibility:
Develop and maintain conceptual logical, and physical data models and to support business needs
Contribute to and Enforce data standards, governance policies, and best practices
Design and manage metadata structures to enhance information retrieval and usability
Maintain comprehensive documentation of the architecture, including principles, standards, and models
Evaluate and recommend technologies and tools that best fit the solution requirements
Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency
Requirements:
Master’s degree with 4 - 6 years of experience in Computer Science, IT or related field
Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT or related field
Diploma with 10 - 12 years of experience in Computer Science, IT or related field
Proficiency in creating conceptual, logical, and physical data models
Ability to interview and communicate with business Subject Matter experts to develop data models
Knowledge of metadata standards, taxonomies, and ontologies
Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing
Implementing Data testing and data quality strategies
Nice to have:
Experience with Graph technologies such as Stardog, Allegrograph, Marklogic