CrawlJobs Logo

Lead Data Architect - Common Data Platform

alleima.com Logo

Alleima Precision Tube

Location Icon

Location:
Sweden , Stockholm

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

At Alleima, data is a strategic asset and a key enabler for business performance, digitalization, and future innovation. We are building an Alleima Common Data Platform on Microsoft Fabric that will become a central foundation for trusted, scalable, and business-driven data across the company. When fully implemented, the platform will support analytics, reporting, and decision-making across key domains including Finance, Supply Chain, Production, Quality, and Sustainability. This is an opportunity to take a central role in defining and evolving a platform that connects business needs, architecture principles, and hands-on technical realization. As our Lead Data Architect, you will help shape how Alleima works with modern data architecture, reusable data products, and scalable analytics capabilities.

Job Responsibility:

  • Own the end-to-end architecture for the Common Data Platform, covering source system integration, ingestion, transformation, storage, semantic modeling, visualization, and analytics
  • Lead the design and implementation of scalable data solutions in Microsoft Fabric, including lakehouse patterns, medallion architecture, notebooks, data pipelines, and semantic models
  • Define and maintain target architecture, reference patterns, and engineering standards for Fabric-based solutions
  • Lead the design, implementation, and optimization of ETL/ELT pipelines using SQL, Python/PySpark, notebooks, and related data engineering capabilities
  • Define reusable and governed information models that enable cross-functional analytics across Finance, Supply Chain, Production, Quality, Sustainability, and related domains
  • Drive CI/CD, Git-based ways of working, and deployment practices for Fabric assets, databases, and analytics solutions
  • Ensure the platform is secure, standardized, reusable, and aligned with enterprise architecture principles, access management, and regulatory requirements
  • Lead and coach data engineers, BI developers, and solution teams in best practices for architecture, engineering, governance, code quality, deployment, and platform usage
  • Support the establishment and implementation of data governance, including roles, responsibilities, metadata, lineage, and documentation standards
  • Work closely with business and IT stakeholders to translate needs into sustainable architectural solutions and balanced roadmap decisions

Requirements:

  • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, Software Engineering, or a related field
  • Extensive experience from data platforms, analytics, data engineering, and architecture roles
  • Strong experience with Microsoft Fabric, Power BI and semantic models, Azure data platform services, and modern data engineering practices using SQL, Python, and PySpark
  • Familiar with ETL/ELT design, orchestration, CI/CD, Git-based development, integration patterns, and secure enterprise data platform design
  • Experience with Databricks, Purview, Entra ID, Dataverse/D365 integrations, and Azure integration or cloud architecture patterns is considered a plus
  • Understand enterprise business processes and analytical information needs across domains such as Finance, Supply Chain, Production, Quality, or Sustainability
  • Fluent in English
  • Swedish is a plus
  • Based in Sweden

Nice to have:

  • Experience with Databricks, Purview, Entra ID, Dataverse/D365 integrations, and Azure integration or cloud architecture patterns
  • Swedish language skills

Additional Information:

Job Posted:
April 10, 2026

Expiration:
April 12, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Lead Data Architect - Common Data Platform

Cloud Technical Architect / Data DevOps Engineer

The role involves designing, implementing, and optimizing scalable Big Data and ...
Location
Location
United Kingdom , Bristol
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • An organised and methodical approach
  • Excellent time keeping and task prioritisation skills
  • An ability to provide clear and concise updates
  • An ability to convey technical concepts to all levels of audience
  • Data engineering skills – ETL/ELT
  • Technical implementation skills – application of industry best practices & designs patterns
  • Technical advisory skills – experience in researching technological products / services with the intent to provide advice on system improvements
  • Experience of working in hybrid environments with both classical and DevOps
  • Excellent written & spoken English skills
  • Excellent knowledge of Linux operating system administration and implementation
Job Responsibility
Job Responsibility
  • Detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems
  • Participating in the full lifecycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between
  • Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies
  • Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
  • Assisting with solution improvement activities driven either by the project or service
  • Support the design and development of new capabilities, preparing solution options, investigating technology, designing and running proof of concepts, providing assessments, advice and solution options, providing high level and low level design documentation
  • Cloud Engineering capability to leverage Public Cloud platform using automated build processes deployed using Infrastructure as Code
  • Provide technical challenge and assurance throughout development and delivery of work
  • Develop re-useable common solutions and patterns to reduce development lead times, improve commonality and lowering Total Cost of Ownership
  • Work independently and/or within a team using a DevOps way of working
What we offer
What we offer
  • Extensive social benefits
  • Flexible working hours
  • Competitive salary
  • Shared values
  • Equal opportunities
  • Work-life balance
  • Evolving career opportunities
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing
  • Fulltime
Read More
Arrow Right

Senior Data Platform Architect

We are seeking a Senior Data Platform Architect to serve as a lead architect and...
Location
Location
United States , McLean
Salary
Salary:
Not provided
clarkconstruction.com Logo
Clark Construction Company
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7–10+ years in data engineering, enterprise integration, or platform architecture
  • Hands-on experience working in a hybrid environment that leverages Snowflake or Databricks for its powerful cloud data warehousing, AWS Infrastructure, Microsoft Fabric (OneLake) for unified data mesh capabilities, and Talend, Matillion, Mulesoft (or similar ETL/ELT tools) for complex enterprise integration
  • You are equally comfortable discussing API integration patterns as you are writing complex scripts ( eg - python, lambda functions andSQL)
  • Deep, hands-on expertise in Data Lakehouse technologies like Snowflake, Databricks and ELT/ETL technologies such as Mulesoft, Talend, Matillion, Informatica Cloud services is essential
  • You have a proven track record of solving scaled problems and building platforms that support 1,000+ users
Job Responsibility
Job Responsibility
  • Technical Leadership: Serve as the senior technical lead for the platform, establishing the standards for how the platform is architected, managed, data is moved, stored, and secured
  • Foundational Data Modeling: Design the "Unified Core" of our data creating the logic and common connectors that allow data from various sources to be combined into a single, reliable source of truth for enterprise usage
  • Complex Data Integration: Architect and maintain mission-critical data flows between disparate enterprise systems, and democratize data consumption to enable self-serve analytics
  • Cross-platform Governance: Implement cross-platform governance and security, ensuring seamless data movement and "Zero-Copy" sharing across data storage and Data Lakehouse - For example - Connectivity between Snowflake Iceberg tables and Microsoft OneLake, integration of SAP Business Data Cloud with Snowflake and Databricks
  • Platform Modernization: You will develop and maintain long-term roadmaps and drive healthy operating stats for data platforms for security, performance and availability. Advise and execute on continuous innovation that will include feature augmentation and selection of new technologies
  • Data Governance and Catalog: Establish and scale corporate data catalog, ensuring 100% metadata coverage and establishing clear data stewardship roles to improve enterprise data discoverability and trust
  • System Stability & Scalability: Lead high-priority infrastructure initiatives and upgrades, such as Snowflake upgrades and performance tuning to ensure a resilient, 24/7 production environment
  • Collaboration: Partner with business ensure the underlying data architecture perfectly supports advanced visualization and machine learning deployments
  • Fulltime
Read More
Arrow Right

Lead Data Solution Architect

Lead data architects at Thoughtworks play a key part in developing modern data a...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
thoughtworks.com Logo
Thoughtworks
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in defining and implementing different types of data architecture, analyzing trade-offs and can define technology stacks for different types of data architecture
  • Exposure to designing application system architecture based on big data, artificial intelligence and related technologies
  • Experience in building, maintaining and tuning data platforms, as well as experience in data warehouse design, data modeling, data monitoring and maintenance
  • Experience with common design patterns, application frameworks and foundational/theoretical knowledge (i.e.: distributed systems, data intensive applications, etc.)
  • Proficient in common open-source distributed computing/storage technologies, including but not limited to YARN, Impala, Spark, MapReduce, Kafka and Flink, with practical project architecture experience
  • Good understanding of business and communication, collaboration skills, strong learning and summarizing abilities
  • Exposure to defining, developing and enabling data-driven techniques, advanced analytics, ML/AI and data mining applications in enterprise. This includes technologies such as LLMs, VLMs, vector & graph databases and/or agentic architectures
  • Exposure to developing real-time and low-latency data streaming solutions
  • Exposure to productionizing machine learning models and applying techniques, tools and processes
  • Passionate about data infrastructure and operations, with expertise working in cloud environments
Job Responsibility
Job Responsibility
  • Navigate, with support, a data project's architectural concerns, enabling delivery teams to deliver on accepted standards within time and budget
  • Provide client-facing technical leadership and guidance on topics related to data architecture, engineering, and analytics to advise clients on bringing their data strategy to life
  • Interact with client counterparts from the enterprise architecture group to deliver, share, align and sign-off on key architectural decisions, trade-offs and ways of working
  • Communicate both high-level and low-level technical details of data architecture to engineers and business stakeholders
  • Understand and can collaborate at the intersection of analytical and operational architectures
  • Lead with support the technical design of data governance, data security and data privacy to fulfill compliance requirements
  • Lead, with support, the incorporating of data quality frameworks and processes to address and fulfill requirements as set out in strategy and acceptance criteria
  • Collaborate with sales and pre-sales to clarify requirements and design viable solutions
What we offer
What we offer
  • Learning & Development
  • Interactive tools
  • Numerous development programs
  • Teammates who want to help you grow
  • Fulltime
Read More
Arrow Right

Staff Software Engineer, Data Infrastructure

At Docker, we make app development easier so developers can focus on what matter...
Location
Location
United States , Seattle
Salary
Salary:
195400.00 - 275550.00 USD / Year
docker.com Logo
Docker
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of software engineering experience with 3+ years focused on data engineering and analytics systems
  • Expert-level experience with Snowflake including advanced SQL, performance optimization, and cost management
  • Deep proficiency in DBT for data modeling, transformation, and testing with experience in large-scale implementations
  • Strong expertise with Apache Airflow for complex workflow orchestration and pipeline management
  • Hands-on experience with Sigma or similar modern BI platforms for self-service analytics
  • Extensive AWS experience including data services (S3, Redshift, EMR, Glue, Lambda, Kinesis) and infrastructure management
  • Proficiency in Python, SQL, and other programming languages commonly used in data engineering
  • Experience with infrastructure-as-code, CI/CD practices, and modern DevOps tools
  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience
  • Proven track record designing and implementing large-scale distributed data systems
Job Responsibility
Job Responsibility
  • Define and drive the technical strategy for Docker's data platform architecture, establishing long-term vision for scalable data systems
  • Lead design and implementation of highly scalable data infrastructure leveraging Snowflake, AWS, Airflow, DBT, and Sigma
  • Architect end-to-end data pipelines supporting real-time and batch analytics across Docker's product ecosystem
  • Drive technical decision-making around data platform technologies, architectural patterns, and engineering best practices
  • Establish technical standards for data quality, testing, monitoring, and operational excellence
  • Design and build robust, scalable data systems that process petabytes of data and support millions of user interactions
  • Implement complex data transformations and modeling using DBT for analytics and business intelligence use cases
  • Develop and maintain sophisticated data orchestration workflows using Apache Airflow
  • Optimize Snowflake performance and cost efficiency while ensuring reliability and scalability
  • Build data APIs and services that enable self-service analytics and integration with downstream systems
What we offer
What we offer
  • Freedom & flexibility
  • fit your work around your life
  • Designated quarterly Whaleness Days plus end of year Whaleness break
  • Home office setup
  • we want you comfortable while you work
  • 16 weeks of paid Parental leave
  • Technology stipend equivalent to $100 net/month
  • PTO plan that encourages you to take time to do the things you enjoy
  • Training stipend for conferences, courses and classes
  • Equity
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Rearc, we're committed to empowering engineers to build awesome products and ...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering, showcasing expertise in diverse architectures, technology stacks, and use cases
  • Strong expertise in designing and implementing data warehouse and data lake architectures, particularly in AWS environments
  • Extensive experience with Python for data engineering tasks, including familiarity with libraries and frameworks commonly used in Python-based data engineering workflows
  • Proven experience with data pipeline orchestration using platforms such as Airflow, Databricks, DBT or AWS Glue
  • Hands-on experience with data analysis tools and libraries like Pyspark, NumPy, Pandas, or Dask
  • Proficiency with Spark and Databricks is highly desirable
  • Experience with SQL and NoSQL databases, including PostgreSQL, Amazon Redshift, Delta Lake, Iceberg and DynamoDB
  • In-depth knowledge of data architecture principles and best practices, especially in cloud environments
  • Proven experience with AWS services, including expertise in using AWS CLI, SDK, and Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or AWS CDK
  • Exceptional communication skills, capable of clearly articulating complex technical concepts to both technical and non-technical stakeholders
Job Responsibility
Job Responsibility
  • Strategic Data Engineering Leadership: Provide strategic vision and technical leadership in data engineering, guiding the development and execution of advanced data strategies that align with business objectives
  • Architect Data Solutions: Design and architect complex data pipelines and scalable architectures, leveraging advanced tools and frameworks (e.g., Apache Kafka, Kubernetes) to ensure optimal performance and reliability
  • Drive Innovation: Lead the exploration and adoption of new technologies and methodologies in data engineering, driving innovation and continuous improvement across data processes
  • Technical Expertise: Apply deep expertise in ETL processes, data modelling, and data warehousing to optimize data workflows and ensure data integrity and quality
  • Collaboration and Mentorship: Collaborate closely with cross-functional teams to understand requirements and deliver impactful data solutions—mentor and coach junior team members, fostering their growth and development in data engineering practices
  • Thought Leadership: Contribute to thought leadership in the data engineering domain through technical articles, conference presentations, and participation in industry forums
Read More
Arrow Right

Enterprise Solutions Architect

The individual in this role will require a combination of deep technical experti...
Location
Location
United States , Trenton
Salary
Salary:
Not provided
dashtechinc.com Logo
Dash Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data Governance
  • Middleware
  • master data management
  • Data Warehousing
  • Platforms
  • understand
  • Ability to make informed decisions under uncertainty, weigh multiple factors/constraints, consider long-term implications of architectural choices
  • Knowledge of electronic data processing platforms and software
  • Knowledge of database/data dictionary systems and applications
  • Knowledge of application portfolio management, software design patterns, application integration, legacy system modernization, app lifecycle mgmt
Job Responsibility
Job Responsibility
  • Design integration strategies for disparate systems, applications, and data sources across the enterprise
  • Establish common interfaces, architecture principles, data standards, and best practices
  • Support senior management on complex technology initiatives and investment decisions
  • Create an architecture review process to evaluate proposed solutions against standards and strategic objectives
  • Collaborate with the security team to ensure solutions meet organizational and regulatory requirements
  • Fulltime
Read More
Arrow Right

Data Analyst

The data analyst is responsible for analyzing business requirements and converti...
Location
Location
United States , Bellevue
Salary
Salary:
68000.00 - 122700.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree
  • 2-4 years Experience in data identification, querying, cleaning, wrangling, checking quality, working with common relational and non-relational databases in big data environments on both on-prem and cloud such as Azure, AWS and Google Cloud
  • 2-4 years Experience articulating and translating business questions and using statistical techniques to arrive at an answer using data
  • 2-4 years Experience writing and speaking about technical concepts to business, technical, and lay audiences and giving data-driven presentations
  • 2-4 years Experience working with relational databases using SQL
  • 2-4 years Experience with data visualization tools such as Power BI, Tableau etc
  • Experience of relational databases, data warehousing and data architecture computer systems, including not limited to SQL, Python
  • Strong experience with large language models (LLMs) such as GPT, Claude, Gemini, Llama etc.
  • Technology Experience with Data querying, cleaning, wrangling, working with common relational and non-relational databases in big data environments such as Azure, AWS and Google Cloud
  • Technology Experience articulating and translating business questions and using statistical techniques to arrive at an answer using data
Job Responsibility
Job Responsibility
  • Collaborate with Business SMEs, Product owners, Data Stewards, and Data Architects to identify critical data elements, define business terms, capture metadata, define data schema, clean, and prepare data in an appropriate format to perform analysis
  • Analyze data using statistical techniques to identify trends and patterns (e.g. growth/decline in key metrics) and create visualizations and reports to present findings to business stakeholders and/or data scientists for further actions
  • Define analytics tagging requirements for new and enhanced web/app features and ensure alignment with business KPIs
  • Automate the generation of analytics event specifications using AI/LLM tools (e.g., GPT-based models, generative automation platforms)
  • Perform data profiling, assess, and monitor the quality of data by working with the Data stewards, Data quality leads, and internal stakeholders and support the implementation of data quality rules and remediation of issues
  • Collaborate/Partner with product owner and agile delivery teams to ensure that the project/product is delivered with quality and in time
  • Interpret data models (Logical and Physical) and ensuring they align with the solution design
  • Convert user requirements into technical specifications and design documents, and identifying new data sources for the data warehouse
  • Also responsible for other Duties/Projects as assigned by business management as needed
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Employee stock grants
  • Employee stock purchase plan
  • Paid time off
  • Fulltime
Read More
Arrow Right

Manager, Enterprise Information Management Platform Admin

Manage, in a highly automated manner, the infrastructure of SHC's Enterprise Inf...
Location
Location
United States of America , Palo Alto
Salary
Salary:
83.98 - 111.27 USD / Hour
stanfordhealthcare.org Logo
Stanford Health Care
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/BA degree in information technology, information systems, business management, business analytics, business administration or a directly related field from an accredited college or university required
  • EIGHT (8) OR MORE YEARS EXPERIENCE AS A CLOUD ENGINEER, DATA ENGINEER, OR IN SIMILAR TECHNOLOGY ROLES WITH A RECENT FOCUS ON CLOUD TECHNOLOGIES REQUIRED
  • Certifications in Azure (e.g., Azure Solutions Architect, Azure Data Engineer) and/or Databricks required
  • Thought Leadership in Cloud Technologies
  • Comprehensive Knowledge of Azure Services
  • Strategic Vision for Data Architecture
  • Expertise in Infrastructure-as-Code and Automation
  • Innovative Solution Design
  • Influential Leadership and Mentorship
  • Expertise in Data Security and Compliance
Job Responsibility
Job Responsibility
  • Technical Leadership: Serve as the highest technical authority for a specific operational domain and/or technology
  • Innovation and Research: Drive innovation by researching and implementing cutting-edge technologies and methodologies
  • Strategy & Architecture Design: Design complex data architectures and frameworks to support business needs
  • EIM Environment & Infrastructure Mgmt: Collaborate with Cloud Infrastructure leadership to align and optimize common operational processes
  • Design and enforce platform standards
  • Data Operations Management: Own domains and/or components of the data platform and operational processes
  • Data Pipelines/Ingestion: Design and implement scalable and efficient data pipelines for various data types
  • Security and Compliance: Contribute to risk management with Security and Compliance to define potential solutions
  • FinOps: Contribute to cost governance, usage showback/chargeback reporting, chargeback agreements, etc
  • Documentation and Training: Create comprehensive documentation of solutions and operations processes
  • Fulltime
Read More
Arrow Right