This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking an experienced Data Architect with deep technical expertise and a strategic mindset to lead the design, implementation, and governance of enterprise data architecture. The ideal candidate will have a strong background in data modeling, data governance, cloud-based solutions, and a passion for building scalable, secure, and high-performance data platforms.
Job Responsibility:
Design and implement enterprise-grade data architectures to support analytics, reporting, and operational needs
Define data standards, data flows, and governance frameworks across systems and departments
Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical data solutions
Develop and maintain logical and physical data models using modern modeling tools
Oversee data integration strategies including ETL/ELT pipelines, APIs, and real-time data ingestion
Evaluate, recommend, and implement new data technologies and tools aligned with industry best practices
Ensure data quality, security, and compliance across all platforms
Act as a technical mentor to engineering and analytics teams, promoting architectural consistency and knowledge sharing
Partner with DevOps and infrastructure teams to ensure optimal deployment, scalability, and performance of data systems
Lead initiatives in data warehousing, master data management, and data lakes (on-premise and cloud)
Requirements:
7+ years of experience in data architecture, data engineering, or database design
Proven experience designing large-scale data systems in cloud environments (AWS, Azure, or GCP)
Strong expertise in relational and non-relational databases (e.g., PostgreSQL, SQL Server, MongoDB, Snowflake, Redshift, BigQuery)
Proficiency in data modeling tools (e.g., ER/Studio, ERwin, dbt, Lucidchart)
Hands-on experience with ETL frameworks, data pipelines, and orchestration tools (e.g., Apache Airflow, Fivetran, Talend)
Solid understanding of data governance, metadata management, and data lineage tools
Experience working with modern data stack technologies (e.g., Databricks, Kafka, Spark, dbt)
Strong SQL and at least one programming language (Python, Scala, or Java)
Excellent communication and leadership skills
Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field
Nice to have:
Experience with real-time analytics and streaming data
Knowledge of machine learning pipelines and model deployment architecture
Familiarity with data privacy laws (e.g., GDPR, CCPA) and security best practices
Certifications in cloud platforms or data architecture (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer)
What we offer:
100% remote with flexible hours
Work from anywhere in the world
Be part of a senior, talented, and supportive team
Flat structure – your input is always welcome
Clients in the US and Europe, projects with real impact
Room to grow and experiment with cutting-edge AI solutions
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.