CrawlJobs Logo

Gcp Data Architect

lingarogroup.com Logo

Lingaro

Location Icon

Location:
Poland

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Responsibility:

  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices, and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
  • Stay updated with emerging trends and technologies in data engineering, recommending, and implementing innovative solutions as appropriate
  • Conduct performance analysis and optimization of data engineering systems, identifying and resolving bottlenecks and inefficiencies
  • Ensure data quality and integrity throughout the data engineering processes, implementing appropriate validation and monitoring mechanisms
  • Collaborate with cross-functional teams to integrate data engineering solutions with other systems and applications
  • Participate in project planning and estimation, providing technical insights and recommendations
  • Document data architecture, infrastructure, and design decisions, ensuring clear and up-to-date documentation for implementation, reference, and knowledge sharing

Requirements:

  • At least 6 years of experience as Data Architect, including min. 4 years of experience working with GCP cloud-based infrastructure & systems
  • Deep knowledge of Google Cloud Platform and cloud computing services
  • Strong experience in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as columnar databases, relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks
  • Knowledge of modern data transformation tools (such as DBT, Dataform)
  • Knowledge of at least one orchestration and scheduling tool
  • Programming skills (SQL, Python, other scripting)
  • Tools knowledge: Git, Jira, Confluence, etc.
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
  • Excellent communication and collaboration skills, with the ability to effectively communicate technical concepts to non-technical stakeholders
  • Experience in leading and mentoring data engineering teams, providing guidance and technical expertise
  • Familiarity with agile methodologies and experience in working in agile development environments
  • Continuous learning mindset, staying updated with the latest advancements and trends in data engineering and related technologies
  • Strong project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Strong understanding of distributed computing principles, including parallel processing, data partitioning, and fault-tolerance

Nice to have:

  • Certifications in big data technologies or/and cloud platforms
  • Experience with BI solutions (e.g. Looker, Power BI, Tableau)
  • Experience with ETL tools: e.g. Talend, Alteryx
  • Experience with Apache Spark, especially in GCP environment
  • Experience with Databricks
  • Experience with Azure cloud-based infrastructure & systems
What we offer:
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • “Office as an option” model. You can choose to work remotely or in the office
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
  • A diverse, inclusive, and values-driven community
  • Autonomy to choose the way you work. We trust your ideas
  • Create our community together. Refer your friends to receive bonuses
  • Activities to support your well-being and health
  • Social fund benefits for everyone. All Lingarians can apply for social fund benefits, such as vacation co-financing
  • Plenty of opportunities to donate to charities and support the environment
  • Modern office equipment. Purchased for you or available to borrow, depending on your location

Additional Information:

Job Posted:
December 09, 2025

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Gcp Data Architect

Data Intelligence Data Architect

We are seeking a visionary Data Architect to lead the design, governance, and op...
Location
Location
Serbia , Belgrade
Salary
Salary:
Not provided
everseen.ai Logo
Everseen
Expiration Date
Until further notice
Requirements
Requirements
  • 8+ years in data architecture, data engineering, or enterprise data management
  • Strong experience in data integration architecture across complex systems
  • Expertise in data modeling (conceptual, logical, physical) and database technologies
  • Strong knowledge of cloud data platforms (AWS, Azure, GCP) and integration tools
  • Familiarity with data governance frameworks and regulatory compliance
  • Proficiency in SQL and Python for building data pipelines, performing data transformations, and implementing automation tasks
Job Responsibility
Job Responsibility
  • Develop and maintain the enterprise data architecture blueprint aligned with business strategy and AI product roadmaps
  • Define enterprise-wide data models, taxonomies, and standards for consistent data usage
  • Collaborate with business stakeholders to identify data-driven revenue opportunities including APIs, data products, and new service offerings
  • Design and oversee data integration solutions (ETL/ELT, APIs, streaming, event-driven architecture) across applications, platforms, and business units
  • Enable real-time and batch data flows to support operational and analytical systems
  • Ensure data accessibility across business units and external partners while adhering to data sovereignty and compliance laws
  • Implement data governance policies covering metadata management, data lineage, access control, and retention
  • Define data quality metrics and oversee data cleansing and validation initiatives
  • Define data stewardship roles and accountability structures
  • Select, implement, and manage enterprise data platforms (data lakes, API gateways, event streaming platforms)
  • Fulltime
Read More
Arrow Right

Data Architect - Enterprise Data & AI Solutions

We are looking for a visionary Data Architect who can translate enterprise data ...
Location
Location
India , Chennai; Madurai; Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Requirements
Requirements
  • Strong background in RDBMS design, data modeling, and schema optimization
  • Advanced SQL skills, including performance tuning and analytics functions
  • Proven expertise in data warehouses, data lakes, and lakehouse architectures
  • Proficiency in ETL/ELT tools (Informatica, Talend, dbt, Glue)
  • Hands-on with cloud platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake)
  • Familiarity with GenAI frameworks (OpenAI, Vertex AI, Bedrock, Azure OpenAI)
  • Experience with real-time streaming (Kafka, Kinesis, Flink) and big data ecosystems (Hadoop, Spark)
  • Strong communication skills with the ability to present data insights to executives
  • 8+ years in data architecture, enterprise data strategy, or modernization programs
  • Hands-on with AI-driven analytics and GenAI adoption
Job Responsibility
Job Responsibility
  • Design scalable data models, warehouses, lakes, and lakehouse solutions
  • Build data pipelines to support advanced analytics, reporting, and predictive insights
  • Integrate GenAI frameworks to enhance data generation, automation, and summarization
  • Define and enforce enterprise-wide data governance, standards, and security practices
  • Drive data modernization initiatives, including cloud migrations
  • Collaborate with stakeholders, engineers, and AI/ML teams to align solutions with business goals
  • Enable real-time and batch insights through dashboards, AI-driven recommendations, and predictive reporting
  • Mentor teams on best practices in data and AI adoption
What we offer
What we offer
  • Opportunity to design next-generation enterprise data & AI architectures
  • Exposure to cutting-edge GenAI platforms to accelerate innovation
  • Collaborate with experts across cloud, data engineering, and AI practices
  • Access to learning, certifications, and leadership mentoring
  • Competitive pay with opportunities for career growth and leadership visibility
  • Fulltime
Read More
Arrow Right

Enterprise Data Architect

The Enterprise Data Architect (EDA) is responsible for defining and advancing th...
Location
Location
United States , San Mateo
Salary
Salary:
200000.00 - 275000.00 USD / Year
verkada.com Logo
Verkada
Expiration Date
Until further notice
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field
  • 10+ years of experience in data architecture, data engineering, or enterprise systems integration
  • Proven track record in designing enterprise-level data architectures in complex, multi-system environments
  • Expertise in data modeling, MDM, Meta Data, and data governance frameworks
  • Experience with cloud-based data platforms (AWS, Azure, GCP) and modern data stacks (e.g., Redshift, BigQuery, Snowflake)
  • Experience with integration platforms (ETL/ELT, APIs, event streaming, middleware)
  • Knowledge of data security (tokenization, encryption, access controls) and compliance frameworks
  • Strong ability to influence cross-functional stakeholders and drive consensus across departments
  • Excellent communication skills, capable of presenting complex data concepts to both technical and non-technical audiences
  • Strategic thinker with the ability to balance long-term architecture vision with short-term delivery needs
Job Responsibility
Job Responsibility
  • Define and own the enterprise data strategy in alignment with corporate goals
  • Design and maintain comprehensive data architecture spanning ERP, HRIS, CRM, ATS, Finance, and other core business systems
  • Ensure the architecture supports scalability, performance, and resilience as the company grows
  • Lead initiatives to break down data silos and establish trusted, unified sources of truth across the organization
  • Champion a company-wide data governance framework
  • Facilitate Regular Meetings
  • Drive Cross-Functional Collaboration
  • Identify & Track Data Gaps
  • Coordinate Remediation Efforts
  • Implement Monitoring & Alerts
  • Fulltime
Read More
Arrow Right

IT Data Platform Architect

As an IT Data Platform Architect, you will be instrumental in designing and impl...
Location
Location
United States , Charlotte
Salary
Salary:
Not provided
brightspeed.com Logo
Brightspeed
Expiration Date
Until further notice
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field
  • 8+ years of relevant experience in data platform architecture
  • Experience with containerization technologies
  • Demonstrated ability in building and managing data platforms with integrated AI/ML capabilities
  • Strong knowledge and experience in GCP, with familiarity in other cloud platforms
  • Have extensive experience in data architecture, especially in supporting AI/ML applications
  • Demonstrate expertise in infrastructure as code, CI/CD practices, and troubleshooting across various technical domains
  • Show strong ability in handling streaming data applications
  • Can effectively communicate technical concepts to diverse audiences
  • Are a proactive problem-solver with a keen eye for improving system architectures
Job Responsibility
Job Responsibility
  • Develop a high-performance data architecture to support large-scale data processing and AI/ML analytics
  • Lead effort to implement infrastructure as code using tools like Terraform and Ansible to automate the deployment of both infrastructure and applications
  • Design CI/CD integrations using GitHub, GitHub Actions, Jenkins, ensuring smooth deployment on GCP
  • Create, update, and maintain comprehensive documentation, including procedural/process guides and infrastructure topology diagrams
  • Stay updated with technological advancements, advocating for and implementing necessary changes and updates to our systems
  • Proactively identify improvement areas in infrastructure architecture and develop plans for enhancements
  • Work with streaming data applications, ensuring robust data flow and integration
  • Possess knowledge of containerization technologies and orchestration tools to manage and scale applications effectively
  • Build reusable code, components, and services, focusing on versioning, reconciliation, and robust exception handling
  • Communicate complex technical concepts effectively to both technical and non-technical stakeholders
What we offer
What we offer
  • competitive medical, dental, vision, and life insurance
  • employee assistance program
  • 401K plan with company match
  • host of voluntary benefits
  • Fulltime
Read More
Arrow Right

Data Architect

Delivery Centric is seeking a highly skilled Data Architect to design cloud-read...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
deliverycentric.com Logo
Delivery Centric Technologies
Expiration Date
Until further notice
Requirements
Requirements
  • Proven experience in data architecture, data modelling, and enterprise data platform design
  • Strong expertise in SQL, NoSQL, data warehousing, and major cloud platforms (Azure, AWS, GCP)
  • Hands-on experience with ETL/ELT tooling and big data technologies (Spark, Hadoop)
  • Experience building data pipelines and event-driven workflows
  • Certifications ideal for this role: Azure Data Engineer, AWS Developer, Databricks Data Engineer
  • Exposure to AI/ML environments and advanced analytical use cases
  • Strong analytical and problem-solving capabilities with excellent stakeholder engagement skills
Job Responsibility
Job Responsibility
  • Design scalable, secure, and high-performing data architectures aligned to business objectives
  • Develop conceptual, logical, and physical data models for enterprise data platforms
  • Drive data governance practices, ensuring compliance, quality, and security across all data assets
  • Lead integration initiatives and build reliable data pipelines across cloud and on-prem ecosystems
  • Optimize existing data platforms, improving performance, scalability, and operational efficiency
  • Collaborate with business stakeholders to translate requirements into technical solutions
  • Maintain architecture documentation, standards, data dictionaries, and solution diagrams
  • Support big data, analytics, and AI/ML initiatives through scalable data foundations
  • Fulltime
Read More
Arrow Right

Data Architect

Embark on an exciting journey into the realm of data engineering and architectur...
Location
Location
India , Noida
Salary
Salary:
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
Requirements
Requirements
  • Translate business requirements into data requests, reports and dashboards.
  • Strong Database & modeling concepts with exposure to SQL & NoSQL Databases
  • Strong data architecture patterns & principles, ability to design secure & scalable data lakes, data warehouse, data hubs, and other event-driven architectures
  • Expertise in designing and writing ETL processes in Python / Java / Scala
  • Understanding of Hadoop framework - Exposure to PySpark, Spark, Storm, HDFS, Hive
  • Strong hands-on experience with either Databricks or Snowflake
  • experience with both is desirable.
  • Knowledge of Master Data management and related tools
  • Strong exposure to data security and privacy regulations (GDPR, HIPAA) and best practices
  • Skilled in ensuring data accuracy, consistency, and quality
Job Responsibility
Job Responsibility
  • Work closely with business leaders and information management teams to define and implement a roadmap on data architecture, data management, business intelligence or analytics solutions.
  • Define a reference architecture for our customers.
  • Support our clients to take control of their data and get value out of it.
  • Fulltime
Read More
Arrow Right

Cloud Data Architect

Work with the team to evaluate business needs and priorities, liaise with key bu...
Location
Location
United States , Schaumburg
Salary
Salary:
Not provided
gantecpublishing.com Logo
Gantec Publishing Solutions
Expiration Date
Until further notice
Requirements
Requirements
  • Minimum Education Requirement: This position requires, at a minimum, a bachelor’s degree in computer science, computer information systems, information technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects
  • Skills: Scala, PySpark and/or Python. Azure, SQL GCP
Job Responsibility
Job Responsibility
  • Work with the team to evaluate business needs and priorities, liaise with key business partners
  • Participate in project planning
  • identifying milestones, deliverables and resource requirements
  • tracks activities and task execution
  • Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or Google Cloud Platform Methods
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business
  • Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or Google Cloud Platform cloud platform.
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking an experienced Data Architect with deep technical expertise and a...
Location
Location
United States
Salary
Salary:
Not provided
indatalabs.com Logo
InData Labs
Expiration Date
Until further notice
Requirements
Requirements
  • 7+ years of experience in data architecture, data engineering, or database design
  • Proven experience designing large-scale data systems in cloud environments (AWS, Azure, or GCP)
  • Strong expertise in relational and non-relational databases (e.g., PostgreSQL, SQL Server, MongoDB, Snowflake, Redshift, BigQuery)
  • Proficiency in data modeling tools (e.g., ER/Studio, ERwin, dbt, Lucidchart)
  • Hands-on experience with ETL frameworks, data pipelines, and orchestration tools (e.g., Apache Airflow, Fivetran, Talend)
  • Solid understanding of data governance, metadata management, and data lineage tools
  • Experience working with modern data stack technologies (e.g., Databricks, Kafka, Spark, dbt)
  • Strong SQL and at least one programming language (Python, Scala, or Java)
  • Excellent communication and leadership skills
  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field
Job Responsibility
Job Responsibility
  • Design and implement enterprise-grade data architectures to support analytics, reporting, and operational needs
  • Define data standards, data flows, and governance frameworks across systems and departments
  • Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical data solutions
  • Develop and maintain logical and physical data models using modern modeling tools
  • Oversee data integration strategies including ETL/ELT pipelines, APIs, and real-time data ingestion
  • Evaluate, recommend, and implement new data technologies and tools aligned with industry best practices
  • Ensure data quality, security, and compliance across all platforms
  • Act as a technical mentor to engineering and analytics teams, promoting architectural consistency and knowledge sharing
  • Partner with DevOps and infrastructure teams to ensure optimal deployment, scalability, and performance of data systems
  • Lead initiatives in data warehousing, master data management, and data lakes (on-premise and cloud)
What we offer
What we offer
  • 100% remote with flexible hours
  • Work from anywhere in the world
  • Be part of a senior, talented, and supportive team
  • Flat structure – your input is always welcome
  • Clients in the US and Europe, projects with real impact
  • Room to grow and experiment with cutting-edge AI solutions
Read More
Arrow Right