CrawlJobs Logo

Gcp data analyst - bigquery

rackspace.com Logo

Rackspace

Location Icon

Location:
Canada , Toronto

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

143700.00 - 245520.00 USD / Year

Job Description:

We’re seeking a GCP Data Analyst with deep expertise in BigQuery, strong SQL and Python skills, and a sharp analytical mindset to support both data validation initiatives and ongoing analytics work. This role is ideal for someone who can navigate large datasets, build robust queries, and identify inconsistencies with precision and insight. The analyst will work across a variety of data workflows, from validating metrics during system migrations to supporting day-to-day data analysis and reporting needs.

Job Responsibility:

  • Write, optimize, and execute complex SQL queries in BigQuery to validate data accuracy, identify inconsistencies, and support analytics and reporting
  • Analyze large datasets to assess data quality, compare trends across systems, and surface anomalies or unexpected behaviors
  • Utilize advanced BigQuery features such as authorized views, materialized views, UDFs, partitioned tables, and joins to support scalable, high-performance analysis
  • Use Python (including data frames and relevant libraries) for exploratory analysis, data manipulation, and supporting validation workflows
  • Support time series analysis and, where applicable, anomaly detection using SQL or Python-based approaches
  • Assist with load/transform validation to ensure reliability and accuracy in data pipelines
  • Collaborate with engineering teams to understand data pipelines, with basic ability to read and interpret Java or Scala code when needed
  • Perform side-by-side comparisons of data across systems to ensure consistency during and after migrations
  • Maintain basic familiarity with orchestration tools such as Airflow (Composer) to follow pipeline logic and collaborate effectively with engineering
  • Work within the GCP environment, leveraging cloud tools and services to support analysis, troubleshoot issues, and navigate cloud-based workflows
  • Clearly communicate analytical findings and data quality issues to cross-functional stakeholders to support decision-making

Requirements:

  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field
  • 5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills
  • 5+ years of experience building and operating solutions on Google Cloud Platform (GCP)
  • Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies
  • Proficient in Python, including use of data frames and common analytical libraries
  • Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis
  • Strong analytical skills and experience validating data across systems during migrations and ongoing operations
  • Basic ability to read and understand Java or Scala code to support engineering collaboration
  • Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows
  • Proficiency in SQL, BigQuery, and Python
  • Advanced SQL skills in BigQuery for complex data validation, anomaly detection, and trend analysis
  • Experience comparing datasets across systems
  • Proven ability to identify and investigate data discrepancies across platforms
  • Strong analytical intuition to sense-check metrics and flag issues that may not trigger formal alerts
  • Ability to perform side-by-side metric and trend comparisons to confirm post-migration accuracy
  • Skilled in root cause analysis using SQL, domain expertise, and supporting context
  • Effective communicator who can document findings and share insights with both technical and non-technical stakeholders
  • Familiarity with time series analysis to detect unexpected shifts, drops, or spikes in metrics
  • Ability to follow structured validation processes while proactively identifying workflow improvements

Nice to have:

  • Familiarity with Looker or other BI tools for metric validation and reporting support
  • BigQuery ML and Vertex AI
  • Basic familiarity with legacy systems such as Oozie or Pig for reading existing scripts

Additional Information:

Job Posted:
January 05, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Gcp data analyst - bigquery

Gcp Data Engineer

We at AlgebraIT are looking for a GCP Data Engineer with 3+ years of experience ...
Location
Location
United States , Austin
Salary
Salary:
Not provided
algebrait.com Logo
AlgebraIT
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering with GCP
  • Proficiency in Python, SQL, and GCP services
  • Experience with data pipeline orchestration tools
  • Strong problem-solving abilities and attention to detail
  • Bachelor’s degree in Computer Science or related field
Job Responsibility
Job Responsibility
  • Build and maintain scalable data pipelines using GCP tools
  • Ensure data security and governance
  • Monitor, troubleshoot, and optimize data workflows
  • Collaborate with stakeholders to gather requirements and deliver data solutions
  • Implement data quality checks and best practices
  • Develop and maintain ETL processes
  • Create detailed documentation of data processes
  • Work closely with data analysts and business teams for data alignment
  • Ensure high availability and reliability of data services
  • Stay current with GCP data technology advancements
  • Fulltime
Read More
Arrow Right

Data Analyst

Location
Location
India , Pune
Salary
Salary:
Not provided
enormousenterprise.com Logo
Enormous Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years’ experience in related field preferred
  • Experience with other tools such as GCP BQ, Alteryx, Tableau required
  • Experience with data query, visualization, dashboard and/or scorecard tools preferred
  • Extensive experience performing Analysis and Solutioning
  • Experience in Big Data systems, Data Lakes and Analytics related field preferred
  • Hands-on experience in Data Analysis/validation/data mapping and proficient in SQL queries
  • Experience developing Business/Functional Requirement documentation and facilitating requirement work sessions with Business Units/Clients
  • Excellent verbal/written communication skills in an internal client facing role with ability to present information and findings clearly in analysis/reports
  • Can work on complex projects of large scope
Job Responsibility
Job Responsibility
  • Scopes, designs, and develops data analytics solutions/models to solve challenging business problems using Google BigQuery, SQL, SAP, Tableau, AtScale, Looker Studio, Alteryx, and other data-centric technologies
  • Proficiency in data modeling/design and reading/writing/debugging/optimizing advanced SQL queries are firmly required
  • Knowledge of data visualization, dashboard, reporting, and OLAP/dimensional design tools preferred
  • Translates complex business requirements into technical requirements, and ultimately into data solutions that enable the business to make fast, effective, data-driven decisions (BigQuery data models/views, AtScale virtual cubes, Tableau dashboards
  • supports decision-making, metric management, enterprise performance management)
  • Asks probing questions to understand detailed business processes, requirements, and data architectures
  • Brings a foundation of data analysis experience and best-practices to discover and anticipate requirements beyond those that are plainly provided Observes patterns in processes and data and implementing end-to-end data driven intelligent processes
  • Extracts, manipulates, and analyzes varying types of data to find insight with a high degree of accuracy and attention to detail
  • Integrates analysis and underlying work (e.g., data mapping, transformations, validation) back into preexisting data platforms, with an enterprise mentality beyond any single use case
  • Applies hypotheses and an understanding of cause-and-effect from the logical analysis of a complex situation
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Gcp Data Analyst - Bigquery

We’re seeking a GCP Data Analyst with deep expertise in BigQuery, strong SQL and...
Location
Location
United States , San Antonio
Salary
Salary:
143700.00 - 245520.00 USD / Year
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field
  • 5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills
  • 5+ years of experience building and operating solutions on Google Cloud Platform (GCP)
  • Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies
  • Proficient in Python, including use of data frames and common analytical libraries
  • Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis
  • Strong analytical skills and experience validating data across systems during migrations and ongoing operations
  • Basic ability to read and understand Java or Scala code to support engineering collaboration
  • Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows
  • Proficiency in SQL, BigQuery, and Python
Job Responsibility
Job Responsibility
  • Write, optimize, and execute complex SQL queries in BigQuery to validate data accuracy, identify inconsistencies, and support analytics and reporting
  • Analyze large datasets to assess data quality, compare trends across systems, and surface anomalies or unexpected behaviors
  • Utilize advanced BigQuery features such as authorized views, materialized views, UDFs, partitioned tables, and joins to support scalable, high-performance analysis
  • Use Python (including data frames and relevant libraries) for exploratory analysis, data manipulation, and supporting validation workflows
  • Support time series analysis and, where applicable, anomaly detection using SQL or Python-based approaches
  • Assist with load/transform validation to ensure reliability and accuracy in data pipelines
  • Collaborate with engineering teams to understand data pipelines, with basic ability to read and interpret Java or Scala code when needed
  • Perform side-by-side comparisons of data across systems to ensure consistency during and after migrations
  • Maintain basic familiarity with orchestration tools such as Airflow (Composer) to follow pipeline logic and collaborate effectively with engineering
  • Work within the GCP environment, leveraging cloud tools and services to support analysis, troubleshoot issues, and navigate cloud-based workflows
  • Fulltime
Read More
Arrow Right
New

Team Manager AI & Data Strategy

At Vantage Towers, we're on a mission to power Europe's sustainable digital tran...
Location
Location
Portugal , Lisboa
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in business, computer science, data analytics, engineering, or related field (or equivalent experience)
  • Several years of experience in data, analytics, and/or data platform environments with responsibility for business-facing delivery
  • Proven experience leading teams and/or complex delivery streams in stakeholder-driven environments
  • Strong understanding of cloud-based data platforms, ideally GCP, including data lake and scalable analytics platform concepts
  • Familiarity with GCP services such as BigQuery, Cloud Run, Airflow, Dataflow, or comparable services on other cloud platforms
  • Some exposure to ML/GenAI/RAG/agentic implementations (worked alongside teams delivering such solutions
  • understands key data requirements and common pitfalls)
  • Experience in managing data integration, transformation and reporting environments
  • ability to steer end-to-end delivery
  • Experience driving business value and adoption from data solutions (self-service analytics, reporting products, datasets)
Job Responsibility
Job Responsibility
  • Lead and develop the Data team (data engineers, data analysts and data architects)
  • set clear goals, coach performance, and foster a collaborative delivery culture
  • Drive prioritization based on business value, strategic impact and feasibility
  • maintain a transparent backlog and delivery roadmap - including AI-enabling priorities where data foundations are critical
  • Ensure key datasets are AI-ready and adopted by driving data quality, documentation and consistent business semantics that support downstream AI and analytics consumption
  • Manage stakeholder demand end-to-end: clarify needs, translate into deliverable requirements, and align expectations across business and IT
  • Oversee the development and operation of a scalable, maintainable and reliable GCP-based data platform and analytics products as the foundation for scalable AI use cases
  • Steer data integration, transformation and reporting delivery across GCP components (e.g., BigQuery, Cloud Run, Airflow, Dataflow) and related services
  • Ensure orchestration, monitoring and operational transparency
  • establish run-books, KPIs and platform health reporting, and drive cost awareness and optimization of GCP resources
What we offer
What we offer
  • A diverse, multicultural setup based on our values – Accountability, Respect, Teamwork, and Trust – and the unique opportunity to shape the organisation
  • An attractive salary package
  • Meal Allowance: Delivered on Pluxee card - €10.20/day
  • Pension Plan
  • Full Health Insurance: For employees and co-payment for family members
  • Life Insurance
  • 7 extra vacation days: 4 flexible, plus 3 fixed — 1 on Carnival, 1 on Christmas, and half a day on Easter and New Year's
  • Parking Slot
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right

D&T Head of Data, BI & Analytics

The Head of Data Sciences, Analytics, AI, and BI is a senior strategic and opera...
Location
Location
Salary
Salary:
Not provided
aramex.com Logo
Aramex
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Leadership Behaviors: Collaborate & break silos
  • Execution & Accountability
  • Growth mindset
  • Innovation
Job Responsibility
Job Responsibility
  • Develop and communicate a long-term vision and strategy for Data Sciences, Analytics, AI, and BI, aligned with Aramex’s business strategy and growth objectives.
  • Drive innovation in analytics and AI, exploring emerging technologies, algorithms and methodologies to enhance business decision-making and operational efficiency.
  • Proactively engage business stakeholders to understand complex requirements, desired outcomes, and strategic priorities.
  • Design and deliver fit-for-purpose analytics and AI solutions that optimize operations, improve customer satisfaction, and drive revenue.
  • Use insights from customer feedback, usage analytics, and emerging trends to identify continuous improvement opportunities.
  • Own end to end operations of enterprise analytics platforms, including GCP-based ecosystems (BigQuery, Dataflow, Pub/Sub, Looker) and legacy BI systems (SQL Server, Oracle BI, ETL pipelines).
  • Ensure platform stability, performance, security and compliance through robust ITIL-aligned processes (incident, problem, change, release management).
  • Define and maintain SLAs, SLOs, and operational KPIs for uptime, pipeline reliability, cost efficiency, and user satisfaction.
  • Lead post-project operationalization, including service acceptance, monitoring, and continuous improvement loops.
  • Lead and mentor a high performing team of data scientists, AI engineers, BI analysts, and data operations engineers.
Read More
Arrow Right
New

Data Engineer (GCP) - VOIS

We are seeking a skilled Data Engineer to design, build and maintain scalable da...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hold a Bachelor’s or Master’s degree in Computer Science, Information Technology or a related field
  • Experienced with SQL and NoSQL databases
  • Knowledgeable in data warehousing concepts and industry best practices
  • Familiar with data integration tools and frameworks
  • Demonstrate strong problem-solving and analytical capabilities
  • Communicate clearly and collaborate effectively with diverse teams
  • Comfortable working in a dynamic, fast-paced environment
Job Responsibility
Job Responsibility
  • Design, develop and maintain scalable data pipelines and ETL processes using GCP services including BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer, Cloud Functions and Cloud Run
  • Collaborate with data scientists, analysts and business stakeholders to understand data requirements and deliver robust data solutions
  • Implement data integration solutions to ingest, process and store structured and unstructured data from multiple sources
  • Optimise and tune data pipelines for performance, reliability and cost efficiency
  • Ensure data quality through validation, cleansing and transformation processes
  • Develop and maintain data models, schemas and metadata to support analytics and reporting needs
  • Monitor, troubleshoot and resolve data pipeline issues to minimise disruption
  • Stay current with GCP technologies and best practices, recommending improvements where appropriate
  • Mentor junior data engineers and promote a collaborative, knowledge-sharing culture
What we offer
What we offer
  • Opportunity to work on modern, cloud-native data platforms using Google Cloud technologies
  • Exposure to enterprise-scale data engineering challenges across diverse business domains
  • A collaborative environment that values continuous improvement and shared learning
  • The ability to influence data quality, performance and analytics outcomes across teams
  • Fulltime
Read More
Arrow Right