This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We’re seeking a GCP Data Analyst with deep expertise in BigQuery, strong SQL and Python skills, and a sharp analytical mindset to support both data validation initiatives and ongoing analytics work. This role is ideal for someone who can navigate large datasets, build robust queries, and identify inconsistencies with precision and insight. The analyst will work across a variety of data workflows, from validating metrics during system migrations to supporting day-to-day data analysis and reporting needs.
Job Responsibility:
Write, optimize, and execute complex SQL queries in BigQuery to validate data accuracy, identify inconsistencies, and support analytics and reporting
Analyze large datasets to assess data quality, compare trends across systems, and surface anomalies or unexpected behaviors
Utilize advanced BigQuery features such as authorized views, materialized views, UDFs, partitioned tables, and joins to support scalable, high-performance analysis
Use Python (including data frames and relevant libraries) for exploratory analysis, data manipulation, and supporting validation workflows
Support time series analysis and, where applicable, anomaly detection using SQL or Python-based approaches
Assist with load/transform validation to ensure reliability and accuracy in data pipelines
Collaborate with engineering teams to understand data pipelines, with basic ability to read and interpret Java or Scala code when needed
Perform side-by-side comparisons of data across systems to ensure consistency during and after migrations
Maintain basic familiarity with orchestration tools such as Airflow (Composer) to follow pipeline logic and collaborate effectively with engineering
Work within the GCP environment, leveraging cloud tools and services to support analysis, troubleshoot issues, and navigate cloud-based workflows
Clearly communicate analytical findings and data quality issues to cross-functional stakeholders to support decision-making
Requirements:
Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field
5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills
5+ years of experience building and operating solutions on Google Cloud Platform (GCP)
Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies
Proficient in Python, including use of data frames and common analytical libraries
Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis
Strong analytical skills and experience validating data across systems during migrations and ongoing operations
Basic ability to read and understand Java or Scala code to support engineering collaboration
Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows
Proficiency in SQL, BigQuery, and Python
Advanced SQL skills in BigQuery for complex data validation, anomaly detection, and trend analysis
Experience comparing datasets across systems
Proven ability to identify and investigate data discrepancies across platforms
Strong analytical intuition to sense-check metrics and flag issues that may not trigger formal alerts
Ability to perform side-by-side metric and trend comparisons to confirm post-migration accuracy
Skilled in root cause analysis using SQL, domain expertise, and supporting context
Effective communicator who can document findings and share insights with both technical and non-technical stakeholders
Familiarity with time series analysis to detect unexpected shifts, drops, or spikes in metrics
Ability to follow structured validation processes while proactively identifying workflow improvements
Nice to have:
Familiarity with Looker or other BI tools for metric validation and reporting support
BigQuery ML and Vertex AI
Basic familiarity with legacy systems such as Oozie or Pig for reading existing scripts