CrawlJobs Logo

Gcp data analyst - bigquery

rackspace.com Logo

Rackspace

Location Icon

Location:
United States , San Antonio

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Requirements:

  • How many years of experience do you have with BigQuery?
  • How many years of experience do you have with SQL?
  • How many years of experience do you have with Python?
  • Have you used advanced BigQuery features (UDFs, partitions, materialized views)? (Yes/No)
  • How many data migration projects have you supported or validated?
  • Are you comfortable reading basic Java or Scala code? (Yes/No)
  • Have you used Airflow (Composer) or similar orchestration tools? (Yes/No)
  • How many years experience do you have with Oozie?
  • How many years experience do you have with Pig?
  • Have you done any anomaly detection or trend analysis in your past work? (Yes/No)
  • What is your level expertise in writing sql queries? Please elaborate.
  • How much experience do you have in schema design? 1 - 3 years
  • 3 - 5 years
  • Over 5 years

Additional Information:

Job Posted:
February 18, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Gcp data analyst - bigquery

Gcp Data Engineer

We at AlgebraIT are looking for a GCP Data Engineer with 3+ years of experience ...
Location
Location
United States , Austin
Salary
Salary:
Not provided
algebrait.com Logo
AlgebraIT
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering with GCP
  • Proficiency in Python, SQL, and GCP services
  • Experience with data pipeline orchestration tools
  • Strong problem-solving abilities and attention to detail
  • Bachelor’s degree in Computer Science or related field
Job Responsibility
Job Responsibility
  • Build and maintain scalable data pipelines using GCP tools
  • Ensure data security and governance
  • Monitor, troubleshoot, and optimize data workflows
  • Collaborate with stakeholders to gather requirements and deliver data solutions
  • Implement data quality checks and best practices
  • Develop and maintain ETL processes
  • Create detailed documentation of data processes
  • Work closely with data analysts and business teams for data alignment
  • Ensure high availability and reliability of data services
  • Stay current with GCP data technology advancements
  • Fulltime
Read More
Arrow Right

Data Analyst

Location
Location
India , Pune
Salary
Salary:
Not provided
enormousenterprise.com Logo
Enormous Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years’ experience in related field preferred
  • Experience with other tools such as GCP BQ, Alteryx, Tableau required
  • Experience with data query, visualization, dashboard and/or scorecard tools preferred
  • Extensive experience performing Analysis and Solutioning
  • Experience in Big Data systems, Data Lakes and Analytics related field preferred
  • Hands-on experience in Data Analysis/validation/data mapping and proficient in SQL queries
  • Experience developing Business/Functional Requirement documentation and facilitating requirement work sessions with Business Units/Clients
  • Excellent verbal/written communication skills in an internal client facing role with ability to present information and findings clearly in analysis/reports
  • Can work on complex projects of large scope
Job Responsibility
Job Responsibility
  • Scopes, designs, and develops data analytics solutions/models to solve challenging business problems using Google BigQuery, SQL, SAP, Tableau, AtScale, Looker Studio, Alteryx, and other data-centric technologies
  • Proficiency in data modeling/design and reading/writing/debugging/optimizing advanced SQL queries are firmly required
  • Knowledge of data visualization, dashboard, reporting, and OLAP/dimensional design tools preferred
  • Translates complex business requirements into technical requirements, and ultimately into data solutions that enable the business to make fast, effective, data-driven decisions (BigQuery data models/views, AtScale virtual cubes, Tableau dashboards
  • supports decision-making, metric management, enterprise performance management)
  • Asks probing questions to understand detailed business processes, requirements, and data architectures
  • Brings a foundation of data analysis experience and best-practices to discover and anticipate requirements beyond those that are plainly provided Observes patterns in processes and data and implementing end-to-end data driven intelligent processes
  • Extracts, manipulates, and analyzes varying types of data to find insight with a high degree of accuracy and attention to detail
  • Integrates analysis and underlying work (e.g., data mapping, transformations, validation) back into preexisting data platforms, with an enterprise mentality beyond any single use case
  • Applies hypotheses and an understanding of cause-and-effect from the logical analysis of a complex situation
Read More
Arrow Right
New

Business Consulting-Technical analyst with ETL, GCP using Pyspark

The Business Consulting-Technical Analyst role focuses on ETL and GCP using PySp...
Location
Location
India , Pune
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • GCP Professional Data Engineering certification
  • Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc
  • Demonstrated experience in using PySpark for data processing, transformation, and analysis
  • Solid Python programming skills for data manipulation and scripting
  • Experience with data modeling, ETL processes, and data warehousing concepts
  • Proficiency in SQL for querying and manipulating data in relational databases
  • Understanding of big data principles and distributed computing concepts
  • Ability to effectively communicate technical solutions and collaborate with cross-functional teams
  • Systems design experience with the ability to architect and explain complex systems interactions, data flows, common interfaces and APIs
  • Deep understanding of and experience with software development and programming languages such as Java/Kotlin, and Shell scripting
Job Responsibility
Job Responsibility
  • Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing
  • Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services
  • Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis
  • Utilizing PySpark for data manipulation, cleansing, enrichment, and validation
  • Ensuring the performance and scalability of data processing jobs on GCP
  • Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions
  • Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP
  • Diagnosing and resolving issues related to data pipelines and infrastructure
  • Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Gcp data analyst - bigquery

We’re seeking a GCP Data Analyst with deep expertise in BigQuery, strong SQL and...
Location
Location
Canada , Toronto; Vancouver; Calgary
Salary
Salary:
143700.00 - 245520.00 USD / Year
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field
  • 5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills
  • 5+ years of experience building and operating solutions on Google Cloud Platform (GCP)
  • Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies
  • Proficient in Python, including use of data frames and common analytical libraries
  • Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis
  • Strong analytical skills and experience validating data across systems during migrations and ongoing operations
  • Basic ability to read and understand Java or Scala code to support engineering collaboration
  • Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows
  • Proficiency in SQL, BigQuery, and Python
Job Responsibility
Job Responsibility
  • Write, optimize, and execute complex SQL queries in BigQuery to validate data accuracy, identify inconsistencies, and support analytics and reporting
  • Analyze large datasets to assess data quality, compare trends across systems, and surface anomalies or unexpected behaviors
  • Utilize advanced BigQuery features such as authorized views, materialized views, UDFs, partitioned tables, and joins to support scalable, high-performance analysis
  • Use Python (including data frames and relevant libraries) for exploratory analysis, data manipulation, and supporting validation workflows
  • Support time series analysis and, where applicable, anomaly detection using SQL or Python-based approaches
  • Assist with load/transform validation to ensure reliability and accuracy in data pipelines
  • Collaborate with engineering teams to understand data pipelines, with basic ability to read and interpret Java or Scala code when needed
  • Perform side-by-side comparisons of data across systems to ensure consistency during and after migrations
  • Maintain basic familiarity with orchestration tools such as Airflow (Composer) to follow pipeline logic and collaborate effectively with engineering
  • Work within the GCP environment, leveraging cloud tools and services to support analysis, troubleshoot issues, and navigate cloud-based workflows
  • Fulltime
Read More
Arrow Right

Gcp Data Analyst - Bigquery

We’re seeking a GCP Data Analyst with deep expertise in BigQuery, strong SQL and...
Location
Location
United States , San Antonio
Salary
Salary:
143700.00 - 245520.00 USD / Year
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field
  • 5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills
  • 5+ years of experience building and operating solutions on Google Cloud Platform (GCP)
  • Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies
  • Proficient in Python, including use of data frames and common analytical libraries
  • Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis
  • Strong analytical skills and experience validating data across systems during migrations and ongoing operations
  • Basic ability to read and understand Java or Scala code to support engineering collaboration
  • Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows
  • Proficiency in SQL, BigQuery, and Python
Job Responsibility
Job Responsibility
  • Write, optimize, and execute complex SQL queries in BigQuery to validate data accuracy, identify inconsistencies, and support analytics and reporting
  • Analyze large datasets to assess data quality, compare trends across systems, and surface anomalies or unexpected behaviors
  • Utilize advanced BigQuery features such as authorized views, materialized views, UDFs, partitioned tables, and joins to support scalable, high-performance analysis
  • Use Python (including data frames and relevant libraries) for exploratory analysis, data manipulation, and supporting validation workflows
  • Support time series analysis and, where applicable, anomaly detection using SQL or Python-based approaches
  • Assist with load/transform validation to ensure reliability and accuracy in data pipelines
  • Collaborate with engineering teams to understand data pipelines, with basic ability to read and interpret Java or Scala code when needed
  • Perform side-by-side comparisons of data across systems to ensure consistency during and after migrations
  • Maintain basic familiarity with orchestration tools such as Airflow (Composer) to follow pipeline logic and collaborate effectively with engineering
  • Work within the GCP environment, leveraging cloud tools and services to support analysis, troubleshoot issues, and navigate cloud-based workflows
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right

D&T Head of Data, BI & Analytics

The Head of Data Sciences, Analytics, AI, and BI is a senior strategic and opera...
Location
Location
Salary
Salary:
Not provided
aramex.com Logo
Aramex
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Leadership Behaviors: Collaborate & break silos
  • Execution & Accountability
  • Growth mindset
  • Innovation
Job Responsibility
Job Responsibility
  • Develop and communicate a long-term vision and strategy for Data Sciences, Analytics, AI, and BI, aligned with Aramex’s business strategy and growth objectives.
  • Drive innovation in analytics and AI, exploring emerging technologies, algorithms and methodologies to enhance business decision-making and operational efficiency.
  • Proactively engage business stakeholders to understand complex requirements, desired outcomes, and strategic priorities.
  • Design and deliver fit-for-purpose analytics and AI solutions that optimize operations, improve customer satisfaction, and drive revenue.
  • Use insights from customer feedback, usage analytics, and emerging trends to identify continuous improvement opportunities.
  • Own end to end operations of enterprise analytics platforms, including GCP-based ecosystems (BigQuery, Dataflow, Pub/Sub, Looker) and legacy BI systems (SQL Server, Oracle BI, ETL pipelines).
  • Ensure platform stability, performance, security and compliance through robust ITIL-aligned processes (incident, problem, change, release management).
  • Define and maintain SLAs, SLOs, and operational KPIs for uptime, pipeline reliability, cost efficiency, and user satisfaction.
  • Lead post-project operationalization, including service acceptance, monitoring, and continuous improvement loops.
  • Lead and mentor a high performing team of data scientists, AI engineers, BI analysts, and data operations engineers.
Read More
Arrow Right