CrawlJobs Logo

Senior BigQuery Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
Romania , Bucharest

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

You will join the OneMIS stream, responsible for management, regulatory & risk reporting, and advanced analytics. Our mission includes enhancing data quality via KPIs and migrating data platforms to modern, cloud-native ecosystems. We operate in an agile environment, committed to responsible data practices. We are looking for a Senior Data Engineer to design and deliver scalable data pipelines and high performance analytical solutions using SQL/BigQuery, Spark/PySpark, and Python on Google Cloud. This role focuses on building reliable, cloud native data products that enable advanced reporting, analytics, and decision making across the organization.

Job Responsibility:

  • Build scalable data pipelines: Design and deliver batch and real-time ETL/ELT pipelines across cloud environments to support analytics and reporting
  • Develop SQL and BigQuery solutions: Write and optimize advanced SQL transformations and build performant, cost‑efficient BigQuery data models
  • Develop Python workflows: Implement scalable data processing solutions using Python and PySpark, ensuring maintainable and high‑quality code
  • Design data models and ensure quality: Build robust data models and apply validation practices to maintain accuracy and reliability
  • Build cloud‑native data solutions: Use GCP services such as BigQuery, Dataflow, Cloud Composer, Pub/Sub, and GCS to build and operate modern data platforms
  • Optimize performance and reliability: Troubleshoot complex pipeline issues and continuously improve compute, storage, and processing performance
  • Collaborate using strong engineering practices: Work with engineering, analytics, and business teams while contributing to CI/CD, code reviews, and testing standards

Requirements:

  • University degree in computer science or a comparable qualification
  • At least 5 years of experience as a Data Engineer, building scalable data pipelines and working with cloud-based data ecosystems
  • Strong expertise in SQL and hands‑on experience building performant datasets in BigQuery (or similar cloud data warehouses)
  • Proven experience with Python and PySpark for scalable data processing in distributed environments
  • Solid understanding of data modeling, ELT/ETL patterns, and data quality best practices
  • Experience with Google Cloud Platform, particularly BigQuery, Dataflow, Cloud Composer, GCS, or equivalent cloud data services
  • Hands‑on experience building scalable data pipelines (batch and near real‑time) in a cloud‑native environment
  • Proficiency with version control, CI/CD pipelines, and automated testing frameworks
  • Ability to troubleshoot and optimize performance across compute, storage, and processing layers

Nice to have:

  • Experience with Infrastructure as Code (Terraform, Ansible, Chef)
  • Knowledge of shell scripting
  • Experience in financial services or regulated environments
What we offer:
  • Smooth integration and a supportive mentor
  • Pick your working style: choose from Remote, Hybrid or Office work opportunities
  • Different working hours to suit your needs
  • Sponsored certifications, trainings and top e-learning platforms
  • Private Health Insurance custom-made for you
  • Individual coaching sessions
  • Accredited Coaching School
  • Parties or themed events

Additional Information:

Job Posted:
May 14, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior BigQuery Engineer

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role in Data & Analytics, Group Digital to build trusted da...
Location
Location
Spain , Madrid
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of hands-on building production data systems
  • Experience designing and operating batch and streaming pipelines on cloud platforms (GCP preferred)
  • Proficiency with tools like BigQuery, Dataflow/Beam, Pub/Sub (or Kafka), Cloud Composer/Airflow, and dbt
  • Fluent in SQL and production-grade Python/Scala for data processing and orchestration
  • Understanding of data modeling (star/snowflake, vault), partitioning, clustering, and performance at TB-PB scale
  • Experience turning ambiguous data needs into robust, observable data products with clear SLAs
  • Comfort with messy external data and geospatial datasets
  • Experience partnering with Data Scientists to productionize features, models, and feature stores
  • Ability to automate processes, codify standards, and champion governance and privacy by design (GDPR, PII handling, access controls)
Job Responsibility
Job Responsibility
  • Build Expansion360, the expansion data platform
  • Architect and operate data pipelines on GCP to ingest and harmonize internal and external data
  • Define canonical models, shared schemas, and data contracts as single source of truth
  • Enable interactive maps and location analytics through geospatial processing at scale
  • Deliver curated marts and APIs that power scenario planning and product features
  • Implement CI/CD for data, observability, access policies, and cost controls
  • Contribute to shared libraries, templates, and infrastructure-as-code
What we offer
What we offer
  • Intellectually stimulating, diverse, and open atmosphere
  • Collaboration with world-class peers across Data & Analytics, Product, and Engineering
  • Opportunity to create measurable, global impact
  • Modern tooling on Google Cloud Platform
  • Hardware and OS of your choice
  • Continuous learning (aim to spend ~20% of time on learning)
  • Flexible, friendly, values-led working environment
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer, you will be pivotal in designing, building, and optim...
Location
Location
United States
Salary
Salary:
102000.00 - 125000.00 USD / Year
wpromote.com Logo
Wpromote
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent practical experience
  • 4+ years of experience in data engineering or a related field
  • Intermediate to advanced programming skills in Python
  • Proficiency in SQL and experience with relational databases
  • Strong knowledge of database and data warehousing design and management
  • Strong experience with DBT (data build tool) and test-driven development practices
  • Proficiency with at least 1 cloud database (e.g. BigQuery, Snowflake, Redshift, etc.)
  • Excellent problem-solving skills, project management habits, and attention to detail
  • Advanced level Excel and Google Sheets experience
  • Familiarity with data orchestration tools (e.g. Airflow, Dagster, AWS Glue, Azure data factory, etc.)
Job Responsibility
Job Responsibility
  • Developing data pipelines leveraging a variety of technologies including dbt and BigQuery
  • Gathering requirements from non-technical stakeholders and building effective solutions
  • Identifying areas of innovation that align with existing company and team objectives
  • Managing multiple pipelines across Wpromote’s client portfolio
What we offer
What we offer
  • Half-day Fridays year round
  • Unlimited PTO
  • Extended Holiday break (Winter)
  • Flexible schedules
  • Work from anywhere options*
  • 100% paid parental leave
  • 401(k) matching
  • Medical, Dental, Vision, Life, Pet Insurance
  • Sponsored life insurance
  • Short Term Disability insurance and additional voluntary insurance
  • Fulltime
Read More
Arrow Right

Senior Data Engineer - Catalog

Join the Catalog team and be part of Deezer’s next steps. We ingest, reconcile a...
Location
Location
France , Paris
Salary
Salary:
Not provided
deezer.com Logo
Deezer
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in Data Engineering
  • Familiarity with data quality frameworks and best practices
  • Solid experience with Python, SQL, BigQuery, Spark/Scala, ETL
  • Experience with entity matching, data reconciliation or data science applied to quality problems is a strong plus
  • Proactive, organised and with strong team spirit
  • Passion for music and improving user experience through better metadata
  • Fluent in English
Job Responsibility
Job Responsibility
  • Design and maintain ETL pipelines to ingest and process large volumes of catalog data from disparate sources
  • Ingest and match entities like artists, albums, concerts and lyrics
  • Develop solutions to improve metadata quality at scale
  • Contribute to projects like: Artist disambiguation and source-of-truth building
  • Album grouping via ML models in collaboration with Data scientists
  • Build analytics and quality KPIs to monitor catalog performance
  • Collaborate closely with Data Scientists, Researchers, Product Managers and partners to implement new solutions
What we offer
What we offer
  • A Deezer premium family account for free
  • Access to gym classes
  • Join over 70 Deezer Communities
  • Deezer parties several times a year and drinks every thursday
  • Allowance for sports, travelling and culture
  • Meal vouchers
  • Mental health and well-being support from Moka.Care
  • Great offices
  • Hybrid remote work policy
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We’re hiring a Senior Data Engineer to build and own critical components of our ...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects that you find meaningful outside of work - and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Professional massage at the office
  • Health and fitness benefits through Urban Sport Club membership
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Finland , Helsinki
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Extensive Occupational Health Care, Dental Care, as well as sports, culture, massage and lunch benefits
  • Regular office breakfast
Read More
Arrow Right

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Ireland , Cork
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL
  • Familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Plankton program recognizes extra work to the open source ecosystem for developers and non-developers alike
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Private medical & dental health insurance
  • Childbirth cash benefit
  • Fulltime
Read More
Arrow Right

Senior Analytics Engineer

We’re looking for a Senior Analytics Engineer to elevate the quality, reliabilit...
Location
Location
Spain , Barcelona
Salary
Salary:
Not provided
heetch.com Logo
Heetch
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Deep experience with SQL, dataform/dbt, and cloud data warehouses (BigQuery, Redshift)
  • Skilled at designing semantic layers, dimensional models, and performant ELT pipelines
  • Understand how to build datasets that serve many stakeholders with different needs, while keeping them consistent and high-quality
  • Can balance long-term architecture with pragmatic, incremental delivery
  • Care deeply about data quality, observability, and operational excellence
  • Communicate clearly and work collaboratively with technical and non-technical teams
  • Have 5+ years of experience in analytics engineering, data engineering, BI engineering, or a similar role
Job Responsibility
Job Responsibility
  • Design, build, and maintain high-quality, scalable data models that power analytics, experimentation, and product decision-making
  • Define standards and best practices for data modelling, documentation, testing, versioning, and CI/CD across the analytics engineering function
  • Collaborate with Product, Engineering, and Data Science to understand business needs and translate them into reliable data structures and pipelines
  • Own and improve the semantic layer (dataform models, metrics definitions, data marts), ensuring consistency across teams and markets
  • Drive data quality initiatives: monitoring, alerting, observability, lineage, and automated testing
  • Partner with Analysts and Data Scientists to enable faster, more reliable insight generation
  • Improve performance and cost-efficiency of our data warehouse, queries, and pipelines
  • Mentor data analysts and scientists on modelling, SQL craftsmanship, and best practices
  • Contribute to the evolution of Heetch’s data platform, advocating for tools, processes, and architectures that improve reliability and developer experience
  • Fulltime
Read More
Arrow Right