This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Role: Data Scientist. Type: Contracts (6 Months). Location: London, UK (REMOTE). Working Model: Fully Remote. Payrate: 412 - 512 GBP/day on PAYE; 487 - 587 GBP/day on RUPAYE; 599 - 699 GBP/day on Inside IR35 on Umbrella. The Role: Join our team to drive data quality and empower cross-functional decision-making by building robust pipelines and self-serve data products. You will be essential in defining metrics for platform health, developer productivity, and ML/AI adoption. What You Will Do: Own end-to-end analytical data modeling in BigQuery using dbt. Build reliable data pipelines utilizing SQL, Python, and distributed processing frameworks (Apache Spark, Scio, Beam, or Flink). Develop clear, story-driven dashboards using tools like Looker or Tableau. Champion data quality, implement CI/CD for dbt models, and mentor junior team members. What You Bring: 5+ years of experience in analytics or data engineering, with deep expertise in SQL. Extensive experience with dbt, a cloud data warehouse, and workflow orchestrators (Airflow, Dagster, Prefect, or Flyte). Proficiency in Python for data analysis and automation. Bonus: Experience with experimentation, ML/AI metrics, or platform productivity. This is an urgent vacancy with a deadline where the hiring manager is shortlisting for an interview immediately. Please apply with a copy of your CV or send it praveen. sukkala2 @ randstaddigital. Com
Job Responsibility:
Drive data quality and empower cross-functional decision-making by building robust pipelines and self-serve data products
Define metrics for platform health, developer productivity, and ML/AI adoption
Own end-to-end analytical data modeling in BigQuery using dbt
Build reliable data pipelines utilizing SQL, Python, and distributed processing frameworks (Apache Spark, Scio, Beam, or Flink)
Develop clear, story-driven dashboards using tools like Looker or Tableau
Champion data quality, implement CI/CD for dbt models, and mentor junior team members
Requirements:
5+ years of experience in analytics or data engineering, with deep expertise in SQL
Extensive experience with dbt, a cloud data warehouse, and workflow orchestrators (Airflow, Dagster, Prefect, or Flyte)
Proficiency in Python for data analysis and automation
Nice to have:
Experience with experimentation, ML/AI metrics, or platform productivity