CrawlJobs Logo

Conversational Analytics Engineer - GCP

valtech.com Logo

Valtech

Location Icon

Location:
United Kingdom , London

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We’re looking for a Conversational Analytics Engineer with strong expertise across the Google Cloud Platform ecosystem to drive forward conversational analytics and AI solutions. This role blends hands-on engineering, AI/ML innovation and client-facing problem solving. You’ll work across traditional data engineering and emerging LLM / agentic AI patterns, helping design and deploy scalable, production-ready solutions.

Job Responsibility:

  • Design and build scalable data solutions using BigQuery and GCP-native tooling
  • Develop and deploy ML and AI solutions, including LLM-based and agentic systems
  • Work with tools such as Gemini to power conversational and analytical use cases
  • Build and integrate analytical agents into real-world applications
  • Implement evaluation frameworks (version-controlled via Git) to assess model and system performance
  • Orchestrate APIs and enable seamless integration between models, data pipelines, and applications
  • Apply modern MLOps and engineering best practices to deploy and maintain solutions in production
  • Define and assess the quality and timeliness of available data sources
  • Translate business challenges into scalable data and AI solutions
  • Communicate complex technical concepts clearly to non-technical stakeholders
  • Lead client conversations to shape data and analytics solutions, building trust and guiding them through technical and strategic decisions
  • Work closely with cross-functional teams (engineering, product, business)
  • Help shape opportunities where conversational analytics and AI can drive value

Requirements:

  • Strong experience as a Data Engineer / Tech Lead within GCP
  • Deep expertise in BigQuery and cloud-based data architectures
  • Hands-on experience building and integrating ML / AI models into applications
  • Experience with LLMs, RAG, embeddings, or agentic AI systems
  • Strong programming skills (Python, SQL)
  • Experience with API orchestration and system integration
  • Familiarity with evaluation pipelines and version control (Git)
  • Strong communicator who can bridge technical and business teams
  • Curious, pragmatic problem solver with a focus on impact
  • Comfortable working in ambiguity and shaping new AI-driven solutions
  • Collaborative mindset with the ability to guide and influence others

Nice to have:

  • Experience with Gemini, ADK, or similar AI frameworks
  • Exposure to CDPs, analytics tooling, or customer data ecosystems
  • Experience with MLOps and productionising AI solutions
What we offer:
  • Flexibility, with remote and hybrid work options (country-dependent)
  • Career advancement, with international mobility and professional development programs
  • Learning and development, with access to cutting-edge tools, training and industry experts

Additional Information:

Job Posted:
May 05, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Conversational Analytics Engineer - GCP

Analytics Engineer

As an analytics engineer, you’ll be an integral part of our data analysis and QA...
Location
Location
Canada , Vancouver
Salary
Salary:
106300.00 - 134350.00 CAD / Year
dialpad.com Logo
Dialpad
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master’s degree in Computer Science, Software Engineering, or related fields
  • 1 - 3 years of working experience with software engineering or data engineering projects
  • 1 - 3 years of experience with Python, working with GCP, including storage, BigQuery, Compute, Kubernetes, or similar
  • 1 - 3 years of experience with SQL, able to optimize complex SQL queries and build data pipelines
  • Experience with BI tools such as Tableau
  • Experience analyzing the performance of conversational AI systems (e.g., voice bots, chatbots) and collaborating with cross-functional teams (AI, Product) on data-driven improvements
  • Experience working with popular LLM frameworks
  • Strong problem-solving and analytical abilities, with the capacity to handle complex technical and analytical problems
Job Responsibility
Job Responsibility
  • Work closely with other Agentic AI teams to test and evaluate our Agentic voice and chat solutions
  • Design and implement data logging and monitoring strategies to capture key analytical insights
  • Utilize data analysis to identify areas for optimization in the voice bot's conversational flows and data pipeline
  • Build and maintain data pipelines on Vertex AI pipelines
  • Work closely with the AI Platform team to build tooling for data science projects
  • Implement automation and processes to improve our workflow
  • Create and maintain dashboards (Tableau) and data pipelines (Dataform) that help drive product and business decisions
  • Contribute to our continuous efforts to enforce data privacy and compliance
  • Collaborate with cross-functional teams, including AI, engineering, and product teams
What we offer
What we offer
  • Competitive benefits and perks
  • Robust training program
  • Inclusive office environment
  • Great Place to Work certified culture
  • Fulltime
Read More
Arrow Right
New

Analytics Engineer

We are seeking an experienced and versatile Analytics Engineer to join our dynam...
Location
Location
Canada
Salary
Salary:
81990.00 - 91100.00 CAD / Year
tucows.com Logo
Tucows
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of experience in data analytics or a related field, with significant exposure to AI and Machine Learning applications in analytics
  • Advanced SQL skills with experience in writing and optimizing complex queries on large-scale datasets
  • Hands-on experience with dbt (Data Build Tool) and its features for building, testing, and documenting data models
  • Expert-level knowledge of data modeling and data warehouse concepts (e.g., star schema, normalization, slowly changing dimensions)
  • Experience with Snowflake's Data Cloud platform and familiarity with its advanced AI capabilities (Snowflake Intelligence – Cortex Analyst, Cortex Agents, Cortex Search, AISQL, etc.) is highly preferred
  • Strong skills in Looker data visualization and LookML (including familiarity with Looker's conversational AI and data agent capabilities) or similar BI tools
  • Experience with AI agents or generative AI tools to optimize workflows and service delivery (such as creating chatbots or automated analytic assistants) is a plus
  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis, Spark Streaming) for handling continuous data flows
  • Proficient in Python for data analysis and manipulation (pandas, NumPy, etc.), with the ability to write clean, efficient code. Experienced with shell scripting and command-line tools for automating workflows and data processing tasks
  • Familiarity with ETL processes and workflow orchestration tools like Apache Airflow (or similar scheduling tools) for automating data pipelines alongside Docker for local development and testing
Job Responsibility
Job Responsibility
  • Design, develop, and maintain complex data models in our Snowflake data warehouse. Utilize dbt (Data Build Tool) to create efficient data pipelines and transformations for our data platform
  • Leverage Snowflake Intelligence features (e.g., Cortex Analyst, Cortex Agents, Cortex Search, AISQL) to implement conversational data queries and AI-driven insights directly within our data environment. Develop AI solutions that harness these capabilities to extract valuable business insights
  • Design and build advanced SQL queries to retrieve and manipulate complex data sets. Dive deep into large datasets to uncover patterns, trends, and opportunities that inform strategic decision-making
  • Develop, maintain, and optimize Looker dashboards and LookML to effectively communicate data insights. Leverage Looker's conversational analytics and data agent features to enable stakeholders to interact with data using natural language queries
  • Communicate effectively with stakeholders to understand business requirements and deliver data-driven solutions. Identify opportunities for implementing AI/ML/NLP technologies in collaboration with product, engineering, and business teams
  • Write efficient Python code for data analysis, data processing, and automation of recurring tasks. Skilled in shell scripting and command-line tools to support data workflows and system tasks. Ensure code is well-tested and integrated into automated workflows (e.g., via Airflow job scheduling)
  • Create compelling visualizations and presentations to deliver analytical insights and actionable recommendations to senior management and cross-functional teams. Tailor communication of complex analyses to diverse audiences
  • Stay up-to-date with industry trends, emerging tools, and best practices in data engineering and analytics (with a focus on dbt features, Snowflake's latest offerings and BI innovations). Develop and implement innovative ideas to continuously improve our analytics stack and practices
What we offer
What we offer
  • fair compensation
  • generous benefits
  • Fulltime
Read More
Arrow Right

BI Manager

Groupon is a marketplace where customers discover new experiences and services e...
Location
Location
Salary
Salary:
Not provided
groupon.com Logo
Groupon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Leadership: 2+ years managing or mentoring a data team (sprint planning, code reviews, career development)
  • Technical Fluency: Advanced SQL is mandatory
  • You can read/debug Python or Airflow DAGs
  • You understand data modeling (Star Schema, Data Marts) and cloud warehouses (BigQuery)
  • Business Acumen: You can explain 'Revenue Recognition' or 'Conversion Funnels' to an engineer, and 'ETL Latency' to a Sales Director
  • Data analytics: Expert-level proficiency in analytics and data visualisation
  • You know how to design for usability, not just aesthetics
  • AI first mindset: be a pioneer and lead by example on AI usage for both personal effectiveness (MCPs, N8N, cursor AI or Claude code) as well as AI first solution (text mining, optimisation, innovative solution)
Job Responsibility
Job Responsibility
  • Squad Leadership: Manage the backlog, capacity, and development of ~5 data professionals
  • Translate vague business problems into technical specs
  • Data Product Ownership: Own the end-to-end lifecycle of your domain’s data—from ingestion design (reviewing Engineering plans) to final visualization (Tableau)
  • Stakeholder Management: Act as the single point of truth for your domain
  • Negotiate priorities with Business Leaders to ensure the team works on high-ROI tasks
  • Modernization: Be part of data platform unification project to decommission legacy systems running on Teradata and Hive and go fully to GCP native
  • Quality & Governance: Enforce best practices on data governance and documentation, ensure metric consistency
  • If the data breaks, you lead the fix
Read More
Arrow Right

Senior Data Engineer

Adswerve is looking for a Senior Data Engineer to join our Adobe Services team. ...
Location
Location
United States
Salary
Salary:
130000.00 - 155000.00 USD / Year
adswerve.com Logo
Adswerve, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience)
  • 5+ years of experience in a data engineering, analytics, or marketing technology role
  • Hands-on expertise in Adobe Experience Platform (AEP), Real-Time CDP, Journey Optimizer, or similar tools is a big plus
  • Strong proficiency in SQL and hands-on experience with data transformation and modeling
  • Understanding of ETL/ELT workflows (e.g., dbt, Fivetran, Airflow, etc.) and cloud data platforms (e.g., GCP, Snowflake, AWS, Azure)
  • Experience with ingress/egress patterns and interacting with API’s to move data
  • Experience with Python, or JavaScript in a data or scripting context
  • Experience with customer data platforms (CDPs), event-based tracking, or customer identity management
  • Understanding of Adobe Experience Cloud integrations (e.g., Adobe Analytics, Target, Campaign) is a plus
  • Strong communication skills with the ability to lead technical conversations and present to both technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Lead the end-to-end architecture of data ingestion and transformation in Adobe Experience Platform (AEP) using Adobe Data Collection (Tags), Experience Data Model (XDM), and source connectors
  • Design and optimize data models, identity graphs, and segmentation strategies within Real-Time CDP to enable personalized customer experiences
  • Implement schema mapping, identity resolution, and data governance strategies
  • Collaborate with Data Architects to build scalable, reliable data pipelines across multiple systems
  • Conduct data quality assessments and support QA for new source integrations and activations
  • Write and maintain internal documentation and knowledge bases on AEP best practices and data workflows
  • Simplify complex technical concepts and educate team members and clients in a clear, approachable way
  • Contribute to internal knowledge sharing and mentor junior engineers in best practices around data modeling, pipeline development, and Adobe platform capabilities
  • Stay current on the latest Adobe Experience Platform features and data engineering trends to inform client strategies
What we offer
What we offer
  • Medical, dental and vision available for employees
  • Paid time off including vacation, sick leave & company holidays
  • Paid volunteer time
  • Flexible working hours
  • Summer Fridays
  • “Work From Home Light” days between Christmas and New Year’s Day
  • 401(k) Plan with 5% company match and no vesting period
  • Employer Paid Parental Leave
  • Health-care Spending Accounts
  • Dependent-care Spending Accounts
  • Fulltime
Read More
Arrow Right

Product Manager - AI/ML

We are seeking an experienced Technical Product Manager – AI/ML to lead the defi...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
evoluteiq.com Logo
EvoluteIQ
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 15+ years of overall experience
  • Several years in technical/engineering roles (software development, data engineering, or AI/ML)
  • 7+ years in product management
  • Strong understanding of Predictive AI/ML (classification, regression, anomaly detection)
  • Expertise in Natural Language Processing (LLMs, embeddings, conversational AI, text analytics)
  • Experience with Time Series Modeling (forecasting, demand planning, anomaly detection)
  • Knowledge of Generative AI (LLM-based copilots, text-to-X products, prompt engineering, RAG pipelines)
  • Hands-on familiarity with Python/Java, ML frameworks (TensorFlow, PyTorch), and cloud services (AWS, Azure, GCP)
  • Proven track record in building enterprise SaaS products and leading technical product discussions
  • Excellent communication and stakeholder management skills
Job Responsibility
Job Responsibility
  • Define and own the product roadmap for AI/ML features across predictive, NLP, time series, and generative AI domains
  • Align AI product strategy with overall platform vision, market trends, and customer needs
  • Identify opportunities for embedding AI capabilities into low-code/no-code workflows
  • Participate in technical design reviews with engineering and data science teams
  • Define API contracts, integration patterns, and deployment considerations for AI/ML features
  • Ensure product features are technically feasible, scalable, and aligned with enterprise architecture principles
  • Act as a bridge between technical and non-technical stakeholders, ensuring clarity of vision and execution
  • Collaborate with engineering, data science, and design teams to deliver scalable AI features
  • Define clear product specifications, use cases, and success metrics
  • Ensure compliance with security, data governance, and responsible AI principles
What we offer
What we offer
  • Opportunity to shape the strategy of a next-gen hyper-automation platform
  • Work with a cross-disciplinary team in a fast-growing, innovation-driven environment
  • Competitive compensation and growth opportunities
  • A culture of innovation, ownership, and continuous learning
  • Fulltime
Read More
Arrow Right

Cloud Solutions Architect – Data Platforms

Tier4 Group is seeking a high caliber Data Architect to join our team and play a...
Location
Location
United States , Alpharetta
Salary
Salary:
Not provided
tier4group.com Logo
Tier4 Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Deep hands on knowledge of modern data platforms such as Snowflake, Databricks, Azure Synapse, BigQuery, or similar technologies
  • Strong experience with ETL/ELT frameworks, data modeling, data integration, and data governance methodologies
  • Familiarity with BI and visualization tools such as Power BI, Tableau, or Qlik, along with foundational advanced analytics concepts
  • Exceptional communication and storytelling abilities, capable of engaging both technical and executive audiences
  • Proven experience developing solution proposals, architecture diagrams, and cost models
  • Strong consulting mindset with the ability to balance technical depth and business outcomes
  • 8+ years of experience in data engineering, analytics, or related disciplines
  • 3+ years in presales, solution architecture, or consulting roles
  • Hands on exposure to cloud ecosystems (AWS, Azure, GCP) and hybrid data architectures
  • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field
Job Responsibility
Job Responsibility
  • Engage with clients to understand business objectives and technical requirements, translating them into scalable, secure, and future ready data architectures
  • Design end to end solutions leveraging modern data platforms such as Snowflake, Azure, AWS, and GCP
  • Develop high level architecture diagrams, reference architectures, and solution blueprints covering data ingestion, integration, analytics, and governance
  • Partner with sales teams to qualify opportunities, define solution scope, and craft winning technical strategies
  • Lead client presentations, demos, and Proofs of Concept (POCs) that clearly articulate value and differentiation
  • Author and contribute to RFP/RFI responses, including architecture narratives, implementation approaches, and cost estimates
  • Stay at the forefront of emerging trends in data engineering, cloud platforms, and AI driven analytics
  • Provide strategic guidance on data modernization, cloud migration, analytics enablement, and data governance best practices
  • Position the organization as a trusted partner through insight driven conversations and technical credibility
  • Work closely with delivery teams to ensure a seamless transition from presales to implementation, maintaining architectural integrity and client vision
  • Fulltime
Read More
Arrow Right

Staff Engineer - MarTech (Adobe Experience Platform)

We are looking for a technical leader to shape, scale, and continuously evolve o...
Location
Location
Salary
Salary:
Not provided
admiralgroup.co.uk Logo
Admiral Group Plc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven ability to lead through influence, shaping direction, coaching engineers, and strengthening capability across cross‑functional teams
  • Strong communication and stakeholder engagement skills, confident operating with senior leaders in both technical and non‑technical areas
  • Ability to balance long‑term technical strategy with near‑term delivery needs
  • Passion for continuous improvement, experimentation, and innovation
  • Deep hands‑on knowledge of Adobe Experience Cloud, including: AEP Real Time CDP covering data modelling, identity, profile governance, sources and destinations, and segment design
  • Customer Journey Analytics covering workspace design, connections, and cross‑channel analytics
  • Adobe Target covering experimentation frameworks, personalisation strategies, guardrails, and performance optimisation
  • AEM Sites, EDS and Assets covering component architecture, CF and XF patterns, metadata, rights management, and asset workflow
  • Workfront covering marketing workflow design, templates, approvals, and capacity management
  • Broad grounding in engineering including full‑stack fundamentals, distributed systems, API design, cloud platforms (GCP), containers, Kubernetes, and CI/CD
Job Responsibility
Job Responsibility
  • Optimise our Real Time Customer Profile foundation so audience segmentation, enrichment, and activation become more relevant and effective
  • Implement and scale cross‑channel customer journeys using Adobe Journey Optimizer, improving time to launch and supporting stronger conversion, retention, and satisfaction
  • Advance Customer Journey Analytics to move away from fragmented reporting and towards unified, self‑serve insights that help Marketing and Digital teams make better decisions
  • Establish strong standards for experimentation and personalisation using Adobe Target, enabling more testing, safer execution, and higher return on investment
  • Enhance how digital content is produced, managed, and deployed. This includes improving campaign speed, reuse, compliance, and consistency across channels
  • Embed and optimise Workfront as the backbone of marketing workflow from brief to delivery, improving visibility, throughput, and alignment across teams
  • Own the MarTech technical roadmap across AEP, ensuring it aligns with business goals and marketing priorities. Define long‑term architectural direction, integrations, and investment choices across Adobe Experience Cloud. Partner with Marketing leadership to translate customer and commercial goals into use cases and platform capabilities
  • Architect and govern AEP implementations including XDM modelling, identity resolution, ingestion patterns, activation, destination architecture, and platform governance. Define standards for segment design, profile policies, real‑time data flows, and edge decisioning
  • Define triggers, schemas, guardrails, and operational patterns for orchestrating scalable cross‑channel journeys. Ensure journeys are testable, observable, and measurable
  • Design and maintain analytics connections, data views, and workspaces. Lead the transition from legacy reporting platforms to CJA with a focus on accuracy, stability, and adoption
What we offer
What we offer
  • Eligible for up to £3,600 of free shares each year after one year of service
  • 33 days holiday (including bank holidays) when they join us, increasing the longer you stay with us, up to a maximum of 38 days (including bank holidays)
  • Option to buy or sell up to an additional five days of annual leave
  • Financial & Mortgage Advice
  • 24-Hour Ecare
  • Cycle to Work Scheme
  • Annual Holiday Allowance
  • Flexible Working
  • Simply Health
  • Private Health Cover
  • Fulltime
Read More
Arrow Right

AI & Analytics Workload Specialist

The HPE Worldwide Hybrid Cloud Acceleration Team is seeking a technically skille...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2 - 4 years of hands-on experience in building or validating AI and Analytics solutions, with a focus on real-world enterprise use cases
  • proven experience developing or deploying AI-powered applications such as Retrieval-Augmented Generation (RAG) systems, conversational AI/chatbot solutions, ML model pipelines for analytics or inference
  • strong proficiency in Python and familiarity with common AI/ML frameworks (e.g., LangChain, Hugging Face, PyTorch, TensorFlow, OpenAI APIs)
  • hands-on experience with data manipulation, embedding/vector databases (e.g., FAISS, Chroma, Weaviate), and prompt engineering
  • experience with virtualization platforms (e.g., VMware, KVM) and containers (e.g., Docker, Kubernetes) is a plus
  • familiarity with deploying AI workloads in cloud environments (e.g., Azure, AWS, or GCP), particularly using GPU-accelerated instances, is a plus
  • strong written and verbal communication skills, with the ability to explain complex technical ideas clearly
  • bachelor’s degree in computer science, Data Science, Engineering, or a related technical field.
Job Responsibility
Job Responsibility
  • apply technical expertise to design and validate AI/Analytics workload solutions
  • contribute technical assets such as demos, white papers, videos, blogs, labs, and internal enablement materials
  • collaborate with stakeholders to define the scope and align solutions with business needs
  • leverage internal infrastructure and AI-powered tools to accelerate development and validation
  • create compelling enablement content for the field, partners, and customers
  • support solution adoption through TekTalks, webinars, Slack forums, and internal events
  • integrate assets with field tools and help measure asset utilization
  • invest in technical and personal growth through internal training, certifications, mentorships, and participation in HPE’s Technical Career Path (TCP)
  • contribute to team goals, mentor peers, and participate in a collaborative engineering environment.
What we offer
What we offer
  • health and wellbeing benefits
  • personal and professional development programs
  • inclusive workplace environment.
  • Fulltime
Read More
Arrow Right