This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Product Expert for Data Control & Data Quality, you will play a key role in the design, build, configuration, monitoring and continuous improvement of Coca-Cola Hellenic’s Data Quality Management products, with a primary focus on the Data Control Center (DCC). You will work closely with the Data Quality & Standards Product Manager, Business Stakeholders, Business Unit Data Stewards and IT partners (DTPS), ensuring that data quality controls, monitoring and reporting enable reliable and trusted data across the organization. This is a hands-on data engineering role requiring strong technical skills in SQL, PySpark and Azure-based data platforms and the emerging AI and GenAI techniques, combined with full ownership of deliverables from production support and the flexibility to handle ad-hoc requests and parallel priorities while translating business requirements into scalable, automated and intelligent data quality solutions.
Job Responsibility:
Lead the design and end-to-end ownership of Data Quality controls in the Data Control Center (DCC), from implementation through production support and continuous improvement
Translate business requirements into data quality rules, including logic definition, thresholds, exception handling and reduction of false positives
Discover the possibility of AI solutions for advanced checks and/or smart data errors remediation
Ensure DQ controls are aligned with agreed business definitions and standards
Monitor DQ performance and ensure stable DCC operations, including regular refresh cycles and system reliability
Troubleshoot failed checks, investigate data issues and perform root-cause analysis in collaboration with relevant stakeholders
Support incident resolution and contribute to continuous operational improvements
Implement improvements to existing DQ checks and contribute to the continuous enhancement of DCC capabilities
Identify opportunities to improve automation, intelligence, scalability and reusability of controls and monitoring processes
Identify opportunities to automate monitoring and reduce manual effort for stakeholders
Develop and maintain DQ dashboards and scorecards, enabling visibility of DQ KPIs, trends and drilldowns
Support stakeholders with insights and reporting to track performance and drive quality improvement actions
Support data standards operationalization by ensuring alignment between published standards and DQ checks
Support adoption and compliance visibility by enabling reporting and transparency on standards adherence
Work closely with the Product Manager and business stakeholders to clarify requirements, validate solutions and ensure high usability and adoption
Collaborate with DTPS and vendors to deliver end-to-end solutions, including integration dependencies and technical enablement
Run demos and workshops when needed to support rollout, stakeholder understanding and adoption
Support release and testing activities, including UAT support and regression testing
Maintain basic documentation of implemented DQ checks, logic, definitions and changes
Requirements:
3+ years in data engineering / data science / analytics engineering / data quality engineering
Strong SQL and Python for scalable data validation and monitoring
Experience working with Databricks and modern data pipelines (ADF or similar)
Strong analytical and problem-solving mindset with attention to detail and quality
Ability to translate business requirements into technical implementation and measurable outcomes
Comfortable working across multiple stakeholders in a cross-functional environment
Strong communication skills and ability to explain technical topics to non-technical audiences
Ownership mindset: proactive, reliable and continuously improving solutions
SQL
Python
PySpark
Azure Databricks
Azure Data Factory
Power BI
Machine Learning
AI/ GenAI
Nice to have:
Power Platform (Power Automate, etc.)
Microsoft Fabric
Copilot Studio
Experience working in a Data Mesh framework and/or domain data product mindset
Good awareness of key SAP MDG, MM, S/4HANA domains