CrawlJobs Logo

Senior Data Developer (Azure)

coca-colahellenic.com Logo

Coca-Cola HBC

Location Icon

Location:
Egypt , Cairo

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for a highly skilled and experienced Senior Azure Data Developer with significant technical skills in Azure and business acumen to join our team in Coca-Cola Hellenic Digital & Technology Platform Services (DTPS) and be part of the BI Reporting team. You will have the opportunity to contribute to the strategy, design and development of our Enterprise and Operational Reporting, providing insights and supporting decision making for end users. The role involves significant collaboration with business and IT teams, opportunity to integrate & transform data from various data systems into structures that are suitable for building analytical solutions, design, develop and implement Azure-based solutions connecting certified and validated data sources to Power BI front-end visualizations and creating a single source of truth for reporting in CCH.

Job Responsibility:

  • Assist in the design, development, and maintenance of ETL (Extract, Transform, Load) processes using Azure Data Factory
  • Implement data ingestion processes from various data sources into Azure data storage solutions
  • Work with Azure SQL Database, Azure Data Lake Storage, and Hive & Databricks to store and manage large datasets (Terabytes of data)
  • Ensure data integrity and availability by implementing appropriate storage solutions
  • Integrate data from various sources, ensuring accuracy, completeness, and consistency
  • Collaborate with data scientists and analysts to understand data requirements and provide necessary support
  • Contribute to the strategy, design and development of our Enterprise and Operational Reporting, providing insights and supporting decision making for end users
  • Integrate & transform data from various data systems into structures that are suitable for building analytical solutions
  • Design, develop and implement Azure-based solutions connecting certified and validated data sources to Power BI front-end visualizations and creating a single source of truth for reporting

Requirements:

  • Bachelor's or Master’s degree in computer science, information technology, data science, or a related field (or an equivalent of 7+ years of practical experience in a tech-domain)
  • Hands-on experience with MS Power BI, MS PowerApps, MS Azure Data Factory, MS Azure Data Lake Store, SQL Database (T-SQL), DAX & Power Query M (3+ years)
  • Proven track record of implementing proof-of-concept (POC) backend solutions
  • Expert knowledge with cloud platforms (Azure, Google Cloud Storage, Amazon), preferably Microsoft Azure – with a focus on data warehousing & backend development
  • Strong programming skills in languages such as C#, .NET, Python, Scala or Java
  • Experience with big data architectures and large data volumes technologies (partitioned tables with billions of records, files with hundreds of GBs)
  • Familiarity with big data technologies such as Hadoop, Spark, and Hive as well as strong knowledge with data manipulation frameworks such a PySpark
  • Strong analytical & problem-solving abilities, as well as good communication skills
  • Strong willingness to learn and adapt to new technologies and tools
  • Good proficiency in English as a day-to-day business language is a must.

Nice to have:

  • Experience with big data architectures and large data volumes technologies (partitioned tables with billions of records, files with hundreds of GBs) are a plus
  • First proven leadership capabilities towards junior employees and a proactive approach to project management is a plus
  • Strong hands-on understanding of data modelling concepts and techniques as well as DevSecOps concepts and tools and practical experience in their implementation
  • Strong understanding of DevOps principles and experience with CI/CD pipelines
  • Working experience within the SAFE Agile framework is appreciated
  • Knowledge of SAP BW (extractors, transformations, queries, data export) and Azure Data Sources would be appreciated
What we offer:
  • Development opportunities
  • Equal opportunity employer
  • IT Equipment
  • Learning programs
  • Work with iconic brands
  • Supportive team
  • Volunteering Opportunities
  • Wellbeing program

Additional Information:

Job Posted:
March 21, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Developer (Azure)

Senior Azure Data Engineer

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/...
Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
Job Responsibility
Job Responsibility
  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
Read More
Arrow Right

Senior Python Big Data Developer

The Applications Development Senior Programmer Analyst is an intermediate level ...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7 - 12 years of relevant experience
  • Experience in systems analysis and programming of software applications
  • Experience in managing and implementing successful projects
  • Working knowledge of consulting/project management techniques/methods
  • Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements
  • Bachelor’s degree/University degree or equivalent experience
  • Strong expertise in Big Data technologies (Spark, Hadoop, Hive, Impala, Kafka, Scala, Cloudera)
  • Design, develop, and maintain robust and scalable data pipelines using Python, SQL, PySpark, and streaming technologies like Kafka
  • Strong SQL and NoSQL experience (Oracle, MongoDB, PostgreSQL) for data extraction, reconciliation, and transformation
  • Proficiency in Python and Shell scripting for data processing and automation
Job Responsibility
Job Responsibility
  • Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas
  • Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users
  • Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement
  • Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality
  • Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems
  • Ensure essential procedures are followed and help define operating standards and processes
  • Serve as advisor or coach to new or lower level analysts
  • Has the ability to operate with a limited level of direct supervision
  • Can exercise independence of judgement and autonomy
  • Acts as SME to senior stakeholders and /or other team members
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right

Senior Cloud Data Architect

As a Senior Cloud Architect, your role will focus on supporting users, collabora...
Location
Location
Spain , Barcelona
Salary
Salary:
Not provided
https://www.allianz.com Logo
Allianz
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in Azure cloud infrastructure, Data & AI technologies, and data platform management, with proficiency in Azure Synapse Analytics, Azure Machine Learning, Azure Data Lake, and Informatica Intelligent Data Management Cloud (IDMC)
  • Proven experience in modern Data Warehouse architectures (e.g., Lakehouse) and integrating machine learning models and AI capabilities using Azure services like Cognitive Services and Azure Bot Service for predictive analytics and automation
  • In-depth knowledge of data security and compliance practices using Azure AD, Azure Key Vault, and Informatica’s data governance tools, focusing on data privacy and regulatory standards
  • Expertise in optimizing resource usage, performance, and costs across Azure services and IDMC, leveraging tools like Azure Cost Management and Azure Monitor, and skilled in ETL/ELT tools and advanced SQL
  • Proficiency in data integration, machine learning, and generative AI from an architectural perspective, with hands-on experience in Python, SQL, Spark/Scala/PySpark, and container solutions like Docker and Kubernetes
  • Experience with CI/CD pipelines (e.g., GitHub Actions, Jenkins), microservices architectures, and APIs, with knowledge of architecture frameworks like TOGAF or Zachman, adept at managing multiple priorities in fast-paced environments, and excellent communication and presentation skills
  • Over 5 years of experience in cloud architecture focusing on Data & AI infrastructure, particularly in Azure, with expertise in building scalable, secure, and cost-effective solutions for data analytics and AI/ML environments.
Job Responsibility
Job Responsibility
  • Define and prioritize new functional and non-functional capabilities for the cloud-based data platform, ensuring alignment with business needs and Allianz's security, compliance, privacy, and architecture standards
  • Act as the platform SME for both potential and existing users, guiding them in the architecture of scalable, high-performance Data & AI solutions
  • Provide leadership and product guidance to engineering teams during the design, development, and implementation of new platform capabilities
  • Ensure all solutions meet defined quality standards and acceptance criteria
  • Work with stakeholders to co-create data solutions, optimizing business models and identifying opportunities for improved data usage
  • Lead the evaluation and selection of technologies and partners to implement data analytics use cases, focusing on proofs of concept and prototypes
  • Stay up to date with emerging trends in Data, Analytics, AI/ML, and cloud technologies
  • Leverage open-source technologies and cloud tools to drive innovation and cost-efficiency
  • Prepare materials for management briefings and public events
  • Represent the team in technical discussions, particularly regarding architecture and platform capabilities.
What we offer
What we offer
  • Hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad
  • Rewarding performance through company bonus scheme, pension, employee shares program, and multiple employee discounts
  • Career development and digital learning programs to international career mobility
  • Flexible working, health and wellbeing offers (including healthcare and parental leave benefits)
  • Support for balancing family and career and helping employees return from career breaks with experience that nothing else can teach.
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer with a collaborative, “can-do” attitud...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
  • Strong analytical abilities and a strong intellectual curiosity
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals
  • Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery
  • Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders
  • Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability)
  • Stay current with and adopt new tools and applications to ensure high quality and efficient solutions
  • Build cross-platform data strategy to aggregate multiple sources and process development datasets
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right

Senior Software Developer – Digital Solutions (Microsoft Power Platform & Azure)

We are expanding our Digitalization & IT team and are looking for a Senior Softw...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
ibvogt.com Logo
ib vogt GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of software development experience, including solution architecture or senior-level responsibilities
  • Strong expertise in Power Platform and Azure cloud services
  • Proven experience integrating enterprise systems and building APIs or microservices
  • Working knowledge of AI/ML technologies and their practical application
  • Strong communication skills and the ability to turn complex requirements into elegant solutions
Job Responsibility
Job Responsibility
  • Lead development of enterprise applications using the Microsoft Power Platform (Power Apps, Power Automate, Power BI)
  • Architect and implement cloud solutions in Azure (Functions, Logic Apps, App Services, API Management)
  • Build a robust data integration layer connecting ERP, SCADA/telemetry, GIS, CMMS, BIM, and other core systems
  • Develop AI-powered tools using Co-Pilot, Azure AI, LLMs, and automation frameworks
  • Drive Digital Twin development for solar PV assets across project development, engineering, construction, and O&M
  • Act as a technical advisor to stakeholders, ensuring scalable, secure, and maintainable solutions
  • Mentor developers as well as Digital Champions and help establish best practices, governance, and development standards
What we offer
What we offer
  • Competitive remuneration and motivating benefits
  • Opportunity to shape the data foundation of ib vogt’s digital transformation journey
  • Work on cutting-edge data platforms supporting real-world renewable energy assets
  • A truly international working environment with colleagues from all over the world
  • An open-minded, collaborative, dynamic, and highly motivated team
  • Fulltime
Read More
Arrow Right