This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Engineer role involves designing and managing data solutions using cloud technologies and big data frameworks. Candidates should have a strong background in ETL processes, data storage solutions, and data analysis. A bachelor's degree in Computer Science or a related field is preferred, along with a minimum of 5 years of experience. Proficiency in Python and Azure Data Factory is essential.
Job Responsibility:
Create and manage ETL processes using Azure Data Factory to automate data flow between various systems and services
Implement data storage solutions using Azure Data Lake and Cosmos DB, ensuring high availability and scalability
Design and optimize data processes in Databricks, implementing Spark jobs that utilize Delta Lake for efficient data handling and versioning
Build and maintain analytics models that can harness the capabilities of Delta Lake for better data quality and performance
Develop scripts in Python and PowerShell to automate operational tasks, deployments, and monitoring of data workflows
Collaborate with development teams to integrate CI/CD practices using Azure DevOps and Git for version control
Manage and optimize MS SQL Server databases, writing efficient T-SQL queries for data manipulation and reporting
Ensure data integrity and security through proper database management practices
Implement logging and monitoring solutions to proactively address any issues impacting data applications and workflows
Conduct root cause analysis on data-related problems and develop solutions to mitigate future occurrences
Collaborate with business intelligence teams to create and share reports using Power BI
Provide insights and recommendations based on data analysis to guide business decisions
Requirements:
Bachelor’s degree in Computer Science, Information Technology, or a related field is preferred
Minimum 5 years of experience in a similar role
Proficiency in Python for data processing and automation
Experience with PowerShell scripting and Azure Data Factory
Strong understanding of MS SQL Server and T-SQL for database scripting
Familiarity with Databricks, Delta Lake, Azure Data Lake, and Cosmos DB
Experience with version control using Git and CI/CD frameworks in Azure DevOps
Solid problem-solving skills and ability to troubleshoot complex data issues
Familiarity with DevOps methodologies including Scrum and Agile practices for software development
Experience creating dashboards and reports using Power BI
Knowledge of data governance and data security best practices
Excellent command of both spoken and written English
What we offer:
Smooth integration and a supportive mentor
Choose from Remote, Hybrid or Office work opportunities
Different working hours to suit your needs
Sponsored certifications, trainings and top e-learning platforms
Private Health Insurance
Individual coaching sessions or accredited Coaching School