This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Randstad Digital is hiring a Data Engineer for direct integration, in a company located in Porto. Hybrid work model (3 days on-site; 2 days remote / week).
Job Responsibility:
Participate in development, and implementation of scalable data engineering solutions across the Medallion architecture (Bronze, Silver, Gold)
Design and maintain efficient data pipelines and integration processes using Azure Data Factory, Synapse, and Azure Data Lake Storage
Develop, test, and deploy SQL and PySpark ETL/ELT workflows ensuring data quality, consistency, and performance
Collaborate with data architects, data scientists, analysts, and business stakeholders to define data requirements and deliver high-quality solutions aligned with governance and modeling standards
Monitor, troubleshoot, and optimize data pipelines and architectures for scalability, reliability, and efficiency
Ensure compliance with data governance, security, and regulatory standards throughout the data lifecycle
Maintain clear technical documentation and promote reusable frameworks, automation, and best practices
Mentor junior engineers and foster a culture of technical excellence, collaboration, and continuous improvement
Requirements:
2+ years of professional experience in data engineering with proven expertise in Azure technologies, including at least 2 years in a leadership or senior technical role
Strong track record in designing, implementing, and managing complex data pipelines, ETL/ELT processes, and integration solutions
Advanced knowledge of data modeling, dimensional design, and data warehousing using SQL, NoSQL, and columnar databases
Hands-on experience with Azure Data Factory, Synapse (dedicated and serverless), and Azure Data Lake Storage
Proficiency in Python and PySpark for scalable data processing
Expert-level SQL skills for complex transformations, optimization, and performance tuning
Familiarity with Agile and DevOps methodologies, including CI/CD implementation using Azure DevOps or GitHub Actions
Working knowledge of data visualization tools such as Power BI, Tableau, or Looker
Solid understanding of data governance, data quality, and security best practices (RBAC, encryption, GDPR)
Strong leadership and communication skills, with proven ability to mentor engineers and collaborate effectively with cross-functional teams
Proficiency in English is essential
Nice to have:
Additional experience with Java or Scala is a plus
Exposure to Microsoft Fabric and Cognite Data Fusion (CDF) is an advantage