This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This is a remote 4 month engagement with a likelihood of extending to 12 months. Possible extension up to 18 months. Are you ready to make a significant impact at the forefront of technological innovation? Aquent is partnering with a leading global technology company dedicated to shaping the future through groundbreaking advancements. This organization is renowned for its commitment to developing robust, scalable solutions that empower millions worldwide. We are seeking a highly skilled and passionate individual to join a pivotal team responsible for building and maintaining the foundational data platforms and pipelines that drive critical insights and power next-generation AI initiatives. Your expertise will directly contribute to designing, developing, and maintaining the efficient and reliable data infrastructure that underpins the company’s strategic decisions and product evolution, truly unleashing the power of data.
Job Responsibility:
Design, build, and maintain scalable data platforms and pipelines utilizing cutting-edge tools and technologies
Collaborate closely with diverse stakeholders to meticulously gather business requirements and translate them into robust technical specifications
Develop and implement sophisticated data models that effectively support advanced analytics and comprehensive reporting needs
Champion data quality and governance by implementing rigorous validation, consistency checks, and reliability measures
Partner with cross-functional teams, including data analysts, data scientists, and business leaders, to deliver high-quality data solutions that meet evolving demands
Continuously monitor and optimize data pipelines for peak performance, scalability, and cost-efficiency
Establish and implement comprehensive monitoring and observability metrics to proactively ensure data quality and detect anomalies within data pipelines
Create clear, comprehensive documentation for data processes and effectively communicate complex technical concepts to both technical and non-technical audiences
Requirements:
A Bachelor’s degree in Computer Science, Engineering, Information Systems, or a closely related field
A minimum of 2 years of professional experience in data engineering with experience in Python, SQL, Kubernetes, Airflow and Scala
Demonstrated proficiency in data warehouse management, alongside strong experience in building and maintaining robust data pipelines and ETL processes
Excellent verbal and written communication skills, with the ability to clearly convey technical information to diverse audiences
Proven ability to thrive and contribute effectively within a collaborative, cross-functional team environment
Strong analytical capabilities, including the ability to gather complex business requirements and debug intricate issues across various data systems
Nice to have:
Experience with leading cloud platforms such as AWS, GCP, or Azure
Familiarity with various industry-leading data warehousing technologies
Knowledge of data governance and data security best practices