This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Senior Data Engineer to design, develop, and optimize data platforms, pipelines, and governance frameworks to enhance business intelligence, analytics, and AI capabilities for Adtalem Global Education.
Job Responsibility:
Design, develop, and optimize data platforms, pipelines, and governance frameworks
Enhance business intelligence, analytics, and AI capabilities
Ensure accurate data flows and push data-driven decision-making across teams
Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
Develop and manage CI/CD pipelines to support continuous deployment of data products
Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
Develop and expose REST APIs for sharing data across teams
Maintain comprehensive documentation of data pipelines, designs, and strategies
Implement data governance policies and ensure accurate data across data elements
Manage data stewardship processes, including development and management of bronze, silver, and gold medallion layers within GCP for data quality and consistency
Capture and maintain data lineage and operational metadata for data governance and transparency
Requirements:
Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
3 years of experience in data engineering
Experience building data platforms and pipelines
Experience with AWS, GCP or Azure
Experience with SQL and Python for data manipulation, transformation, and automation
Experience with Apache Airflow for workflow orchestration
Experience with data governance, data quality, data lineage and metadata management
Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
Experience with CI/CD pipelines for continuous deployment and delivery of data products
Experience maintaining technical records and system designs
Experience working in an Agile environment with cross-functional teams
What we offer:
Health, dental, vision, life and disability insurance
401k Retirement Program + 6% employer match
Participation in Adtalem’s Flexible Time Off (FTO) Policy
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.