This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Data Engineer at Robert Half, you will be the backbone of our data-driven decision-making process. You aren't just "moving data"; you are architecting the flow of information that powers our localized market analytics and global recruitment engines. In the DC market, this often involves handling high-compliance data environments and integrating cutting-edge AI frameworks into traditional ETL workflows.
Job Responsibility:
Pipeline Architecture: Design, build, and optimize scalable ETL/ELT pipelines using Python and SQL to move data from various internal and external sources into our central Snowflake warehouse
Modern Orchestration: Manage and schedule complex workflows using tools like Apache Airflow or Dagster to ensure 99.9% data availability for our analysts
GenAI Integration: Help build "GenAI Ops" frameworks, ensuring that the data feeding our internal Large Language Models (LLMs) is clean, tokenized, and ethically sourced
Data Modeling: Develop robust dimensional models (Star/Snowflake schemas) that simplify complex recruitment and financial datasets for executive dashboards
Performance Tuning: Monitor and optimize query performance and storage costs within AWS or Azure environments
Requirements:
Design, build, and optimize scalable ETL/ELT pipelines using Python and SQL
Manage and schedule complex workflows using tools like Apache Airflow or Dagster