This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Senior Data Engineer to join a highly collaborative, hands-on team in West Hollywood. This is an on-site role for someone who enjoys building, owning, and evolving data platforms end-to-end. You’ll work closely with technical and non-technical stakeholders, take ownership of complex data challenges, and help drive cross-functional initiatives. This role requires someone who is curious, independent, and comfortable rolling up their sleeves to solve foundational and ambiguous problems.
Job Responsibility:
Design, build, and maintain scalable data pipelines using Python and SQL
Develop and manage cloud-based data infrastructure (GCP, AWS, or Azure)
Own data orchestration workflows (Prefect preferred
Airflow or Dagster experience is transferable)
Implement robust data modeling practices (dbt preferred
strong SQL foundations required)
Reconcile multiple data sources and establish master data management strategies
Partner with stakeholders to define requirements, track deliverables, and run cross-functional projects
Support containerized workloads and data services using Docker and Kubernetes
Evaluate and integrate low-code or no-code data syncing tools where appropriate
Requirements:
Bachelor’s degree or higher in a relevant field
4–7+ years of professional experience in data engineering
Strong Python and SQL skills (including complex joins and performance optimization)
Experience building production-grade data pipelines
Hands-on experience with at least one major cloud platform (GCP, AWS, or Azure)
Excellent communication skills and the ability to work directly with stakeholders
Proven stability in prior roles (no frequent job hopping)
Strong critical-thinking skills and comfort tackling foundational data questions
Nice to have:
Experience with AI-driven data projects, agents, or automation frameworks
Exposure to graph or relational graph databases (e.g., Neo4j), ontology layers, or knowledge graphs
Familiarity with sub-agent automation or agent-based system design
Experience designing systems that integrate traditional data engineering with emerging AI workflows